PkSLMNM: Pakistan Sign Language Manual and Non-Manual Gestures Dataset

Published: 26 May 2022| Version 1 | DOI: 10.17632/m3m9924p3v.1
Contributor:
Sameena Javaid

Description

Sign language is a non-verbal form of communication used by people with impaired hearing and speech. They also use facial actions to provide sign language prosody, similar to intonation in spoken languages. Sign Language Recognition (SLR) using hand signs is a typical way, however, face expression and body language play an important role in communication, which has not been analyzed to its fullest potential. In this paper, we present a dataset that comprises manual (hand signs) and non-manual (facial expressions and body movements) gestures of Pakistan Sign Language (PSL). It contains videos of 7 basic affective expressions performed by 100 healthy individuals, presented in an easily accessible format of .MP4 that can be used to train and test systems to make robust models for real-time applications using videos. Current data can also help with facial feature detection, classification of subjects by gender and age, or provide insights into any individual’s interest and emotional state.

Files

Institutions

Bahria University - Karachi Campus

Categories

Computer Vision, Machine Learning, Video Processing, Sign Language, Deep Learning

Licence