PkSLMNM: Pakistan Sign Language Manual and Non-Manual Gestures Dataset
Sign language is a non-verbal form of communication used by people with impaired hearing and speech. They also use facial actions to provide sign language prosody, similar to intonation in spoken languages. Sign Language Recognition (SLR) using hand signs is a typical way, however, face expression and body language play an important role in communication, which has not been analyzed to its fullest potential. In this paper, we present a dataset that comprises manual (hand signs) and non-manual (facial expressions and body movements) gestures of Pakistan Sign Language (PSL). It contains videos of 7 basic affective expressions performed by 100 healthy individuals, presented in an easily accessible format of .MP4 that can be used to train and test systems to make robust models for real-time applications using videos. Current data can also help with facial feature detection, classification of subjects by gender and age, or provide insights into any individual’s interest and emotional state.