A video dataset on agricultural words in Indian sign language

Published: 19 November 2021| Version 1 | DOI: 10.17632/6v53kfxf46.1
Adithya Venugopalan,


The dataset includes the videos of the dynamic hand gestures for the agricultural words from Indian sign language (ISL). This dataset is used in a work on the recognition of hand gestures for the Indian sign language (ISL) words commonly used by deaf farmers. A hybrid deep learning model with convolutional long short term memory (LSTM) network has been exploited for gesture classification. The model has attained an average classification accuracy of 76.21% on the proposed dataset of ISL words from the agricultural domain. The work is published in the journal Expert Systems with Applications https://www.sciencedirect.com/science/article/abs/pii/S0957417421010009.



Central University of Kerala


Computer Vision, Sign Language, Video, Gesture Recognition