American Sign Language (ASL) Fingerspelling dataset for Myo Sensor

Published: 16 October 2018| Version 1 | DOI: 10.17632/dbymbhhpk9.1
Contributor:
Prajwal Paudyal

Description

This is the dataset used in the following publication. Please cite this publication if use this dataset: This work was published on the 2017 ACM IUI . @inproceedings{paudyal2016sceptre, title={Sceptre: a pervasive, non-invasive, and programmable gesture recognition technology}, author={Paudyal, Prajwal and Banerjee, Ayan and Gupta, Sandeep KS}, booktitle={Proceedings of the 21st International Conference on Intelligent User Interfaces}, pages={282--293}, year={2016}, organization={ACM} } @inproceedings{paudyal2017dyfav, title={Dyfav: Dynamic feature selection and voting for real-time recognition of fingerspelled alphabet using wearables}, author={Paudyal, Prajwal and Lee, Junghyo and Banerjee, Ayan and Gupta, Sandeep KS}, booktitle={Proceedings of the 22nd International Conference on Intelligent User Interfaces}, pages={457--467}, year={2017}, organization={ACM} } 9 users wore the Myo Armband and data was collected for 5s. for each letter of the alphabet. The first 8 columns contain data for the 8 EMG pods, the next 3 are for Accelerometer, the next 3 are for Gyroscope and the final 3are for Orientation (Roll, Pitch and Yaw)

Files

Steps to reproduce

Use Myo armband on the primary hand and collect data for 5s.

Institutions

Arizona State University

Categories

Activity Recognition, American Sign Language

Licence