OpenSign - Kinect V2 Hand Gesture Data - American Sign Language

Published: 30 May 2019 | Version 1 | DOI: 10.17632/b9pf4k4vjj.1

Description of this data

The attached file is a sample video of 10 volunteers who recorded 10 static gestures from American Sign Language. The dataset actually contains RGB and registered depth images in png and bin formats respectively. The letters/numbers taken from American Sign Language are A, F, D, L, 7, 5, 2, W, Y, None.

Experiment data files

This data is associated with the following publication:

A real-time human-robot interaction framework with robust background invariant hand gesture detection

Published in: Robotics and Computer Integrated Manufacturing

Latest version

  • Version 1


    Published: 2019-05-30

    DOI: 10.17632/b9pf4k4vjj.1

    Cite this dataset

    Mazhar, Osama (2019), “OpenSign - Kinect V2 Hand Gesture Data - American Sign Language”, Mendeley Data, v1


Views: 203
Downloads: 46


Computer Vision, Gesture, Convolutional Neural Network, Deep Learning


CC BY 4.0 Learn more

The files associated with this dataset are licensed under a Creative Commons Attribution 4.0 International licence.

What does this mean?
You can share, copy and modify this dataset so long as you give appropriate credit, provide a link to the CC BY license, and indicate if changes were made, but you may not do so in a way that suggests the rights holder has endorsed you or your use of the dataset. Note that further permission may be required for any content within the dataset that is identified as belonging to a third party.