A datasets of egocentric and exocentric view hands in interactive sense

Published: 1 July 2024| Version 1 | DOI: 10.17632/bxr7kx84y6.1
, Mohd Shahrizal Sunar,


The dataset provides data on the egocentric view( first-person view) and the exocentric view (third-person view). The dataset contains 49352 frame images from original videos captured by the iPhone. The data from egocentric and exocentric are recorded synchronously. Moreover, the data was acquired in the real world under natural light, white light, yellow light, and dim light. The dataset includes two, three, and four persons participating in interactive activities such as poker, checkers, and dice. The datasets contain various hand gestures, including extraordinary cases such as blur, severe deformation, sharp shadows, and extremely dim light. This dataset principally provides original data. Researchers can process the data for training, validation, and testing of supervised, semi-supervised, unsupervised and self-supervised deep learning models in static or real-time interaction scenarios for their research requirements.



Universiti Teknologi Malaysia


Artificial Intelligence, Human-Computer Interaction, Deep Learning