Exocentric view hand interaction dataset

Published: 12 June 2025| Version 1 | DOI: 10.17632/9d37m3nfgm.1
Contributors:
,
,

Description

The dataset provides data exclusively on the exocentric view (third-person view), captured using a Google Pixel 6a smartphone. The dataset contains 7,221 frame images extracted from original videos recorded under real-world conditions. The images are in PNG format. The data was taken in various lighting environments, including natural light, white light, yellow light, and dim light. The dataset features two, three, and four persons engaged in interactive gameplay activities of Ludo, Poker, and Snakes and Ladders. The dataset includes diverse hand gestures with challenging cases such as motion blur, severe deformation, sharp shadows, and extremely dim lighting conditions. This dataset provides original, unprocessed data suitable for researchers to use in training, validation, and testing of supervised, semi-supervised, unsupervised, and self-supervised deep learning models for both static analysis and real-time interaction scenarios. Dataset collection: • Source Location: Rampura, Dhaka, Bangladesh. • Capture Method: Google Pixel 6a smartphone camera. • Anonymization: All data were rigorously anonymized to maintain confidentiality and privacy. • Consent: All participants provided informed consent for research use. Dataset composition: • Total Participants: 4 Male aged 20 years or older.

Files

Institutions

American International University Bangladesh

Categories

Artificial Intelligence, Human-Computer Interaction, Deep Learning

Licence