InGesture Dataset

Published: 22 July 2025| Version 5 | DOI: 10.17632/fdxst56tcj.5
Contributors:
Pedro Daniel Gohl,
,
,

Description

The InGesture dataset provides high-resolution (200 Hz) inertial sensor data for hand gesture recognition, focusing on the challenge of distinguishing fluid intake from seven other kinematically similar gestures. Data were collected from 50 participants (34 male, 16 female, aged 18-67) across 65 recording sessions. An inertial sensor (WT901BLECL5 IMU) was placed on each participant's dominant wrist, and gestures were annotated in real-time with a synchronized mobile app to ensure high temporal accuracy. The full dataset contains de labeled gesture instances. To facilitate use, the data is provided in two complementary formats: 1. Continuous Recordings: CSV files containing the full recording of each session (~10 minutes), ideal for segmentation and sequence modeling tasks. Columns: timestamp, accX/Y/Z (accelerometer), asX/Y/Z (gyroscope), and label (gesture code). 2. Segmented Gestures: Individual CSV files, where each file represents a single, pre-segmented gesture instance, ready for use in classification models. Columns: timestamp, x/y/z (accelerometer), gx/gy/gz (gyroscope). The filename indicates the gesture (e.g., fluid_intake_1_800.csv). Gesture Classes (Labels): 0: Free Condition / Other 1: Fluid Intake 2: Answering Phone 3: Scratching Head 4: Passing Hand over Face 5: Adjusting Glasses 6: Holding Chin 7: Stretching with Hands behind Neck Additional Resources: To accelerate analysis, the repository includes: two Jupyter Notebook with Python code for loading, processing, and visualizing the data and classify examples. A detailed metadata file with participant demographics (age, sex, height, weight) and session details (e.g., container type used), allowing for robust, stratified analysis.

Files

Steps to reproduce

All the scripts are within the dataset_utils.ipynb (python notebook) Set base_dir to the root of your dataset directory (the folder containing your subject subfolders), for example: base_dir = "./InGesture/" Compute summary statistics (number of gestures per subject, raw file counts, total samples, etc.) In the final cells you can call: • summarize_dataset(base_dir) – prints a table with counts per subject • subject = os.path.join(base_dir, "subject_1") (to target Subject01 or change to another folder) • plot_gesture(subject, 'fluid_intake') – plots every “fluid intake” occurrence for that subject • plot_raw_data(subject) – shows the full continuous recording with colored background per label DEPENDENCIES Make sure you have pandas, numpy and matplotlib installed. For example, in your shell run: pip install pandas numpy matplotlib SUMMARY STATISTICS The summarize_dataset function walks through each subject folder, counts how many files exist for each gesture name, how many raw recordings there are, and totals up all samples. It returns a DataFrame you can view or export. PLOTTING FUNCTIONS • plot_gesture(subject_folder, gesture_name): looks for files named gesture_name__*.csv and plots x/y/z and gx/gy/gz vs. sample index at 200 Hz. • plot_raw_data(subject_folder): loads each full-length CSV (those without "__"), splits by the label column into contiguous segments, shades the background by label, and plots accX/Y/Z and asX/Y/Z vs. sample index. EXAMPLE USAGE – Print the overall summary: summarize_dataset(base_dir) – Plot each gesture for Subject01: plot_gesture(subject, 'adjusting_glasses') plot_gesture(subject, 'answering_phone') plot_gesture(subject, 'fluid_intake') … – Plot the raw continuous recording (with label shading): plot_raw_data(subject)

Institutions

Universidade Federal do Amazonas

Categories

Gesture Recognition

Funding

Fundação de Amparo à Pesquisa do Estado do Amazonas

Coordenação de Aperfeicoamento de Pessoal de Nível Superior

Licence