A multi-sensory dataset for the activities of daily living

Published: 10 September 2020| Version 2 | DOI: 10.17632/wjpbtgdyzm.2
Contributors:
, Alessandro Carfì,
,
,

Description

The dataset contains multiple instances of 9 Activities of Daily Living (ADL)-related actions namely Walk, Sit Down, Stand Up, Open Door, Close Door, Pour Water, Drink Glass, Brush Teeth and Clean Table. Each of the 10 volunteers performed each activity at least 14 times, with the notable exception of the walking activity that has been performed 40 times, in different sequences and alternating the used hand. For each volunteer, the dataset contains 7 CSV files, i.e., one file for each of the 6 IMU sensors worn by the volunteer on different body parts, as described in Figure 1, namely left lower arm (lla.csv), left upper arm, (lua.csv), right lower arm (rla.csv), right upper arm (rua.csv) and right thigh (rt.csv). Each file contains the overall sequence recorded during the experiment. The first column contains a label "qags" indicating the type of recorded data (quaternions, acceleration, angular velocity). The next column is the time-stamp in milliseconds elapsed from 00:00:0.000 AM (with a 30 milliseconds sampling time). The next four columns are the quaternions (with a resolution of 0.0001 ). Following them, we have three columns with the accelerations along the x, y and z axes (with a 0.1 mG resolution). The last three columns refer to the angular velocity about the x, y and z axes (with a 0.01 dps resolution). The last CSV file (annotation.csv) contains the data labelling. The first two columns of this file contain the time in the format hh.mm.ss.000 (current day time) and in milliseconds elapsed from 00:00:0.000 AM. All the remaining columns are organised as couples where the first element represents the scope of the labelling and the second indicates whether the labelled activity is starting or ending. In the annotation file, there are four different labelling scopes. • “BothArms”: all instances of each activity are labelled independently of which arm has been used; • “RightArm”: are labelled the activity instances using only the right arm, or in case they belong to the Walk, Sit Down or the Stand Up activities; • “LeftArm”: are labelled the activity instances using only the left arm, or in case they belong to the Walk, Sit Down or the Stand Up activities; • “Locomotion”: in this scope are labelled only the instances of Walk, Sit Down and Stand Up. Finally, the last two columns report a session ID. There are four different sessions characterised by the order in which the activities are performed, and by the used arm (see also Table 3), and whether the session starts or ends. The videos recorded during the experiments have been only used for labelling purposes, and they are not published. Together with the dataset, we provide a MATLAB script named TimeStampExtraction.m that extract from the annotation and the data files, for each volunteer and for each sensor, the time stamp associated with the start and end of each ADL.

Files

Steps to reproduce

For further details about the dataset collection procedure, refer and cite the article "A multi-sensory dataset for the activities of daily living " published on Data in Brief (https://www.sciencedirect.com/science/article/pii/S2352340920310167).

Categories

Activity Recognition, Human Movement Studies, Activity of Daily Living Scale

Licence