Fidgety Philip and the Suggested Clinical Immobilization Test: Annotation Dataset & HMovements Automated Movement Detection Algorithm
Description
ANNOTATION DATASET To facilitate ‘structured behavioral observations’ in clinical practice, we are interested in analyzing movement patterns, mainly voluntary and involuntary motions, during sitting. We developed a clinical protocol for analyzing characteristics of voluntary movements during sitting with the goal to capture disorder specific movement patterns and contribute to the phenotypic characterization of H-behaviors. To enhance our understanding, we conducted this phenotyping exercise independent of discipline-related boundaries and applied a pictogram guided-phenotyping language (PG-PL). Phase 1, Step 1: Annotation/analysis concept naïve research assistants (RAs) were trained. They then annotated three original Fidgety Philip cartoons and were instructed to ‘describe, but not interpret’. The dataset contains RAs' annotations of the cartoons. Phase 1, Step 2: RAs then annotated snapshots from suggested clinical immobilization tests (SCIT) to work out the distinction between ‘interpretive’ and ‘neutral, non-interpretive’ PG-PL descriptions in the analysis of snapshots from the SCIT with free-hand and then PG-PL annotations. The dataset contains RAs' free-hand and pictogram annotations of 12 SCIT snapshots. Phase 2: The goal of this phase was to apply the PG-PL to SCIT videos and to develop the first machine learning algorithm for automated movement detection. The dataset contains RAs' pictogram annotations of 1-minute long SCIT video clips. The data are available in raw and processed formats. HMOVEMENTS: AUTOMATED MOVEMENT DETECTION ALGORITHM The automated movement detection algorithm also is called "HMovements." This algorithm is available for download within this dataset. See the User Guide for instructions on using the algorithm. Note that the algorithm cannot be used on a Mac.