Trajectory dataset for Synchronous vs. Non-Synchronous Imitation: Using Dance to Explore Interpersonal Coordination During Observational Learning

Published: 25-02-2021| Version 1 | DOI: 10.17632/c97n6sbwzr.1
Contributors:
Cassandra Crone,
Lillian Rigoli,
Gaurav Patil,
Sarah Pini,
John Sutton,
Rachel Kallen,
Michael Richardson

Description

The dataset represents 18-dimensional trajectories that describe the whole-body motion for 22 Synchronous and 20 Non-synchronous participants. The frequency of the data is 100Hz and the data is pre-processed such that relative positions of the joint centres were calculated using the pelvic centre coordinates as a reference to control for whole-body motion and to control for the relative position of participants in space. Header for all .csv files RightAnkle.X RightAnkle.Y RightAnkle.Z LeftAnkle.X LeftAnkle.Y LeftAnkle.Z RightShoulder.X RightShoulder.Y RightShoulder.Z RightWrist.X RightWrist.Y RightWrist.Z LeftShoulder.X LeftShoulder.Y LeftShoulder.Z LeftWrist.X LeftWrist.Y LeftWrist.Z File naming convention leader<n>.csv - nth trial of the expert confederate follower<n>.csv - nth trial of the participant This dataset is an addendum to the paper: Cassandra L. Crone, Lillian M. Rigoli, Gaurav Patil, Sarah Pini, John Sutton, Rachel W. Kallen, Michael J. Richardson, Synchronous vs. non-synchronous imitation: Using dance to explore interpersonal coordination during observational learning, Human Movement Science,Volume 76,2021,102776,ISSN 0167-9457,https://doi.org/10.1016/j.humov.2021.102776. For more details, please refer to the paper or contact gaurav.patil@mq.edu.au.

Files

Steps to reproduce

The results from the study can be reproduced by using the provided data and running a Multidimensional Cross-Recurrence Quantification Analysis (MdCRQA; Wallot, 2018) using the code https://github.com/Wallot/MdRQA