Perceptual encoding of emotions in interactive bodily expressions

Published: 6 October 2023| Version 3 | DOI: 10.17632/5y3fgf4j9r.3
Contributors:
Nick Taubert, Martin Giese

Description

For social species, such as primates or dogs, the perceptual analysis of social interactions is an essential skill for survival that emerges already early during development1,2. While emotional behavior in real life is predominantly based on interactions between conspecifics, research on the perception of emotional body expressions has primarily focused on perception of single individuals3-5. While previous studies using point-light or video stimuli of interacting people suggest an influence of social context on the perception and neural encoding of interacting bodies, it remains entirely unknown how emotions of multiple interacting agents are perceptually integrated5-8. Exploiting methods from computer animation we studied this question by creating scenes with two interacting avatars whose emotional style was independently controlled. While participants were instructed to report the emotional style of a single agent, we found a systematic influence of the emotion expressed by the other, which reflects the ecological relevance of the emotions within the social interaction context. The emotional styles of interacting individuals are thus jointly encoded. The dataset consists of animated stimuli based on motion capture data recorded with a VICON MX system using eight cameras at 120Hz from two-person emotional interactions. Each actor was wearing 41 reflecting markers. Recorded trajectories were processed and retargeted on a self-made unisex avatar model using the commercial software Autodesk Motion Builder. Animations were created and edited in the 3D animation software Autodesk 3D Studio MAX. We rendered movies with these stimuli with a frame rate of 30fps.

Files

Categories

Motion Capture, Video

Licence