Sex differences in vocal learning ability in songbirds are linked with differences in flexible rhythm pattern perception - Supporting Files

Published: 7 September 2022| Version 2 | DOI: 10.17632/2r29x6gr7w.2
Contributors:
Andrew Rouse,
,
,

Description

Humans readily recognize a familiar rhythmic pattern, such as isochrony (equal timing between events) across a wide range of rates. This ability reflects a facility with perceiving the relative timing of events, not just absolute interval durations. Several lines of evidence suggest that this ability is supported by precise temporal predictions that arise from forebrain auditory-motor interactions. We have shown previously that male zebra finches, which possess specialized auditory-motor networks and communicate with rhythmically patterned sequences, share our ability to recognize isochrony independent of rate. To test the hypothesis that flexible rhythm pattern perception is linked to vocal learning, we ask whether female zebra finches, which do not learn to sing, can also recognize global temporal patterns. We find that non-singing females can flexibly recognize isochrony but perform slightly worse than males on average. These findings are consistent with recent work showing that while females have reduced forebrain song regions, the overall network connectivity of vocal premotor regions is similar to that in males and supports predictions of upcoming events. Comparative studies of male and female songbirds thus offer an opportunity to study how individual differences in auditory-motor connectivity influence perception of relative timing, a hallmark of human music perception. This dataset contains trial data, experiment notes, and analysis code for this experiment. - 2022 Finch Regularity.xlsx is data pertaining to the experiment and birdData.xlsx is a spreadsheet of data on the subjects - See readme.txt for descriptions of the data files

Files

Steps to reproduce

For the analysis script, the birdTable.xlsx file is used for the subjectFile parameter The code for running the experiment is available on GitHub: https://github.com/arouse01/pyoperant Stimuli and schematics are available here: http://dx.doi.org/10.17632/fw5f2vrf4k.2

Institutions

Tufts University Department of Biology, Tufts University

Categories

Animal Cognition, Animal Learning, Animal Models, Operant Conditioning, Animal Perception, Vocal Communication

Licence