Musical Emotion Evaluation Test - Raw data - Music emotion items - Musicians and nonmusicians
Description
Raw data including responses from 200 participants (100 musicians) on perceived emotion, as well as valence and arousal ratings, for 116 musical stimuli. The dataset also includes responses to the Beck Anxiety Inventory (BAI), Beck Depression Inventory (BDI), and a musical background questionnaire. The dataset further includes the musical stimuli selected and validated for the Musical Emotion Evaluation Test (MEET), supporting transparency, replication, and reuse.
Files
Steps to reproduce
The study involved 200 participants, aged between 18 and 40 years old. There were 100 musicians and 100 non-musicians. A total of 116 musical stimuli were used in the study, composed specifically to evoke emotions of happiness, fear/anger, sadness, and serenity, with 29 stimuli for each emotion. The participants performed the task in a digital platform listening to the 116 composed musical stimuli, answering which emotion, among the four main ones (happiness, fear/anger, sadness, and serenity), was perceived and what was the level of valence and excitability measured by the Self-Assessment Manikin (SAM) self-assessment scale. In addition to evaluating the participants' emotional responses, we also assessed the presence of depression and anxiety as control variables using the Beck Depression Inventory (BDI), Beck Anxiety Inventory (BAI), and we used a musical background questionnaire, specifically design for this research. Reproduction of the study is supported by the availability of both raw data and the validated musical stimuli (MEET) included in this dataset.
Institutions
Categories
Funders
- Programa de Bolsas de Produtividade em Pesquisa - UEMGGrant ID: PQ/UEMG- Edital 10/2022