DATA_HARMCOMP

Published: 23 January 2023| Version 1 | DOI: 10.17632/xtgdr9283r.1
Contributor:
Lorena Mihelac

Description

DATA_HARMCOMP_1 We contribute to the longstanding challenge of how to explain the listener’s acceptability for a particular piece of music, using harmony as one of the crucial dimensions in music, one of the least examined in this context. We propose three measures for the complexity of harmony: (i) the complexity based on usage of the basic tonal functions and parallels in the harmonic progression, (ii) the entropies of unigrams and bigrams in the sequence of chords, and (iii) the regularity of the harmonic progression. Additionally, we propose four measures for the acceptability of musical pieces (perceptual variables): difficulty, pleasantness, recognition, and repeatability. These measures have been evaluated in each musical example within our dataset, consisting of 160 carefully selected musical excerpts from different musical styles. The first and the third complexity measures and the musical style of excerpts were determined by the first author using criteria described in the article, while the entropies were computed by computer using Shannon’s formula after the harmonic progression was determined. The four perceptual variables were obtained by a group of 21 participants, taking their mean values as the final score. A statistical analysis of this dataset shows that all the measures of complexity are consistent and are together with the musical style's important features in explaining musical acceptability. These relations were further elaborated by regression tree analysis for difficulty and pleasantness after unigram entropy was eliminated due to the high correlation with bigram entropy. Results offer reasonable interpretations and also illuminate the relative importance of the predictor variables. In particular, the regularity of the harmonic progression is in both cases the most important predictor. Link do the research work: https://doi.org/10.1145/3375014 DATA_HARMCOMP_2 In this study, we replace the human experts involved in the detection of (ir)regularity with artificial intelligence algorithms. We evaluated eight variables measuring entropy and information content, which can be analyzed for each musical piece using the computational model IDyOM. The algorithm was tested using 160 musical excerpts. A preliminary statistical analysis indicated that three of the eight variables were significant predictors of regularity (Ecpitch, ICcpintfref,andEcpintfref). Additionally, we observed linear separation between regular and irregular excerpts; therefore, we employed support vector machine and artificial neural network (ANN) algorithms with a linear kernel and a linear activation function, respectively, to predict regularity. The final algorithms were capable of predicting regularity with an accuracy ranging from 89% for the ANN algorithm using only the most significant predictor to 100% for the ANN algorithm using all eight prediction variables. Link to the research article: DOI: https://doi.org/10.34768/amcs-2020-0056

Files

Categories

Artificial Intelligence in Music, Content-Based Music Processing, Musicology

License