Modeling Continuous Emotional Appraisals of Music Using System Identification
MetadataShow full item record
The goal of this project is to apply system identification techniques to model people's perception of emotion in music as a function of time. Emotional appraisals of six selections of classical music are measured from volunteers who continuously quantify emotion using the dimensions valence and arousal. Also, features that communicate emotion are extracted from the music as a function of time. By treating the features as inputs to a system and the emotional appraisals as outputs of that system, linear models of the emotional appraisals are created. The models are validated by predicting a listener's emotional appraisals of a musical selection (song) unfamiliar to the system. The results of this project show that system identification provides a means to improve previous models for individual songs by allowing them to generalize emotional appraisals for a genre of music. The average <i>R</i>² statistic of the best model structure in this project is 7. 7% for valence and 75. 1% for arousal, which is comparable to the <i>R</i>² statistics for models of individual songs.