133102-Thumbnail Image.png
Description
Advances in computational processing have made big data analysis in fields like music information retrieval (MIR) possible. Through MIR techniques researchers have been able to study information on a song, its musical parameters, the metadata generated by the song's listeners,

Advances in computational processing have made big data analysis in fields like music information retrieval (MIR) possible. Through MIR techniques researchers have been able to study information on a song, its musical parameters, the metadata generated by the song's listeners, and contextual data regarding the artists and listeners (Schedl, 2014). MIR research techniques have been applied within the field of music and emotions research to help analyze the correlative properties between the music information and the emotional output. By pairing methods within music and emotions research with the analysis of the musical features extracted through MIR, researchers have developed predictive models for emotions within a musical piece. This research has increased our understanding of the correlative properties of certain musical features like pitch, timbre, rhythm, dynamics, mel frequency cepstral coefficients (MFCC's), and others, to the emotions evoked by music (Lartillot 2008; Schedl 2014) This understanding of the correlative properties has enabled researchers to generate predictive models of emotion within music based on listeners' emotional response to it. However, robust models that account for a user's individualized emotional experience and the semantic nuances of emotional categorization have eluded the research community (London, 2001). To address these two main issues, more advanced analytical methods have been employed. In this article we will look at two of these more advanced analytical methods, machine learning algorithms and deep learning techniques, and discuss the effect that they have had on music and emotions research (Murthy, 2018). Current trends within MIR research, the application of support vector machines and neural networks, will also be assessed to explain how these methods help to address the two main issues within music and emotion research. Finally, future research within the field of machine and deep learning will be postulated to show how individuate models may be developed from a user or a pool of user's listening libraries. Also how developments of semi-supervised classification models that assess categorization by cluster instead of by nominal data, may be helpful in addressing the nuances of emotional categorization.
3.1 MB application/pdf

Download restricted. Please sign in.
Restrictions Statement

Barrett Honors College theses and creative projects are restricted to ASU community members.

Details

Title
  • Determining Emotive Correlates in Music through Music Information Retrieval and Artificial Intelligence
Contributors
Date Created
2018-12
Resource Type
  • Text
  • Machine-readable links