A multi-genre model for music emotion recognition using linear regressors

Darryl Griffiths, Stuart Cunningham*, Jonathan Rex Weinel, Richard Picking

*Corresponding author for this work

Research output: Contribution to journalJournal articleResearchpeer-review

7 Citations (Scopus)

Abstract

Making the link between human emotion and music is challenging. Our aim was to produce an efficient system that emotionally rates songs from multiple genres. To achieve this, we employed a series of online self-report studies, utilising Russell's circumplex model. The first study (n = 44) identified audio features that map to arousal and valence for 20 songs. From this, we constructed a set of linear regressors. The second study (n = 158) measured the efficacy of our system, utilising 40 new songs to create a ground truth. Results show our approach may be effective at emotionally rating music, particularly in the prediction of valence.

Original languageEnglish
JournalJournal of New Music Research
Volume50
Issue number4
Pages (from-to)355-372
Number of pages18
ISSN0929-8215
DOIs
Publication statusPublished - 2021
Externally publishedYes

Keywords

  • Arousal
  • MER
  • emotion
  • music
  • perception
  • regression
  • valence

Fingerprint

Dive into the research topics of 'A multi-genre model for music emotion recognition using linear regressors'. Together they form a unique fingerprint.

Cite this