Modelling novice and expert listeners’ ability to detect changes in short melodies

Kathrine Agres, David Meredith

Publikation: Konferencebidrag uden forlag/tidsskriftKonferenceabstrakt til konferenceForskningpeer review

47 Downloads (Pure)


Although musical memory and melodic change detection (ChDet) have received significant attention in the literature, the relative influence of different parameters on ChDet performance is not well understood. To explore this, we developed a computational model, based on the salience of notes arising from tonal and rhythmic features, to simulate the results of two previous experiments (Agres, 2018). In these experiments, listeners (either professional musicians or non-musicians) indicated whether a standard and comparison melody were the same or different (different melodies contained one changed tone). We aimed to model the results from two ChDet experiments by systematically varying the relative weights (i.e., relative contribution to the model) of several rhythmic and tonal features. By discovering which feature combinations best account for listeners’ ChDet performance, we aim to better understand the cognitive mechanisms underlying short-term memory for melodies. Our model predicts the likelihood of a change being detected in a pair of melodies. The model quantifies the salience of each note as a weighted sum of normalized values representing its duration, metrical strength, tonal stability and tonal instability. Tonal stability, t, is quantified as the normalized tonal hierarchy rating (Krumhansl, 1990), tonal instability is defined as 1 – t and metrical salience is based on the note’s metrical strength within a metrical hierarchy. We compute which combination of weighted salience features best predicts listeners’ ChDet performance. We obtained separate optimal relative feature weightings for musicians and non-musicians. Seventy-two melodies were used: 36 stylistic melodies conforming to Western musical norms, 18 non-stylistic melodies containing unusual melodic leaps or implied harmonies, and 18 random melodies containing tones randomly selected from a diatonic scale. Each stylistic and non-stylistic melody could contain at most one non-diatonic tone. Tonal instability in the comparison melody was the most reliable predictor of ChDet performance for both musicians and non-musicians (cf. temporal asymmetry effects in Krumhansl, 1990). Note duration and tonal stability in the first melody also had an effect, as did tonal instability in the first melody (for musicians only). We were able to model non-musicians’ performance with a correlation of 0.51 (Exp 1) and 0.55 (Exp 2), and musicians’ performance with a correlation of 0.58 (Exp 1) and 0.65 (Exp 2). All correlations were at a significance level of p < 0.001. By using a linear combination of rhythmic and tonal features, we modelled listeners’ ability to detect tone changes in pairs of melodies. Our results indicate that an important factor for both musicians and non-musicians is the tonal instability of notes in the second melody presented. Other contributing factors were note duration, tonal stability and tonal instability in the first melody. This approach allowed us to discover which features are most likely to drive listeners' change detection performance in music. References: Agres, K. (2018). Change detection and schematic processing in music. Psychology of Music. DOI: 10.1177/0305735617751249. Krumhansl, C. L. (1990). Cognitive Foundations of Musical Pitch. New York, NY. Oxford University Press.
Antal sider1
StatusUdgivet - 2018
Begivenhed15th International Conference on Music Perception and Cognition (ICMPC15) & 10th Triennial Conference of the European Society for the Cognitive Sciences of Music (ESCOM10) - University of Graz, Graz, Østrig
Varighed: 23 jul. 201828 jul. 2018
Konferencens nummer: 15


Konference15th International Conference on Music Perception and Cognition (ICMPC15) & 10th Triennial Conference of the European Society for the Cognitive Sciences of Music (ESCOM10)
LokationUniversity of Graz


Dyk ned i forskningsemnerne om 'Modelling novice and expert listeners’ ability to detect changes in short melodies'. Sammen danner de et unikt fingeraftryk.