Emotion rendering in auditory simulations of imagined walking styles

Luca Turchet, Antonio Rodá

Research output: Contribution to journalJournal articleResearchpeer-review

9 Citations (Scopus)

Abstract

This paper investigated how different emotional states of a walker can be rendered and recognized by means of footstep sounds synthesis algorithms. In a first experiment, participants were asked to render, according to
imagined walking scenarios, five emotions (aggressive, happy, neutral, sad, and tender) by manipulating the parameters of synthetic footstep sounds simulating various combinations of surface materials and shoes types.
Results allowed to identify, for the involved emotions and sound conditions, the mean values and ranges of variation of two parameters, sound level and temporal distance between consecutive steps. Results were in accordance
with those reported in previous studies on real walking, suggesting that expression of emotions in walking is independent from the real or imagined motor activity. In a second experiment participants were asked to identify
the emotions portrayed by walking sounds synthesized by setting the synthesis engine parameters to the mean values found in the first experiment. Results showed that the involved algorithms were successful in conveying
the emotional information at a level comparable with previous studies. Both experiments involved musicians and non-musicians. In both experiments, a similar general trend was found between the two groups.
Original languageEnglish
JournalIEEE Transactions on Affective Computing
VolumePP
Issue number99
ISSN2371-9850
DOIs
Publication statusPublished - 2016

Fingerprint

Dive into the research topics of 'Emotion rendering in auditory simulations of imagined walking styles'. Together they form a unique fingerprint.

Cite this