The impact of an accurate vertical localization with HRTFs on short explorations of immersive virtual reality scenarios

Publikation: Bidrag til bog/antologi/rapport/konference proceedingKonferenceartikel i proceedingForskningpeer review

Abstract

Achieving a full 3D auditory experience with head-related transfer functions (HRTFs) is still one of the main challenges of spatial audio rendering. HRTFs capture the listener's acoustic effects and personal perception, allowing immersion in virtual reality (VR) applications. This paper aims to investigate the connection between listener sensitivity in vertical localization cues and experienced presence, spatial audio quality, and attention. Two VR experiments with head-mounted display (HMD) and animated visual avatar are proposed: (i) a screening test aiming to evaluate the participants' localization performance with HRTFs for a non-visible spatialized audio source, and (ii) a 2 minute free exploration of a VR scene with five audiovisual sources in a both non-spatialized (2D stereo panning) and spatialized (free-field HRTF rendering) listening conditions. The screening test allows a distinction between good and bad localizers. The second one shows that no biases are introduced in the quality of the experience (QoE) due to different audio rendering methods; more interestingly, good localizers perceive a lower audio latency and they are less involved in the visual aspects.
Luk

Detaljer

Achieving a full 3D auditory experience with head-related transfer functions (HRTFs) is still one of the main challenges of spatial audio rendering. HRTFs capture the listener's acoustic effects and personal perception, allowing immersion in virtual reality (VR) applications. This paper aims to investigate the connection between listener sensitivity in vertical localization cues and experienced presence, spatial audio quality, and attention. Two VR experiments with head-mounted display (HMD) and animated visual avatar are proposed: (i) a screening test aiming to evaluate the participants' localization performance with HRTFs for a non-visible spatialized audio source, and (ii) a 2 minute free exploration of a VR scene with five audiovisual sources in a both non-spatialized (2D stereo panning) and spatialized (free-field HRTF rendering) listening conditions. The screening test allows a distinction between good and bad localizers. The second one shows that no biases are introduced in the quality of the experience (QoE) due to different audio rendering methods; more interestingly, good localizers perceive a lower audio latency and they are less involved in the visual aspects.
OriginalsprogEngelsk
TitelProc. 17th IEEE International Symposium on Mixed and Augmented Reality (ISMAR)
ForlagIEEE Computer Society Press
Publikationsdatookt. 2018
Sider1-8
ISBN (Trykt)978-1-5386-7460-4
DOI
StatusUdgivet - okt. 2018
PublikationsartForskning
Peer reviewJa
Begivenhed17th IEEE/ACM Int. Symposium on Mixed and Augmented Reality (ISMAR) - Munich, Tyskland
Varighed: 16 okt. 201820 okt. 2018

Konference

Konference17th IEEE/ACM Int. Symposium on Mixed and Augmented Reality (ISMAR)
LandTyskland
ByMunich
Periode16/10/201820/10/2018
ID: 281626852