Hearing with Eyes in Virtual Reality

Publikation: Bidrag til bog/antologi/rapport/konference proceedingKonferenceartikel i proceedingForskningpeer review

Resumé

Sound and light signal propagation have similar physical properties
both contain reflection, refraction and attenuation. This behavioral
synergy between sound and light provides inspiration for creating
an audio-visual echolocation system, where light could be mapped
to the sound signal, visually representing auralization of the virtual
environment (VE). Some mammals, e.g. bats, navigate in the dark
environment using sound, which is a result of a long evolutionary
process. Humans are less successful when it comes to echolocation,
as they have only one life time to learn it. To the authors knowledge
it is still remains to be seen if sound propagation and its visualization
have been implemented in such an artistic way that is perceptually
pleasant and is used for navigation purposes in the VE. Therefore
the core novelty of this research is navigation with visualized echolocation
signal using a cognitive mental mapping activity in the VE.
Albeit test results show that sound signal visualization was helpful
for navigation in the unlit VE, it was significantly easier to navigate
in a fully illuminated scene, which might be an indication of
behavioral similarities when navigating with echolocation in the VE.
Nonetheless it allowed test participants to create cognitive mental
maps of the virtual cave.
OriginalsprogEngelsk
TitelHearing with Eyes in Virtual Reality
ForlagIEEE
StatusAccepteret/In press - 20 feb. 2019

Fingerprint

virtual reality
hearing
navigation
acoustics
bats
inspiration
caves
mammals
sound propagation
refraction
indication
attenuation
life (durability)
propagation

Citer dette

Palamas, G., & Andreasen, A. (Accepteret/In press). Hearing with Eyes in Virtual Reality. I Hearing with Eyes in Virtual Reality IEEE.
Palamas, George ; Andreasen, Anastassia. / Hearing with Eyes in Virtual Reality. Hearing with Eyes in Virtual Reality. IEEE, 2019.
@inproceedings{5aacea29102045af96eacc8010525041,
title = "Hearing with Eyes in Virtual Reality",
abstract = "Sound and light signal propagation have similar physical propertiesboth contain reflection, refraction and attenuation. This behavioralsynergy between sound and light provides inspiration for creatingan audio-visual echolocation system, where light could be mappedto the sound signal, visually representing auralization of the virtualenvironment (VE). Some mammals, e.g. bats, navigate in the darkenvironment using sound, which is a result of a long evolutionaryprocess. Humans are less successful when it comes to echolocation,as they have only one life time to learn it. To the authors knowledgeit is still remains to be seen if sound propagation and its visualizationhave been implemented in such an artistic way that is perceptuallypleasant and is used for navigation purposes in the VE. Thereforethe core novelty of this research is navigation with visualized echolocationsignal using a cognitive mental mapping activity in the VE.Albeit test results show that sound signal visualization was helpfulfor navigation in the unlit VE, it was significantly easier to navigatein a fully illuminated scene, which might be an indication ofbehavioral similarities when navigating with echolocation in the VE.Nonetheless it allowed test participants to create cognitive mentalmaps of the virtual cave.",
keywords = "Human Centered Design, Human Centered Computing, Visualization design and evaluation methods, Visualization techniques, Sound and Music Computing – modeling systems., Virtual Reality (VR), phenomenological method",
author = "George Palamas and Anastassia Andreasen",
year = "2019",
month = "2",
day = "20",
language = "English",
booktitle = "Hearing with Eyes in Virtual Reality",
publisher = "IEEE",
address = "United States",

}

Palamas, G & Andreasen, A 2019, Hearing with Eyes in Virtual Reality. i Hearing with Eyes in Virtual Reality. IEEE.

Hearing with Eyes in Virtual Reality. / Palamas, George; Andreasen, Anastassia.

Hearing with Eyes in Virtual Reality. IEEE, 2019.

Publikation: Bidrag til bog/antologi/rapport/konference proceedingKonferenceartikel i proceedingForskningpeer review

TY - GEN

T1 - Hearing with Eyes in Virtual Reality

AU - Palamas, George

AU - Andreasen, Anastassia

PY - 2019/2/20

Y1 - 2019/2/20

N2 - Sound and light signal propagation have similar physical propertiesboth contain reflection, refraction and attenuation. This behavioralsynergy between sound and light provides inspiration for creatingan audio-visual echolocation system, where light could be mappedto the sound signal, visually representing auralization of the virtualenvironment (VE). Some mammals, e.g. bats, navigate in the darkenvironment using sound, which is a result of a long evolutionaryprocess. Humans are less successful when it comes to echolocation,as they have only one life time to learn it. To the authors knowledgeit is still remains to be seen if sound propagation and its visualizationhave been implemented in such an artistic way that is perceptuallypleasant and is used for navigation purposes in the VE. Thereforethe core novelty of this research is navigation with visualized echolocationsignal using a cognitive mental mapping activity in the VE.Albeit test results show that sound signal visualization was helpfulfor navigation in the unlit VE, it was significantly easier to navigatein a fully illuminated scene, which might be an indication ofbehavioral similarities when navigating with echolocation in the VE.Nonetheless it allowed test participants to create cognitive mentalmaps of the virtual cave.

AB - Sound and light signal propagation have similar physical propertiesboth contain reflection, refraction and attenuation. This behavioralsynergy between sound and light provides inspiration for creatingan audio-visual echolocation system, where light could be mappedto the sound signal, visually representing auralization of the virtualenvironment (VE). Some mammals, e.g. bats, navigate in the darkenvironment using sound, which is a result of a long evolutionaryprocess. Humans are less successful when it comes to echolocation,as they have only one life time to learn it. To the authors knowledgeit is still remains to be seen if sound propagation and its visualizationhave been implemented in such an artistic way that is perceptuallypleasant and is used for navigation purposes in the VE. Thereforethe core novelty of this research is navigation with visualized echolocationsignal using a cognitive mental mapping activity in the VE.Albeit test results show that sound signal visualization was helpfulfor navigation in the unlit VE, it was significantly easier to navigatein a fully illuminated scene, which might be an indication ofbehavioral similarities when navigating with echolocation in the VE.Nonetheless it allowed test participants to create cognitive mentalmaps of the virtual cave.

KW - Human Centered Design

KW - Human Centered Computing

KW - Visualization design and evaluation methods

KW - Visualization techniques

KW - Sound and Music Computing – modeling systems.

KW - Virtual Reality (VR)

KW - phenomenological method

M3 - Article in proceeding

BT - Hearing with Eyes in Virtual Reality

PB - IEEE

ER -

Palamas G, Andreasen A. Hearing with Eyes in Virtual Reality. I Hearing with Eyes in Virtual Reality. IEEE. 2019