Mobile AR in and Out: Towards Delay-Based Modeling of Acoustic Scenes

Research output: Contribution to book/anthology/report/conference proceedingArticle in proceedingResearchpeer-review

6 Citations (Scopus)

Abstract

We have previously presented an augmented reality (AR) audio application, where scattering delay networks efficiently generate and organize a reverberator, based on room geometry scanned by an AR device. The application allowed for real-Time processing and updating of reflection path geometry and provided a proof-of-concept for plausible audio-spatial registration of a virtual object in real environments. Here we present our ongoing work that aims to extend the simulation to outdoor scenes by using the Waveguide Web, instead of the original formulation with the Scattering Delay Networks. The current implementation is computationally more demanding, but has a potential to provide more accurate second-order reflections, and therefore, better registering of audio-visual AR scenes.

Original languageEnglish
Title of host publication25th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2018 - Proceedings
EditorsFrank Steinicke, Bruce Thomas, Kiyoshi Kiyokawa, Greg Welch
Number of pages2
PublisherIEEE
Publication date24 Aug 2018
Pages543-544
Article number8446230
ISBN (Print)978-1-5386-3366-3
ISBN (Electronic)978-1-5386-3365-6
DOIs
Publication statusPublished - 24 Aug 2018
Event25th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2018 - Reutlingen, Germany
Duration: 18 Mar 201822 Mar 2018

Conference

Conference25th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2018
Country/TerritoryGermany
CityReutlingen
Period18/03/201822/03/2018
SponsorART, Digital Projection, et al., Haption, MiddleVR, VICON

Keywords

  • Human-centered computing-Mixed / augmented reality-Human-centered computing-Auditory feedback

Fingerprint

Dive into the research topics of 'Mobile AR in and Out: Towards Delay-Based Modeling of Acoustic Scenes'. Together they form a unique fingerprint.

Cite this