Sound Synthesis and Evaluation of Interactive Footsteps for Virtual Reality Applications

Research output: Contribution to journalConference article in JournalResearchpeer-review

34 Citations (Scopus)
703 Downloads (Pure)

Abstract

A system to synthesize in real-time the sound of footsteps on different materials is presented. The system is based on microphones which allow the user to interact with his own footwear. This solution distinguishes our system from previous efforts that require specific shoes enhanced with sensors. The microphones detect real footsteps sounds from users, from which the ground reaction force (GRF) is estimated. Such GRF is used to control a sound synthesis engine based on physical models. Evaluations of the system in terms of sound validity and fidelity of interaction are described.
Original languageEnglish
JournalI E E E Virtual Reality Annual International Symposium
Volume1
Pages (from-to)147-153
Number of pages7
ISSN1087-8270
DOIs
Publication statusPublished - 2010
EventIEEE Virtual Reality 2010 - Boston, United States
Duration: 20 Mar 201026 Mar 2010
Conference number: 17

Conference

ConferenceIEEE Virtual Reality 2010
Number17
CountryUnited States
CityBoston
Period20/03/201026/03/2010

Fingerprint

Virtual reality
Acoustic waves
Microphones
Engines
Sensors

Bibliographical note

This paper won the "Best Paper Award" at the conference "IEEE Virtual Reality 2010"

Keywords

  • Sound Synthesis
  • Physical model
  • Footstep Sound
  • Auditory Perception

Cite this

@inproceedings{307efa703d4511dfb167000ea68e967b,
title = "Sound Synthesis and Evaluation of Interactive Footsteps for Virtual Reality Applications",
abstract = "A system to synthesize in real-time the sound of footsteps on different materials is presented. The system is based on microphones which allow the user to interact with his own footwear. This solution distinguishes our system from previous efforts that require specific shoes enhanced with sensors. The microphones detect real footsteps sounds from users, from which the ground reaction force (GRF) is estimated. Such GRF is used to control a sound synthesis engine based on physical models. Evaluations of the system in terms of sound validity and fidelity of interaction are described.",
keywords = "Lydsyntese, Physical model, Sound Synthesis, Physical model, Footstep Sound, Auditory Perception",
author = "Rolf Nordahl and Stefania Serafin and Luca Turchet",
note = "This paper won the {"}Best Paper Award{"} at the conference {"}IEEE Virtual Reality 2010{"}",
year = "2010",
doi = "10.1109/VR.2010.5444796",
language = "English",
volume = "1",
pages = "147--153",
journal = "I E E E Virtual Reality Annual International Symposium",
issn = "1087-8270",
publisher = "IEEE",

}

Sound Synthesis and Evaluation of Interactive Footsteps for Virtual Reality Applications. / Nordahl, Rolf; Serafin, Stefania; Turchet, Luca.

In: I E E E Virtual Reality Annual International Symposium, Vol. 1, 2010, p. 147-153.

Research output: Contribution to journalConference article in JournalResearchpeer-review

TY - GEN

T1 - Sound Synthesis and Evaluation of Interactive Footsteps for Virtual Reality Applications

AU - Nordahl, Rolf

AU - Serafin, Stefania

AU - Turchet, Luca

N1 - This paper won the "Best Paper Award" at the conference "IEEE Virtual Reality 2010"

PY - 2010

Y1 - 2010

N2 - A system to synthesize in real-time the sound of footsteps on different materials is presented. The system is based on microphones which allow the user to interact with his own footwear. This solution distinguishes our system from previous efforts that require specific shoes enhanced with sensors. The microphones detect real footsteps sounds from users, from which the ground reaction force (GRF) is estimated. Such GRF is used to control a sound synthesis engine based on physical models. Evaluations of the system in terms of sound validity and fidelity of interaction are described.

AB - A system to synthesize in real-time the sound of footsteps on different materials is presented. The system is based on microphones which allow the user to interact with his own footwear. This solution distinguishes our system from previous efforts that require specific shoes enhanced with sensors. The microphones detect real footsteps sounds from users, from which the ground reaction force (GRF) is estimated. Such GRF is used to control a sound synthesis engine based on physical models. Evaluations of the system in terms of sound validity and fidelity of interaction are described.

KW - Lydsyntese

KW - Physical model

KW - Sound Synthesis

KW - Physical model

KW - Footstep Sound

KW - Auditory Perception

U2 - 10.1109/VR.2010.5444796

DO - 10.1109/VR.2010.5444796

M3 - Conference article in Journal

VL - 1

SP - 147

EP - 153

JO - I E E E Virtual Reality Annual International Symposium

JF - I E E E Virtual Reality Annual International Symposium

SN - 1087-8270

ER -