How Well Can Driverless Vehicles Hear? An Introduction to Auditory Perception for Autonomous and Smart Vehicles

Letizia Marchegiani, Xenofon Fafoutis

Research output: Contribution to journalJournal articleResearchpeer-review

4 Citations (Scopus)
106 Downloads (Pure)

Abstract

From sirens to car horns, the urban environment is full of sounds that are designed to direct drivers' attention to conditions that require special care. Microphone-equipped autonomous vehicles can also use these acoustic cues to increase safety and performance. This article explores auditory perception in the context of autonomous driving and smart vehicles, in general, examining the potential of exploiting acoustic cues in driverless vehicle technology. With a journey through the literature, we discuss various applications of auditory perception in driverless vehicles, ranging from the identification and localization of external acoustic objects to leveraging ego noise for motion estimation and engine fault detection. In addition to solutions already proposed in the literature, we point out directions for further investigations, focusing, in particular, on parallel studies in the areas of acoustics and audio signal processing that demonstrate potential for improving the performance of driverless cars.

Original languageEnglish
JournalIEEE Intelligent Transportation Systems Magazine
Volume14
Issue number3
Pages (from-to)92-105
Number of pages14
ISSN1939-1390
DOIs
Publication statusPublished - 2022

Keywords

  • Acoustics
  • Automobiles
  • Autonomous vehicles
  • Hidden Markov models
  • Location awareness
  • Neural networks
  • Sensors

Fingerprint

Dive into the research topics of 'How Well Can Driverless Vehicles Hear? An Introduction to Auditory Perception for Autonomous and Smart Vehicles'. Together they form a unique fingerprint.

Cite this