Sensor Data Fusion

Alfredo Plascencia, Petr Stepán

Research output: Contribution to book/anthology/report/conference proceedingArticle in proceedingResearch

Abstract

The main contribution of this paper is to present a sensor fusion approach to scene environment mapping as part of a Sensor Data Fusion (SDF) architecture. This approach involves combined sonar array with stereo vision readings.  Sonar readings are interpreted using probability density functions to the occupied and empty regions. Scale Invariant Feature Transform (SIFT) feature descriptors are interpreted using gaussian probabilistic error models. The use of occupancy grids is proposed for representing the sensor readings. The Bayesian estimation approach is applied to update the sonar array  and the SIFT descriptors' uncertainty grids. The sensor fusion yields a significant reduction in the uncertainty of the occupancy grid compared to the individual sensor readings.
Original languageEnglish
Title of host publicationProceedings of the IEEE Systems, Man and Cybernetics Society
Number of pages6
PublisherElectrical Engineering/Electronics, Computer, Communications and Information Technology Association
Publication date2006
Pages20-25
Publication statusPublished - 2006
EventConference on Advances in Cybernetics Systems - Shefield, United Kingdom
Duration: 7 Sept 20068 Sept 2006
Conference number: 5

Conference

ConferenceConference on Advances in Cybernetics Systems
Number5
Country/TerritoryUnited Kingdom
CityShefield
Period07/09/200608/09/2006

Fingerprint

Dive into the research topics of 'Sensor Data Fusion'. Together they form a unique fingerprint.

Cite this