Sensor Fusion - Sonar and Stereo Vision, Using Occupancy Grids and SIFT

Alfredo Plascencia, Jan Dimon Bendtsen

Research output: Contribution to book/anthology/report/conference proceedingArticle in proceedingResearchpeer-review

Abstract

The main contribution of this paper is to present a sensor fusion approach to scene environment mapping as part of a SDF (Sensor Data Fusion) architecture. This approach involves combined sonar and stereo vision readings. Sonar readings are interpreted using probability density functions to the occupied and empty regions. SIFT (Scale Invariant Feature Transform) feature descriptors are  interpreted using gaussian probabilistic error models. The use of occupancy grids is proposed for representing the sonar  as well as the features descriptors readings. The Bayesian estimation approach is applied to update the sonar and the SIFT descriptors' uncertainty grids. The sensor fusion yields a significant reduction in the uncertainty of the occupancy grid compared to the individual sensor readings.
Original languageEnglish
Title of host publicationProceedings of the IEEE International Conference on Dynamics, Instrumentation and Control (CDIC'06)
Number of pages12
Publication date2006
Pages303-314
Publication statusPublished - 2006
Event2006 International Conference on Dynamics, Instrumentation and Control - Queretaro, Mexico
Duration: 13 Aug 200616 Aug 2006

Conference

Conference2006 International Conference on Dynamics, Instrumentation and Control
Country/TerritoryMexico
CityQueretaro
Period13/08/200616/08/2006

Fingerprint

Dive into the research topics of 'Sensor Fusion - Sonar and Stereo Vision, Using Occupancy Grids and SIFT'. Together they form a unique fingerprint.

Cite this