A biologically inspired scale-space for illumination invariant feature detection

Vassilios Vonikakis, Dimitrios Chrysostomou, Rigas Kouskouridas, Antonios Gasteratos

Research output: Contribution to journalJournal articleResearchpeer-review

40 Citations (Scopus)

Abstract

This paper presents a new illumination invariant operator, combining the nonlinear
characteristics of biological center-surround cells with the classic difference of Gaussians operator. It specifically targets the underexposed image regions, exhibiting increased sensitivity to low contrast, while not affecting performance in the correctly exposed ones. The proposed operator can be used to create a scale-space, which in turn can be a part of a SIFT-based detector module. The main advantage of this illumination invariant scale-space is that, using just one global threshold, keypoints can be detected in both dark and bright image regions. In order to evaluate the degree of illumination invariance that the proposed, as well as other, existing, operators exhibit, a new benchmark dataset is introduced. It features a greater variety of imaging conditions, compared to existing databases, containing real scenes under
various degrees and combinations of uniform and non-uniform illumination.
Experimental results show that the proposed detector extracts a greater number of features, with a high level of repeatability, compared to other approaches, for both uniform and non-uniform illumination. This, along with its simple implementation, renders the proposed feature detector particularly appropriate for outdoor vision systems, working in environments under uncontrolled illumination conditions.
Original languageEnglish
JournalMeasurement Science and Technology
Volume24
Issue number7
Pages (from-to)24 - 37
Number of pages13
ISSN0957-0233
DOIs
Publication statusPublished - 12 Feb 2013
Externally publishedYes

Fingerprint

Dive into the research topics of 'A biologically inspired scale-space for illumination invariant feature detection'. Together they form a unique fingerprint.

Cite this