An alternative to scale-space representation for extracting local features in image recognition

Hans Jørgen Andersen, Phuong Giang Nguyen

Publikation: Bidrag til bog/antologi/rapport/konference proceedingKonferenceartikel i proceedingForskningpeer review

1 Citationer (Scopus)
620 Downloads (Pure)

Abstract

In image recognition, the common approach for extracting local features using a scale-space representation has usually three main steps; first interest points are extracted at different scales, next from a patch around each interest point the rotation is calculated with corresponding orientation and compensation, and finally a descriptor is computed for the derived patch (i.e. feature of the patch). To avoid the memory and computational intensive process of constructing the scale-space, we use a method where no scale-space is required This is done by dividing the given image into a number of triangles with sizes dependent on the content of the image, at the location of each triangle. In this paper, we will demonstrate that by rotation of the interest regions at the triangles it is possible in grey scale images to achieve a recognition precision comparable with that of MOPS.The test of the proposed method is performed on two data sets of buildings.
OriginalsprogEngelsk
TitelInternational Conference on Computer Vision Theory and Applications
RedaktørerGabriela Csurka, Jose Braz
Antal sider5
Vol/bind1
ForlagInstitute for Systems and Technologies of Information, Control and Communication
Publikationsdato24 feb. 2012
Sider341-345
ISBN (Trykt)978-989-8565-03-7
StatusUdgivet - 24 feb. 2012
BegivenhedInternational Conference on Computer Vision Theory and Applications - Rome, Italien
Varighed: 24 feb. 201226 feb. 2012

Konference

KonferenceInternational Conference on Computer Vision Theory and Applications
Land/OmrådeItalien
ByRome
Periode24/02/201226/02/2012

Fingeraftryk

Dyk ned i forskningsemnerne om 'An alternative to scale-space representation for extracting local features in image recognition'. Sammen danner de et unikt fingeraftryk.

Citationsformater