A Deep Neural Network Sensor for Visual Servoing in 3D Spaces

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningpeer review

10 Citationer (Scopus)
75 Downloads (Pure)

Abstract

This paper describes a novel stereo vision sensor based on deep neural networks, that can be used to produce a feedback signal for visual servoing in unmanned aerial vehicles such as drones.
Two deep convolutional neural networks attached to the stereo camera in the drone are trained to detect wind turbines in images and stereo triangulation is used to calculate the distance from a wind turbine to the drone. Our experimental results show that the sensor produces data accurate enough to be used for servoing, even in the presence of noise generated when the drone is not being completely stable. Our results also show that appropriate filtering of the signals is needed and that to produce correct results, it is very important to keep the wind turbine within the field of vision of both cameras, so that both deep neural networks could detect it.
OriginalsprogEngelsk
Artikelnummer1437
TidsskriftSensors
Vol/bind20
Udgave nummer5
Antal sider18
ISSN1424-8220
DOI
StatusUdgivet - mar. 2020

Fingeraftryk

Dyk ned i forskningsemnerne om 'A Deep Neural Network Sensor for Visual Servoing in 3D Spaces'. Sammen danner de et unikt fingeraftryk.

Citationsformater