UAV Visual Servoing Navigation in Sparsely Populated Environments

Research output: Contribution to book/anthology/report/conference proceedingArticle in proceedingResearchpeer-review

206 Downloads (Pure)

Abstract

This paper presents a novel approach based on deep neural networks for autonomous navigation of a quad-copter UAV in a sparsely populated environment. Images from the video camera mounted on the UAV are split, emulating a compounded eye, and processed by deep neural networks that calculate the probability that each image contains a wind turbine object. Then, these probabilities are used as inputs to a vision servoing system that controls the drone’s navigation movements. Our experiments show that our approach produces relatively stable movements in the UAV, allowing it to find and navigate autonomously towards a wind turbine.
Original languageEnglish
Title of host publication15th European Workshop on Advanced Control and Diagnosis
Number of pages15
PublisherSpringer
Publication date17 Jun 2022
Pages1257-1272
ISBN (Electronic)978-3-030-85318-1
DOIs
Publication statusPublished - 17 Jun 2022
Event15th European Workshop on Advanced Control and Diagnosis - Bologna, Italy
Duration: 21 Nov 201922 Nov 2019
https://eventi.unibo.it/acd2019

Conference

Conference15th European Workshop on Advanced Control and Diagnosis
Country/TerritoryItaly
CityBologna
Period21/11/201922/11/2019
Internet address
SeriesLecture Notes in Control and Information Sciences - Proceedings
ISSN2522-5383

Keywords

  • Quad-copter UAV
  • Control
  • Visual Servoing
  • Compounded Eye
  • Artificial Intelligence
  • Vision
  • Machine Learning
  • Robotics
  • Deep Learing

Fingerprint

Dive into the research topics of 'UAV Visual Servoing Navigation in Sparsely Populated Environments'. Together they form a unique fingerprint.

Cite this