Affordance segmentation using tiny networks for sensing systems in wearable robotic devices

Edoardo Ragusa, Strahinja Dosen, Rodolfo Zunino, Paolo Gastaldo

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningpeer review

10 Downloads (Pure)

Abstract

Affordance segmentation is used to split object images into parts according to the possible interactions, usually to drive safe robotic grasping. Most approaches to affordance segmentation are computationally demanding; this hinders their integration into wearable robots, whose compact structure typically offers limited processing power. This article describes a design strategy for tiny, deep neural networks (DNNs) that can accomplish affordance segmentation and deploy effectively on microcontroller-like processing units. This is attained by specialized, hardware-aware neural architecture search (HW-NAS). The method was validated by assessing the performance of several tiny networks, at different levels of complexity, on three benchmark datasets. The outcome measure was the accuracy of the generated affordance maps and the associated spatial object descriptors (orientation, center of mass, and size). The experimental results confirmed that the proposed method compared satisfactorily with the state-of-the-art approaches, yet allowing a considerable reduction in both network complexity and inference time. The proposed networks can, therefore, support the development of a teleceptive sensing system to improve the semiautomatic control of wearable robots for assisting grasping.

OriginalsprogEngelsk
Artikelnummer10235885
TidsskriftIEEE Sensors Journal
Vol/bind23
Udgave nummer19
Sider (fra-til)23916-23926
Antal sider11
ISSN2379-9153
DOI
StatusUdgivet - 1 okt. 2023

Fingeraftryk

Dyk ned i forskningsemnerne om 'Affordance segmentation using tiny networks for sensing systems in wearable robotic devices'. Sammen danner de et unikt fingeraftryk.

Citationsformater