Affordance segmentation using tiny networks for sensing systems in wearable robotic devices

Edoardo Ragusa, Strahinja Dosen, Rodolfo Zunino, Paolo Gastaldo

Research output: Contribution to journalJournal articleResearchpeer-review

11 Downloads (Pure)

Abstract

Affordance segmentation is used to split object images into parts according to the possible interactions, usually to drive safe robotic grasping. Most approaches to affordance segmentation are computationally demanding; this hinders their integration into wearable robots, whose compact structure typically offers limited processing power. This article describes a design strategy for tiny, deep neural networks (DNNs) that can accomplish affordance segmentation and deploy effectively on microcontroller-like processing units. This is attained by specialized, hardware-aware neural architecture search (HW-NAS). The method was validated by assessing the performance of several tiny networks, at different levels of complexity, on three benchmark datasets. The outcome measure was the accuracy of the generated affordance maps and the associated spatial object descriptors (orientation, center of mass, and size). The experimental results confirmed that the proposed method compared satisfactorily with the state-of-the-art approaches, yet allowing a considerable reduction in both network complexity and inference time. The proposed networks can, therefore, support the development of a teleceptive sensing system to improve the semiautomatic control of wearable robots for assisting grasping.

Original languageEnglish
Article number10235885
JournalIEEE Sensors Journal
Volume23
Issue number19
Pages (from-to)23916-23926
Number of pages11
ISSN2379-9153
DOIs
Publication statusPublished - 1 Oct 2023

Keywords

  • Affordance segmentation
  • embedded systems
  • grasping
  • microcontrollers
  • tiny convolutional neural networks (CNNs)
  • wearable robots

Fingerprint

Dive into the research topics of 'Affordance segmentation using tiny networks for sensing systems in wearable robotic devices'. Together they form a unique fingerprint.

Cite this