Abstract

The advent of Industry 5.0 marks a significant transition towards a collaborative partnership between humans and robots, exploiting their respective capabilities and features to enhance the manufacturing process. This increased cooperation necessitates a secure environment and, in this context, trust becomes a pivotal factor influencing the quality of human-robot interactions. To ensure safety and workload balance, it is essential to have a reliable and timely measure of trust in robots. This study explores the use of facial features to identify potential correlations with human trust levels. To this purpose, a chemical industry scenario was developed where a cobot assisted the human handing over a beaker and pouring chemicals. The analysis employed Deep Learning models, specifically Convolutional Neural Networks (CNNs), to explore the relationship between facial expressions and trust levels. The results of the investigation revealed an accuracy rate of 78.61% for the handing task, and an accuracy of 73.35% for the pouring task. Nevertheless, the findings highlight the importance of implementing sensor fusion algorithms to improve the accuracy and robustness of trust evaluation towards robots.
Original languageEnglish
Title of host publication20th IEEE International Conference on Advanced Robotics and Its Social Impacts (ARSO 2024)
PublisherIEEE
Publication statusAccepted/In press - 2024
Event20th IEEE International Conference on Advanced Robotics and Its Social Impacts (ARSO 2024) - Hong Kong, China
Duration: 20 May 202422 May 2024

Conference

Conference20th IEEE International Conference on Advanced Robotics and Its Social Impacts (ARSO 2024)
Country/TerritoryChina
CityHong Kong
Period20/05/202422/05/2024

Keywords

  • Human-Robot Collaboration
  • Human-Robot Interaction
  • Trust in Human-Robot Collaboration

Fingerprint

Dive into the research topics of 'Analysis of Facial Features for Trust Evaluation in Industrial Human-Robot Collaboration'. Together they form a unique fingerprint.

Cite this