TY - GEN
T1 - Explainable robotics in human-robot interactions
AU - Setchi, Rossitza
AU - Dehkordi, Maryam Banitalebi
AU - Siraj Khan, Juwairiya
PY - 2020
Y1 - 2020
N2 - This paper introduces a new research area called Explainable Robotics, which studies explainability in the context of human-robot interactions. The focus is on developing novel computational models, methods and algorithms for generating explanations that allow robots to operate at different levels of autonomy and communicate with humans in a trustworthy and human-friendly way. Individuals may need explanations during human-robot interactions for different reasons, which depend heavily on the context and human users involved. Therefore, the research challenge is identifying what needs to be explained at each level of autonomy and how these issues should be explained to different individuals. The paper presents the case for Explainable Robotics using a scenario involving the provision of medical health care to elderly patients with dementia with the help of technology. The paper highlights the main research challenges of Explainable Robotics. The first challenge is the need for new algorithms for generating explanations that use past experiences, analogies and real-time data to adapt to particular audiences and purposes. The second research challenge is developing novel computational models of situational and learned trust and new algorithms for the real-time sensing of trust. Finally, more research is needed to understand whether trust can be used as a control variable in Explainable Robotics.
AB - This paper introduces a new research area called Explainable Robotics, which studies explainability in the context of human-robot interactions. The focus is on developing novel computational models, methods and algorithms for generating explanations that allow robots to operate at different levels of autonomy and communicate with humans in a trustworthy and human-friendly way. Individuals may need explanations during human-robot interactions for different reasons, which depend heavily on the context and human users involved. Therefore, the research challenge is identifying what needs to be explained at each level of autonomy and how these issues should be explained to different individuals. The paper presents the case for Explainable Robotics using a scenario involving the provision of medical health care to elderly patients with dementia with the help of technology. The paper highlights the main research challenges of Explainable Robotics. The first challenge is the need for new algorithms for generating explanations that use past experiences, analogies and real-time data to adapt to particular audiences and purposes. The second research challenge is developing novel computational models of situational and learned trust and new algorithms for the real-time sensing of trust. Finally, more research is needed to understand whether trust can be used as a control variable in Explainable Robotics.
KW - AI
KW - Explainable AI
KW - Explainable robotics
KW - Explanation
KW - Reasoning
KW - Robotics
U2 - 10.1016/j.procs.2020.09.198
DO - 10.1016/j.procs.2020.09.198
M3 - Conference article in Journal
SN - 1877-0509
VL - 176
SP - 3057
EP - 3066
JO - Procedia Computer Science
JF - Procedia Computer Science
ER -