Recognition and Prediction of Human Actions for Safe Human-Robot Collaboration

Publikation: Konferencebidrag uden forlag/tidsskriftKonferenceabstrakt til konferenceForskning

Resumé

Collaborative industrial robots are creating new opportunities for collaboration between humans and robots in shared workspaces. In order for such collaboration to be efficient, robots - as well as humans - need to have an understanding of the other's intentions and current ongoing action. In this work, we propose a method for learning, classifying, and predicting actions taken by a human. Our proposed method is based on the human skeleton model from a Kinect. For demonstration of our approach we chose a typical pick-and-place scenario. Therefore, only arms and upper body are considered; in total 15 joints. Based on trajectories on these joints, different classes of motion are separated using partitioning around medoids (PAM). Subsequently, SVM is used to train the classes to form a library of human motions. The approach allows run-time detection of when a new motion has been initiated, early prediction of the motion class based on the first fraction of the movement, and classification of the motion when it has neded. Additionally, the approach makes it possible to expand the library of human motions in run-time by detection of previously unknown motion types.
OriginalsprogEngelsk
Publikationsdato2017
StatusUdgivet - 2017
Begivenhed6th Aalborg U Robotics Workshop - Aalborg, Danmark
Varighed: 27 nov. 201727 nov. 2017

Konference

Konference6th Aalborg U Robotics Workshop
LandDanmark
ByAalborg
Periode27/11/201727/11/2017

Fingerprint

Robots
Industrial robots
Demonstrations
Trajectories

Citer dette

Andersen, R. S., Bøgh, S., & Ceballos, I. (2017). Recognition and Prediction of Human Actions for Safe Human-Robot Collaboration. Abstract fra 6th Aalborg U Robotics Workshop, Aalborg, Danmark.
Andersen, Rasmus Skovgaard ; Bøgh, Simon ; Ceballos, Iker. / Recognition and Prediction of Human Actions for Safe Human-Robot Collaboration. Abstract fra 6th Aalborg U Robotics Workshop, Aalborg, Danmark.
@conference{e363466b6b814e00aa37a0cdde32106a,
title = "Recognition and Prediction of Human Actions for Safe Human-Robot Collaboration",
abstract = "Collaborative industrial robots are creating new opportunities for collaboration between humans and robots in shared workspaces. In order for such collaboration to be efficient, robots - as well as humans - need to have an understanding of the other's intentions and current ongoing action. In this work, we propose a method for learning, classifying, and predicting actions taken by a human. Our proposed method is based on the human skeleton model from a Kinect. For demonstration of our approach we chose a typical pick-and-place scenario. Therefore, only arms and upper body are considered; in total 15 joints. Based on trajectories on these joints, different classes of motion are separated using partitioning around medoids (PAM). Subsequently, SVM is used to train the classes to form a library of human motions. The approach allows run-time detection of when a new motion has been initiated, early prediction of the motion class based on the first fraction of the movement, and classification of the motion when it has neded. Additionally, the approach makes it possible to expand the library of human motions in run-time by detection of previously unknown motion types.",
keywords = "Robot vision, Machine Learning",
author = "Andersen, {Rasmus Skovgaard} and Simon B{\o}gh and Iker Ceballos",
year = "2017",
language = "English",
note = "null ; Conference date: 27-11-2017 Through 27-11-2017",

}

Andersen, RS, Bøgh, S & Ceballos, I 2017, 'Recognition and Prediction of Human Actions for Safe Human-Robot Collaboration' 6th Aalborg U Robotics Workshop, Aalborg, Danmark, 27/11/2017 - 27/11/2017, .

Recognition and Prediction of Human Actions for Safe Human-Robot Collaboration. / Andersen, Rasmus Skovgaard; Bøgh, Simon; Ceballos, Iker.

2017. Abstract fra 6th Aalborg U Robotics Workshop, Aalborg, Danmark.

Publikation: Konferencebidrag uden forlag/tidsskriftKonferenceabstrakt til konferenceForskning

TY - ABST

T1 - Recognition and Prediction of Human Actions for Safe Human-Robot Collaboration

AU - Andersen, Rasmus Skovgaard

AU - Bøgh, Simon

AU - Ceballos, Iker

PY - 2017

Y1 - 2017

N2 - Collaborative industrial robots are creating new opportunities for collaboration between humans and robots in shared workspaces. In order for such collaboration to be efficient, robots - as well as humans - need to have an understanding of the other's intentions and current ongoing action. In this work, we propose a method for learning, classifying, and predicting actions taken by a human. Our proposed method is based on the human skeleton model from a Kinect. For demonstration of our approach we chose a typical pick-and-place scenario. Therefore, only arms and upper body are considered; in total 15 joints. Based on trajectories on these joints, different classes of motion are separated using partitioning around medoids (PAM). Subsequently, SVM is used to train the classes to form a library of human motions. The approach allows run-time detection of when a new motion has been initiated, early prediction of the motion class based on the first fraction of the movement, and classification of the motion when it has neded. Additionally, the approach makes it possible to expand the library of human motions in run-time by detection of previously unknown motion types.

AB - Collaborative industrial robots are creating new opportunities for collaboration between humans and robots in shared workspaces. In order for such collaboration to be efficient, robots - as well as humans - need to have an understanding of the other's intentions and current ongoing action. In this work, we propose a method for learning, classifying, and predicting actions taken by a human. Our proposed method is based on the human skeleton model from a Kinect. For demonstration of our approach we chose a typical pick-and-place scenario. Therefore, only arms and upper body are considered; in total 15 joints. Based on trajectories on these joints, different classes of motion are separated using partitioning around medoids (PAM). Subsequently, SVM is used to train the classes to form a library of human motions. The approach allows run-time detection of when a new motion has been initiated, early prediction of the motion class based on the first fraction of the movement, and classification of the motion when it has neded. Additionally, the approach makes it possible to expand the library of human motions in run-time by detection of previously unknown motion types.

KW - Robot vision

KW - Machine Learning

M3 - Conference abstract for conference

ER -

Andersen RS, Bøgh S, Ceballos I. Recognition and Prediction of Human Actions for Safe Human-Robot Collaboration. 2017. Abstract fra 6th Aalborg U Robotics Workshop, Aalborg, Danmark.