Abstract
Delegating tasks from a human to a robot needs an efficient and easy-to-use communication pipeline between them - especially when inexperienced users are involved. This work presents a robotic system that is able to bridge this communication gap by exploiting 3D sensing for gesture recognition and real-time object segmentation. We visually extract an unknown object indicated by a human through a pointing gesture and thereby communicating the object of interest to the robot which can be used to perform a certain task. The robot uses RGB-D sensors to observe the human and find the 3D point indicated by the pointing gesture. This point is used to initialize a fixation-based, fast object segmentation algorithm, inferring thus the outline of the whole object. A series of experiments with different objects and pointing gestures show that both the recognition of the gesture, the extraction of the pointing direction in 3D, and the object segmentation perform robustly. The discussed system can provide the first step towards more complex tasks, such as object recognition, grasping or learning by demonstration with obvious value in both industrial and domestic settings.
Originalsprog | Engelsk |
---|---|
Titel | Advances in Autonomous Robotics Systems : 15th Annual Conference, TAROS 2014, Birmingham, UK, September 1-3, 2014. Proceedings |
Antal sider | 12 |
Forlag | Springer |
Publikationsdato | 1 jan. 2014 |
Sider | 209-220 |
ISBN (Trykt) | 978-3-319-10400-3 |
ISBN (Elektronisk) | 978-3-319-10401-0 |
DOI | |
Status | Udgivet - 1 jan. 2014 |
Begivenhed | Towards Autonomous Robotic Systems - Birmingham, Storbritannien Varighed: 1 sep. 2014 → 3 sep. 2014 Konferencens nummer: 15 |
Konference
Konference | Towards Autonomous Robotic Systems |
---|---|
Nummer | 15 |
Land/Område | Storbritannien |
By | Birmingham |
Periode | 01/09/2014 → 03/09/2014 |
Navn | Lecture Notes in Computer Science |
---|---|
Vol/bind | 8717 |
ISSN | 0302-9743 |