Projekter pr. år
Abstract
One challenge in robotics is to develop motion planning tools that
enable mobile robots to move safely and predictably in public
spaces with people. Navigating public spaces requires a high
degree of information about context and environment, which can
be partly identified by understanding how people move. Motivated
by research in dance and robotics, we developed an exploratory
study of improvisational movement for dance performance. Dance
forms have recognizable styles and specific interaction patterns
(turn-taking, tempo changes, etc.) that reveal important information
about behavior and context. Following extensive iterations with
expert dancers, we developed a sequence of basic motion
algorithms based on improvisation exercises to generate three
unique, original performances between a robot and human
performers trained in various dance styles. We developed a novel
method for tracking dancers in real time using inputs to generate
choreography for non-anthropomorphic robots. Although the
motion algorithms were identical, the individual dancers generated
vastly different performances and elicited unexpected motions and
choreographies from the robot. We summarize our study and
identify some challenges of devising performances between robots
and humans, and outline future work to experiment with more
advanced algorithms.
enable mobile robots to move safely and predictably in public
spaces with people. Navigating public spaces requires a high
degree of information about context and environment, which can
be partly identified by understanding how people move. Motivated
by research in dance and robotics, we developed an exploratory
study of improvisational movement for dance performance. Dance
forms have recognizable styles and specific interaction patterns
(turn-taking, tempo changes, etc.) that reveal important information
about behavior and context. Following extensive iterations with
expert dancers, we developed a sequence of basic motion
algorithms based on improvisation exercises to generate three
unique, original performances between a robot and human
performers trained in various dance styles. We developed a novel
method for tracking dancers in real time using inputs to generate
choreography for non-anthropomorphic robots. Although the
motion algorithms were identical, the individual dancers generated
vastly different performances and elicited unexpected motions and
choreographies from the robot. We summarize our study and
identify some challenges of devising performances between robots
and humans, and outline future work to experiment with more
advanced algorithms.
Originalsprog | Engelsk |
---|---|
Titel | MOCO '19: Proceedings of the 6th International Conference on Movement and Computing |
Antal sider | 11 |
Forlag | Association for Computing Machinery |
Publikationsdato | okt. 2019 |
Artikelnummer | 7 |
ISBN (Elektronisk) | 978-1-4503-7654-9 |
DOI | |
Status | Udgivet - okt. 2019 |
Begivenhed | 6th International Conference on Movement and Computing - Arizona State University, Tempe, USA Varighed: 10 okt. 2019 → 12 okt. 2019 https://www.movementcomputing.org/ |
Konference
Konference | 6th International Conference on Movement and Computing |
---|---|
Lokation | Arizona State University |
Land/Område | USA |
By | Tempe |
Periode | 10/10/2019 → 12/10/2019 |
Internetadresse |
Fingeraftryk
Dyk ned i forskningsemnerne om 'Tonight We Improvise! Real-time tracking for human-robot improvisational dance'. Sammen danner de et unikt fingeraftryk.Projekter
- 1 Igangværende
-
RAPP Lab: Robots, Art, People and Performance
Jochum, E., Heath, D. & Vlachos, E.
01/11/2016 → …
Projekter: Projekt › Forskning