Tonight We Improvise! Real-time tracking for human-robot improvisational dance

Elizabeth Jochum, Jeroen Derks

Publikation: Bidrag til bog/antologi/rapport/konference proceedingKonferenceartikel i proceedingForskningpeer review

2 Citationer (Scopus)


One challenge in robotics is to develop motion planning tools that
enable mobile robots to move safely and predictably in public
spaces with people. Navigating public spaces requires a high
degree of information about context and environment, which can
be partly identified by understanding how people move. Motivated
by research in dance and robotics, we developed an exploratory
study of improvisational movement for dance performance. Dance
forms have recognizable styles and specific interaction patterns
(turn-taking, tempo changes, etc.) that reveal important information
about behavior and context. Following extensive iterations with
expert dancers, we developed a sequence of basic motion
algorithms based on improvisation exercises to generate three
unique, original performances between a robot and human
performers trained in various dance styles. We developed a novel
method for tracking dancers in real time using inputs to generate
choreography for non-anthropomorphic robots. Although the
motion algorithms were identical, the individual dancers generated
vastly different performances and elicited unexpected motions and
choreographies from the robot. We summarize our study and
identify some challenges of devising performances between robots
and humans, and outline future work to experiment with more
advanced algorithms.
TitelMOCO '19: Proceedings of the 6th International Conference on Movement and Computing
Antal sider11
ForlagAssociation for Computing Machinery
Publikationsdatookt. 2019
ISBN (Elektronisk)978-1-4503-7654-9
StatusUdgivet - okt. 2019
Begivenhed6th International Conference on Movement and Computing - Arizona State University, Tempe, USA
Varighed: 10 okt. 201912 okt. 2019


Konference6th International Conference on Movement and Computing
LokationArizona State University


Dyk ned i forskningsemnerne om 'Tonight We Improvise! Real-time tracking for human-robot improvisational dance'. Sammen danner de et unikt fingeraftryk.