Tonight We Improvise! Real-time tracking for human-robot improvisational dance

Elizabeth Jochum, Jeroen Derks

Research output: Contribution to book/anthology/report/conference proceedingArticle in proceedingResearchpeer-review

3 Citations (Scopus)


One challenge in robotics is to develop motion planning tools that
enable mobile robots to move safely and predictably in public
spaces with people. Navigating public spaces requires a high
degree of information about context and environment, which can
be partly identified by understanding how people move. Motivated
by research in dance and robotics, we developed an exploratory
study of improvisational movement for dance performance. Dance
forms have recognizable styles and specific interaction patterns
(turn-taking, tempo changes, etc.) that reveal important information
about behavior and context. Following extensive iterations with
expert dancers, we developed a sequence of basic motion
algorithms based on improvisation exercises to generate three
unique, original performances between a robot and human
performers trained in various dance styles. We developed a novel
method for tracking dancers in real time using inputs to generate
choreography for non-anthropomorphic robots. Although the
motion algorithms were identical, the individual dancers generated
vastly different performances and elicited unexpected motions and
choreographies from the robot. We summarize our study and
identify some challenges of devising performances between robots
and humans, and outline future work to experiment with more
advanced algorithms.
Original languageEnglish
Title of host publicationMOCO '19: Proceedings of the 6th International Conference on Movement and Computing
Number of pages11
PublisherAssociation for Computing Machinery
Publication dateOct 2019
Article number7
ISBN (Electronic)978-1-4503-7654-9
Publication statusPublished - Oct 2019
Event6th International Conference on Movement and Computing - Arizona State University, Tempe, United States
Duration: 10 Oct 201912 Oct 2019


Conference6th International Conference on Movement and Computing
LocationArizona State University
CountryUnited States
Internet address


  • human robot interaction
  • Dance
  • Improvisation


Dive into the research topics of 'Tonight We Improvise! Real-time tracking for human-robot improvisational dance'. Together they form a unique fingerprint.

Cite this