Tonight We Improvise! Real-time tracking for human-robot improvisational dance

Elizabeth Jochum, Jeroen Derks

Publikation: Bidrag til bog/antologi/rapport/konference proceedingKonferenceartikel i proceedingForskningpeer review

Resumé

One challenge in robotics is to develop motion planning tools that
enable mobile robots to move safely and predictably in public
spaces with people. Navigating public spaces requires a high
degree of information about context and environment, which can
be partly identified by understanding how people move. Motivated
by research in dance and robotics, we developed an exploratory
study of improvisational movement for dance performance. Dance
forms have recognizable styles and specific interaction patterns
(turn-taking, tempo changes, etc.) that reveal important information
about behavior and context. Following extensive iterations with
expert dancers, we developed a sequence of basic motion
algorithms based on improvisation exercises to generate three
unique, original performances between a robot and human
performers trained in various dance styles. We developed a novel
method for tracking dancers in real time using inputs to generate
choreography for non-anthropomorphic robots. Although the
motion algorithms were identical, the individual dancers generated
vastly different performances and elicited unexpected motions and
choreographies from the robot. We summarize our study and
identify some challenges of devising performances between robots
and humans, and outline future work to experiment with more
advanced algorithms.
OriginalsprogEngelsk
TitelInternational Conference on Movement and Computing (MOCO) 2019
StatusAfsendt - 2019

Fingerprint

Robots
Robotics
Motion planning
Mobile robots
Experiments

Citer dette

Jochum, E., & Derks, J. (2019). Tonight We Improvise! Real-time tracking for human-robot improvisational dance. Manuskript afsendt til publicering. I International Conference on Movement and Computing (MOCO) 2019
Jochum, Elizabeth ; Derks, Jeroen. / Tonight We Improvise! Real-time tracking for human-robot improvisational dance. International Conference on Movement and Computing (MOCO) 2019. 2019.
@inproceedings{04715b8c90b74d9da9bf2a1e91c79784,
title = "Tonight We Improvise! Real-time tracking for human-robot improvisational dance",
abstract = "One challenge in robotics is to develop motion planning tools thatenable mobile robots to move safely and predictably in publicspaces with people. Navigating public spaces requires a highdegree of information about context and environment, which canbe partly identified by understanding how people move. Motivatedby research in dance and robotics, we developed an exploratorystudy of improvisational movement for dance performance. Danceforms have recognizable styles and specific interaction patterns(turn-taking, tempo changes, etc.) that reveal important informationabout behavior and context. Following extensive iterations withexpert dancers, we developed a sequence of basic motionalgorithms based on improvisation exercises to generate threeunique, original performances between a robot and humanperformers trained in various dance styles. We developed a novelmethod for tracking dancers in real time using inputs to generatechoreography for non-anthropomorphic robots. Although themotion algorithms were identical, the individual dancers generatedvastly different performances and elicited unexpected motions andchoreographies from the robot. We summarize our study andidentify some challenges of devising performances between robotsand humans, and outline future work to experiment with moreadvanced algorithms.",
keywords = "human robot interaction, Dance, Improvisation",
author = "Elizabeth Jochum and Jeroen Derks",
year = "2019",
language = "English",
booktitle = "International Conference on Movement and Computing (MOCO) 2019",

}

Jochum, E & Derks, J 2019, Tonight We Improvise! Real-time tracking for human-robot improvisational dance. i International Conference on Movement and Computing (MOCO) 2019.

Tonight We Improvise! Real-time tracking for human-robot improvisational dance. / Jochum, Elizabeth; Derks, Jeroen.

International Conference on Movement and Computing (MOCO) 2019. 2019.

Publikation: Bidrag til bog/antologi/rapport/konference proceedingKonferenceartikel i proceedingForskningpeer review

TY - GEN

T1 - Tonight We Improvise! Real-time tracking for human-robot improvisational dance

AU - Jochum, Elizabeth

AU - Derks, Jeroen

PY - 2019

Y1 - 2019

N2 - One challenge in robotics is to develop motion planning tools thatenable mobile robots to move safely and predictably in publicspaces with people. Navigating public spaces requires a highdegree of information about context and environment, which canbe partly identified by understanding how people move. Motivatedby research in dance and robotics, we developed an exploratorystudy of improvisational movement for dance performance. Danceforms have recognizable styles and specific interaction patterns(turn-taking, tempo changes, etc.) that reveal important informationabout behavior and context. Following extensive iterations withexpert dancers, we developed a sequence of basic motionalgorithms based on improvisation exercises to generate threeunique, original performances between a robot and humanperformers trained in various dance styles. We developed a novelmethod for tracking dancers in real time using inputs to generatechoreography for non-anthropomorphic robots. Although themotion algorithms were identical, the individual dancers generatedvastly different performances and elicited unexpected motions andchoreographies from the robot. We summarize our study andidentify some challenges of devising performances between robotsand humans, and outline future work to experiment with moreadvanced algorithms.

AB - One challenge in robotics is to develop motion planning tools thatenable mobile robots to move safely and predictably in publicspaces with people. Navigating public spaces requires a highdegree of information about context and environment, which canbe partly identified by understanding how people move. Motivatedby research in dance and robotics, we developed an exploratorystudy of improvisational movement for dance performance. Danceforms have recognizable styles and specific interaction patterns(turn-taking, tempo changes, etc.) that reveal important informationabout behavior and context. Following extensive iterations withexpert dancers, we developed a sequence of basic motionalgorithms based on improvisation exercises to generate threeunique, original performances between a robot and humanperformers trained in various dance styles. We developed a novelmethod for tracking dancers in real time using inputs to generatechoreography for non-anthropomorphic robots. Although themotion algorithms were identical, the individual dancers generatedvastly different performances and elicited unexpected motions andchoreographies from the robot. We summarize our study andidentify some challenges of devising performances between robotsand humans, and outline future work to experiment with moreadvanced algorithms.

KW - human robot interaction

KW - Dance

KW - Improvisation

M3 - Article in proceeding

BT - International Conference on Movement and Computing (MOCO) 2019

ER -

Jochum E, Derks J. Tonight We Improvise! Real-time tracking for human-robot improvisational dance. I International Conference on Movement and Computing (MOCO) 2019. 2019