Multimodal Detection of Music Performances for Intelligent Emotion Based Lighting

Esben Oxholm Skjødt Bonde, Ellen Kathrine Hansen, Georgios Triantafyllidis

Research output: Contribution to book/anthology/report/conference proceedingArticle in proceedingResearchpeer-review

Abstract

Playing music is about conveying emotions and the lighting at a concert can help do that. However, new and unknown bands that play at smaller venues and bands that don’t have the budget to hire a dedicated light technician have to miss out on lighting that will help them to convey the emotions of what they play. In this paper it is investigated whether it is possible or not to develop an intelligent system that through a multimodal input detects the intended emotions of the played music and in realtime adjusts the lighting accordingly. A concept for such an intelligent lighting system is developed and described. Through existing research on music and emotion, as well as on musicians’ body movements related to the emotion they want to convey, a row of cues is defined. This includes amount, speed, fluency and regularity for the visual and
level, tempo, articulation and timbre for the auditory. Using a microphone and a
Kinect camera to detect such cues, the system is able to detect the intended
emotion of what is being played. Specific lighting designs are then developed to
support the specific emotions and the system is able to change between and
alter the lighting design based on the incoming cues. The results suggest that
the intelligent emotion-based lighting system has an advantage over a just beat
synced lighting and it is concluded that there is reason to explore this idea
further.
Original languageEnglish
Title of host publicationInteractivity, Game Creation, Design, Learning, and Innovation : 5th International Conference, ArtsIT 2016, and First International Conference, DLI 2016, Esbjerg, Denmark, May 2–3, 2016, Proceedings
PublisherSpringer
Publication date2017
Pages212-219
ISBN (Print)978-3-319-55833-2
ISBN (Electronic)978-3-319-55834-9
DOIs
Publication statusPublished - 2017
Event5th EAI International Conference: ArtsIT, Interactivity & Game Creation - Esbjerg, Denmark
Duration: 2 May 20163 May 2016

Conference

Conference5th EAI International Conference: ArtsIT, Interactivity & Game Creation
CountryDenmark
CityEsbjerg
Period02/05/201603/05/2016
SeriesLecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST
Volume196
ISSN1867-8211

Fingerprint

Lighting
Conveying
Intelligent systems
Microphones
Cameras

Cite this

Bonde, E. O. S., Hansen, E. K., & Triantafyllidis, G. (2017). Multimodal Detection of Music Performances for Intelligent Emotion Based Lighting. In Interactivity, Game Creation, Design, Learning, and Innovation: 5th International Conference, ArtsIT 2016, and First International Conference, DLI 2016, Esbjerg, Denmark, May 2–3, 2016, Proceedings (pp. 212-219). Springer. Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST, Vol.. 196 https://doi.org/10.1007/978-3-319-55834-9_25
Bonde, Esben Oxholm Skjødt ; Hansen, Ellen Kathrine ; Triantafyllidis, Georgios. / Multimodal Detection of Music Performances for Intelligent Emotion Based Lighting. Interactivity, Game Creation, Design, Learning, and Innovation: 5th International Conference, ArtsIT 2016, and First International Conference, DLI 2016, Esbjerg, Denmark, May 2–3, 2016, Proceedings. Springer, 2017. pp. 212-219 (Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST, Vol. 196).
@inproceedings{a1b4a14c7af94c3eb40027f8aa49e44a,
title = "Multimodal Detection of Music Performances for Intelligent Emotion Based Lighting",
abstract = "Playing music is about conveying emotions and the lighting at a concert can help do that. However, new and unknown bands that play at smaller venues and bands that don’t have the budget to hire a dedicated light technician have to miss out on lighting that will help them to convey the emotions of what they play. In this paper it is investigated whether it is possible or not to develop an intelligent system that through a multimodal input detects the intended emotions of the played music and in realtime adjusts the lighting accordingly. A concept for such an intelligent lighting system is developed and described. Through existing research on music and emotion, as well as on musicians’ body movements related to the emotion they want to convey, a row of cues is defined. This includes amount, speed, fluency and regularity for the visual andlevel, tempo, articulation and timbre for the auditory. Using a microphone and aKinect camera to detect such cues, the system is able to detect the intendedemotion of what is being played. Specific lighting designs are then developed tosupport the specific emotions and the system is able to change between andalter the lighting design based on the incoming cues. The results suggest thatthe intelligent emotion-based lighting system has an advantage over a just beatsynced lighting and it is concluded that there is reason to explore this ideafurther.",
author = "Bonde, {Esben Oxholm Skj{\o}dt} and Hansen, {Ellen Kathrine} and Georgios Triantafyllidis",
year = "2017",
doi = "10.1007/978-3-319-55834-9_25",
language = "English",
isbn = "978-3-319-55833-2",
series = "Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST",
publisher = "Springer",
pages = "212--219",
booktitle = "Interactivity, Game Creation, Design, Learning, and Innovation",
address = "Germany",

}

Bonde, EOS, Hansen, EK & Triantafyllidis, G 2017, Multimodal Detection of Music Performances for Intelligent Emotion Based Lighting. in Interactivity, Game Creation, Design, Learning, and Innovation: 5th International Conference, ArtsIT 2016, and First International Conference, DLI 2016, Esbjerg, Denmark, May 2–3, 2016, Proceedings. Springer, Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST, vol. 196, pp. 212-219, 5th EAI International Conference: ArtsIT, Interactivity & Game Creation, Esbjerg, Denmark, 02/05/2016. https://doi.org/10.1007/978-3-319-55834-9_25

Multimodal Detection of Music Performances for Intelligent Emotion Based Lighting. / Bonde, Esben Oxholm Skjødt; Hansen, Ellen Kathrine; Triantafyllidis, Georgios.

Interactivity, Game Creation, Design, Learning, and Innovation: 5th International Conference, ArtsIT 2016, and First International Conference, DLI 2016, Esbjerg, Denmark, May 2–3, 2016, Proceedings. Springer, 2017. p. 212-219 (Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST, Vol. 196).

Research output: Contribution to book/anthology/report/conference proceedingArticle in proceedingResearchpeer-review

TY - GEN

T1 - Multimodal Detection of Music Performances for Intelligent Emotion Based Lighting

AU - Bonde, Esben Oxholm Skjødt

AU - Hansen, Ellen Kathrine

AU - Triantafyllidis, Georgios

PY - 2017

Y1 - 2017

N2 - Playing music is about conveying emotions and the lighting at a concert can help do that. However, new and unknown bands that play at smaller venues and bands that don’t have the budget to hire a dedicated light technician have to miss out on lighting that will help them to convey the emotions of what they play. In this paper it is investigated whether it is possible or not to develop an intelligent system that through a multimodal input detects the intended emotions of the played music and in realtime adjusts the lighting accordingly. A concept for such an intelligent lighting system is developed and described. Through existing research on music and emotion, as well as on musicians’ body movements related to the emotion they want to convey, a row of cues is defined. This includes amount, speed, fluency and regularity for the visual andlevel, tempo, articulation and timbre for the auditory. Using a microphone and aKinect camera to detect such cues, the system is able to detect the intendedemotion of what is being played. Specific lighting designs are then developed tosupport the specific emotions and the system is able to change between andalter the lighting design based on the incoming cues. The results suggest thatthe intelligent emotion-based lighting system has an advantage over a just beatsynced lighting and it is concluded that there is reason to explore this ideafurther.

AB - Playing music is about conveying emotions and the lighting at a concert can help do that. However, new and unknown bands that play at smaller venues and bands that don’t have the budget to hire a dedicated light technician have to miss out on lighting that will help them to convey the emotions of what they play. In this paper it is investigated whether it is possible or not to develop an intelligent system that through a multimodal input detects the intended emotions of the played music and in realtime adjusts the lighting accordingly. A concept for such an intelligent lighting system is developed and described. Through existing research on music and emotion, as well as on musicians’ body movements related to the emotion they want to convey, a row of cues is defined. This includes amount, speed, fluency and regularity for the visual andlevel, tempo, articulation and timbre for the auditory. Using a microphone and aKinect camera to detect such cues, the system is able to detect the intendedemotion of what is being played. Specific lighting designs are then developed tosupport the specific emotions and the system is able to change between andalter the lighting design based on the incoming cues. The results suggest thatthe intelligent emotion-based lighting system has an advantage over a just beatsynced lighting and it is concluded that there is reason to explore this ideafurther.

U2 - 10.1007/978-3-319-55834-9_25

DO - 10.1007/978-3-319-55834-9_25

M3 - Article in proceeding

SN - 978-3-319-55833-2

T3 - Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST

SP - 212

EP - 219

BT - Interactivity, Game Creation, Design, Learning, and Innovation

PB - Springer

ER -

Bonde EOS, Hansen EK, Triantafyllidis G. Multimodal Detection of Music Performances for Intelligent Emotion Based Lighting. In Interactivity, Game Creation, Design, Learning, and Innovation: 5th International Conference, ArtsIT 2016, and First International Conference, DLI 2016, Esbjerg, Denmark, May 2–3, 2016, Proceedings. Springer. 2017. p. 212-219. (Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST, Vol. 196). https://doi.org/10.1007/978-3-319-55834-9_25