A Gaze-Driven Digital Interface for Musical Expression Based on Real-time Physical Modelling Synthesis

Research output: Contribution to book/anthology/report/conference proceedingArticle in proceedingResearchpeer-review

111 Downloads (Pure)

Abstract

Individuals with severely limited physical function such as Amyotrophic Lateral Sclerosis (ALS) sufferers are unable to engage in conventional music-making activities, as their bodily capabilities are often limited to eye movements. The rise of modern eye-tracking cameras has led to the development of augmented digital interfaces that can allow these individuals to compose music using only their gaze. This paper presents a gaze-controlled digital interface for musical expression and performance using a real-time physical model of a xylophone. The interface was designed to work with a basic Tobii eye-tracker and a scalable, open-source framework was built using the JUCE programming environment. A usability evaluation was carried out with nine convenience-sampled participants. Whilst the interface was found to be a feasible means for gaze-driven music performance our qualitative results indicate that the utility of the interface can be enhanced by expanding the possibilities for expressive control over the physical model. Potential improvements include a more robust gaze calibration method, as well as a redesigned graphical interface with more expressive possibilities. Overall, we see this work as a step towards accessible and inclusive musical performance interfaces for those with major physical limitations.
Original languageEnglish
Title of host publicationProceedings of the 19th Sound and Music Computing Conference, June 5-12th, 2022, Saint-Étienne (France) : SMC/JIM/IFC 2022
EditorsRomain Michon, Laurent Pottier, Yann Orlarey
Number of pages8
PublisherSound and Music Computing Network
Publication date2022
Pages461-468
ISBN (Electronic)978-2-9584126-0-9
DOIs
Publication statusPublished - 2022
Event19th Sound and Music Computing Conference, SMC 2022 - Saint-Etienne, France
Duration: 5 Jun 202212 Jun 2022

Conference

Conference19th Sound and Music Computing Conference, SMC 2022
Country/TerritoryFrance
CitySaint-Etienne
Period05/06/202212/06/2022
SeriesProceedings of the Sound and Music Computing Conference
ISSN2518-3672

Fingerprint

Dive into the research topics of 'A Gaze-Driven Digital Interface for Musical Expression Based on Real-time Physical Modelling Synthesis'. Together they form a unique fingerprint.

Cite this