A Gaze-Driven Digital Interface for Musical Expression Based on Real-time Physical Modelling Synthesis

Publikation: Bidrag til bog/antologi/rapport/konference proceedingKonferenceartikel i proceedingForskningpeer review

17 Downloads (Pure)


Individuals with severely limited physical function such as Amyotrophic Lateral Sclerosis (ALS) sufferers are unable to engage in conventional music-making activities, as their bodily capabilities are often limited to eye movements. The rise of modern eye-tracking cameras has led to the development of augmented digital interfaces that can allow these individuals to compose music using only their gaze. This paper presents a gaze-controlled digital interface for musical expression and performance using a real-time physical model of a xylophone. The interface was designed to work with a basic Tobii eye-tracker and a scalable, open-source framework was built using the JUCE programming environment. A usability evaluation was carried out with nine convenience-sampled participants. Whilst the interface was found to be a feasible means for gaze-driven music performance our qualitative results indicate that the utility of the interface can be enhanced by expanding the possibilities for expressive control over the physical model. Potential improvements include a more robust gaze calibration method, as well as a redesigned graphical interface with more expressive possibilities. Overall, we see this work as a step towards accessible and inclusive musical performance interfaces for those with major physical limitations.
TitelProceedings of the 19th Sound and Music Computing Conference, June 5-12th, 2022, Saint-Étienne (France) : SMC/JIM/IFC 2022
RedaktørerRomain Michon, Laurent Pottier, Yann Orlarey
Antal sider8
ForlagSound and Music Computing Network
ISBN (Elektronisk)978-2-9584126-0-9
StatusUdgivet - 2022
Begivenhed19th Sound and Music Computing Conference, SMC 2022 - Saint-Etienne, Frankrig
Varighed: 5 jun. 202212 jun. 2022


Konference19th Sound and Music Computing Conference, SMC 2022
NavnProceedings of the Sound and Music Computing Conference


Dyk ned i forskningsemnerne om 'A Gaze-Driven Digital Interface for Musical Expression Based on Real-time Physical Modelling Synthesis'. Sammen danner de et unikt fingeraftryk.