TY - GEN
T1 - A Gaze-Driven Digital Interface for Musical Expression Based on Real-time Physical Modelling Synthesis
AU - Kandpal, Devansh
AU - Kantan, Prithvi Ravi
AU - Serafin, Stefania
PY - 2022
Y1 - 2022
N2 - Individuals with severely limited physical function such as Amyotrophic Lateral Sclerosis (ALS) sufferers are unable to engage in conventional music-making activities, as their bodily capabilities are often limited to eye movements. The rise of modern eye-tracking cameras has led to the development of augmented digital interfaces that can allow these individuals to compose music using only their gaze. This paper presents a gaze-controlled digital interface for musical expression and performance using a real-time physical model of a xylophone. The interface was designed to work with a basic Tobii eye-tracker and a scalable, open-source framework was built using the JUCE programming environment. A usability evaluation was carried out with nine convenience-sampled participants. Whilst the interface was found to be a feasible means for gaze-driven music performance our qualitative results indicate that the utility of the interface can be enhanced by expanding the possibilities for expressive control over the physical model. Potential improvements include a more robust gaze calibration method, as well as a redesigned graphical interface with more expressive possibilities. Overall, we see this work as a step towards accessible and inclusive musical performance interfaces for those with major physical limitations.
AB - Individuals with severely limited physical function such as Amyotrophic Lateral Sclerosis (ALS) sufferers are unable to engage in conventional music-making activities, as their bodily capabilities are often limited to eye movements. The rise of modern eye-tracking cameras has led to the development of augmented digital interfaces that can allow these individuals to compose music using only their gaze. This paper presents a gaze-controlled digital interface for musical expression and performance using a real-time physical model of a xylophone. The interface was designed to work with a basic Tobii eye-tracker and a scalable, open-source framework was built using the JUCE programming environment. A usability evaluation was carried out with nine convenience-sampled participants. Whilst the interface was found to be a feasible means for gaze-driven music performance our qualitative results indicate that the utility of the interface can be enhanced by expanding the possibilities for expressive control over the physical model. Potential improvements include a more robust gaze calibration method, as well as a redesigned graphical interface with more expressive possibilities. Overall, we see this work as a step towards accessible and inclusive musical performance interfaces for those with major physical limitations.
UR - http://www.scopus.com/inward/record.url?scp=85137733774&partnerID=8YFLogxK
U2 - 10.5281/zenodo.6798248
DO - 10.5281/zenodo.6798248
M3 - Article in proceeding
T3 - Proceedings of the Sound and Music Computing Conference
SP - 461
EP - 468
BT - Proceedings of the 19th Sound and Music Computing Conference, June 5-12th, 2022, Saint-Étienne (France)
A2 - Michon, Romain
A2 - Pottier, Laurent
A2 - Orlarey, Yann
PB - Sound and Music Computing Network
T2 - 19th Sound and Music Computing Conference, SMC 2022
Y2 - 5 June 2022 through 12 June 2022
ER -