Musical instruments and musical user interfaces provide rich input and feedback through mostly tangible interactions, resulting in complex behavior. However, publications of novel interfaces often lack the required detail due to the complex- ity or the focus on a specific part of the interfaces and absence of a specific template or structure to describe these interactions. Drawing on and synthesizing models from interaction design and music making we propose a way for modeling musical interfaces by providing a scheme and visual language to describe, design, analyze, and compare interfaces for music making. To illustrate its capabilities we apply the proposed model to a range of assistive musical instruments, which often draw on multi-modal in- and output, resulting in complex designs and descriptions thereof.
|Title of host publication||Proceedings of the international conference on new interfaces for musical expression, Copenhagen, Denmark, 2017|
|Publication status||Published - 2017|
|Event||New Interfaces for Musical Expression 2017 - AAU Sydhavn, Copenhagen, Denmark|
Duration: 14 May 2017 → 18 May 2017
|Conference||New Interfaces for Musical Expression 2017|
|Period||14/05/2017 → 18/05/2017|