Abstract
We present a generic, structured model for design and evaluation of musical interfaces. This model is development oriented, and it is based on the fundamental function of the musical interfaces, i.e., to coordinate the human action and perception for musical expression, subject to human capabilities and skills. To illustrate the particulars of this model and present it in operation, we consider the previous design and evaluation phase of iPalmas, our testbed for exploring rhythmic interaction. Our findings inform the current design phase of iPalmas visual and auditory displays, where we build on what has resonated with the test users, and explore further possibilities based on the evaluation results.
Original language | English |
---|---|
Book series | Proceedings of the International Conference on New Interfaces for Musical Expression |
Pages (from-to) | 477-480 |
Number of pages | 4 |
ISSN | 2220-4792 |
Publication status | Published - 2011 |
Event | International conference on New Interfaces for Musical Expression, NIME 2011 - Oslo, Norway Duration: 30 May 2011 → 1 Jun 2011 |
Conference
Conference | International conference on New Interfaces for Musical Expression, NIME 2011 |
---|---|
Country/Territory | Norway |
City | Oslo |
Period | 30/05/2011 → 01/06/2011 |
Bibliographical note
Publisher Copyright:© 2020, Steering Committee of the International Conference on New Interfaces for Musical Expression.
Keywords
- Multimodal displays
- Rhythmic interaction
- Sonification
- UML