Abstract
This paper presents an audio-based tennis simulation game for mobile devices, which uses motion input and non-verbal audio feedback as exclusive means of interaction. Players have to listen carefully to the provided auditory clues, like racquet hits and ball bounces, rhythmically synchronizing their movements in order to keep the ball into play. The device can be swung freely and act as a full-fledged motion- based controller, as the game does not rely at all on visual feedback and the device display can thus be ignored. The game aims to be entertaining but also effective for educa- tional purposes, such as ear training or improvement of the sense of timing, and enjoyable both by visually-impaired and sighted users.
Original language | English |
---|---|
Title of host publication | New Interfaces for Musical Expression (NIME) |
Number of pages | 2 |
Publisher | NIME |
Publication date | 2013 |
Pages | 200-201 |
Publication status | Published - 2013 |
Event | 12th International Conference on New Interfaces for Musical Expression - Ann-Arbor, Michigan, United States Duration: 21 May 2012 → 23 May 2013 http://aimlab.kaist.ac.kr/nime2013/ |
Conference
Conference | 12th International Conference on New Interfaces for Musical Expression |
---|---|
Country/Territory | United States |
City | Ann-Arbor, Michigan |
Period | 21/05/2012 → 23/05/2013 |
Internet address |
Keywords
- Audio game
- Mobile devices
- Sonic interaction design
- Rhythmic interaction
- Motion-based