We present a report covering our preliminary research on the control of spatial sound sources in wavefield synthesis through gesture based interfaces.
After a short general introduction on spatial sound and few basic concepts on wavefield synthesis, we presents a graphical application called spAAce which let users to con- trol real-time movements of sound sources by drawing tra- jectories on a screen. The first prototype of this application has been developed bound to WFSCollider, an open-source software based on Supercollider which let users control wavefield synthesis. The spAAce application has been im- plemented using Processing, a programming language for sketches and prototypes within the context of visual arts, and communicates with WFSCollider through the Open Sound Control protocol. This application aims to create a new way of interaction for live performance of spatial composition and live electronics.
In a subsequent section we present an auditory game in which players can walk freely inside a virtual acoustic en- vironment (a room in a commercial ship) while being ex- posed to the presence of several “enemies”, which the player needs to localise and eliminate by using a Nintendo Wi- iMote game controller to “throw” sounding objects towards them. Aim of this project was to create a gestural interface for a game based on auditory cues only, and to investigate how convolution reverberation can affects people’s percep- tion of distance in a wavefield synthesis setup environment.
|Title of host publication||Sound And Music Computing Conference Proceedings 2016|
|Publisher||Zentrum für Mikrotonale Musik und Multimediale Komposition (ZM4)|
|Publication date||Aug 2016|
|Publication status||Published - Aug 2016|
|Event||13th Sound and Music Computing Conference (SMC 2016) - Hamburg, Germany|
Duration: 31 Aug 2016 → 3 Sept 2016
|Conference||13th Sound and Music Computing Conference (SMC 2016)|
|Period||31/08/2016 → 03/09/2016|