Projects per year
Abstract
A multisensory virtual environment has been designed, aiming at recreating a realistic interaction with a set of vibrating strings. Haptic, auditory and visual cues progressively istantiate the environment: force and tactile feedback are provided by a robotic arm reporting for string reaction, string surface properties, and furthermore defining the physical touchpoint in form of a virtual plectrum embodied by the arm stylus. Auditory feedback is instantaneously synthesized as a result of the contacts of this plectrum against the strings, reproducing guitar sounds. A simple visual scenario contextualizes the plectrum in action along with the vibrating strings. Notes and chords are selected using a keyboard controller, in ways that one hand is engaged in the creation of a melody while the other hand plucks virtual strings. Such components have been integrated within the Unity3D simulation environment for game development, and run altogether on a PC. As also declared by a group of users testing a monophonic Keytar prototype with no keyboard control, the most significant contribution to the realism of the strings is given by the haptic feedback, in particular by the textural nuances that the robotic arm synthesizes while reproducing physical attributes of a metal surface. Their opinion, hence, argues in favor of the importance of factors others than auditory feedback for the design of new musical interfaces.
Original language | English |
---|---|
Title of host publication | Proceedings of the 22nd International Conference on Digital Audio Effects, DAFx 2019 |
Number of pages | 6 |
Publisher | DMT Lab, Birmingham City University |
Publication date | 2019 |
Pages | 47-52 |
Publication status | Published - 2019 |
Event | 22nd International Conference on Digital Audio Effects, DAFx 2019 - Birmingham, United Kingdom Duration: 2 Sept 2019 → 6 Sept 2019 |
Conference
Conference | 22nd International Conference on Digital Audio Effects, DAFx 2019 |
---|---|
Country/Territory | United Kingdom |
City | Birmingham |
Period | 02/09/2019 → 06/09/2019 |
Bibliographical note
Funding Information:This work is supported by NordForsk’s Nordic University Hub Nordic Sound and Music Computing Network NordicSMC, project number 86892.
Publisher Copyright:
© 2019 Federico Fontana et al.
Fingerprint
Dive into the research topics of 'Keytar: Melodic control of multisensory feedback from virtual strings'. Together they form a unique fingerprint.Projects
- 1 Active
-
NordicSMC: Nordic Sound and Music Computing Network
Serafin, S. (Project Coordinator), Dahl, S. (Project Participant), Willemsen, S. (Project Participant), Kantan, P. R. (Project Participant) & Paisa, R. (Project Participant)
01/01/2018 → …
Project: Research