Keytar: Melodic control of multisensory feedback from virtual strings

Federico Fontana, Andrea Passalenti, Stefania Serafin, Razvan Paisa

Research output: Contribution to book/anthology/report/conference proceedingArticle in proceedingResearchpeer-review

1 Citation (Scopus)
48 Downloads (Pure)

Abstract

A multisensory virtual environment has been designed, aiming at recreating a realistic interaction with a set of vibrating strings. Haptic, auditory and visual cues progressively istantiate the environment: force and tactile feedback are provided by a robotic arm reporting for string reaction, string surface properties, and furthermore defining the physical touchpoint in form of a virtual plectrum embodied by the arm stylus. Auditory feedback is instantaneously synthesized as a result of the contacts of this plectrum against the strings, reproducing guitar sounds. A simple visual scenario contextualizes the plectrum in action along with the vibrating strings. Notes and chords are selected using a keyboard controller, in ways that one hand is engaged in the creation of a melody while the other hand plucks virtual strings. Such components have been integrated within the Unity3D simulation environment for game development, and run altogether on a PC. As also declared by a group of users testing a monophonic Keytar prototype with no keyboard control, the most significant contribution to the realism of the strings is given by the haptic feedback, in particular by the textural nuances that the robotic arm synthesizes while reproducing physical attributes of a metal surface. Their opinion, hence, argues in favor of the importance of factors others than auditory feedback for the design of new musical interfaces.

Original languageEnglish
Title of host publicationProceedings of the 22nd International Conference on Digital Audio Effects, DAFx 2019
Number of pages6
PublisherDMT Lab, Birmingham City University
Publication date2019
Pages47-52
Publication statusPublished - 2019
Event22nd International Conference on Digital Audio Effects, DAFx 2019 - Birmingham, United Kingdom
Duration: 2 Sept 20196 Sept 2019

Conference

Conference22nd International Conference on Digital Audio Effects, DAFx 2019
Country/TerritoryUnited Kingdom
CityBirmingham
Period02/09/201906/09/2019

Bibliographical note

Funding Information:
This work is supported by NordForsk’s Nordic University Hub Nordic Sound and Music Computing Network NordicSMC, project number 86892.

Publisher Copyright:
© 2019 Federico Fontana et al.

Fingerprint

Dive into the research topics of 'Keytar: Melodic control of multisensory feedback from virtual strings'. Together they form a unique fingerprint.

Cite this