Brain-computer interfaces: A unique window into the hearing soul

Matthias S. Treder, Daniel Miklody, Benjamin Blankertz, Hendrik Purwins

Publikation: Konferencebidrag uden forlag/tidsskriftPaper uden forlag/tidsskriftForskningpeer review


Auditory perception is a complex cognitive process that involves information from the cochlea travelling via the auditory nerve to subcortical areas and finally the auditory cortices, where the auditory 'experience' is believed to arise. When subjects report on their auditory experience in an experiment, typically overt behavior (questionnaire, button press etc) is used.
However, overt behavior is the result of a cascade of cognitive evaluations, strategies, and motor actions following up on or concurrent with the perceptual experience. Obviously, overt responses are susceptible to various forms of perceptual and cognitive biases. Furthermore, subjects can only report on stimuli if they have a clear percept of them.
On the other hand, the electroencephalogram (EEG), the electrical brain activity measured with electrodes on the scalp, is a more direct measure. It allows us to tap into the ongoing neural auditory processing stream. In particular, it can tap brain processes that are pre-conscious or even unconscious, such as the earliest brain responses to sounds stimuli in primary auditory cortex. In a series of studies, we used a machine learning approach to show that the EEG can accurately reflect the subjects attentional state and phasic brain responses at different stages of neural perceptual processing.
Firstly, in an experiment using polyphonic music, we built a classifier that can accurately detect which instrument in a polyphonic piece of music a subject is paying attention to using the brain signals alone. Brain responses corresponding to attended and unattended instruments clearly differed (see Figure). Secondly, using temporal-kernel canonical correlation analysis (tkCCA), we were able to match the derivative of the power of the audio signals with the brain time series on the same experimental data, suggesting that the ongoing EEG in auditory cortices embeds the physical structure of the audio signal. Thirdly, in a study on audio quality assessment, we showed that the brain shows a graded response to auditory stimuli corrupted with different amounts of noise, giving rise to a 'brain quality measure'. We were able to show that for stimuli close to the perceptual threshold, there was sometimes a discrepancy between overt responses and brain responses, shedding light on subjects using different response criteria (e.g., more liberal or more conservative).
To conclude, brain-computer interfaces (BCIs) open up an new window into auditory perceptual experience. Although the application to cochlear implant (CI) patients is challenging due to electrical artefacts produced by the implant, we believe that this technology has the potential to contribute to the quality assessment and enhancement of cochlear implants in end users.
StatusUdgivet - 2017
BegivenhedThe Music & Cochlear Implants Symposium - Eriksholm Research Centre, Snekkersten, Danmark
Varighed: 13 okt. 201614 okt. 2016


KonferenceThe Music & Cochlear Implants Symposium
LokationEriksholm Research Centre

Fingeraftryk Dyk ned i forskningsemnerne om 'Brain-computer interfaces: A unique window into the hearing soul'. Sammen danner de et unikt fingeraftryk.