Decoding covert speech for intuitive control of brain-computer interfaces based on single-trial EEG: a feasibility study

Lea Tøttrup, Kasper Leerskov, Johannes Thorling Hadsund, Ernest Nlandu Kamavuako, Rasmus Leck Kæseler, Mads Jochumsen

Research output: Contribution to book/anthology/report/conference proceedingArticle in proceedingResearchpeer-review

Abstract

For individuals with severe motor deficiencies, controlling external devices such as robotic arms or wheelchairs can be challenging, as many devices require some degree of motor control to be operated, e.g. when controlled using a joystick. A brain-computer interface (BCI) relies only on signals from the brain and may be used as a controller instead of muscles. Motor imagery (MI) has been used in many studies as a control signal for BCIs. However, MI may not be suitable for all control purposes, and several people cannot obtain BCI control with MI. In this study, the aim was to investigate the feasibility of decoding covert speech from single-trial EEG and compare and combine it with MI. In seven healthy subjects, EEG was recorded with twenty-five channels during six different actions: Speaking three words (both covert and overt speech), two arm movements (both motor imagery and execution), and one idle class. Temporal and spectral features were derived from the epochs and classified with a random forest classifier. The average classification accuracy was 67 \pm 9 % and 75\pm 7 % for covert and overt speech, respectively; this was 5-10 % lower than the movement classification. The performance of the combined movement-speech decoder was 61 \pm 9 % and 67\pm 7 % (covert and overt), but it is possible to have more classes available for control. The possibility of using covert speech for controlling a BCI was outlined; this is a step towards a multimodal BCI system for improved usability.

Original languageEnglish
Title of host publication2019 International 16th Conference on Rehabilitation Robotics (ICORR)
Number of pages5
PublisherIEEE
Publication dateJun 2019
Pages689-693
Article number8779499
ISBN (Electronic)978-1-7281-2755-2
DOIs
Publication statusPublished - Jun 2019
EventInternational Conference on Rehabilitation Robotics 2019 (ICORR 2019) - Toronto, Canada
Duration: 24 Jun 201928 Jun 2019

Conference

ConferenceInternational Conference on Rehabilitation Robotics 2019 (ICORR 2019)
CountryCanada
CityToronto
Period24/06/201928/06/2019
SeriesI E E E International Conference on Rehabilitation Robotics. Proceedings
Volume16
ISSN1945-7898

    Fingerprint

Cite this

Tøttrup, L., Leerskov, K., Hadsund, J. T., Kamavuako, E. N., Kæseler, R. L., & Jochumsen, M. (2019). Decoding covert speech for intuitive control of brain-computer interfaces based on single-trial EEG: a feasibility study. In 2019 International 16th Conference on Rehabilitation Robotics (ICORR) (pp. 689-693). [8779499] IEEE. I E E E International Conference on Rehabilitation Robotics. Proceedings, Vol.. 16 https://doi.org/10.1109/ICORR.2019.8779499