Decoding covert speech for intuitive control of brain-computer interfaces based on single-trial EEG: a feasibility study

Lea Tøttrup, Kasper Leerskov, Johannes Thorling Hadsund, Ernest Nlandu Kamavuako, Rasmus Leck Kæseler, Mads Jochumsen

Publikation: Bidrag til bog/antologi/rapport/konference proceedingKonferenceartikel i proceedingForskningpeer review

Resumé

For individuals with severe motor deficiencies, controlling external devices such as robotic arms or wheelchairs can be challenging, as many devices require some degree of motor control to be operated, e.g. when controlled using a joystick. A brain-computer interface (BCI) relies only on signals from the brain and may be used as a controller instead of muscles. Motor imagery (MI) has been used in many studies as a control signal for BCIs. However, MI may not be suitable for all control purposes, and several people cannot obtain BCI control with MI. In this study, the aim was to investigate the feasibility of decoding covert speech from single-trial EEG and compare and combine it with MI. In seven healthy subjects, EEG was recorded with twenty-five channels during six different actions: Speaking three words (both covert and overt speech), two arm movements (both motor imagery and execution), and one idle class. Temporal and spectral features were derived from the epochs and classified with a random forest classifier. The average classification accuracy was 67 ± 9 % and 75 ± 7 % for covert and overt speech, respectively; this was 5-10 % lower than the movement classification. The performance of the combined movement-speech decoder was 61 ± 9 % and 67 ± 7 % (covert and overt), but it is possible to have more classes available for control. The possibility of using covert speech for controlling a BCI was outlined; this is a step towards a multimodal BCI system for improved usability.
OriginalsprogEngelsk
Titel2019 International 16th Conference on Rehabilitation Robotics (ICORR)
Antal sider5
ForlagIEEE
Publikationsdatojun. 2019
Sider689-693
ISBN (Elektronisk)978-1-7281-2755-2
DOI
StatusUdgivet - jun. 2019
BegivenhedInternational Conference on Rehabilitation Robotics 2019 (ICORR 2019) - Toronto, Canada
Varighed: 24 jun. 201928 jun. 2019

Konference

KonferenceInternational Conference on Rehabilitation Robotics 2019 (ICORR 2019)
LandCanada
ByToronto
Periode24/06/201928/06/2019
NavnI E E E International Conference on Rehabilitation Robotics. Proceedings
Vol/bind16
ISSN1945-7898

Fingerprint

Brain-Computer Interfaces
Imagery (Psychotherapy)
Feasibility Studies
Electroencephalography
Equipment and Supplies
Wheelchairs
Computer Systems
Robotics
Healthy Volunteers
Muscles
Brain

Citer dette

Tøttrup, L., Leerskov, K., Hadsund, J. T., Kamavuako, E. N., Kæseler, R. L., & Jochumsen, M. (2019). Decoding covert speech for intuitive control of brain-computer interfaces based on single-trial EEG: a feasibility study. I 2019 International 16th Conference on Rehabilitation Robotics (ICORR) (s. 689-693). IEEE. I E E E International Conference on Rehabilitation Robotics. Proceedings, Bind. 16 https://doi.org/10.1109/ICORR.2019.8779499
Tøttrup, Lea ; Leerskov, Kasper ; Hadsund, Johannes Thorling ; Kamavuako, Ernest Nlandu ; Kæseler, Rasmus Leck ; Jochumsen, Mads. / Decoding covert speech for intuitive control of brain-computer interfaces based on single-trial EEG : a feasibility study. 2019 International 16th Conference on Rehabilitation Robotics (ICORR) . IEEE, 2019. s. 689-693 (I E E E International Conference on Rehabilitation Robotics. Proceedings, Bind 16).
@inproceedings{e2d6c6df33bb4706aa4f6175afe8561d,
title = "Decoding covert speech for intuitive control of brain-computer interfaces based on single-trial EEG: a feasibility study",
abstract = "For individuals with severe motor deficiencies, controlling external devices such as robotic arms or wheelchairs can be challenging, as many devices require some degree of motor control to be operated, e.g. when controlled using a joystick. A brain-computer interface (BCI) relies only on signals from the brain and may be used as a controller instead of muscles. Motor imagery (MI) has been used in many studies as a control signal for BCIs. However, MI may not be suitable for all control purposes, and several people cannot obtain BCI control with MI. In this study, the aim was to investigate the feasibility of decoding covert speech from single-trial EEG and compare and combine it with MI. In seven healthy subjects, EEG was recorded with twenty-five channels during six different actions: Speaking three words (both covert and overt speech), two arm movements (both motor imagery and execution), and one idle class. Temporal and spectral features were derived from the epochs and classified with a random forest classifier. The average classification accuracy was 67 ± 9 {\%} and 75 ± 7 {\%} for covert and overt speech, respectively; this was 5-10 {\%} lower than the movement classification. The performance of the combined movement-speech decoder was 61 ± 9 {\%} and 67 ± 7 {\%} (covert and overt), but it is possible to have more classes available for control. The possibility of using covert speech for controlling a BCI was outlined; this is a step towards a multimodal BCI system for improved usability.",
author = "Lea T{\o}ttrup and Kasper Leerskov and Hadsund, {Johannes Thorling} and Kamavuako, {Ernest Nlandu} and K{\ae}seler, {Rasmus Leck} and Mads Jochumsen",
year = "2019",
month = "6",
doi = "10.1109/ICORR.2019.8779499",
language = "English",
series = "I E E E International Conference on Rehabilitation Robotics. Proceedings",
publisher = "IEEE",
pages = "689--693",
booktitle = "2019 International 16th Conference on Rehabilitation Robotics (ICORR)",
address = "United States",

}

Tøttrup, L, Leerskov, K, Hadsund, JT, Kamavuako, EN, Kæseler, RL & Jochumsen, M 2019, Decoding covert speech for intuitive control of brain-computer interfaces based on single-trial EEG: a feasibility study. i 2019 International 16th Conference on Rehabilitation Robotics (ICORR) . IEEE, I E E E International Conference on Rehabilitation Robotics. Proceedings, bind 16, s. 689-693, International Conference on Rehabilitation Robotics 2019 (ICORR 2019), Toronto, Canada, 24/06/2019. https://doi.org/10.1109/ICORR.2019.8779499

Decoding covert speech for intuitive control of brain-computer interfaces based on single-trial EEG : a feasibility study. / Tøttrup, Lea; Leerskov, Kasper; Hadsund, Johannes Thorling; Kamavuako, Ernest Nlandu; Kæseler, Rasmus Leck; Jochumsen, Mads.

2019 International 16th Conference on Rehabilitation Robotics (ICORR) . IEEE, 2019. s. 689-693 (I E E E International Conference on Rehabilitation Robotics. Proceedings, Bind 16).

Publikation: Bidrag til bog/antologi/rapport/konference proceedingKonferenceartikel i proceedingForskningpeer review

TY - GEN

T1 - Decoding covert speech for intuitive control of brain-computer interfaces based on single-trial EEG

T2 - a feasibility study

AU - Tøttrup, Lea

AU - Leerskov, Kasper

AU - Hadsund, Johannes Thorling

AU - Kamavuako, Ernest Nlandu

AU - Kæseler, Rasmus Leck

AU - Jochumsen, Mads

PY - 2019/6

Y1 - 2019/6

N2 - For individuals with severe motor deficiencies, controlling external devices such as robotic arms or wheelchairs can be challenging, as many devices require some degree of motor control to be operated, e.g. when controlled using a joystick. A brain-computer interface (BCI) relies only on signals from the brain and may be used as a controller instead of muscles. Motor imagery (MI) has been used in many studies as a control signal for BCIs. However, MI may not be suitable for all control purposes, and several people cannot obtain BCI control with MI. In this study, the aim was to investigate the feasibility of decoding covert speech from single-trial EEG and compare and combine it with MI. In seven healthy subjects, EEG was recorded with twenty-five channels during six different actions: Speaking three words (both covert and overt speech), two arm movements (both motor imagery and execution), and one idle class. Temporal and spectral features were derived from the epochs and classified with a random forest classifier. The average classification accuracy was 67 ± 9 % and 75 ± 7 % for covert and overt speech, respectively; this was 5-10 % lower than the movement classification. The performance of the combined movement-speech decoder was 61 ± 9 % and 67 ± 7 % (covert and overt), but it is possible to have more classes available for control. The possibility of using covert speech for controlling a BCI was outlined; this is a step towards a multimodal BCI system for improved usability.

AB - For individuals with severe motor deficiencies, controlling external devices such as robotic arms or wheelchairs can be challenging, as many devices require some degree of motor control to be operated, e.g. when controlled using a joystick. A brain-computer interface (BCI) relies only on signals from the brain and may be used as a controller instead of muscles. Motor imagery (MI) has been used in many studies as a control signal for BCIs. However, MI may not be suitable for all control purposes, and several people cannot obtain BCI control with MI. In this study, the aim was to investigate the feasibility of decoding covert speech from single-trial EEG and compare and combine it with MI. In seven healthy subjects, EEG was recorded with twenty-five channels during six different actions: Speaking three words (both covert and overt speech), two arm movements (both motor imagery and execution), and one idle class. Temporal and spectral features were derived from the epochs and classified with a random forest classifier. The average classification accuracy was 67 ± 9 % and 75 ± 7 % for covert and overt speech, respectively; this was 5-10 % lower than the movement classification. The performance of the combined movement-speech decoder was 61 ± 9 % and 67 ± 7 % (covert and overt), but it is possible to have more classes available for control. The possibility of using covert speech for controlling a BCI was outlined; this is a step towards a multimodal BCI system for improved usability.

U2 - 10.1109/ICORR.2019.8779499

DO - 10.1109/ICORR.2019.8779499

M3 - Article in proceeding

T3 - I E E E International Conference on Rehabilitation Robotics. Proceedings

SP - 689

EP - 693

BT - 2019 International 16th Conference on Rehabilitation Robotics (ICORR)

PB - IEEE

ER -

Tøttrup L, Leerskov K, Hadsund JT, Kamavuako EN, Kæseler RL, Jochumsen M. Decoding covert speech for intuitive control of brain-computer interfaces based on single-trial EEG: a feasibility study. I 2019 International 16th Conference on Rehabilitation Robotics (ICORR) . IEEE. 2019. s. 689-693. (I E E E International Conference on Rehabilitation Robotics. Proceedings, Bind 16). https://doi.org/10.1109/ICORR.2019.8779499