TONGUE-BRAIN COMPUTER INTERFACE FOR ROBOTIC CONTROL

Research output: Contribution to conference without publisher/journalConference abstract for conferenceResearch

Abstract

Amyotrophic lateral sclerosis leads progressively to severe motor impairments (upper and lower extremities) and eventually ends in a lock-in-state. With no option of using a joystick, the tongue and eventually the brain are the only control options that are usable for control of assistive robots. The iTongue (an intraoral 18-target keyboard that can be operated using the tongue) allow se-verely motor impaired patients to control assistive robots. However, such robotic control requires many targets/inputs which require good tongue mova-bility that is gradually lost. A tongue-brain hybrid computer interface can reduce the target-quantity by introducing control modes that are selectable using visual steady-state evoked potentials (brain-computer interface, BCI). Fig. 1 shows the control paradigm using four control modes with four unique control-targets. It was tested in a case study with one healthy subject where the subject controlled a robotic arm to pick up a water bottle, move it to a glass and finally pour water into the glass. The hybrid and the full iTongue system were tested in randomized order three times each. The average completion time for the full iTongue system was 63±5.3 seconds but included one failure (tipped over the bottle). The hybrid system had a completion time of 94.2±3.5 seconds, with no failure.
While the performance of the hybrid system is lower, it may still be favorable for patients with limited tongue movability.
Original languageEnglish
Publication date9 Oct 2019
Number of pages1
Publication statusPublished - 9 Oct 2019

Fingerprint

Brain computer interface
Robotics
Bottles
Hybrid systems
Brain
Robots
Hybrid computers
Glass
Robotic arms
Bioelectric potentials
Interfaces (computer)
Water

Keywords

  • Brain computer interface
  • Tongue computer interface
  • Assistive robots
  • Hybrid computer interface

Cite this

@conference{2b9b3a412d9b4701995564dcb7b3f498,
title = "TONGUE-BRAIN COMPUTER INTERFACE FOR ROBOTIC CONTROL",
abstract = "Amyotrophic lateral sclerosis leads progressively to severe motor impairments (upper and lower extremities) and eventually ends in a lock-in-state. With no option of using a joystick, the tongue and eventually the brain are the only control options that are usable for control of assistive robots. The iTongue (an intraoral 18-target keyboard that can be operated using the tongue) allow se-verely motor impaired patients to control assistive robots. However, such robotic control requires many targets/inputs which require good tongue mova-bility that is gradually lost. A tongue-brain hybrid computer interface can reduce the target-quantity by introducing control modes that are selectable using visual steady-state evoked potentials (brain-computer interface, BCI). Fig. 1 shows the control paradigm using four control modes with four unique control-targets. It was tested in a case study with one healthy subject where the subject controlled a robotic arm to pick up a water bottle, move it to a glass and finally pour water into the glass. The hybrid and the full iTongue system were tested in randomized order three times each. The average completion time for the full iTongue system was 63±5.3 seconds but included one failure (tipped over the bottle). The hybrid system had a completion time of 94.2±3.5 seconds, with no failure. While the performance of the hybrid system is lower, it may still be favorable for patients with limited tongue movability.",
keywords = "Brain computer interface, Tongue computer interface, Assistive robots, Hybrid computer interface",
author = "K{\ae}seler, {Rasmus Leck} and Mads Jochumsen and Struijk, {Lotte N. S. Andreasen}",
year = "2019",
month = "10",
day = "9",
language = "English",
pages = "1",

}

TONGUE-BRAIN COMPUTER INTERFACE FOR ROBOTIC CONTROL. / Kæseler, Rasmus Leck; Jochumsen, Mads; Struijk, Lotte N. S. Andreasen.

2019. 1.

Research output: Contribution to conference without publisher/journalConference abstract for conferenceResearch

TY - ABST

T1 - TONGUE-BRAIN COMPUTER INTERFACE FOR ROBOTIC CONTROL

AU - Kæseler, Rasmus Leck

AU - Jochumsen, Mads

AU - Struijk, Lotte N. S. Andreasen

PY - 2019/10/9

Y1 - 2019/10/9

N2 - Amyotrophic lateral sclerosis leads progressively to severe motor impairments (upper and lower extremities) and eventually ends in a lock-in-state. With no option of using a joystick, the tongue and eventually the brain are the only control options that are usable for control of assistive robots. The iTongue (an intraoral 18-target keyboard that can be operated using the tongue) allow se-verely motor impaired patients to control assistive robots. However, such robotic control requires many targets/inputs which require good tongue mova-bility that is gradually lost. A tongue-brain hybrid computer interface can reduce the target-quantity by introducing control modes that are selectable using visual steady-state evoked potentials (brain-computer interface, BCI). Fig. 1 shows the control paradigm using four control modes with four unique control-targets. It was tested in a case study with one healthy subject where the subject controlled a robotic arm to pick up a water bottle, move it to a glass and finally pour water into the glass. The hybrid and the full iTongue system were tested in randomized order three times each. The average completion time for the full iTongue system was 63±5.3 seconds but included one failure (tipped over the bottle). The hybrid system had a completion time of 94.2±3.5 seconds, with no failure. While the performance of the hybrid system is lower, it may still be favorable for patients with limited tongue movability.

AB - Amyotrophic lateral sclerosis leads progressively to severe motor impairments (upper and lower extremities) and eventually ends in a lock-in-state. With no option of using a joystick, the tongue and eventually the brain are the only control options that are usable for control of assistive robots. The iTongue (an intraoral 18-target keyboard that can be operated using the tongue) allow se-verely motor impaired patients to control assistive robots. However, such robotic control requires many targets/inputs which require good tongue mova-bility that is gradually lost. A tongue-brain hybrid computer interface can reduce the target-quantity by introducing control modes that are selectable using visual steady-state evoked potentials (brain-computer interface, BCI). Fig. 1 shows the control paradigm using four control modes with four unique control-targets. It was tested in a case study with one healthy subject where the subject controlled a robotic arm to pick up a water bottle, move it to a glass and finally pour water into the glass. The hybrid and the full iTongue system were tested in randomized order three times each. The average completion time for the full iTongue system was 63±5.3 seconds but included one failure (tipped over the bottle). The hybrid system had a completion time of 94.2±3.5 seconds, with no failure. While the performance of the hybrid system is lower, it may still be favorable for patients with limited tongue movability.

KW - Brain computer interface

KW - Tongue computer interface

KW - Assistive robots

KW - Hybrid computer interface

M3 - Conference abstract for conference

SP - 1

ER -