Abstrakt

People with complete tetraplegia who are paralyzed from the neck down require alternative computer and robot interfaces based on the remaining functionalities. The inductive tongue-computer interface (ITCI) consists of two small touchpads that are manipulated by the tongue and can provide individuals with tetraplegia the possibility to control computers, robots, and other electronic devices. The tongue gesture can be used to expand the vocabulary of the ITCI and provide more control commands. In this study for the first time, a tongue gesture recognition method was introduced that can identify a set of six gestures with 94.3% accuracy and a set of 23 gestures with 72.3% accuracy. Furthermore, a robot interface was developed using the tongue gestures and it was compared with a previously tested method based on soft buttons on the ITCI. Four able-bodied participants performed a pouring water task with the two control methods when visual feedback from the interface was presented on a screen and then without the visual feedback. Task completion time for the gesture-based control and the button-based control did not differ. However, gesture-based control provides more control commands that can be used for interfacing other systems.
OriginalsprogEngelsk
TitelBIBE 2020; International Conference on Biological Information and Biomedical Engineering
ForlagIEEE Computer Society Press
StatusAccepteret/In press - 11 aug. 2020

Fingeraftryk Dyk ned i forskningsemnerne om 'A pilot study on a novel gesture-based tongue interface for robot and computer control'. Sammen danner de et unikt fingeraftryk.

  • Citationsformater

    Mohammadi, M., Knoche, H., Bentsen, B., Gaihede, M., & Struijk, L. N. S. A. (Accepteret/In press). A pilot study on a novel gesture-based tongue interface for robot and computer control. I BIBE 2020; International Conference on Biological Information and Biomedical Engineering IEEE Computer Society Press.