Non-invasive brain-computer interface using electroencephalography (EEG) signals promises a convenient approach empowering humans to communicate with and even control the outside world only with intentions. Herein, we propose to analyze EEG signals using fuzzy integral with deep reinforcement learning optimization to aggregate two aspects of information contained within EEG signals, namely local spatio-temporal and global temporal information, and demonstrate its benefits in EEG-based human intention recognition tasks. The EEG signals are first transformed into a 3D format preserving both topological and temporal structures, followed by distinctive local spatio-temporal feature extraction by a 3D-CNN, as well as the global temporal feature extraction by an RNN. Next, a fuzzy integral with respect to the optimized fuzzy measures with deep reinforcement learning is utilized to integrate the two extracted information and makes a final decision. The proposed approach retains the topological and temporal structures of EEG signals and merges them in a more efficient way. Experiments on a public EEG-based movement intention dataset demonstrate the effectiveness and superior performance of our proposed method.
|Title of host publication||Advances in Knowledge Discovery and Data Mining - 22nd Pacific-Asia Conference, PAKDD 2018, Proceedings|
|Editors||Dinh Phung, Geoffrey I. Webb, Bao Ho, Vincent S. Tseng, Mohadeseh Ganji, Lida Rashidi|
|Number of pages||13|
|Publication status||Published - 2018|
|Event||22nd Pacific-Asia Conference on Advances in Knowledge Discovery and Data Mining, PAKDD 2018 - Melbourne, Australia|
Duration: 3 Jun 2018 → 6 Jun 2018
|Conference||22nd Pacific-Asia Conference on Advances in Knowledge Discovery and Data Mining, PAKDD 2018|
|Period||03/06/2018 → 06/06/2018|
|Sponsor||Deakin University as the host institution, Trusting Social , University of Melbourne|
|Series||Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)|
Bibliographical notePublisher Copyright:
© Springer International Publishing AG, part of Springer Nature 2018.