Parallel Spatial-Temporal Self-Attention CNN-Based Motor Imagery Classification for BCI | |
Liu, Xiuling3,4; Shen, Yonglong3,4; Liu, Jing1,2,4; Yang, Jianli3,4; Xiong, Peng3,4; Lin, Feng5 | |
刊名 | FRONTIERS IN NEUROSCIENCE |
2020-12-11 | |
卷号 | 14页码:12 |
关键词 | motor imagery EEG BCI spatial-temporal self-attention deep learning |
DOI | 10.3389/fnins.2020.587520 |
英文摘要 | Motor imagery (MI) electroencephalography (EEG) classification is an important part of the brain-computer interface (BCI), allowing people with mobility problems to communicate with the outside world via assistive devices. However, EEG decoding is a challenging task because of its complexity, dynamic nature, and low signal-to-noise ratio. Designing an end-to-end framework that fully extracts the high-level features of EEG signals remains a challenge. In this study, we present a parallel spatial-temporal self-attention-based convolutional neural network for four-class MI EEG signal classification. This study is the first to define a new spatial-temporal representation of raw EEG signals that uses the self-attention mechanism to extract distinguishable spatial-temporal features. Specifically, we use the spatial self-attention module to capture the spatial dependencies between the channels of MI EEG signals. This module updates each channel by aggregating features over all channels with a weighted summation, thus improving the classification accuracy and eliminating the artifacts caused by manual channel selection. Furthermore, the temporal self-attention module encodes the global temporal information into features for each sampling time step, so that the high-level temporal features of the MI EEG signals can be extracted in the time domain. Quantitative analysis shows that our method outperforms state-of-the-art methods for intra-subject and inter-subject classification, demonstrating its robustness and effectiveness. In terms of qualitative analysis, we perform a visual inspection of the new spatial-temporal representation estimated from the learned architecture. Finally, the proposed method is employed to realize control of drones based on EEG signal, verifying its feasibility in real-time applications. |
资助项目 | National Natural Science Foundation of China[61802109] ; National Natural Science Foundation of China[61673158] ; National Natural Science Foundation of China[61703133] ; Natural Science Foundation of Hebei Province[F2020205006] ; Natural Science Foundation of Hebei Province[F2018201070] ; Top Youth Talents of Science and Technology Research Project in Hebei Province[BJ2020059] ; Youth Talent Support Program of Hebei Province[BJ2019044] ; Science Foundation of Hebei Normal University[L2018K02] |
WOS研究方向 | Neurosciences & Neurology |
语种 | 英语 |
出版者 | FRONTIERS MEDIA SA |
WOS记录号 | WOS:000601597400001 |
内容类型 | 期刊论文 |
源URL | [http://119.78.100.204/handle/2XEOYT63/16571] |
专题 | 中国科学院计算技术研究所 |
通讯作者 | Liu, Jing |
作者单位 | 1.Hebei Normal Univ, Coll Comp & Cyber Secur, Shijiazhuang, Hebei, Peoples R China 2.Chinese Acad Sci, Inst Comp Technol, Beijing Key Lab Mobile Comp & Pervas Device, Beijing, Peoples R China 3.Hebei Univ, Coll Elect Informat Engn, Baoding, Peoples R China 4.Hebei Univ, Key Lab Digital Med Engn Hebei Prov, Baoding, Peoples R China 5.Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore, Singapore |
推荐引用方式 GB/T 7714 | Liu, Xiuling,Shen, Yonglong,Liu, Jing,et al. Parallel Spatial-Temporal Self-Attention CNN-Based Motor Imagery Classification for BCI[J]. FRONTIERS IN NEUROSCIENCE,2020,14:12. |
APA | Liu, Xiuling,Shen, Yonglong,Liu, Jing,Yang, Jianli,Xiong, Peng,&Lin, Feng.(2020).Parallel Spatial-Temporal Self-Attention CNN-Based Motor Imagery Classification for BCI.FRONTIERS IN NEUROSCIENCE,14,12. |
MLA | Liu, Xiuling,et al."Parallel Spatial-Temporal Self-Attention CNN-Based Motor Imagery Classification for BCI".FRONTIERS IN NEUROSCIENCE 14(2020):12. |
个性服务 |
查看访问统计 |
相关权益政策 |
暂无数据 |
收藏/分享 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论