Pub Date : 2023-10-02DOI: 10.1080/27706710.2023.2258938
Wenwei Luo, Wanguang Yin, Quanying Liu, Youzhi Qu
The key to electroencephalography (EEG)-based brain-computer interface (BCI) lies in neural decoding, and its accuracy can be improved by using hybrid BCI paradigms, that is, fusing multiple paradigms. However, hybrid BCIs usually require separate processing processes for EEG signals in each paradigm, which greatly reduces the efficiency of EEG feature extraction and the generalizability of the model. Here, we propose a two-stream convolutional neural network (TSCNN) based hybrid brain-computer interface. It combines steady-state visual evoked potential (SSVEP) and motor imagery (MI) paradigms. TSCNN automatically learns to extract EEG features in the two paradigms in the training process, and improves the decoding accuracy by 25.4% compared with the MI mode, and 2.6% compared with SSVEP mode in the test data. Moreover, the versatility of TSCNN is verified as it provides considerable performance in both single-mode (70.2% for MI, 93.0% for SSVEP) and hybrid-mode scenarios (95.6% for MI-SSVEP hybrid). Our work will facilitate the real-world applications of EEG-based BCI systems.
{"title":"A Hybrid Brain-Computer Interface Using Motor Imagery and SSVEP Based on Convolutional Neural Network","authors":"Wenwei Luo, Wanguang Yin, Quanying Liu, Youzhi Qu","doi":"10.1080/27706710.2023.2258938","DOIUrl":"https://doi.org/10.1080/27706710.2023.2258938","url":null,"abstract":"The key to electroencephalography (EEG)-based brain-computer interface (BCI) lies in neural decoding, and its accuracy can be improved by using hybrid BCI paradigms, that is, fusing multiple paradigms. However, hybrid BCIs usually require separate processing processes for EEG signals in each paradigm, which greatly reduces the efficiency of EEG feature extraction and the generalizability of the model. Here, we propose a two-stream convolutional neural network (TSCNN) based hybrid brain-computer interface. It combines steady-state visual evoked potential (SSVEP) and motor imagery (MI) paradigms. TSCNN automatically learns to extract EEG features in the two paradigms in the training process, and improves the decoding accuracy by 25.4% compared with the MI mode, and 2.6% compared with SSVEP mode in the test data. Moreover, the versatility of TSCNN is verified as it provides considerable performance in both single-mode (70.2% for MI, 93.0% for SSVEP) and hybrid-mode scenarios (95.6% for MI-SSVEP hybrid). Our work will facilitate the real-world applications of EEG-based BCI systems.","PeriodicalId":497306,"journal":{"name":"Brain-Apparatus Communication A Journal of Bacomics","volume":"231 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135790338","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Purpose EEG analysis of emotions is greatly significant for the diagnosis of psychological diseases and brain-computer interface (BCI) applications. However, the applications of EEG brain neural network for emotion classification are rarely reported and the accuracy of emotion recognition for cross-subject tasks remains a challenge. Thus, this paper proposes to design a domain invariant model for EEG-network based emotion identification.Methods A novel brain-inception-network based deep learning model is proposed to extract discriminative graph features from EEG brain networks. To verify its efficiency, we compared our proposed method with some commonly used methods and three types of brain networks. In addition, we also compared the performance difference between the EEG brain network and EEG energy distribution for emotion recognition.Result One public EEG-based emotion dataset (SEED) was utilized in this paper, and the classification accuracy of leave-one-subject-out cross-validation was adopted as the comparison index. The classification results show that the performance of the proposed method is superior to those of the other methods mentioned in this paper.Conclusion The proposed method can capture discriminative structural features from the EEG network, which improves the emotion classification performance of brain neural networks.
{"title":"A novel brain inception neural network model using EEG graphic structure for emotion recognition","authors":"Weijie Huang, Xiaohui Gao, Guanyi Zhao, Yumeng Han, Jiangyu Han, Hao Tang, Zhengyu Wang, Cunbo Li, Yin Tian, Peiyang Li","doi":"10.1080/27706710.2023.2222159","DOIUrl":"https://doi.org/10.1080/27706710.2023.2222159","url":null,"abstract":"Purpose EEG analysis of emotions is greatly significant for the diagnosis of psychological diseases and brain-computer interface (BCI) applications. However, the applications of EEG brain neural network for emotion classification are rarely reported and the accuracy of emotion recognition for cross-subject tasks remains a challenge. Thus, this paper proposes to design a domain invariant model for EEG-network based emotion identification.Methods A novel brain-inception-network based deep learning model is proposed to extract discriminative graph features from EEG brain networks. To verify its efficiency, we compared our proposed method with some commonly used methods and three types of brain networks. In addition, we also compared the performance difference between the EEG brain network and EEG energy distribution for emotion recognition.Result One public EEG-based emotion dataset (SEED) was utilized in this paper, and the classification accuracy of leave-one-subject-out cross-validation was adopted as the comparison index. The classification results show that the performance of the proposed method is superior to those of the other methods mentioned in this paper.Conclusion The proposed method can capture discriminative structural features from the EEG network, which improves the emotion classification performance of brain neural networks.","PeriodicalId":497306,"journal":{"name":"Brain-Apparatus Communication A Journal of Bacomics","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-06-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135525267","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}