首页 > 最新文献

Brain-Apparatus Communication A Journal of Bacomics最新文献

英文 中文
A Hybrid Brain-Computer Interface Using Motor Imagery and SSVEP Based on Convolutional Neural Network 基于卷积神经网络的运动图像和SSVEP混合脑机接口
Pub Date : 2023-10-02 DOI: 10.1080/27706710.2023.2258938
Wenwei Luo, Wanguang Yin, Quanying Liu, Youzhi Qu
The key to electroencephalography (EEG)-based brain-computer interface (BCI) lies in neural decoding, and its accuracy can be improved by using hybrid BCI paradigms, that is, fusing multiple paradigms. However, hybrid BCIs usually require separate processing processes for EEG signals in each paradigm, which greatly reduces the efficiency of EEG feature extraction and the generalizability of the model. Here, we propose a two-stream convolutional neural network (TSCNN) based hybrid brain-computer interface. It combines steady-state visual evoked potential (SSVEP) and motor imagery (MI) paradigms. TSCNN automatically learns to extract EEG features in the two paradigms in the training process, and improves the decoding accuracy by 25.4% compared with the MI mode, and 2.6% compared with SSVEP mode in the test data. Moreover, the versatility of TSCNN is verified as it provides considerable performance in both single-mode (70.2% for MI, 93.0% for SSVEP) and hybrid-mode scenarios (95.6% for MI-SSVEP hybrid). Our work will facilitate the real-world applications of EEG-based BCI systems.
基于脑电图(EEG)的脑机接口(BCI)的关键在于神经解码,采用混合脑机接口范式,即融合多种脑机接口范式,可以提高脑机接口解码的准确率。然而,混合脑机接口通常需要对每种范式的脑电信号进行单独的处理过程,这大大降低了脑电信号特征提取的效率和模型的可泛化性。本文提出了一种基于双流卷积神经网络(TSCNN)的混合脑机接口。它结合了稳态视觉诱发电位(SSVEP)和运动意象(MI)范式。TSCNN在训练过程中自动学习提取两种范式下的脑电特征,在测试数据中,与MI模式相比,解码准确率提高了25.4%,与SSVEP模式相比,解码准确率提高了2.6%。此外,TSCNN的多功能性得到了验证,因为它在单模(MI为70.2%,SSVEP为93.0%)和混合模式场景(MI-SSVEP混合场景为95.6%)中都提供了可观的性能。我们的工作将促进基于脑电图的脑机接口系统的实际应用。
{"title":"A Hybrid Brain-Computer Interface Using Motor Imagery and SSVEP Based on Convolutional Neural Network","authors":"Wenwei Luo, Wanguang Yin, Quanying Liu, Youzhi Qu","doi":"10.1080/27706710.2023.2258938","DOIUrl":"https://doi.org/10.1080/27706710.2023.2258938","url":null,"abstract":"The key to electroencephalography (EEG)-based brain-computer interface (BCI) lies in neural decoding, and its accuracy can be improved by using hybrid BCI paradigms, that is, fusing multiple paradigms. However, hybrid BCIs usually require separate processing processes for EEG signals in each paradigm, which greatly reduces the efficiency of EEG feature extraction and the generalizability of the model. Here, we propose a two-stream convolutional neural network (TSCNN) based hybrid brain-computer interface. It combines steady-state visual evoked potential (SSVEP) and motor imagery (MI) paradigms. TSCNN automatically learns to extract EEG features in the two paradigms in the training process, and improves the decoding accuracy by 25.4% compared with the MI mode, and 2.6% compared with SSVEP mode in the test data. Moreover, the versatility of TSCNN is verified as it provides considerable performance in both single-mode (70.2% for MI, 93.0% for SSVEP) and hybrid-mode scenarios (95.6% for MI-SSVEP hybrid). Our work will facilitate the real-world applications of EEG-based BCI systems.","PeriodicalId":497306,"journal":{"name":"Brain-Apparatus Communication A Journal of Bacomics","volume":"231 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135790338","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A novel brain inception neural network model using EEG graphic structure for emotion recognition 一种基于脑电图图结构的情绪识别新脑启神经网络模型
Pub Date : 2023-06-16 DOI: 10.1080/27706710.2023.2222159
Weijie Huang, Xiaohui Gao, Guanyi Zhao, Yumeng Han, Jiangyu Han, Hao Tang, Zhengyu Wang, Cunbo Li, Yin Tian, Peiyang Li
Purpose EEG analysis of emotions is greatly significant for the diagnosis of psychological diseases and brain-computer interface (BCI) applications. However, the applications of EEG brain neural network for emotion classification are rarely reported and the accuracy of emotion recognition for cross-subject tasks remains a challenge. Thus, this paper proposes to design a domain invariant model for EEG-network based emotion identification.Methods A novel brain-inception-network based deep learning model is proposed to extract discriminative graph features from EEG brain networks. To verify its efficiency, we compared our proposed method with some commonly used methods and three types of brain networks. In addition, we also compared the performance difference between the EEG brain network and EEG energy distribution for emotion recognition.Result One public EEG-based emotion dataset (SEED) was utilized in this paper, and the classification accuracy of leave-one-subject-out cross-validation was adopted as the comparison index. The classification results show that the performance of the proposed method is superior to those of the other methods mentioned in this paper.Conclusion The proposed method can capture discriminative structural features from the EEG network, which improves the emotion classification performance of brain neural networks.
目的情绪的脑电图分析对心理疾病的诊断和脑机接口(BCI)的应用具有重要意义。然而,脑电图脑神经网络在情绪分类中的应用鲜有报道,跨学科任务情绪识别的准确性仍然是一个挑战。因此,本文提出了一种基于脑电图网络的情感识别领域不变模型。方法提出一种新的基于脑初始网络的深度学习模型,从脑电图脑网络中提取判别图特征。为了验证该方法的有效性,我们将该方法与一些常用方法和三种脑网络进行了比较。此外,我们还比较了脑电网络和脑电能量分布对情绪识别的性能差异。结果本文利用一个公开的基于脑电图的情感数据集(SEED),采用留一受试者交叉验证的分类准确率作为比较指标。分类结果表明,该方法的分类性能优于本文提出的其他方法。结论该方法能够从脑电网络中捕捉到判别性的结构特征,提高了脑神经网络的情绪分类性能。
{"title":"A novel brain inception neural network model using EEG graphic structure for emotion recognition","authors":"Weijie Huang, Xiaohui Gao, Guanyi Zhao, Yumeng Han, Jiangyu Han, Hao Tang, Zhengyu Wang, Cunbo Li, Yin Tian, Peiyang Li","doi":"10.1080/27706710.2023.2222159","DOIUrl":"https://doi.org/10.1080/27706710.2023.2222159","url":null,"abstract":"Purpose EEG analysis of emotions is greatly significant for the diagnosis of psychological diseases and brain-computer interface (BCI) applications. However, the applications of EEG brain neural network for emotion classification are rarely reported and the accuracy of emotion recognition for cross-subject tasks remains a challenge. Thus, this paper proposes to design a domain invariant model for EEG-network based emotion identification.Methods A novel brain-inception-network based deep learning model is proposed to extract discriminative graph features from EEG brain networks. To verify its efficiency, we compared our proposed method with some commonly used methods and three types of brain networks. In addition, we also compared the performance difference between the EEG brain network and EEG energy distribution for emotion recognition.Result One public EEG-based emotion dataset (SEED) was utilized in this paper, and the classification accuracy of leave-one-subject-out cross-validation was adopted as the comparison index. The classification results show that the performance of the proposed method is superior to those of the other methods mentioned in this paper.Conclusion The proposed method can capture discriminative structural features from the EEG network, which improves the emotion classification performance of brain neural networks.","PeriodicalId":497306,"journal":{"name":"Brain-Apparatus Communication A Journal of Bacomics","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-06-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135525267","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
Brain-Apparatus Communication A Journal of Bacomics
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1