Filter banks guided correlational convolutional neural network for SSVEPs based BCI classification.

Xin Wen, Shuting Jia, Dan Han, Yanqing Dong, Chengxin Gao, Ruochen Cao, Yanrong Hao, Yuxiang Guo, Rui Cao
{"title":"Filter banks guided correlational convolutional neural network for SSVEPs based BCI classification.","authors":"Xin Wen, Shuting Jia, Dan Han, Yanqing Dong, Chengxin Gao, Ruochen Cao, Yanrong Hao, Yuxiang Guo, Rui Cao","doi":"10.1088/1741-2552/ad7f89","DOIUrl":null,"url":null,"abstract":"<p><p><i>Objective.</i>In the field of steady-state visual evoked potential brain computer interfaces (SSVEP-BCIs) research, convolutional neural networks (CNNs) have gradually been proved to be an effective method. Whereas, majority works apply the frequency domain characteristics in long time window to train the network, thus lead to insufficient performance of those networks in short time window. Furthermore, only the frequency domain information for classification lacks of other task-related information.<i>Approach.</i>To address these issues, we propose a time-frequency domain generalized filter-bank convolutional neural network (FBCNN-G) to improve the SSVEP-BCIs classification performance. The network integrates multiple frequency information of electroencephalogram (EEG) with template and predefined prior of sine-cosine signals to perform feature extraction, which contains correlation analyses in both template and signal aspects. Then the classification is performed at the end of the network. In addition, the method proposes the use of filter banks divided into specific frequency bands as pre-filters in the network to fully consider the fundamental and harmonic frequency characteristics of the signal.<i>Main results.</i>The proposed FBCNN-G model is compared with other methods on the public dataset Benchmark. The results manifest that this model has higher accuracy of character recognition accuracy and information transfer rates in several time windows. Particularly, in the 0.2 s time window, the mean accuracy of the proposed method reaches62.02%±5.12%, indicating its superior performance.<i>Significance.</i>The proposed FBCNN-G model is critical for the exploitation of SSVEP-BCIs character recognition models.</p>","PeriodicalId":94096,"journal":{"name":"Journal of neural engineering","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-10-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of neural engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1088/1741-2552/ad7f89","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Objective.In the field of steady-state visual evoked potential brain computer interfaces (SSVEP-BCIs) research, convolutional neural networks (CNNs) have gradually been proved to be an effective method. Whereas, majority works apply the frequency domain characteristics in long time window to train the network, thus lead to insufficient performance of those networks in short time window. Furthermore, only the frequency domain information for classification lacks of other task-related information.Approach.To address these issues, we propose a time-frequency domain generalized filter-bank convolutional neural network (FBCNN-G) to improve the SSVEP-BCIs classification performance. The network integrates multiple frequency information of electroencephalogram (EEG) with template and predefined prior of sine-cosine signals to perform feature extraction, which contains correlation analyses in both template and signal aspects. Then the classification is performed at the end of the network. In addition, the method proposes the use of filter banks divided into specific frequency bands as pre-filters in the network to fully consider the fundamental and harmonic frequency characteristics of the signal.Main results.The proposed FBCNN-G model is compared with other methods on the public dataset Benchmark. The results manifest that this model has higher accuracy of character recognition accuracy and information transfer rates in several time windows. Particularly, in the 0.2 s time window, the mean accuracy of the proposed method reaches62.02%±5.12%, indicating its superior performance.Significance.The proposed FBCNN-G model is critical for the exploitation of SSVEP-BCIs character recognition models.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
滤波器组引导的相关卷积神经网络用于基于 SSVEPs 的 BCI 分类。
研究目的在稳态视觉诱发电位脑机接口(SSVEP-BCIs)研究领域,卷积神经网络(CNNs)逐渐被证明是一种有效的方法。然而,大多数研究都是利用长时间段内的频域特征来训练网络,从而导致这些网络在短时间内的性能不足。此外,仅利用频域信息进行分类缺乏与任务相关的其他信息:为解决这些问题,我们提出了一种时频域广义滤波器库卷积神经网络(FBCNN-G),以提高 SSVEP-BCI 的分类性能。该网络将脑电图(EEG)的多个频率信息与正弦波信号的模板和预设先验进行整合,以进行特征提取,其中包括模板和信号两方面的相关性分析。然后在网络末端进行分类。此外,该方法还建议在网络中使用按特定频段划分的滤波器组作为前置滤波器,以充分考虑信号的基频和谐波频率特性:主要结果:在公共数据集 Benchmark 上,将提出的 FBCNNG 模型与其他方法进行了比较。结果表明,在多个时间窗口中,该模型具有更高的字符识别精度和信息传输率。特别是在 0.2 秒的时间窗口中,所提出方法的平均准确率达到了 62.02 ± 5.12%,表明其性能优越。提出的 FBCNN-G 模型对于利用 SSVEP-BCI 字符识别模型至关重要。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Attention demands modulate brain electrical microstates and mental fatigue induced by simulated flight tasks. Temporal attention fusion network with custom loss function for EEG-fNIRS classification. Classification of hand movements from EEG using a FusionNet based LSTM network. Frequency-dependent phase entrainment of cortical cell types during tACS: computational modeling evidence. Patient-specific visual neglect severity estimation for stroke patients with neglect using EEG.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1