Benchmarking brain-computer interface algorithms: Riemannian approaches vs convolutional neural networks.

Manuel Eder, Jiachen Xu, Moritz Grosse-Wentrup
{"title":"Benchmarking brain-computer interface algorithms: Riemannian approaches vs convolutional neural networks.","authors":"Manuel Eder, Jiachen Xu, Moritz Grosse-Wentrup","doi":"10.1088/1741-2552/ad6793","DOIUrl":null,"url":null,"abstract":"<p><p><i>Objective.</i>To date, a comprehensive comparison of Riemannian decoding methods with deep convolutional neural networks for EEG-based brain-computer interfaces remains absent from published work. We address this research gap by using MOABB, The Mother Of All BCI Benchmarks, to compare novel convolutional neural networks to state-of-the-art Riemannian approaches across a broad range of EEG datasets, including motor imagery, P300, and steady-state visual evoked potentials paradigms.<i>Approach.</i>We systematically evaluated the performance of convolutional neural networks, specifically EEGNet, shallow ConvNet, and deep ConvNet, against well-established Riemannian decoding methods using MOABB processing pipelines. This evaluation included within-session, cross-session, and cross-subject methods, to provide a practical analysis of model effectiveness and to find an overall solution that performs well across different experimental settings.<i>Main results.</i>We find no significant differences in decoding performance between convolutional neural networks and Riemannian methods for within-session, cross-session, and cross-subject analyses.<i>Significance.</i>The results show that, when using traditional Brain-Computer Interface paradigms, the choice between CNNs and Riemannian methods may not heavily impact decoding performances in many experimental settings. These findings provide researchers with flexibility in choosing decoding approaches based on factors such as ease of implementation, computational efficiency or individual preferences.</p>","PeriodicalId":94096,"journal":{"name":"Journal of neural engineering","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-08-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of neural engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1088/1741-2552/ad6793","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Objective.To date, a comprehensive comparison of Riemannian decoding methods with deep convolutional neural networks for EEG-based brain-computer interfaces remains absent from published work. We address this research gap by using MOABB, The Mother Of All BCI Benchmarks, to compare novel convolutional neural networks to state-of-the-art Riemannian approaches across a broad range of EEG datasets, including motor imagery, P300, and steady-state visual evoked potentials paradigms.Approach.We systematically evaluated the performance of convolutional neural networks, specifically EEGNet, shallow ConvNet, and deep ConvNet, against well-established Riemannian decoding methods using MOABB processing pipelines. This evaluation included within-session, cross-session, and cross-subject methods, to provide a practical analysis of model effectiveness and to find an overall solution that performs well across different experimental settings.Main results.We find no significant differences in decoding performance between convolutional neural networks and Riemannian methods for within-session, cross-session, and cross-subject analyses.Significance.The results show that, when using traditional Brain-Computer Interface paradigms, the choice between CNNs and Riemannian methods may not heavily impact decoding performances in many experimental settings. These findings provide researchers with flexibility in choosing decoding approaches based on factors such as ease of implementation, computational efficiency or individual preferences.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
脑机接口算法基准:黎曼方法与卷积神经网络。
目的:迄今为止,基于脑电图的脑机接口的黎曼解码方法与深度卷积神经网络的全面比较仍未在公开发表的论文中出现。我们利用 MOABB(所有 BCI 基准之母),将新型卷积神经网络与最先进的黎曼解码方法在广泛的脑电图数据集(包括运动图像、P300 和稳态视觉诱发电位范例)上进行比较,从而填补了这一研究空白。我们使用 MOABB 处理管道系统地评估了卷积神经网络(特别是 EEGNet、浅 ConvNet 和深 ConvNet)与成熟的黎曼解码方法的性能。该评估包括会话内、跨会话和跨受试者方法,以便对模型的有效性进行实际分析,并找到在不同实验环境中表现良好的整体解决方案。我们发现卷积神经网络和黎曼方法在会话内、跨会话和跨受试者分析中的解码性能没有明显差异。研究结果表明,在使用传统脑机接口范例时,选择卷积神经网络和黎曼方法可能不会对许多实验环境中的解码性能产生严重影响。这些发现为研究人员提供了根据实施难易程度、计算效率或个人偏好等因素选择解码方法的灵活性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Attention demands modulate brain electrical microstates and mental fatigue induced by simulated flight tasks. Temporal attention fusion network with custom loss function for EEG-fNIRS classification. Classification of hand movements from EEG using a FusionNet based LSTM network. Frequency-dependent phase entrainment of cortical cell types during tACS: computational modeling evidence. Patient-specific visual neglect severity estimation for stroke patients with neglect using EEG.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1