Motion Cognitive Decoding of Cross-Subject Motor Imagery Guided on Different Visual Stimulus Materials.

IF 2.7 4区 医学 Q3 NEUROSCIENCES Journal of integrative neuroscience Pub Date : 2024-12-19 DOI:10.31083/j.jin2312218
Tian-Jian Luo, Jing Li, Rui Li, Xiang Zhang, Shen-Rui Wu, Hua Peng
{"title":"Motion Cognitive Decoding of Cross-Subject Motor Imagery Guided on Different Visual Stimulus Materials.","authors":"Tian-Jian Luo, Jing Li, Rui Li, Xiang Zhang, Shen-Rui Wu, Hua Peng","doi":"10.31083/j.jin2312218","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>Motor imagery (MI) plays an important role in brain-computer interfaces, especially in evoking event-related desynchronization and synchronization (ERD/S) rhythms in electroencephalogram (EEG) signals. However, the procedure for performing a MI task for a single subject is subjective, making it difficult to determine the actual situation of an individual's MI task and resulting in significant individual EEG response variations during motion cognitive decoding.</p><p><strong>Methods: </strong>To explore this issue, we designed three visual stimuli (arrow, human, and robot), each of which was used to present three MI tasks (left arm, right arm, and feet), and evaluated differences in brain response in terms of ERD/S rhythms. To compare subject-specific variations of different visual stimuli, a novel cross-subject MI-EEG classification method was proposed for the three visual stimuli. The proposed method employed a covariance matrix centroid alignment for preprocessing of EEG samples, followed by a model agnostic meta-learning method for cross-subject MI-EEG classification.</p><p><strong>Results and conclusion: </strong>The experimental results showed that robot stimulus materials were better than arrow or human stimulus materials, with an optimal cross-subject motion cognitive decoding accuracy of 79.04%. Moreover, the proposed method produced robust classification of cross-subject MI-EEG signal decoding, showing superior results to conventional methods on collected EEG signals.</p>","PeriodicalId":16160,"journal":{"name":"Journal of integrative neuroscience","volume":"23 12","pages":"218"},"PeriodicalIF":2.7000,"publicationDate":"2024-12-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of integrative neuroscience","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.31083/j.jin2312218","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"NEUROSCIENCES","Score":null,"Total":0}
引用次数: 0

Abstract

Background: Motor imagery (MI) plays an important role in brain-computer interfaces, especially in evoking event-related desynchronization and synchronization (ERD/S) rhythms in electroencephalogram (EEG) signals. However, the procedure for performing a MI task for a single subject is subjective, making it difficult to determine the actual situation of an individual's MI task and resulting in significant individual EEG response variations during motion cognitive decoding.

Methods: To explore this issue, we designed three visual stimuli (arrow, human, and robot), each of which was used to present three MI tasks (left arm, right arm, and feet), and evaluated differences in brain response in terms of ERD/S rhythms. To compare subject-specific variations of different visual stimuli, a novel cross-subject MI-EEG classification method was proposed for the three visual stimuli. The proposed method employed a covariance matrix centroid alignment for preprocessing of EEG samples, followed by a model agnostic meta-learning method for cross-subject MI-EEG classification.

Results and conclusion: The experimental results showed that robot stimulus materials were better than arrow or human stimulus materials, with an optimal cross-subject motion cognitive decoding accuracy of 79.04%. Moreover, the proposed method produced robust classification of cross-subject MI-EEG signal decoding, showing superior results to conventional methods on collected EEG signals.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
不同视觉刺激材料引导下跨主体运动意象的运动认知解码。
背景:运动想象(MI)在脑机接口中起着重要的作用,特别是在脑电图信号中引发事件相关的去同步和同步(ERD/S)节律。然而,单个被试执行MI任务的过程是主观的,很难确定个体MI任务的实际情况,导致运动认知解码过程中个体EEG反应的显著差异。方法:为了探讨这一问题,我们设计了三种视觉刺激(箭头、人类和机器人),每种视觉刺激都用于呈现三种MI任务(左臂、右臂和脚),并根据ERD/S节奏评估大脑反应的差异。为了比较不同视觉刺激的被试特异性变化,提出了一种新的三种视觉刺激的跨被试MI-EEG分类方法。该方法采用协方差矩阵质心对齐方法对脑电样本进行预处理,然后采用模型不可知的元学习方法进行跨主体MI-EEG分类。结果与结论:实验结果表明,机器人刺激材料优于箭或人体刺激材料,最佳跨主体运动认知解码准确率为79.04%。此外,该方法对跨主体MI-EEG信号解码具有鲁棒性分类,对采集到的脑电信号的分类效果优于传统方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
2.80
自引率
5.60%
发文量
173
审稿时长
2 months
期刊介绍: JIN is an international peer-reviewed, open access journal. JIN publishes leading-edge research at the interface of theoretical and experimental neuroscience, focusing across hierarchical levels of brain organization to better understand how diverse functions are integrated. We encourage submissions from scientists of all specialties that relate to brain functioning.
期刊最新文献
Appropriately Designed Studies are Needed to Assess Whether Transcranial Direct Current Stimulation Actually Improves Gait Performance in Older People. Response to "Considerations for Transcranial Direct Current Stimulation to Improve Gait Performance in the Elderly". Integrating Multimodal Neuroimaging and Physical-Health Markers for Autism Spectrum Disorder in the ABCD Study. Physiological Regulatory Mechanisms of Neurovascular Coupling and the Role of Its Dysfunction in Neurological Diseases. Advances and Challenges in the Use of Spinal Cord Organoids in ALS.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1