Temporal attention fusion network with custom loss function for EEG-fNIRS classification.

Chayut Bunterngchit, Jiaxing Wang, Jianqiang Su, Yihan Wang, Shiqi Liu, Zeng-Guang Hou
{"title":"Temporal attention fusion network with custom loss function for EEG-fNIRS classification.","authors":"Chayut Bunterngchit, Jiaxing Wang, Jianqiang Su, Yihan Wang, Shiqi Liu, Zeng-Guang Hou","doi":"10.1088/1741-2552/ad8e86","DOIUrl":null,"url":null,"abstract":"<p><p><i>Objective.</i>Methods that can detect brain activities accurately are crucial owing to the increasing prevalence of neurological disorders. In this context, a combination of electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS) offers a powerful approach to understanding normal and pathological brain functions, thereby overcoming the limitations of each modality, such as susceptibility to artifacts of EEG and limited temporal resolution of fNIRS. However, challenges such as class imbalance and inter-class variability within multisubject data hinder their full potential.<i>Approach.</i>To address this issue, we propose a novel temporal attention fusion network (TAFN) with a custom loss function. The TAFN model incorporates attention mechanisms to its long short-term memory and temporal convolutional layers to accurately capture spatial and temporal dependencies in the EEG-fNIRS data. The custom loss function combines class weights and asymmetric loss terms to ensure the precise classification of cognitive and motor intentions, along with addressing class imbalance issues.<i>Main results.</i>Rigorous testing demonstrated the exceptional cross-subject accuracy of the TAFN, exceeding 99% for cognitive tasks and 97% for motor imagery (MI) tasks. Additionally, the ability of the model to detect subtle differences in epilepsy was analyzed using scalp topography in MI tasks.<i>Significance.</i>This study presents a technique that outperforms traditional methods for detecting high-precision brain activity with subtle differences in the associated patterns. This makes it a promising tool for applications such as epilepsy and seizure detection, in which discerning subtle pattern differences is of paramount importance.</p>","PeriodicalId":94096,"journal":{"name":"Journal of neural engineering","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of neural engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1088/1741-2552/ad8e86","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Objective.Methods that can detect brain activities accurately are crucial owing to the increasing prevalence of neurological disorders. In this context, a combination of electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS) offers a powerful approach to understanding normal and pathological brain functions, thereby overcoming the limitations of each modality, such as susceptibility to artifacts of EEG and limited temporal resolution of fNIRS. However, challenges such as class imbalance and inter-class variability within multisubject data hinder their full potential.Approach.To address this issue, we propose a novel temporal attention fusion network (TAFN) with a custom loss function. The TAFN model incorporates attention mechanisms to its long short-term memory and temporal convolutional layers to accurately capture spatial and temporal dependencies in the EEG-fNIRS data. The custom loss function combines class weights and asymmetric loss terms to ensure the precise classification of cognitive and motor intentions, along with addressing class imbalance issues.Main results.Rigorous testing demonstrated the exceptional cross-subject accuracy of the TAFN, exceeding 99% for cognitive tasks and 97% for motor imagery (MI) tasks. Additionally, the ability of the model to detect subtle differences in epilepsy was analyzed using scalp topography in MI tasks.Significance.This study presents a technique that outperforms traditional methods for detecting high-precision brain activity with subtle differences in the associated patterns. This makes it a promising tool for applications such as epilepsy and seizure detection, in which discerning subtle pattern differences is of paramount importance.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
采用自定义损失函数的时态注意力融合网络,用于脑电图-近红外成像分类。
目的:由于神经系统疾病的发病率越来越高,能够准确检测大脑活动的方法至关重要。在这种情况下,脑电图(EEG)和功能性近红外光谱(fNIRS)的结合为了解正常和病理大脑功能提供了一种强有力的方法,从而克服了每种模式的局限性,如脑电图易受伪影影响和 fNIRS 的时间分辨率有限。为解决这一问题,我们提出了一种具有自定义损失函数的新型时空注意力融合网络(TAFN)。TAFN 模型将注意力机制纳入其长短期记忆和时间卷积层,以准确捕捉 EEG-fNIRS 数据中的空间和时间依赖性。自定义损失函数结合了类权重和非对称损失项,以确保认知意图和运动意图的精确分类,同时解决类不平衡问题。主要结果严格的测试表明,TAFN 的跨受试者准确率非常高,认知任务超过 99%,运动想象(MI)任务超过 97%。这项研究提出的技术在检测相关模式中存在细微差别的高精度大脑活动方面优于传统方法。这使得该技术成为癫痫和癫痫发作检测等应用领域的一种前景广阔的工具,在这些应用领域中,辨别细微的模式差异至关重要。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Temporal attention fusion network with custom loss function for EEG-fNIRS classification. Classification of hand movements from EEG using a FusionNet based LSTM network. Frequency-dependent phase entrainment of cortical cell types during tACS: computational modeling evidence. Patient-specific visual neglect severity estimation for stroke patients with neglect using EEG. SSVEP modulation via non-volitional neurofeedback: An in silico proof of concept.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1