ABR-Attention: An Attention-Based Model for Precisely Localizing Auditory Brainstem Response

IF 4.8 2区 医学 Q2 ENGINEERING, BIOMEDICAL IEEE Transactions on Neural Systems and Rehabilitation Engineering Pub Date : 2024-08-19 DOI:10.1109/TNSRE.2024.3445936
Junyu Ji;Xin Wang;Xiaobei Jing;Mingxing Zhu;Hongguang Pan;Desheng Jia;Chunrui Zhao;Xu Yong;Yangjie Xu;Guoru Zhao;Poly Z.H. Sun;Guanglin Li;Shixiong Chen
{"title":"ABR-Attention: An Attention-Based Model for Precisely Localizing Auditory Brainstem Response","authors":"Junyu Ji;Xin Wang;Xiaobei Jing;Mingxing Zhu;Hongguang Pan;Desheng Jia;Chunrui Zhao;Xu Yong;Yangjie Xu;Guoru Zhao;Poly Z.H. Sun;Guanglin Li;Shixiong Chen","doi":"10.1109/TNSRE.2024.3445936","DOIUrl":null,"url":null,"abstract":"Auditory Brainstem Response (ABR) is an evoked potential in the brainstem’s neural centers in response to sound stimuli. Clinically, characteristic waves, especially Wave V latency, extracted from ABR can objectively indicate auditory loss and diagnose diseases. Several methods have been developed for the extraction of characteristic waves. To ensure the effectiveness of the method, most of the methods are time-consuming and rely on the heavy workloads of clinicians. To reduce the workload of clinicians, automated extraction methods have been developed. However, the above methods also have limitations. This study introduces a novel deep learning network for automatic extraction of Wave V latency, named ABR-Attention. ABR-Attention model includes a self-attention module, first and second-derivative attention module, and regressor module. Experiments are conducted on the accuracy with 10-fold cross-validation, the effects on different sound pressure levels (SPLs), the effects of different error scales and the effects of ablation. ABR-Attention shows efficacy in extracting Wave V latency of ABR, with an overall accuracy of \n<inline-formula> <tex-math>$96.76~\\pm ~0.41$ </tex-math></inline-formula>\n% and an error scale of 0.1ms, and provides a new solution for objective localization of ABR characteristic waves.","PeriodicalId":13419,"journal":{"name":"IEEE Transactions on Neural Systems and Rehabilitation Engineering","volume":"32 ","pages":"3179-3188"},"PeriodicalIF":4.8000,"publicationDate":"2024-08-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10639446","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Neural Systems and Rehabilitation Engineering","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10639446/","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0

Abstract

Auditory Brainstem Response (ABR) is an evoked potential in the brainstem’s neural centers in response to sound stimuli. Clinically, characteristic waves, especially Wave V latency, extracted from ABR can objectively indicate auditory loss and diagnose diseases. Several methods have been developed for the extraction of characteristic waves. To ensure the effectiveness of the method, most of the methods are time-consuming and rely on the heavy workloads of clinicians. To reduce the workload of clinicians, automated extraction methods have been developed. However, the above methods also have limitations. This study introduces a novel deep learning network for automatic extraction of Wave V latency, named ABR-Attention. ABR-Attention model includes a self-attention module, first and second-derivative attention module, and regressor module. Experiments are conducted on the accuracy with 10-fold cross-validation, the effects on different sound pressure levels (SPLs), the effects of different error scales and the effects of ablation. ABR-Attention shows efficacy in extracting Wave V latency of ABR, with an overall accuracy of $96.76~\pm ~0.41$ % and an error scale of 0.1ms, and provides a new solution for objective localization of ABR characteristic waves.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
ABR-Attention:基于注意力的听觉脑干反应精确定位模型
听性脑干反应(ABR)是脑干神经中枢对声音刺激做出反应的一种诱发电位。在临床上,从 ABR 中提取的特征波(尤其是 V 波潜伏期)可以客观地显示听觉损失和诊断疾病。目前已开发出多种提取特征波的方法。为了确保方法的有效性,大多数方法都很耗时,并且依赖于临床医生繁重的工作量。为了减轻临床医生的工作量,人们开发了自动提取方法。然而,上述方法也存在局限性。本研究介绍了一种用于自动提取波 V 潜伏期的新型深度学习网络,名为 ABR-Attention。ABR-Attention 模型包括自我注意模块、第一和第二派生注意模块以及回归模块。实验内容包括 10 倍交叉验证的准确性、对不同声压级 (SPL) 的影响、不同误差标度的影响以及消融的影响。ABR-Attention 在提取 ABR 第 V 波潜伏期方面效果显著,总体准确率为 96.76±0.41%,误差范围为 0.1ms,为客观定位 ABR 特征波提供了新的解决方案。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
8.60
自引率
8.20%
发文量
479
审稿时长
6-12 weeks
期刊介绍: Rehabilitative and neural aspects of biomedical engineering, including functional electrical stimulation, acoustic dynamics, human performance measurement and analysis, nerve stimulation, electromyography, motor control and stimulation; and hardware and software applications for rehabilitation engineering and assistive devices.
期刊最新文献
Semi-Autonomous Continuous Robotic Arm Control Using an Augmented Reality Brain-Computer Interface Low-Intensity Focused Ultrasound Stimulation on Fingertip Can Evoke Fine Tactile Sensations and Different Local Hemodynamic Responses The Neural Basis of the Effect of Transcutaneous Auricular Vagus Nerve Stimulation on Emotion Regulation Related Brain Regions: An rs-fMRI Study An Asynchronous Training-free SSVEP-BCI Detection Algorithm for Non-Equal Prior Probability Scenarios. A Swing-Assist Controller for Enhancing Knee Flexion in a Semi-Powered Transfemoral Prosthesis
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1