AF Identification From Time–Frequency Analysis of ECG Signal Using Deep Neural Networks

IF 2.2 Q3 ENGINEERING, ELECTRICAL & ELECTRONIC IEEE Sensors Letters Pub Date : 2024-07-29 DOI:10.1109/LSENS.2024.3435009
Thivya Anbalagan;Malaya Kumar Nath
{"title":"AF Identification From Time–Frequency Analysis of ECG Signal Using Deep Neural Networks","authors":"Thivya Anbalagan;Malaya Kumar Nath","doi":"10.1109/LSENS.2024.3435009","DOIUrl":null,"url":null,"abstract":"Atrial fibrillation (AF) is a life threatening cardiac abnormality having high prevalence and risk with increased rate of stroke and systemic embolism, if oral anticoagulation is not recommended. Later, this leads to morbidity and mortality. Detection of AF is challenging from the electrocardiogram (ECG) recordings, due to its complex characteristics. Manual observation of ECG is tedious, time consuming, and error prone. This manuscript proposed a novel approach for identifying AF in the presence of noise and other beats by using deep neural networks (DNN) on the 2-D patterns obtained by various time–frequency analysis methods, such as short time Fourier transform (STFT), Chirplet-transform, Stockwell-transform, and Poincare plot from 1-D preprocessed ECG recordings. The above discussed methods identify the variations due to AF in ECG. Initially, the patterns are used by the pretrained DNN models for classification. ResNet18 attained the highest accuracy of 90.56\n<inline-formula><tex-math>$\\%$</tex-math></inline-formula>\n for the patterns of ECG obtained by Chirplet-transform on PAF prediction challenge database, whereas Chirplet patterns used by ResNet50 achieved an accuracy of 89.72\n<inline-formula><tex-math>$\\%$</tex-math></inline-formula>\n. Based on accuracy and the number of parameters in DNN, an ensembled network is designed for improving AF classification. Ensembling of ShuffleNet and AlexNet applied over the patterns obtained by Stockwell transform achieved the highest accuracy of 93.70\n<inline-formula><tex-math>$\\%$</tex-math></inline-formula>\n. This approach is further experimented on PhysioNet CinC 2017 challenge database, consisting of four classes (such as AF, normal, other rhythms, and noise).","PeriodicalId":13014,"journal":{"name":"IEEE Sensors Letters","volume":null,"pages":null},"PeriodicalIF":2.2000,"publicationDate":"2024-07-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Letters","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10613452/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Atrial fibrillation (AF) is a life threatening cardiac abnormality having high prevalence and risk with increased rate of stroke and systemic embolism, if oral anticoagulation is not recommended. Later, this leads to morbidity and mortality. Detection of AF is challenging from the electrocardiogram (ECG) recordings, due to its complex characteristics. Manual observation of ECG is tedious, time consuming, and error prone. This manuscript proposed a novel approach for identifying AF in the presence of noise and other beats by using deep neural networks (DNN) on the 2-D patterns obtained by various time–frequency analysis methods, such as short time Fourier transform (STFT), Chirplet-transform, Stockwell-transform, and Poincare plot from 1-D preprocessed ECG recordings. The above discussed methods identify the variations due to AF in ECG. Initially, the patterns are used by the pretrained DNN models for classification. ResNet18 attained the highest accuracy of 90.56 $\%$ for the patterns of ECG obtained by Chirplet-transform on PAF prediction challenge database, whereas Chirplet patterns used by ResNet50 achieved an accuracy of 89.72 $\%$ . Based on accuracy and the number of parameters in DNN, an ensembled network is designed for improving AF classification. Ensembling of ShuffleNet and AlexNet applied over the patterns obtained by Stockwell transform achieved the highest accuracy of 93.70 $\%$ . This approach is further experimented on PhysioNet CinC 2017 challenge database, consisting of four classes (such as AF, normal, other rhythms, and noise).
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
利用深度神经网络从心电图信号的时频分析中识别房颤
心房颤动(房颤)是一种威胁生命的心脏异常现象,发病率高,风险大,如果不建议口服抗凝药,中风和全身性栓塞的发病率就会增加。随后,这将导致发病率和死亡率。由于心房颤动的特征复杂,从心电图记录中检测心房颤动具有挑战性。人工观察心电图繁琐、耗时且容易出错。本手稿提出了一种新方法,通过使用深度神经网络(DNN),对各种时频分析方法(如短时傅里叶变换(STFT)、啁啾变换、斯托克韦尔变换和泊恩卡雷图)从一维预处理心电图记录中获得的二维模式进行分析,从而在存在噪声和其他搏动的情况下识别房颤。上述方法可识别心电图中房颤引起的变化。最初,这些模式被预训练的 DNN 模型用于分类。在 PAF 预测挑战数据库中,ResNet18 通过 Chirplet 变换获得的心电图模式的准确率最高,达到 90.56$/%$,而 ResNet50 使用的 Chirplet 模式的准确率为 89.72$/%$。根据准确率和 DNN 的参数数量,设计了一种改进房颤分类的集合网络。在斯托克韦尔变换得到的模式上应用 ShuffleNet 和 AlexNet 的集合,取得了 93.70$\%$ 的最高准确率。这种方法在PhysioNet CinC 2017挑战数据库上进行了进一步实验,该数据库由四个类别(如房颤、正常、其他节律和噪声)组成。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
IEEE Sensors Letters
IEEE Sensors Letters Engineering-Electrical and Electronic Engineering
CiteScore
3.50
自引率
7.10%
发文量
194
期刊最新文献
An Efficient and Scalable Internet of Things Framework for Smart Farming Machine Learning-Based Low-Cost Colorimetric Sensor for pH and Free-Chlorine Measurement A Portable and Flexible On-Road Sensing System for Traffic Monitoring Advancing General Sensor Data Synthesis by Integrating LLMs and Domain-Specific Generative Models $\mu$WSense: A Self-Sustainable Microwave-Powered Battery-Less Wireless Sensor Node for Temperature and Humidity Monitoring
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1