HNSleepNet: A Novel Hybrid Neural Network for Home Health-Care Automatic Sleep Staging with Raw Single-Channel EEG

Weiwei Chen, Yun Yang, Po Yang
{"title":"HNSleepNet: A Novel Hybrid Neural Network for Home Health-Care Automatic Sleep Staging with Raw Single-Channel EEG","authors":"Weiwei Chen, Yun Yang, Po Yang","doi":"10.1109/INDIN45582.2020.9442118","DOIUrl":null,"url":null,"abstract":"Proper scoring of sleep stages may offer more intuitive clinical information for assessing the sleep health and improving the diagnosis of sleep disorders in the smart home healthcare. It usually depends on an accurate analysis of the collected physiological signals, especially for the raw sleep Electroencephalogram (EEG). Most of the methods currently available just rely on the pre-processing or handcrafted features that need prior knowledge and preliminary analysis from the sleep experts and only a few of them take full advantage of the temporal information such as the inter-epoch dependency or transition rules among stages, which are more effective for identifying the differences among the sleep stages. In such cases, we proposed a novel hybrid neural network named HNSleepNet. It utilizes a two-branch CNN with multi-scale convolution kernels to capture the time-invariant features from the adjacent sleep EEG epochs both in time and frequency domains automatically, and attention-based residual encoder-decoder LSTM layers to learn the inter-epoch dependency and transition rules at the Sequence-wise level. After the two-step training, HNSleepNet can perform sequence-to-sequence automatic sleep staging with a raw single channel EEG in an end-to-end way. As the experimental results demonstrated, its performance achieved a better overall accuracy and macro F1-score (MASS: 88%, 0.85, Sleep-EDF: 87%-80%, 0.79-0.74) compared with the state-of-the-art approaches on various single-channels (F4-EOG (Left), Fpz-Cz and Pz-Oz) in two public datasets with different scoring standards (AASM and R&K), We hope this progress can make clinically practical value in promoting home sleep studies on various home health-care devices.","PeriodicalId":185948,"journal":{"name":"2020 IEEE 18th International Conference on Industrial Informatics (INDIN)","volume":"68 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-07-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE 18th International Conference on Industrial Informatics (INDIN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/INDIN45582.2020.9442118","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Proper scoring of sleep stages may offer more intuitive clinical information for assessing the sleep health and improving the diagnosis of sleep disorders in the smart home healthcare. It usually depends on an accurate analysis of the collected physiological signals, especially for the raw sleep Electroencephalogram (EEG). Most of the methods currently available just rely on the pre-processing or handcrafted features that need prior knowledge and preliminary analysis from the sleep experts and only a few of them take full advantage of the temporal information such as the inter-epoch dependency or transition rules among stages, which are more effective for identifying the differences among the sleep stages. In such cases, we proposed a novel hybrid neural network named HNSleepNet. It utilizes a two-branch CNN with multi-scale convolution kernels to capture the time-invariant features from the adjacent sleep EEG epochs both in time and frequency domains automatically, and attention-based residual encoder-decoder LSTM layers to learn the inter-epoch dependency and transition rules at the Sequence-wise level. After the two-step training, HNSleepNet can perform sequence-to-sequence automatic sleep staging with a raw single channel EEG in an end-to-end way. As the experimental results demonstrated, its performance achieved a better overall accuracy and macro F1-score (MASS: 88%, 0.85, Sleep-EDF: 87%-80%, 0.79-0.74) compared with the state-of-the-art approaches on various single-channels (F4-EOG (Left), Fpz-Cz and Pz-Oz) in two public datasets with different scoring standards (AASM and R&K), We hope this progress can make clinically practical value in promoting home sleep studies on various home health-care devices.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
HNSleepNet:基于原始单通道脑电图的家庭医疗自动睡眠分期的新型混合神经网络
适当的睡眠阶段评分可以为智能家居医疗中评估睡眠健康状况和提高睡眠障碍的诊断提供更直观的临床信息。它通常依赖于对收集到的生理信号的准确分析,特别是对原始睡眠脑电图(EEG)。现有的方法大多依赖于预处理或手工制作特征,这些特征需要睡眠专家的先验知识和初步分析,只有少数方法充分利用了时间信息,如期间依赖或阶段之间的过渡规则,这对于识别睡眠阶段之间的差异更有效。在这种情况下,我们提出了一种新的混合神经网络HNSleepNet。该算法利用具有多尺度卷积核的两分支CNN在时间和频域自动捕获相邻睡眠脑电图时期的时不变特征,并利用基于注意力的残差编码器-解码器LSTM层在序列层面学习历元间依赖和转换规则。经过两步训练后,HNSleepNet可以端到端对原始单通道EEG进行序列到序列的自动睡眠分期。实验结果表明,该方法在两种不同评分标准的公共数据集(AASM和R&K)上,在不同单通道(F4-EOG(左)、Fpz-Cz和Pz-Oz)上的总体准确率和宏观f1评分(MASS: 88%, 0.85, sleep - edf: 87%-80%, 0.79-0.74)上的表现优于目前最先进的方法,我们希望这一进展对促进各种家庭卫生保健设备上的家庭睡眠研究具有临床实用价值。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A GWO-AFSA-SVM Model-Based Fault Pattern Recognition for the Power Equipment of Autonomous vessels System and Software Engineering, Runtime Intelligence Sentiment Analysis of Chinese E-commerce Reviews Based on BERT IoT - and blockchain-enabled credible scheduling in cloud manufacturing: a systemic framework Industry Digitalisation, Digital Twins in Industrial Applications
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1