ST-USleepNet:用于多通道睡眠分期的时空耦合突出网络

Jingying Ma, Qika Lin, Ziyu Jia, Mengling Feng
{"title":"ST-USleepNet:用于多通道睡眠分期的时空耦合突出网络","authors":"Jingying Ma, Qika Lin, Ziyu Jia, Mengling Feng","doi":"arxiv-2408.11884","DOIUrl":null,"url":null,"abstract":"Sleep staging is critical for assessing sleep quality and diagnosing\ndisorders. Recent advancements in artificial intelligence have driven the\ndevelopment of automated sleep staging models, which still face two significant\nchallenges. 1) Simultaneously extracting prominent temporal and spatial sleep\nfeatures from multi-channel raw signals, including characteristic sleep\nwaveforms and salient spatial brain networks. 2) Capturing the spatial-temporal\ncoupling patterns essential for accurate sleep staging. To address these\nchallenges, we propose a novel framework named ST-USleepNet, comprising a\nspatial-temporal graph construction module (ST) and a U-shaped sleep network\n(USleepNet). The ST module converts raw signals into a spatial-temporal graph\nto model spatial-temporal couplings. The USleepNet utilizes a U-shaped\nstructure originally designed for image segmentation. Similar to how image\nsegmentation isolates significant targets, when applied to both raw sleep\nsignals and ST module-generated graph data, USleepNet segments these inputs to\nextract prominent temporal and spatial sleep features simultaneously. Testing\non three datasets demonstrates that ST-USleepNet outperforms existing\nbaselines, and model visualizations confirm its efficacy in extracting\nprominent sleep features and temporal-spatial coupling patterns across various\nsleep stages. The code is available at:\nhttps://github.com/Majy-Yuji/ST-USleepNet.git.","PeriodicalId":501517,"journal":{"name":"arXiv - QuanBio - Neurons and Cognition","volume":"46 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-08-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"ST-USleepNet: A Spatial-Temporal Coupling Prominence Network for Multi-Channel Sleep Staging\",\"authors\":\"Jingying Ma, Qika Lin, Ziyu Jia, Mengling Feng\",\"doi\":\"arxiv-2408.11884\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Sleep staging is critical for assessing sleep quality and diagnosing\\ndisorders. Recent advancements in artificial intelligence have driven the\\ndevelopment of automated sleep staging models, which still face two significant\\nchallenges. 1) Simultaneously extracting prominent temporal and spatial sleep\\nfeatures from multi-channel raw signals, including characteristic sleep\\nwaveforms and salient spatial brain networks. 2) Capturing the spatial-temporal\\ncoupling patterns essential for accurate sleep staging. To address these\\nchallenges, we propose a novel framework named ST-USleepNet, comprising a\\nspatial-temporal graph construction module (ST) and a U-shaped sleep network\\n(USleepNet). The ST module converts raw signals into a spatial-temporal graph\\nto model spatial-temporal couplings. The USleepNet utilizes a U-shaped\\nstructure originally designed for image segmentation. Similar to how image\\nsegmentation isolates significant targets, when applied to both raw sleep\\nsignals and ST module-generated graph data, USleepNet segments these inputs to\\nextract prominent temporal and spatial sleep features simultaneously. Testing\\non three datasets demonstrates that ST-USleepNet outperforms existing\\nbaselines, and model visualizations confirm its efficacy in extracting\\nprominent sleep features and temporal-spatial coupling patterns across various\\nsleep stages. The code is available at:\\nhttps://github.com/Majy-Yuji/ST-USleepNet.git.\",\"PeriodicalId\":501517,\"journal\":{\"name\":\"arXiv - QuanBio - Neurons and Cognition\",\"volume\":\"46 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-08-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - QuanBio - Neurons and Cognition\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2408.11884\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - QuanBio - Neurons and Cognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2408.11884","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

睡眠分期对于评估睡眠质量和诊断疾病至关重要。人工智能的最新进展推动了自动睡眠分期模型的发展,但这些模型仍面临两个重大挑战。1) 同时从多通道原始信号中提取突出的时间和空间睡眠特征,包括特征性睡眠波形和显著的空间脑网络。2)捕捉对准确睡眠分期至关重要的空间-时间耦合模式。为了应对这些挑战,我们提出了一种名为 ST-USleepNet 的新型框架,由空间-时间图构建模块(ST)和 U 型睡眠网络(USleepNet)组成。ST 模块将原始信号转换为时空图,以模拟时空耦合。USleepNet 采用的 U 形结构最初是为图像分割而设计的。与图像分割分离重要目标的方法类似,当应用于原始睡眠信号和 ST 模块生成的图形数据时,USleepNet 会对这些输入进行分割,以同时提取突出的时间和空间睡眠特征。在三个数据集上进行的测试表明,ST-USleepNet 的性能优于现有的基线,模型可视化也证实了它在提取各睡眠阶段的主要睡眠特征和时空耦合模式方面的功效。代码可在以下网址获取:https://github.com/Majy-Yuji/ST-USleepNet.git。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
ST-USleepNet: A Spatial-Temporal Coupling Prominence Network for Multi-Channel Sleep Staging
Sleep staging is critical for assessing sleep quality and diagnosing disorders. Recent advancements in artificial intelligence have driven the development of automated sleep staging models, which still face two significant challenges. 1) Simultaneously extracting prominent temporal and spatial sleep features from multi-channel raw signals, including characteristic sleep waveforms and salient spatial brain networks. 2) Capturing the spatial-temporal coupling patterns essential for accurate sleep staging. To address these challenges, we propose a novel framework named ST-USleepNet, comprising a spatial-temporal graph construction module (ST) and a U-shaped sleep network (USleepNet). The ST module converts raw signals into a spatial-temporal graph to model spatial-temporal couplings. The USleepNet utilizes a U-shaped structure originally designed for image segmentation. Similar to how image segmentation isolates significant targets, when applied to both raw sleep signals and ST module-generated graph data, USleepNet segments these inputs to extract prominent temporal and spatial sleep features simultaneously. Testing on three datasets demonstrates that ST-USleepNet outperforms existing baselines, and model visualizations confirm its efficacy in extracting prominent sleep features and temporal-spatial coupling patterns across various sleep stages. The code is available at: https://github.com/Majy-Yuji/ST-USleepNet.git.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Early reduced dopaminergic tone mediated by D3 receptor and dopamine transporter in absence epileptogenesis Contrasformer: A Brain Network Contrastive Transformer for Neurodegenerative Condition Identification Identifying Influential nodes in Brain Networks via Self-Supervised Graph-Transformer Contrastive Learning in Memristor-based Neuromorphic Systems Self-Attention Limits Working Memory Capacity of Transformer-Based Models
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1