SANS-CNN:利用宇航员成像数据对航天相关神经眼综合征进行自动机器学习的技术。

IF 4.4 1区 物理与天体物理 Q1 MULTIDISCIPLINARY SCIENCES npj Microgravity Pub Date : 2024-03-28 DOI:10.1038/s41526-024-00364-w
Sharif Amit Kamran, Khondker Fariha Hossain, Joshua Ong, Nasif Zaman, Ethan Waisberg, Phani Paladugu, Andrew G Lee, Alireza Tavakkoli
{"title":"SANS-CNN:利用宇航员成像数据对航天相关神经眼综合征进行自动机器学习的技术。","authors":"Sharif Amit Kamran, Khondker Fariha Hossain, Joshua Ong, Nasif Zaman, Ethan Waisberg, Phani Paladugu, Andrew G Lee, Alireza Tavakkoli","doi":"10.1038/s41526-024-00364-w","DOIUrl":null,"url":null,"abstract":"<p><p>Spaceflight associated neuro-ocular syndrome (SANS) is one of the largest physiologic barriers to spaceflight and requires evaluation and mitigation for future planetary missions. As the spaceflight environment is a clinically limited environment, the purpose of this research is to provide automated, early detection and prognosis of SANS with a machine learning model trained and validated on astronaut SANS optical coherence tomography (OCT) images. In this study, we present a lightweight convolutional neural network (CNN) incorporating an EfficientNet encoder for detecting SANS from OCT images titled \"SANS-CNN.\" We used 6303 OCT B-scan images for training/validation (80%/20% split) and 945 for testing with a combination of terrestrial images and astronaut SANS images for both testing and validation. SANS-CNN was validated with SANS images labeled by NASA to evaluate accuracy, specificity, and sensitivity. To evaluate real-world outcomes, two state-of-the-art pre-trained architectures were also employed on this dataset. We use GRAD-CAM to visualize activation maps of intermediate layers to test the interpretability of SANS-CNN's prediction. SANS-CNN achieved 84.2% accuracy on the test set with an 85.6% specificity, 82.8% sensitivity, and 84.1% F1-score. Moreover, SANS-CNN outperforms two other state-of-the-art pre-trained architectures, ResNet50-v2 and MobileNet-v2, in accuracy by 21.4% and 13.1%, respectively. We also apply two class-activation map techniques to visualize critical SANS features perceived by the model. SANS-CNN represents a CNN model trained and validated with real astronaut OCT images, enabling fast and efficient prediction of SANS-like conditions for spaceflight missions beyond Earth's orbit in which clinical and computational resources are extremely limited.</p>","PeriodicalId":54263,"journal":{"name":"npj Microgravity","volume":null,"pages":null},"PeriodicalIF":4.4000,"publicationDate":"2024-03-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10978911/pdf/","citationCount":"0","resultStr":"{\"title\":\"SANS-CNN: An automated machine learning technique for spaceflight associated neuro-ocular syndrome with astronaut imaging data.\",\"authors\":\"Sharif Amit Kamran, Khondker Fariha Hossain, Joshua Ong, Nasif Zaman, Ethan Waisberg, Phani Paladugu, Andrew G Lee, Alireza Tavakkoli\",\"doi\":\"10.1038/s41526-024-00364-w\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Spaceflight associated neuro-ocular syndrome (SANS) is one of the largest physiologic barriers to spaceflight and requires evaluation and mitigation for future planetary missions. As the spaceflight environment is a clinically limited environment, the purpose of this research is to provide automated, early detection and prognosis of SANS with a machine learning model trained and validated on astronaut SANS optical coherence tomography (OCT) images. In this study, we present a lightweight convolutional neural network (CNN) incorporating an EfficientNet encoder for detecting SANS from OCT images titled \\\"SANS-CNN.\\\" We used 6303 OCT B-scan images for training/validation (80%/20% split) and 945 for testing with a combination of terrestrial images and astronaut SANS images for both testing and validation. SANS-CNN was validated with SANS images labeled by NASA to evaluate accuracy, specificity, and sensitivity. To evaluate real-world outcomes, two state-of-the-art pre-trained architectures were also employed on this dataset. We use GRAD-CAM to visualize activation maps of intermediate layers to test the interpretability of SANS-CNN's prediction. SANS-CNN achieved 84.2% accuracy on the test set with an 85.6% specificity, 82.8% sensitivity, and 84.1% F1-score. Moreover, SANS-CNN outperforms two other state-of-the-art pre-trained architectures, ResNet50-v2 and MobileNet-v2, in accuracy by 21.4% and 13.1%, respectively. We also apply two class-activation map techniques to visualize critical SANS features perceived by the model. SANS-CNN represents a CNN model trained and validated with real astronaut OCT images, enabling fast and efficient prediction of SANS-like conditions for spaceflight missions beyond Earth's orbit in which clinical and computational resources are extremely limited.</p>\",\"PeriodicalId\":54263,\"journal\":{\"name\":\"npj Microgravity\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":4.4000,\"publicationDate\":\"2024-03-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10978911/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"npj Microgravity\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1038/s41526-024-00364-w\",\"RegionNum\":1,\"RegionCategory\":\"物理与天体物理\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MULTIDISCIPLINARY SCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"npj Microgravity","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1038/s41526-024-00364-w","RegionNum":1,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MULTIDISCIPLINARY SCIENCES","Score":null,"Total":0}
引用次数: 0

摘要

太空飞行相关神经眼综合征(SANS)是太空飞行的最大生理障碍之一,需要对未来的行星任务进行评估和缓解。由于太空飞行环境在临床上是一个有限的环境,本研究的目的是通过在宇航员 SANS 光学相干断层扫描(OCT)图像上训练和验证的机器学习模型,提供 SANS 的自动、早期检测和预后。在这项研究中,我们提出了一个轻量级卷积神经网络(CNN),其中包含一个用于从 OCT 图像中检测 SANS 的 EfficientNet 编码器,名为 "SANS-CNN"。我们使用了 6303 张 OCT B 扫描图像进行训练/验证(80%/20% 分离),并使用了 945 张地面图像和宇航员 SANS 图像组合进行测试。SANS-CNN 使用美国国家航空航天局标注的 SANS 图像进行验证,以评估准确性、特异性和灵敏度。为了评估真实世界的结果,我们还在该数据集上使用了两种最先进的预训练架构。我们使用 GRAD-CAM 对中间层的激活图进行可视化,以测试 SANS-CNN 预测的可解释性。SANS-CNN 在测试集上的准确率达到了 84.2%,特异性为 85.6%,灵敏度为 82.8%,F1 分数为 84.1%。此外,SANS-CNN 的准确率分别比 ResNet50-v2 和 MobileNet-v2 这两种最先进的预训练架构高出 21.4% 和 13.1%。我们还应用了两种类激活图技术,将模型感知到的关键 SANS 特征可视化。SANS-CNN 代表了一种利用真实宇航员 OCT 图像训练和验证的 CNN 模型,它能在临床和计算资源极其有限的地球轨道以外的太空飞行任务中快速有效地预测类似 SANS 的情况。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
SANS-CNN: An automated machine learning technique for spaceflight associated neuro-ocular syndrome with astronaut imaging data.

Spaceflight associated neuro-ocular syndrome (SANS) is one of the largest physiologic barriers to spaceflight and requires evaluation and mitigation for future planetary missions. As the spaceflight environment is a clinically limited environment, the purpose of this research is to provide automated, early detection and prognosis of SANS with a machine learning model trained and validated on astronaut SANS optical coherence tomography (OCT) images. In this study, we present a lightweight convolutional neural network (CNN) incorporating an EfficientNet encoder for detecting SANS from OCT images titled "SANS-CNN." We used 6303 OCT B-scan images for training/validation (80%/20% split) and 945 for testing with a combination of terrestrial images and astronaut SANS images for both testing and validation. SANS-CNN was validated with SANS images labeled by NASA to evaluate accuracy, specificity, and sensitivity. To evaluate real-world outcomes, two state-of-the-art pre-trained architectures were also employed on this dataset. We use GRAD-CAM to visualize activation maps of intermediate layers to test the interpretability of SANS-CNN's prediction. SANS-CNN achieved 84.2% accuracy on the test set with an 85.6% specificity, 82.8% sensitivity, and 84.1% F1-score. Moreover, SANS-CNN outperforms two other state-of-the-art pre-trained architectures, ResNet50-v2 and MobileNet-v2, in accuracy by 21.4% and 13.1%, respectively. We also apply two class-activation map techniques to visualize critical SANS features perceived by the model. SANS-CNN represents a CNN model trained and validated with real astronaut OCT images, enabling fast and efficient prediction of SANS-like conditions for spaceflight missions beyond Earth's orbit in which clinical and computational resources are extremely limited.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
npj Microgravity
npj Microgravity Physics and Astronomy-Physics and Astronomy (miscellaneous)
CiteScore
7.30
自引率
7.80%
发文量
50
审稿时长
9 weeks
期刊介绍: A new open access, online-only, multidisciplinary research journal, npj Microgravity is dedicated to publishing the most important scientific advances in the life sciences, physical sciences, and engineering fields that are facilitated by spaceflight and analogue platforms.
期刊最新文献
Dependence of cyanobacterium growth and Mars-specific photobioreactor mass on total pressure, pN2 and pCO2. Formaldehyde initiates memory and motor impairments under weightlessness condition. Development and implementation of a simulated microgravity setup for edible cyanobacteria. Space Analogs and Behavioral Health Performance Research review and recommendations checklist from ESA Topical Team. Surface tension enables induced pluripotent stem cell culture in commercially available hardware during spaceflight.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1