Preity;Ashish Kumar Bhandari;Akanksha Jha;Syed Shahnawazuddin
{"title":"RD-Net: Residual-Dense Network for Glaucoma Prediction Using Structural Features of Optic Nerve Head","authors":"Preity;Ashish Kumar Bhandari;Akanksha Jha;Syed Shahnawazuddin","doi":"10.1109/TAI.2024.3447578","DOIUrl":null,"url":null,"abstract":"Glaucoma is called as the silent thief of eyesight. It is related to the internal damage of optical nerve head (ONH). For early screening, the simplest way is to analyze the subtle variations in structural features such as cup to disc ratio (CDR), disc damage likelihood scale (DDLS), rim width of the inferior, superior, nasal, and temporal (ISNT) regions of ONH. This can be done by accurate segmentation of optic disc (OD) and optic cup (OC). In this work, we have introduced a deep learning framework, called residual dense network (RD-NET), for disc and cup segmentation. Based on the segmentation results, the structural features are calculated. The proposed design differs from the traditional U-Net in that it utilizes filters with variable sizes and an alternative optimization method throughout the up- and down-sampling processes. The introduced method is a hybrid deep learning model that incorporates dense residual blocks and squeeze excitation block introduced within the conventional U-Net architecture. Unlike the classical approaches that are primarily based on CDR calculation, in this work, we first segment OD and OC using RD-Net and then analyze ISNT and DDLS. Once a suspicious case is detected, we then go for CDR calculation. In addition to developing an efficient segmentation model, six distinct kinds of data augmentation techniques have been also used in this study to increase the amount of training data. This, in turn, leads to a better estimation of model parameters. The model is rigorously trained and tested on four benchmark datasets namely DRISHTI, RIMONE, ORIGA, and REFUGE. Subsequently, the structural parameters are calculated for glaucoma prediction. The average accuracies are observed to be 0.9940 and 0.9894 for OD and cup segmentation, respectively. The extensive experiments presented in this article show that our method outperforms other existing state-of-the art algorithms.","PeriodicalId":73305,"journal":{"name":"IEEE transactions on artificial intelligence","volume":"6 1","pages":"107-117"},"PeriodicalIF":0.0000,"publicationDate":"2024-08-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on artificial intelligence","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10643102/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

青光眼被称为视力的无声小偷。它与视神经头(ONH)的内部损伤有关。要进行早期筛查,最简单的方法是分析结构特征的细微变化,如视盘杯比值(CDR)、视盘损伤可能性标度(DDLS)、视神经头下部、上部、鼻部和颞部(ISNT)的边缘宽度。这可以通过准确分割视盘(OD)和视杯(OC)来实现。在这项工作中,我们引入了一种深度学习框架,称为残差密集网络(RD-NET),用于视盘和视杯的分割。根据分割结果,计算出结构特征。所提出的设计与传统的 U-Net 不同,它在整个上采样和下采样过程中使用了大小可变的滤波器和另一种优化方法。所引入的方法是一种混合深度学习模型,它结合了传统 U-Net 架构中引入的密集残差块和挤压激励块。与主要基于 CDR 计算的经典方法不同,在这项工作中,我们首先使用 RD-Net 对 OD 和 OC 进行分割,然后分析 ISNT 和 DDLS。一旦检测到可疑情况,我们就进行 CDR 计算。除了开发高效的分割模型,本研究还使用了六种不同的数据增强技术来增加训练数据量。这反过来又能更好地估计模型参数。该模型在 DRISHTI、RIMONE、ORIGA 和 REFUGE 四个基准数据集上进行了严格的训练和测试。随后,计算了青光眼预测的结构参数。观察发现,外径和杯状分割的平均准确率分别为 0.9940 和 0.9894。本文介绍的大量实验表明,我们的方法优于其他现有的先进算法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
RD-Net: Residual-Dense Network for Glaucoma Prediction Using Structural Features of Optic Nerve Head
Glaucoma is called as the silent thief of eyesight. It is related to the internal damage of optical nerve head (ONH). For early screening, the simplest way is to analyze the subtle variations in structural features such as cup to disc ratio (CDR), disc damage likelihood scale (DDLS), rim width of the inferior, superior, nasal, and temporal (ISNT) regions of ONH. This can be done by accurate segmentation of optic disc (OD) and optic cup (OC). In this work, we have introduced a deep learning framework, called residual dense network (RD-NET), for disc and cup segmentation. Based on the segmentation results, the structural features are calculated. The proposed design differs from the traditional U-Net in that it utilizes filters with variable sizes and an alternative optimization method throughout the up- and down-sampling processes. The introduced method is a hybrid deep learning model that incorporates dense residual blocks and squeeze excitation block introduced within the conventional U-Net architecture. Unlike the classical approaches that are primarily based on CDR calculation, in this work, we first segment OD and OC using RD-Net and then analyze ISNT and DDLS. Once a suspicious case is detected, we then go for CDR calculation. In addition to developing an efficient segmentation model, six distinct kinds of data augmentation techniques have been also used in this study to increase the amount of training data. This, in turn, leads to a better estimation of model parameters. The model is rigorously trained and tested on four benchmark datasets namely DRISHTI, RIMONE, ORIGA, and REFUGE. Subsequently, the structural parameters are calculated for glaucoma prediction. The average accuracies are observed to be 0.9940 and 0.9894 for OD and cup segmentation, respectively. The extensive experiments presented in this article show that our method outperforms other existing state-of-the art algorithms.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
7.70
自引率
0.00%
发文量
0
期刊最新文献
Front Cover Table of Contents IEEE Transactions on Artificial Intelligence Publication Information Table of Contents Front Cover
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1