Convolutional neural networks based transfer learning for diabetic retinopathy fundus image classification

Xiaogang Li, Tiantian Pang, B. Xiong, Weixiang Liu, Ping Liang, Tianfu Wang
{"title":"Convolutional neural networks based transfer learning for diabetic retinopathy fundus image classification","authors":"Xiaogang Li, Tiantian Pang, B. Xiong, Weixiang Liu, Ping Liang, Tianfu Wang","doi":"10.1109/CISP-BMEI.2017.8301998","DOIUrl":null,"url":null,"abstract":"Convolutional Neural Networks (CNNs) have gained remarkable success in computer vision, which is mostly owe to their ability that enables learning rich image representations from large-scale annotated data. In the field of medical image analysis, large amounts of annotated data may be not always available. The number of acquired ground-truth data is sometimes insufficient to train the CNNs without overfitting and convergence issues from scratch. Hence application of the deep CNNs is a challenge in medical imaging domain. However, transfer learning techniques are shown to provide solutions for this challenge. In this paper, our target task is to implement diabetic retinopathy fundus image classification using CNNs based transfer learning. Experiments are performed on 1014 and 1200 fundus images from two publicly available DR1 and MESSIDOR datasets. In order to complete the target task, we carry out experiments using three different methods: 1) fine-tuning all network layers of each of different pre-trained CNN models; 2) fine-tuning a pre-trained CNN model in a layer-wise manner; 3) using pre-trained CNN models to extract features from fundus images, and then training support vector machines using these features. Experimental results show that convolutional neural networks based transfer learning can achieve better classification results in our task with small datasets (target domain), by taking advantage of knowledge learned from other related tasks with larger datasets (source domain). Transfer learning is a promising technique that promotes the use of deep CNNs in medical field with limited amounts of data.","PeriodicalId":6474,"journal":{"name":"2017 10th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI)","volume":"87 1","pages":"1-11"},"PeriodicalIF":0.0000,"publicationDate":"2017-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"116","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 10th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CISP-BMEI.2017.8301998","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 116

Abstract

Convolutional Neural Networks (CNNs) have gained remarkable success in computer vision, which is mostly owe to their ability that enables learning rich image representations from large-scale annotated data. In the field of medical image analysis, large amounts of annotated data may be not always available. The number of acquired ground-truth data is sometimes insufficient to train the CNNs without overfitting and convergence issues from scratch. Hence application of the deep CNNs is a challenge in medical imaging domain. However, transfer learning techniques are shown to provide solutions for this challenge. In this paper, our target task is to implement diabetic retinopathy fundus image classification using CNNs based transfer learning. Experiments are performed on 1014 and 1200 fundus images from two publicly available DR1 and MESSIDOR datasets. In order to complete the target task, we carry out experiments using three different methods: 1) fine-tuning all network layers of each of different pre-trained CNN models; 2) fine-tuning a pre-trained CNN model in a layer-wise manner; 3) using pre-trained CNN models to extract features from fundus images, and then training support vector machines using these features. Experimental results show that convolutional neural networks based transfer learning can achieve better classification results in our task with small datasets (target domain), by taking advantage of knowledge learned from other related tasks with larger datasets (source domain). Transfer learning is a promising technique that promotes the use of deep CNNs in medical field with limited amounts of data.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于卷积神经网络的糖尿病视网膜病变眼底图像分类
卷积神经网络(cnn)在计算机视觉领域取得了显著的成功,这主要归功于它们能够从大规模注释数据中学习丰富的图像表示。在医学图像分析领域,大量的注释数据可能并不总是可用的。获取的真值数据的数量有时不足以从头开始训练cnn,而不会出现过拟合和收敛问题。因此,深度cnn的应用是医学成像领域的一个挑战。然而,迁移学习技术为这一挑战提供了解决方案。在本文中,我们的目标任务是使用基于cnn的迁移学习实现糖尿病视网膜病变眼底图像分类。实验对来自两个公开的DR1和MESSIDOR数据集的1014和1200张眼底图像进行了实验。为了完成目标任务,我们使用了三种不同的方法进行实验:1)微调每个不同预训练CNN模型的所有网络层;2)对预训练好的CNN模型进行分层微调;3)利用预训练好的CNN模型从眼底图像中提取特征,然后利用这些特征训练支持向量机。实验结果表明,基于卷积神经网络的迁移学习可以利用从其他大数据集(源域)的相关任务中学习到的知识,在我们的小数据集(目标域)任务中获得更好的分类结果。迁移学习是一种很有前途的技术,它促进了深度cnn在数据量有限的医学领域的应用。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Polarization Characterization and Evaluation of Healing Process of the Damaged-skin Applied with Chitosan and Silicone Hydrogel Applicator Design and Implementation of OpenDayLight Manager Application Extraction of cutting plans in craniosynostosis using convolutional neural networks Evaluation of Flight Test Data Quality Based on Rough Set Theory Radar Emitter Type Identification Effect Based On Different Structural Deep Feedforward Networks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1