小数据集上基于深度学习的茶病检测的迁移学习和微调

A. Ramdan, A. Heryana, Andria Arisal, R. B. S. Kusumo, H. Pardede
{"title":"小数据集上基于深度学习的茶病检测的迁移学习和微调","authors":"A. Ramdan, A. Heryana, Andria Arisal, R. B. S. Kusumo, H. Pardede","doi":"10.1109/ICRAMET51080.2020.9298575","DOIUrl":null,"url":null,"abstract":"It is well-known that a large amount of data is required to train deep learning systems. However, data collection is very costly if it is not impossible to do. To overcome the limited data problem, one can use models that have been trained with a large dataset and apply them in the target domain with a limited dataset. In this paper, we use pre-trained models on imageNet data and re-train them on our data to detect tea leaf diseases. Those pre-trained models use deep convolutional neural network (DCNN) architectures: VGGNet, ResNet, and Xception. To mitigate the difference tasks of ImageNet and ours, we apply fine-tuning on the pre-trained models by replacing some parts of the pre-trained models with new structures. We evaluate the performance using various re-training and fine-tuning schema. The vanilla pre-trained model is used as the baseline while other techniques such as re-training the models on the appended structures, partially re-training the pre-trained models, and fully re-training the whole networks where the pre-trained models are used in the initialization as the evaluator. Our experiments show that applying transfer learning only on our data may not be effective due to the difference in our task to ImageNet. Applying fine-tuning on pre-trained DCNN models is found to be effective. It is consistently better than that of using transfer learning only or partial fine-tuning. It is also better than training the model from scratch, i.e., without using pre-trained models.","PeriodicalId":228482,"journal":{"name":"2020 International Conference on Radar, Antenna, Microwave, Electronics, and Telecommunications (ICRAMET)","volume":"89 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":"{\"title\":\"Transfer Learning and Fine-Tuning for Deep Learning-Based Tea Diseases Detection on Small Datasets\",\"authors\":\"A. Ramdan, A. Heryana, Andria Arisal, R. B. S. Kusumo, H. Pardede\",\"doi\":\"10.1109/ICRAMET51080.2020.9298575\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"It is well-known that a large amount of data is required to train deep learning systems. However, data collection is very costly if it is not impossible to do. To overcome the limited data problem, one can use models that have been trained with a large dataset and apply them in the target domain with a limited dataset. In this paper, we use pre-trained models on imageNet data and re-train them on our data to detect tea leaf diseases. Those pre-trained models use deep convolutional neural network (DCNN) architectures: VGGNet, ResNet, and Xception. To mitigate the difference tasks of ImageNet and ours, we apply fine-tuning on the pre-trained models by replacing some parts of the pre-trained models with new structures. We evaluate the performance using various re-training and fine-tuning schema. The vanilla pre-trained model is used as the baseline while other techniques such as re-training the models on the appended structures, partially re-training the pre-trained models, and fully re-training the whole networks where the pre-trained models are used in the initialization as the evaluator. Our experiments show that applying transfer learning only on our data may not be effective due to the difference in our task to ImageNet. Applying fine-tuning on pre-trained DCNN models is found to be effective. It is consistently better than that of using transfer learning only or partial fine-tuning. It is also better than training the model from scratch, i.e., without using pre-trained models.\",\"PeriodicalId\":228482,\"journal\":{\"name\":\"2020 International Conference on Radar, Antenna, Microwave, Electronics, and Telecommunications (ICRAMET)\",\"volume\":\"89 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-11-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"8\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 International Conference on Radar, Antenna, Microwave, Electronics, and Telecommunications (ICRAMET)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICRAMET51080.2020.9298575\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 International Conference on Radar, Antenna, Microwave, Electronics, and Telecommunications (ICRAMET)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICRAMET51080.2020.9298575","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8

摘要

众所周知,训练深度学习系统需要大量的数据。然而,数据收集即使不是不可能做到,也是非常昂贵的。为了克服有限的数据问题,可以使用使用大型数据集训练的模型,并将其应用于具有有限数据集的目标领域。在本文中,我们在imageNet数据上使用预训练的模型,并在我们的数据上重新训练它们来检测茶叶病害。这些预训练模型使用深度卷积神经网络(DCNN)架构:VGGNet、ResNet和exception。为了减轻ImageNet和我们的任务差异,我们通过用新结构替换预训练模型的某些部分,对预训练模型进行微调。我们使用各种重新训练和微调模式来评估性能。使用香草预训练模型作为基线,而其他技术,如在附加结构上重新训练模型,部分重新训练预训练模型,以及完全重新训练整个网络,其中预训练模型在初始化中用作评估器。我们的实验表明,由于我们的任务与ImageNet的不同,仅在我们的数据上应用迁移学习可能并不有效。对预训练好的DCNN模型进行微调是有效的。它始终优于只使用迁移学习或部分微调。它也比从头开始训练模型更好,即不使用预训练的模型。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Transfer Learning and Fine-Tuning for Deep Learning-Based Tea Diseases Detection on Small Datasets
It is well-known that a large amount of data is required to train deep learning systems. However, data collection is very costly if it is not impossible to do. To overcome the limited data problem, one can use models that have been trained with a large dataset and apply them in the target domain with a limited dataset. In this paper, we use pre-trained models on imageNet data and re-train them on our data to detect tea leaf diseases. Those pre-trained models use deep convolutional neural network (DCNN) architectures: VGGNet, ResNet, and Xception. To mitigate the difference tasks of ImageNet and ours, we apply fine-tuning on the pre-trained models by replacing some parts of the pre-trained models with new structures. We evaluate the performance using various re-training and fine-tuning schema. The vanilla pre-trained model is used as the baseline while other techniques such as re-training the models on the appended structures, partially re-training the pre-trained models, and fully re-training the whole networks where the pre-trained models are used in the initialization as the evaluator. Our experiments show that applying transfer learning only on our data may not be effective due to the difference in our task to ImageNet. Applying fine-tuning on pre-trained DCNN models is found to be effective. It is consistently better than that of using transfer learning only or partial fine-tuning. It is also better than training the model from scratch, i.e., without using pre-trained models.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Deep Learning for Dengue Fever Event Detection Using Online News Screen Printed Electrochemical Sensor for Ascorbic Acid Detection Based on Nafion/Ionic Liquids/Graphene Composite on Carbon Electrodes Implementation Array-Slotted Miliwires in Artificial Dielectric Material on Waveguide Filters Te10 Mode Path Loss Model of the Maritime Wireless Communication in the Seas of Indonesia Modeling of Low-Resolution Face Imaging
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1