小手术数据集分类的迁移学习技术评价

S. Bali, S. S. Tyagi
{"title":"小手术数据集分类的迁移学习技术评价","authors":"S. Bali, S. S. Tyagi","doi":"10.1109/Confluence47617.2020.9058207","DOIUrl":null,"url":null,"abstract":"Deep learning is the key technology used in a large variety of applications such as self-driving cars, image recognition, automatic machine translation, automatic handwriting generation. The success was fueled due to accessibility of huge datasets, GPUs, max pooling. Earlier machine learning techniques employed two phases: features extraction and classification. The performance of such algorithms was highly dependent on how well the features are extracted and that was the major bottleneck of these techniques. Deep learning techniques employ Convolutional Neural Networks (CNNs) with numerous layers of non-linear processing for extracting the features automatically and classification that solves the previous problem. In the real time applications most of the time, either the dataset is unavailable or has less amount of data which makes it difficult to achieve accurate results for classifying the images. CNNs are hard to be trained using the small datasets. Transfer learning has emerged as a very powerful technique where in the knowledge gained from the larger dataset is transferred to the new dataset. Data augmentation and dropout are also powerful techniques that are useful for dealing with small datasets. In this paper, different techniques using the VGG16 pretrained model are compared on the small dataset.","PeriodicalId":180005,"journal":{"name":"2020 10th International Conference on Cloud Computing, Data Science & Engineering (Confluence)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Evaluation of transfer learning techniques for classifying small surgical dataset\",\"authors\":\"S. Bali, S. S. Tyagi\",\"doi\":\"10.1109/Confluence47617.2020.9058207\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Deep learning is the key technology used in a large variety of applications such as self-driving cars, image recognition, automatic machine translation, automatic handwriting generation. The success was fueled due to accessibility of huge datasets, GPUs, max pooling. Earlier machine learning techniques employed two phases: features extraction and classification. The performance of such algorithms was highly dependent on how well the features are extracted and that was the major bottleneck of these techniques. Deep learning techniques employ Convolutional Neural Networks (CNNs) with numerous layers of non-linear processing for extracting the features automatically and classification that solves the previous problem. In the real time applications most of the time, either the dataset is unavailable or has less amount of data which makes it difficult to achieve accurate results for classifying the images. CNNs are hard to be trained using the small datasets. Transfer learning has emerged as a very powerful technique where in the knowledge gained from the larger dataset is transferred to the new dataset. Data augmentation and dropout are also powerful techniques that are useful for dealing with small datasets. In this paper, different techniques using the VGG16 pretrained model are compared on the small dataset.\",\"PeriodicalId\":180005,\"journal\":{\"name\":\"2020 10th International Conference on Cloud Computing, Data Science & Engineering (Confluence)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 10th International Conference on Cloud Computing, Data Science & Engineering (Confluence)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/Confluence47617.2020.9058207\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 10th International Conference on Cloud Computing, Data Science & Engineering (Confluence)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/Confluence47617.2020.9058207","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

深度学习是自动驾驶汽车、图像识别、自动机器翻译、自动手写生成等各种应用中使用的关键技术。巨大的数据集、gpu和最大池的可访问性推动了这一成功。早期的机器学习技术采用两个阶段:特征提取和分类。这些算法的性能高度依赖于特征提取的好坏,这是这些技术的主要瓶颈。深度学习技术采用卷积神经网络(cnn)进行多层非线性处理,自动提取特征并进行分类,解决了前面的问题。在实时应用中,大多数情况下,要么数据集不可用,要么数据量较少,这使得难以获得准确的图像分类结果。使用小数据集很难训练cnn。迁移学习已经成为一种非常强大的技术,从大数据集中获得的知识被转移到新的数据集中。数据增强和退出也是处理小数据集的强大技术。本文在小数据集上比较了使用VGG16预训练模型的不同技术。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Evaluation of transfer learning techniques for classifying small surgical dataset
Deep learning is the key technology used in a large variety of applications such as self-driving cars, image recognition, automatic machine translation, automatic handwriting generation. The success was fueled due to accessibility of huge datasets, GPUs, max pooling. Earlier machine learning techniques employed two phases: features extraction and classification. The performance of such algorithms was highly dependent on how well the features are extracted and that was the major bottleneck of these techniques. Deep learning techniques employ Convolutional Neural Networks (CNNs) with numerous layers of non-linear processing for extracting the features automatically and classification that solves the previous problem. In the real time applications most of the time, either the dataset is unavailable or has less amount of data which makes it difficult to achieve accurate results for classifying the images. CNNs are hard to be trained using the small datasets. Transfer learning has emerged as a very powerful technique where in the knowledge gained from the larger dataset is transferred to the new dataset. Data augmentation and dropout are also powerful techniques that are useful for dealing with small datasets. In this paper, different techniques using the VGG16 pretrained model are compared on the small dataset.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Identification of the most efficient algorithm to find Hamiltonian Path in practical conditions Segmentation and Detection of Road Region in Aerial Images using Hybrid CNN-Random Field Algorithm A Novel Approach for Isolation of Sinkhole Attack in Wireless Sensor Networks Performance Analysis of various Information Platforms for recognizing the quality of Indian Roads Time Series Data Analysis And Prediction Of CO2 Emissions
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1