基于级联神经网络的垃圾分类

IF 0.7 4区 计算机科学 Q4 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Neural Network World Pub Date : 2023-01-01 DOI:10.14311/nnw.2023.33.007
Xiliang Zhang, Na Zhao, Qinyuan Lv, Zhenyu Ma, Qin Qin, Weifei Gan, Jianfeng Bai, Ling Gan
{"title":"基于级联神经网络的垃圾分类","authors":"Xiliang Zhang, Na Zhao, Qinyuan Lv, Zhenyu Ma, Qin Qin, Weifei Gan, Jianfeng Bai, Ling Gan","doi":"10.14311/nnw.2023.33.007","DOIUrl":null,"url":null,"abstract":"Most existing methods of garbage classification utilize transfer learning to acquire acceptable performance. They focus on some smaller categories. For example, the number of the dataset is small or the number of categories is few. However, they are hardly implemented on small devices, such as a smart phone or a Raspberry Pi, because of the huge number of parameters. Moreover, those approaches have insufficient generalization capability. Based on the aforementioned reasons, a promising cascade approach is proposed. It has better performance than transfer learning in classifying garbage in a large scale. In addition, it requires less parameters and training time. So it is more suitable to a potential application, such as deployment on a small device. Several commonly used backbones of convolutional neural networks are investigated in this study. Two different tasks, that is, the target domain being the same as the source domain and the former being different from the latter, are conducted besides. Results indicate with ResNet101 as the backbone, our algorithm outperforms other existing approaches. The innovation is that, as far as we know, this study is the first work combining a pre-trained convolutional neural network as a feature extractor with extreme learning machine to classify garbage. Furthermore, the training time and the number of trainable parameters is significantly shorter and less, respectively.","PeriodicalId":49765,"journal":{"name":"Neural Network World","volume":"1 1","pages":""},"PeriodicalIF":0.7000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Garbage classification based on a cascade neural network\",\"authors\":\"Xiliang Zhang, Na Zhao, Qinyuan Lv, Zhenyu Ma, Qin Qin, Weifei Gan, Jianfeng Bai, Ling Gan\",\"doi\":\"10.14311/nnw.2023.33.007\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Most existing methods of garbage classification utilize transfer learning to acquire acceptable performance. They focus on some smaller categories. For example, the number of the dataset is small or the number of categories is few. However, they are hardly implemented on small devices, such as a smart phone or a Raspberry Pi, because of the huge number of parameters. Moreover, those approaches have insufficient generalization capability. Based on the aforementioned reasons, a promising cascade approach is proposed. It has better performance than transfer learning in classifying garbage in a large scale. In addition, it requires less parameters and training time. So it is more suitable to a potential application, such as deployment on a small device. Several commonly used backbones of convolutional neural networks are investigated in this study. Two different tasks, that is, the target domain being the same as the source domain and the former being different from the latter, are conducted besides. Results indicate with ResNet101 as the backbone, our algorithm outperforms other existing approaches. The innovation is that, as far as we know, this study is the first work combining a pre-trained convolutional neural network as a feature extractor with extreme learning machine to classify garbage. Furthermore, the training time and the number of trainable parameters is significantly shorter and less, respectively.\",\"PeriodicalId\":49765,\"journal\":{\"name\":\"Neural Network World\",\"volume\":\"1 1\",\"pages\":\"\"},\"PeriodicalIF\":0.7000,\"publicationDate\":\"2023-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Network World\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.14311/nnw.2023.33.007\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Network World","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.14311/nnw.2023.33.007","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

大多数现有的垃圾分类方法利用迁移学习来获得可接受的性能。他们专注于一些较小的类别。例如,数据集的数量很少,或者类别的数量很少。然而,它们很难在小型设备上实现,比如智能手机或树莓派,因为有大量的参数。而且,这些方法泛化能力不足。基于上述原因,提出了一种很有前途的级联方法。在大规模的垃圾分类中,它比迁移学习有更好的性能。此外,它需要较少的参数和训练时间。因此,它更适合于潜在的应用程序,例如在小型设备上的部署。本文研究了几种常用的卷积神经网络主干。此外,还进行了目标域与源域相同、源域与目标域不同的两种不同的任务。结果表明,以ResNet101为主干,我们的算法优于其他现有的方法。创新之处在于,据我们所知,这项研究是第一次将预训练的卷积神经网络作为特征提取器与极限学习机相结合,对垃圾进行分类。训练时间显著缩短,可训练参数数量显著减少。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Garbage classification based on a cascade neural network
Most existing methods of garbage classification utilize transfer learning to acquire acceptable performance. They focus on some smaller categories. For example, the number of the dataset is small or the number of categories is few. However, they are hardly implemented on small devices, such as a smart phone or a Raspberry Pi, because of the huge number of parameters. Moreover, those approaches have insufficient generalization capability. Based on the aforementioned reasons, a promising cascade approach is proposed. It has better performance than transfer learning in classifying garbage in a large scale. In addition, it requires less parameters and training time. So it is more suitable to a potential application, such as deployment on a small device. Several commonly used backbones of convolutional neural networks are investigated in this study. Two different tasks, that is, the target domain being the same as the source domain and the former being different from the latter, are conducted besides. Results indicate with ResNet101 as the backbone, our algorithm outperforms other existing approaches. The innovation is that, as far as we know, this study is the first work combining a pre-trained convolutional neural network as a feature extractor with extreme learning machine to classify garbage. Furthermore, the training time and the number of trainable parameters is significantly shorter and less, respectively.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Neural Network World
Neural Network World 工程技术-计算机:人工智能
CiteScore
1.80
自引率
0.00%
发文量
0
审稿时长
12 months
期刊介绍: Neural Network World is a bimonthly journal providing the latest developments in the field of informatics with attention mainly devoted to the problems of: brain science, theory and applications of neural networks (both artificial and natural), fuzzy-neural systems, methods and applications of evolutionary algorithms, methods of parallel and mass-parallel computing, problems of soft-computing, methods of artificial intelligence.
期刊最新文献
Water quality image classification for aquaculture using deep transfer learning Enhanced QOS energy-efficient routing algorithm using deep belief neural network in hybrid falcon-improved ACO nature-inspired optimization in wireless sensor networks Vibration analyses of railway systems using proposed neural predictors A self-adaptive deep learning-based model to predict cloud workload Integration of railway infrastructure topological description elements from the microL2 to the macroN0,L0 level of detail
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1