Honggui Han , Qiyu Zhang , Fangyu Li , Yongping Du , Yifan Gu , Yufeng Wu
{"title":"Metallic product recognition with dual attention and multi-branch residual blocks-based convolutional neural networks","authors":"Honggui Han , Qiyu Zhang , Fangyu Li , Yongping Du , Yifan Gu , Yufeng Wu","doi":"10.1016/j.cec.2022.100014","DOIUrl":null,"url":null,"abstract":"<div><p>Visual recognition technologies based on deep learning have been gradually playing an important role in various resource recovery fields. However, in the field of metal resource recycling, there is still a lack of intelligent and accurate recognition of metallic products, which seriously hinders the operation of the metal resource recycling industry chain. In this article, a convolutional neural network with dual attention mechanism and multi-branch residual blocks is proposed to realize the recognition of metallic products with a high accuracy. First, a channel-spatial dual attention mechanism is introduced to enhance the model sensitivity on key features. The model can focus on key features even when extracting features of metallic products with too much confusing information. Second, a deep convolutional network with multi-branch residual blocks as the backbone while embedding a dual-attention mechanism module is designed to satisfy deeper and more effective feature extraction for metallic products with complex characteristic features. To evaluate the proposed model, a waste electrical and electronic equipment (WEEE) dataset containing 9266 images in 18 categories and a waste household metal appliance (WHMA) dataset containing 11,757 images in 23 categories are built. The experimental results show that the accuracy reaches 94.31% and 95.88% in WEEE and WHMA, respectively, achieving high accuracy and high quality recycling.</p></div>","PeriodicalId":100245,"journal":{"name":"Circular Economy","volume":"1 2","pages":"Article 100014"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2773167722000140/pdfft?md5=13a787cc567184762ed8c2089983fa84&pid=1-s2.0-S2773167722000140-main.pdf","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Circular Economy","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2773167722000140","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Visual recognition technologies based on deep learning have been gradually playing an important role in various resource recovery fields. However, in the field of metal resource recycling, there is still a lack of intelligent and accurate recognition of metallic products, which seriously hinders the operation of the metal resource recycling industry chain. In this article, a convolutional neural network with dual attention mechanism and multi-branch residual blocks is proposed to realize the recognition of metallic products with a high accuracy. First, a channel-spatial dual attention mechanism is introduced to enhance the model sensitivity on key features. The model can focus on key features even when extracting features of metallic products with too much confusing information. Second, a deep convolutional network with multi-branch residual blocks as the backbone while embedding a dual-attention mechanism module is designed to satisfy deeper and more effective feature extraction for metallic products with complex characteristic features. To evaluate the proposed model, a waste electrical and electronic equipment (WEEE) dataset containing 9266 images in 18 categories and a waste household metal appliance (WHMA) dataset containing 11,757 images in 23 categories are built. The experimental results show that the accuracy reaches 94.31% and 95.88% in WEEE and WHMA, respectively, achieving high accuracy and high quality recycling.