SOYBEAN PEST IDENTIFICATION BASED ON CONVOLUTIONAL NEURAL NETWORK AND TRANSFER LEARNING

Q4 Earth and Planetary Sciences ASEAN Engineering Journal Pub Date : 2023-02-28 DOI:10.11113/aej.v13.18591
Xin MingYuan, A. Weay
{"title":"SOYBEAN PEST IDENTIFICATION BASED ON CONVOLUTIONAL NEURAL NETWORK AND TRANSFER LEARNING","authors":"Xin MingYuan, A. Weay","doi":"10.11113/aej.v13.18591","DOIUrl":null,"url":null,"abstract":"Despite the fact that the ensemble classifier improved classification precision by integrating texture and colour. However, the significant image preparation process is a laborious and time-consuming. Manually depicting constrained feature extraction can result in a semantic void in a picture. These limits cause inaccuracy of agricultural disease identification. Thus, this study proposes a soybean pest detection method based on a hybrid of Transfer learning and pyramidal convolutional neural networks that can identify soybean pests quickly and accurately on small sample sets. Bean borer, soybean poison moth, mite, stink bug, and pod borer photographs are first pre-processed using standard data improvement methods, and then manually categorized into six groups based on pest characteristics. The weight parameters from the VGG16 model trained on the ImageNet image dataset were then transferred to the recognition of soybean pests using the transfer learning method, and the VGG16 convolutional and pooling layers were used as feature extraction layers, while the top layer was redesigned as a pyramidal convolutional layer, an average pooling layer, and a SoftMax output layer, with some of the convolutional layers frozen during training. According to the testing statistics, the model's average test accuracy is 98.23%, and the model size is only 95.4 M. For bean borer, soybean poison moth, mite, skewed night moth, stink bug, and bean pod moth, the model's recognition accuracy is 96.4, 97.78, 98.12, 98.4, 99.56, and 99.16, respectively. The results of the experiments show that the method has a high identification efficiency and a good recognition effect.","PeriodicalId":36749,"journal":{"name":"ASEAN Engineering Journal","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-02-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ASEAN Engineering Journal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.11113/aej.v13.18591","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"Earth and Planetary Sciences","Score":null,"Total":0}
引用次数: 0

Abstract

Despite the fact that the ensemble classifier improved classification precision by integrating texture and colour. However, the significant image preparation process is a laborious and time-consuming. Manually depicting constrained feature extraction can result in a semantic void in a picture. These limits cause inaccuracy of agricultural disease identification. Thus, this study proposes a soybean pest detection method based on a hybrid of Transfer learning and pyramidal convolutional neural networks that can identify soybean pests quickly and accurately on small sample sets. Bean borer, soybean poison moth, mite, stink bug, and pod borer photographs are first pre-processed using standard data improvement methods, and then manually categorized into six groups based on pest characteristics. The weight parameters from the VGG16 model trained on the ImageNet image dataset were then transferred to the recognition of soybean pests using the transfer learning method, and the VGG16 convolutional and pooling layers were used as feature extraction layers, while the top layer was redesigned as a pyramidal convolutional layer, an average pooling layer, and a SoftMax output layer, with some of the convolutional layers frozen during training. According to the testing statistics, the model's average test accuracy is 98.23%, and the model size is only 95.4 M. For bean borer, soybean poison moth, mite, skewed night moth, stink bug, and bean pod moth, the model's recognition accuracy is 96.4, 97.78, 98.12, 98.4, 99.56, and 99.16, respectively. The results of the experiments show that the method has a high identification efficiency and a good recognition effect.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于卷积神经网络和迁移学习的大豆害虫识别
尽管集成分类器通过整合纹理和颜色来提高分类精度。然而,重要的图像准备过程是费力且耗时的。手动描绘受约束的特征提取可能导致图片中的语义空白。这些限制导致农业病害鉴定的不准确性。因此,本研究提出了一种基于转移学习和金字塔卷积神经网络的混合大豆害虫检测方法,该方法可以在小样本集上快速准确地识别大豆害虫。首先使用标准数据改进方法对二化螟、大豆毒蛾、螨、蝽和二化螟照片进行预处理,然后根据害虫特征手动将其分为六组。然后,使用转移学习方法将在ImageNet图像数据集上训练的VGG16模型的权重参数转移到大豆害虫的识别中,VGG16卷积层和池化层被用作特征提取层,而顶层被重新设计为金字塔卷积层、平均池化层和SoftMax输出层,其中一些卷积层在训练期间被冻结。根据测试统计,该模型的平均测试准确率为98.23%,模型大小仅为95.4M。对于二化螟、大豆毒蛾、螨、偏夜蛾、蝽和豆荚蛾,该模型识别准确率分别为96.4、97.78、98.12、98.4、99.56和99.16。实验结果表明,该方法具有较高的识别效率和良好的识别效果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
ASEAN Engineering Journal
ASEAN Engineering Journal Engineering-Engineering (all)
CiteScore
0.60
自引率
0.00%
发文量
75
期刊最新文献
ACOUSTICAL ANALYSIS FOR THE LECTURE ROOMS IN UNIMAP ULTRASOUND-ASSISTED EXTRACTION OF STARCH FROM OIL PALM TRUNK: AN OPTIMIZATION BY RESPONSE SURFACE METHODOLOGY EVALUATION OF STORM SURGE BEHAVIOR DUE TO DIFFERENT TYPHOON TRACKS AND WIND SPEEDS ALONG THE COAST OF DAGUPAN CITY, LINGAYEN GULF, PHILIPPINES SHEAR STRENGTH OF SOFT CLAY REINFORCED WITH ACRYLONITRILE BUTADIENE STYRENE (ABS) COLUMN CUTTING ANALYSIS ON HORIZONTAL DRIILING USING CUTTING CARRY INDEX, CUTTING TRANSPORT RATIO AND CUTTING CONCENTRATION IN ANNULUS METHOD ON A WELL G FIELD S
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1