基于无人机图像和更快的 R-CNN 识别大豆苗期杂草

IF 7.7 1区 农林科学 Q1 AGRICULTURE, MULTIDISCIPLINARY Computers and Electronics in Agriculture Pub Date : 2024-10-29 DOI:10.1016/j.compag.2024.109533
Jian Cui , Xinle Zhang , Jiahuan Zhang , Yongqi Han , Hongfu Ai , Chang Dong , Huanjun Liu
{"title":"基于无人机图像和更快的 R-CNN 识别大豆苗期杂草","authors":"Jian Cui ,&nbsp;Xinle Zhang ,&nbsp;Jiahuan Zhang ,&nbsp;Yongqi Han ,&nbsp;Hongfu Ai ,&nbsp;Chang Dong ,&nbsp;Huanjun Liu","doi":"10.1016/j.compag.2024.109533","DOIUrl":null,"url":null,"abstract":"<div><div>The natural environment in which field soybeans are grown is complex in terms of weed species and distribution, and a wide range of weeds are mixed with soybeans, resulting in low weed recognition rates. Weeds compete with soybeans for sunlight, water and nutrients, and if not managed in a timely manner, weeds may impede soybean growth and reduce yield. The seedling stage is the early stage of soybean growth, and the growth status of soybeans and weeds varies greatly, making it easier to identify and manage weeds. In this paper, a field soybean weed recognition method based on low altitude UAV images and Faster R-CNN algorithm is proposed by utilizing soybean seedling stage weed data collected at low altitude by UAV equipment. A dataset containing 4000 images of soybeans, weeds and broadleaf weeds was constructed and generated in PASCAL VOC format. First, the classification effects of four backbone feature extraction networks, ResNet50, ResNet101, VGG16 and VGG19, were compared to determine the optimal structure; second, the aspect ratio distribution and area distribution of the targets in the dataset were analyzed, and a suitable anchoring framework was designed according to the characteristics of the dataset itself, and the target was trained to be able to recognize soybean seedling weeds with different weed densities. detection model; and two classical target detection algorithms SSD, YOlOv3, YOLOV4, and YOLOV7 were compared. This experiment shows that the Faster R-CNN model with VGG16 as the backbone feature extraction network has the optimal recognition accuracy. By analyzing the characteristics of the dataset itself and optimizing the anchor frame parameters, the optimized model has an average recognition accuracy of 88.69 % for a single data frame, and an average recognition time of 310 ms, which can accurately recognize soybean seedlings and weeds of different densities. Comparing the optimized Faster R-CNN with mainstream target detection models, the average accuracy is 6.31 % higher than the SSD model, 5.79 % higher than the YOlOv3 model, 6.8 % higher than YOLOV4, and 2.92 % higher than YOLOV7. The results show that the optimized target detection model in this paper is more advantageous and can provide scientific guarantee for grass damage monitoring and control in UAV scale.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":null,"pages":null},"PeriodicalIF":7.7000,"publicationDate":"2024-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Weed identification in soybean seedling stage based on UAV images and Faster R-CNN\",\"authors\":\"Jian Cui ,&nbsp;Xinle Zhang ,&nbsp;Jiahuan Zhang ,&nbsp;Yongqi Han ,&nbsp;Hongfu Ai ,&nbsp;Chang Dong ,&nbsp;Huanjun Liu\",\"doi\":\"10.1016/j.compag.2024.109533\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>The natural environment in which field soybeans are grown is complex in terms of weed species and distribution, and a wide range of weeds are mixed with soybeans, resulting in low weed recognition rates. Weeds compete with soybeans for sunlight, water and nutrients, and if not managed in a timely manner, weeds may impede soybean growth and reduce yield. The seedling stage is the early stage of soybean growth, and the growth status of soybeans and weeds varies greatly, making it easier to identify and manage weeds. In this paper, a field soybean weed recognition method based on low altitude UAV images and Faster R-CNN algorithm is proposed by utilizing soybean seedling stage weed data collected at low altitude by UAV equipment. A dataset containing 4000 images of soybeans, weeds and broadleaf weeds was constructed and generated in PASCAL VOC format. First, the classification effects of four backbone feature extraction networks, ResNet50, ResNet101, VGG16 and VGG19, were compared to determine the optimal structure; second, the aspect ratio distribution and area distribution of the targets in the dataset were analyzed, and a suitable anchoring framework was designed according to the characteristics of the dataset itself, and the target was trained to be able to recognize soybean seedling weeds with different weed densities. detection model; and two classical target detection algorithms SSD, YOlOv3, YOLOV4, and YOLOV7 were compared. This experiment shows that the Faster R-CNN model with VGG16 as the backbone feature extraction network has the optimal recognition accuracy. By analyzing the characteristics of the dataset itself and optimizing the anchor frame parameters, the optimized model has an average recognition accuracy of 88.69 % for a single data frame, and an average recognition time of 310 ms, which can accurately recognize soybean seedlings and weeds of different densities. Comparing the optimized Faster R-CNN with mainstream target detection models, the average accuracy is 6.31 % higher than the SSD model, 5.79 % higher than the YOlOv3 model, 6.8 % higher than YOLOV4, and 2.92 % higher than YOLOV7. The results show that the optimized target detection model in this paper is more advantageous and can provide scientific guarantee for grass damage monitoring and control in UAV scale.</div></div>\",\"PeriodicalId\":50627,\"journal\":{\"name\":\"Computers and Electronics in Agriculture\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":7.7000,\"publicationDate\":\"2024-10-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computers and Electronics in Agriculture\",\"FirstCategoryId\":\"97\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0168169924009244\",\"RegionNum\":1,\"RegionCategory\":\"农林科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AGRICULTURE, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers and Electronics in Agriculture","FirstCategoryId":"97","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0168169924009244","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURE, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

摘要

大田大豆生长的自然环境中杂草的种类和分布十分复杂,多种杂草与大豆混生,导致杂草识别率低。杂草与大豆争夺阳光、水分和养分,如果不及时管理,杂草可能会阻碍大豆生长,降低产量。苗期是大豆生长的早期阶段,大豆和杂草的生长状况差异很大,因此更容易识别和管理杂草。本文利用无人机设备在低空采集的大豆苗期杂草数据,提出了一种基于低空无人机图像和 Faster R-CNN 算法的大田大豆杂草识别方法。数据集包含 4000 张大豆、杂草和阔叶杂草图像,并以 PASCAL VOC 格式生成。首先,比较了 ResNet50、ResNet101、VGG16 和 VGG19 四种骨干特征提取网络的分类效果,确定了最优结构;其次,分析了数据集中目标的长宽比分布和面积分布,并根据数据集自身的特点设计了合适的锚定框架,训练出能够识别不同杂草密度的大豆幼苗杂草的目标。检测模型;并对两种经典的目标检测算法 SSD、YOlOv3、YOLOV4 和 YOLOV7 进行了比较。实验结果表明,以 VGG16 为骨干特征提取网络的 Faster R-CNN 模型具有最佳识别精度。通过分析数据集本身的特点和优化锚帧参数,优化后的模型单个数据帧的平均识别准确率为 88.69 %,平均识别时间为 310 ms,可以准确识别不同密度的大豆幼苗和杂草。优化后的 Faster R-CNN 与主流目标检测模型相比,平均准确率比 SSD 模型高 6.31 %,比 YOlOv3 模型高 5.79 %,比 YOLOV4 高 6.8 %,比 YOLOV7 高 2.92 %。结果表明,本文优化后的目标检测模型更具优势,可为无人机规模的草害监测与控制提供科学保障。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Weed identification in soybean seedling stage based on UAV images and Faster R-CNN
The natural environment in which field soybeans are grown is complex in terms of weed species and distribution, and a wide range of weeds are mixed with soybeans, resulting in low weed recognition rates. Weeds compete with soybeans for sunlight, water and nutrients, and if not managed in a timely manner, weeds may impede soybean growth and reduce yield. The seedling stage is the early stage of soybean growth, and the growth status of soybeans and weeds varies greatly, making it easier to identify and manage weeds. In this paper, a field soybean weed recognition method based on low altitude UAV images and Faster R-CNN algorithm is proposed by utilizing soybean seedling stage weed data collected at low altitude by UAV equipment. A dataset containing 4000 images of soybeans, weeds and broadleaf weeds was constructed and generated in PASCAL VOC format. First, the classification effects of four backbone feature extraction networks, ResNet50, ResNet101, VGG16 and VGG19, were compared to determine the optimal structure; second, the aspect ratio distribution and area distribution of the targets in the dataset were analyzed, and a suitable anchoring framework was designed according to the characteristics of the dataset itself, and the target was trained to be able to recognize soybean seedling weeds with different weed densities. detection model; and two classical target detection algorithms SSD, YOlOv3, YOLOV4, and YOLOV7 were compared. This experiment shows that the Faster R-CNN model with VGG16 as the backbone feature extraction network has the optimal recognition accuracy. By analyzing the characteristics of the dataset itself and optimizing the anchor frame parameters, the optimized model has an average recognition accuracy of 88.69 % for a single data frame, and an average recognition time of 310 ms, which can accurately recognize soybean seedlings and weeds of different densities. Comparing the optimized Faster R-CNN with mainstream target detection models, the average accuracy is 6.31 % higher than the SSD model, 5.79 % higher than the YOlOv3 model, 6.8 % higher than YOLOV4, and 2.92 % higher than YOLOV7. The results show that the optimized target detection model in this paper is more advantageous and can provide scientific guarantee for grass damage monitoring and control in UAV scale.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Computers and Electronics in Agriculture
Computers and Electronics in Agriculture 工程技术-计算机:跨学科应用
CiteScore
15.30
自引率
14.50%
发文量
800
审稿时长
62 days
期刊介绍: Computers and Electronics in Agriculture provides international coverage of advancements in computer hardware, software, electronic instrumentation, and control systems applied to agricultural challenges. Encompassing agronomy, horticulture, forestry, aquaculture, and animal farming, the journal publishes original papers, reviews, and applications notes. It explores the use of computers and electronics in plant or animal agricultural production, covering topics like agricultural soils, water, pests, controlled environments, and waste. The scope extends to on-farm post-harvest operations and relevant technologies, including artificial intelligence, sensors, machine vision, robotics, networking, and simulation modeling. Its companion journal, Smart Agricultural Technology, continues the focus on smart applications in production agriculture.
期刊最新文献
Autonomous net inspection and cleaning in sea-based fish farms: A review A review of unmanned aerial vehicle based remote sensing and machine learning for cotton crop growth monitoring High-throughput phenotypic traits estimation of faba bean based on machine learning and drone-based multimodal data Image quality safety model for the safety of the intended functionality in highly automated agricultural machines A general image classification model for agricultural machinery trajectory mode recognition
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1