Field-based multispecies weed and crop detection using ground robots and advanced YOLO models: A data and model-centric approach

IF 6.3 Q1 AGRICULTURAL ENGINEERING Smart agricultural technology Pub Date : 2024-08-16 DOI:10.1016/j.atech.2024.100538
{"title":"Field-based multispecies weed and crop detection using ground robots and advanced YOLO models: A data and model-centric approach","authors":"","doi":"10.1016/j.atech.2024.100538","DOIUrl":null,"url":null,"abstract":"<div><p>The implementation of a machine-vision system for real-time precision weed management is a crucial step towards the development of smart spraying robotic vehicles. The intelligent machine-vision system, constructed using deep learning object detection models, has the potential to accurately detect weeds and crops from images. Both data-centric and model-centric approaches of deep learning model development offer advantages depending on environment and non-environment factors. To assess the performance of weed detection in real-field conditions for eight crop species, the Yolov8, Yolov9, and customized Yolov9 deep learning models were trained and evaluated using RGB images from four locations (Casselton, Fargo, and Carrington) over a two-year period in the Great Plains region of the U.S. The experiment encompassed eight crop species—dry bean, canola, corn, field pea, flax, lentil, soybean, and sugar beet—and five weed species—horseweed, kochia, redroot pigweed, common ragweed, and water hemp. Six Yolov8 and eight Yolov9 model variants were trained using annotated weed and crop images gathered from four different sites including combined dataset from four sites. The Yolov8 and Yolov9 models’ performance in weed detection were assessed based on mean average precision (mAP50) metrics for five datasets, eight crop species, and five weed species. The results of the weed and crop detection evaluation showed high mAP50 values of 86.2 %. The mAP50 values for individual weed and crop species detection ranged from 80.8 % to 98 %. The results demonstrated that the performance of the model varied by model type (model-centric), location due to environment (data-centric), data size (data-centric), data quality (data-centric), and object size in the image (data-centric). Nevertheless, the Yolov9 customized lightweight model has the potential to play a significant role in building a real time machine-vision-based precision weed management system.</p></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":null,"pages":null},"PeriodicalIF":6.3000,"publicationDate":"2024-08-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2772375524001436/pdfft?md5=fb66c49d8d623973c91bee4e32e27d12&pid=1-s2.0-S2772375524001436-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Smart agricultural technology","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2772375524001436","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURAL ENGINEERING","Score":null,"Total":0}
引用次数: 0

Abstract

The implementation of a machine-vision system for real-time precision weed management is a crucial step towards the development of smart spraying robotic vehicles. The intelligent machine-vision system, constructed using deep learning object detection models, has the potential to accurately detect weeds and crops from images. Both data-centric and model-centric approaches of deep learning model development offer advantages depending on environment and non-environment factors. To assess the performance of weed detection in real-field conditions for eight crop species, the Yolov8, Yolov9, and customized Yolov9 deep learning models were trained and evaluated using RGB images from four locations (Casselton, Fargo, and Carrington) over a two-year period in the Great Plains region of the U.S. The experiment encompassed eight crop species—dry bean, canola, corn, field pea, flax, lentil, soybean, and sugar beet—and five weed species—horseweed, kochia, redroot pigweed, common ragweed, and water hemp. Six Yolov8 and eight Yolov9 model variants were trained using annotated weed and crop images gathered from four different sites including combined dataset from four sites. The Yolov8 and Yolov9 models’ performance in weed detection were assessed based on mean average precision (mAP50) metrics for five datasets, eight crop species, and five weed species. The results of the weed and crop detection evaluation showed high mAP50 values of 86.2 %. The mAP50 values for individual weed and crop species detection ranged from 80.8 % to 98 %. The results demonstrated that the performance of the model varied by model type (model-centric), location due to environment (data-centric), data size (data-centric), data quality (data-centric), and object size in the image (data-centric). Nevertheless, the Yolov9 customized lightweight model has the potential to play a significant role in building a real time machine-vision-based precision weed management system.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
利用地面机器人和先进的 YOLO 模型进行基于田间的多物种杂草和作物探测:以数据和模型为中心的方法
实施用于实时精准杂草管理的机器视觉系统是开发智能喷洒机器人车辆的关键一步。利用深度学习对象检测模型构建的智能机器视觉系统有可能从图像中准确检测出杂草和农作物。以数据为中心和以模型为中心的深度学习模型开发方法都具有优势,这取决于环境和非环境因素。为了评估八种作物在真实田地条件下的杂草检测性能,我们使用来自美国大平原地区四个地点(卡塞尔顿、法戈和卡灵顿)的 RGB 图像,对 Yolov8、Yolov9 和定制的 Yolov9 深度学习模型进行了为期两年的训练和评估。实验包括八种作物--干豆、油菜籽、玉米、大田豌豆、亚麻、扁豆、大豆和甜菜,以及五种杂草--马草、科奇亚、红根猪笼草、普通豚草和水麻。利用从四个不同地点(包括四个地点的综合数据集)收集的带注释的杂草和作物图像,对六个 Yolov8 和八个 Yolov9 模型变体进行了训练。根据五个数据集、八个作物物种和五个杂草物种的平均精度(mAP50)指标,评估了 Yolov8 和 Yolov9 模型在杂草检测方面的性能。杂草和作物检测评估结果显示,mAP50 值高达 86.2%。单个杂草和作物物种检测的 mAP50 值介于 80.8 % 到 98 % 之间。结果表明,模型的性能因模型类型(以模型为中心)、环境造成的位置(以数据为中心)、数据大小(以数据为中心)、数据质量(以数据为中心)和图像中物体大小(以数据为中心)而异。尽管如此,Yolov9 定制的轻量级模型仍有可能在构建基于机器视觉的实时精准杂草管理系统中发挥重要作用。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
4.20
自引率
0.00%
发文量
0
期刊最新文献
Development of a low-cost smart irrigation system for sustainable water management in the Mediterranean region Cover crop impacts on soil organic matter dynamics and its quantification using UAV and proximal sensing Design and development of machine vision robotic arm for vegetable crops in hydroponics Cybersecurity threats and mitigation measures in agriculture 4.0 and 5.0 Farmer's attitudes towards GHG emissions and adoption to low-cost sensor-driven smart farming for mitigation: The case of Ireland tillage and horticultural farmers
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1