{"title":"利用地面机器人和先进的 YOLO 模型进行基于田间的多物种杂草和作物探测:以数据和模型为中心的方法","authors":"","doi":"10.1016/j.atech.2024.100538","DOIUrl":null,"url":null,"abstract":"<div><p>The implementation of a machine-vision system for real-time precision weed management is a crucial step towards the development of smart spraying robotic vehicles. The intelligent machine-vision system, constructed using deep learning object detection models, has the potential to accurately detect weeds and crops from images. Both data-centric and model-centric approaches of deep learning model development offer advantages depending on environment and non-environment factors. To assess the performance of weed detection in real-field conditions for eight crop species, the Yolov8, Yolov9, and customized Yolov9 deep learning models were trained and evaluated using RGB images from four locations (Casselton, Fargo, and Carrington) over a two-year period in the Great Plains region of the U.S. The experiment encompassed eight crop species—dry bean, canola, corn, field pea, flax, lentil, soybean, and sugar beet—and five weed species—horseweed, kochia, redroot pigweed, common ragweed, and water hemp. Six Yolov8 and eight Yolov9 model variants were trained using annotated weed and crop images gathered from four different sites including combined dataset from four sites. The Yolov8 and Yolov9 models’ performance in weed detection were assessed based on mean average precision (mAP50) metrics for five datasets, eight crop species, and five weed species. The results of the weed and crop detection evaluation showed high mAP50 values of 86.2 %. The mAP50 values for individual weed and crop species detection ranged from 80.8 % to 98 %. The results demonstrated that the performance of the model varied by model type (model-centric), location due to environment (data-centric), data size (data-centric), data quality (data-centric), and object size in the image (data-centric). Nevertheless, the Yolov9 customized lightweight model has the potential to play a significant role in building a real time machine-vision-based precision weed management system.</p></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":null,"pages":null},"PeriodicalIF":6.3000,"publicationDate":"2024-08-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2772375524001436/pdfft?md5=fb66c49d8d623973c91bee4e32e27d12&pid=1-s2.0-S2772375524001436-main.pdf","citationCount":"0","resultStr":"{\"title\":\"Field-based multispecies weed and crop detection using ground robots and advanced YOLO models: A data and model-centric approach\",\"authors\":\"\",\"doi\":\"10.1016/j.atech.2024.100538\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>The implementation of a machine-vision system for real-time precision weed management is a crucial step towards the development of smart spraying robotic vehicles. The intelligent machine-vision system, constructed using deep learning object detection models, has the potential to accurately detect weeds and crops from images. Both data-centric and model-centric approaches of deep learning model development offer advantages depending on environment and non-environment factors. To assess the performance of weed detection in real-field conditions for eight crop species, the Yolov8, Yolov9, and customized Yolov9 deep learning models were trained and evaluated using RGB images from four locations (Casselton, Fargo, and Carrington) over a two-year period in the Great Plains region of the U.S. The experiment encompassed eight crop species—dry bean, canola, corn, field pea, flax, lentil, soybean, and sugar beet—and five weed species—horseweed, kochia, redroot pigweed, common ragweed, and water hemp. Six Yolov8 and eight Yolov9 model variants were trained using annotated weed and crop images gathered from four different sites including combined dataset from four sites. The Yolov8 and Yolov9 models’ performance in weed detection were assessed based on mean average precision (mAP50) metrics for five datasets, eight crop species, and five weed species. The results of the weed and crop detection evaluation showed high mAP50 values of 86.2 %. The mAP50 values for individual weed and crop species detection ranged from 80.8 % to 98 %. The results demonstrated that the performance of the model varied by model type (model-centric), location due to environment (data-centric), data size (data-centric), data quality (data-centric), and object size in the image (data-centric). Nevertheless, the Yolov9 customized lightweight model has the potential to play a significant role in building a real time machine-vision-based precision weed management system.</p></div>\",\"PeriodicalId\":74813,\"journal\":{\"name\":\"Smart agricultural technology\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":6.3000,\"publicationDate\":\"2024-08-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.sciencedirect.com/science/article/pii/S2772375524001436/pdfft?md5=fb66c49d8d623973c91bee4e32e27d12&pid=1-s2.0-S2772375524001436-main.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Smart agricultural technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2772375524001436\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AGRICULTURAL ENGINEERING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Smart agricultural technology","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2772375524001436","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURAL ENGINEERING","Score":null,"Total":0}
Field-based multispecies weed and crop detection using ground robots and advanced YOLO models: A data and model-centric approach
The implementation of a machine-vision system for real-time precision weed management is a crucial step towards the development of smart spraying robotic vehicles. The intelligent machine-vision system, constructed using deep learning object detection models, has the potential to accurately detect weeds and crops from images. Both data-centric and model-centric approaches of deep learning model development offer advantages depending on environment and non-environment factors. To assess the performance of weed detection in real-field conditions for eight crop species, the Yolov8, Yolov9, and customized Yolov9 deep learning models were trained and evaluated using RGB images from four locations (Casselton, Fargo, and Carrington) over a two-year period in the Great Plains region of the U.S. The experiment encompassed eight crop species—dry bean, canola, corn, field pea, flax, lentil, soybean, and sugar beet—and five weed species—horseweed, kochia, redroot pigweed, common ragweed, and water hemp. Six Yolov8 and eight Yolov9 model variants were trained using annotated weed and crop images gathered from four different sites including combined dataset from four sites. The Yolov8 and Yolov9 models’ performance in weed detection were assessed based on mean average precision (mAP50) metrics for five datasets, eight crop species, and five weed species. The results of the weed and crop detection evaluation showed high mAP50 values of 86.2 %. The mAP50 values for individual weed and crop species detection ranged from 80.8 % to 98 %. The results demonstrated that the performance of the model varied by model type (model-centric), location due to environment (data-centric), data size (data-centric), data quality (data-centric), and object size in the image (data-centric). Nevertheless, the Yolov9 customized lightweight model has the potential to play a significant role in building a real time machine-vision-based precision weed management system.