Algorithm for Locating Apical Meristematic Tissue of Weeds Based on YOLO Instance Segmentation

Agronomy Pub Date : 2024-09-18 DOI:10.3390/agronomy14092121
Daode Zhang, Rui Lu, Zhe Guo, Zhiyong Yang, Siqi Wang, Xinyu Hu
{"title":"Algorithm for Locating Apical Meristematic Tissue of Weeds Based on YOLO Instance Segmentation","authors":"Daode Zhang, Rui Lu, Zhe Guo, Zhiyong Yang, Siqi Wang, Xinyu Hu","doi":"10.3390/agronomy14092121","DOIUrl":null,"url":null,"abstract":"Laser technology can be used to control weeds by irradiating the apical meristematic tissue (AMT) of weeds when they are still seedlings. Two factors are necessary for the successful large-scale implementation of this technique: the ability to accurately identify the apical meristematic tissue and the effectiveness of the localization algorithm used in the process. Based on this, this study proposes a lightweight weed AMT localization algorithm based on YOLO (look only once) instance segmentation. The YOLOv8n-seg network undergoes a lightweight design enhancement by integrating the FasterNet lightweight network as its backbone, resulting in the F-YOLOv8n-seg model. This modification effectively reduces the number of parameters and computational demands during the convolution process, thereby achieving a more efficient model. Subsequently, F-YOLOv8n-seg is combined with the connected domain analysis algorithm (CDA), yielding the F-YOLOv8n-seg-CDA model. This integration enables the precise localization of the AMT of weeds by calculating the center-of-mass coordinates of the connected domains. The experimental results indicate that the optimized model significantly outperforms the original model; the optimized model reduces floating-point computations by 26.7% and the model size by 38.2%. In particular, the floating-point calculation is decreased to 8.9 GFLOPs, and the model size is lowered to 4.2 MB. Comparing this improved model against YOLOv5s-seg and YOLOv10n-seg, it is lighter. Furthermore, it exhibits exceptional segmentation accuracy, with a 97.2% accuracy rate. Experimental tests conducted on five different weed species demonstrated that F-YOLOv8n-seg-CDA exhibits strong generalization capabilities. The combined accuracy of the algorithm for detecting these weeds was 81%. Notably, dicotyledonous weeds were detected with up to 94%. Additionally, the algorithm achieved an average inference speed of 82.9 frames per second. These results indicate that the algorithm is suitable for the real-time detection of apical meristematic tissues across multiple weed species. Furthermore, the experimental results demonstrated the impact of distinctive variations in weed morphology on identifying the location of the AMT of weeds. It was discovered that dicotyledonous and monocotyledonous weeds differed significantly in terms of the detection effect, with dicotyledonous weeds having significantly higher detection accuracy than monocotyledonous weeds. This discovery can offer novel insights and avenues for future investigation into the identification and location of the AMT of weeds.","PeriodicalId":7601,"journal":{"name":"Agronomy","volume":"48 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Agronomy","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3390/agronomy14092121","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Laser technology can be used to control weeds by irradiating the apical meristematic tissue (AMT) of weeds when they are still seedlings. Two factors are necessary for the successful large-scale implementation of this technique: the ability to accurately identify the apical meristematic tissue and the effectiveness of the localization algorithm used in the process. Based on this, this study proposes a lightweight weed AMT localization algorithm based on YOLO (look only once) instance segmentation. The YOLOv8n-seg network undergoes a lightweight design enhancement by integrating the FasterNet lightweight network as its backbone, resulting in the F-YOLOv8n-seg model. This modification effectively reduces the number of parameters and computational demands during the convolution process, thereby achieving a more efficient model. Subsequently, F-YOLOv8n-seg is combined with the connected domain analysis algorithm (CDA), yielding the F-YOLOv8n-seg-CDA model. This integration enables the precise localization of the AMT of weeds by calculating the center-of-mass coordinates of the connected domains. The experimental results indicate that the optimized model significantly outperforms the original model; the optimized model reduces floating-point computations by 26.7% and the model size by 38.2%. In particular, the floating-point calculation is decreased to 8.9 GFLOPs, and the model size is lowered to 4.2 MB. Comparing this improved model against YOLOv5s-seg and YOLOv10n-seg, it is lighter. Furthermore, it exhibits exceptional segmentation accuracy, with a 97.2% accuracy rate. Experimental tests conducted on five different weed species demonstrated that F-YOLOv8n-seg-CDA exhibits strong generalization capabilities. The combined accuracy of the algorithm for detecting these weeds was 81%. Notably, dicotyledonous weeds were detected with up to 94%. Additionally, the algorithm achieved an average inference speed of 82.9 frames per second. These results indicate that the algorithm is suitable for the real-time detection of apical meristematic tissues across multiple weed species. Furthermore, the experimental results demonstrated the impact of distinctive variations in weed morphology on identifying the location of the AMT of weeds. It was discovered that dicotyledonous and monocotyledonous weeds differed significantly in terms of the detection effect, with dicotyledonous weeds having significantly higher detection accuracy than monocotyledonous weeds. This discovery can offer novel insights and avenues for future investigation into the identification and location of the AMT of weeds.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于 YOLO 实例分割的杂草顶端分生组织定位算法
激光技术可用于控制杂草,方法是在杂草还是幼苗时对其顶端分生组织(AMT)进行照射。这项技术的成功大规模应用离不开两个因素:准确识别顶端分生组织的能力和在此过程中使用的定位算法的有效性。基于此,本研究提出了一种基于 YOLO(只看一次)实例分割的轻量级杂草 AMT 定位算法。通过整合 FasterNet 轻量级网络作为其骨干,对 YOLOv8n-seg 网络进行了轻量级设计改进,形成了 F-YOLOv8n-seg 模型。这一修改有效减少了卷积过程中的参数数量和计算需求,从而实现了更高效的模型。随后,F-YOLOv8n-seg 与连接域分析算法(CDA)相结合,产生了 F-YOLOv8n-seg-CDA 模型。这种整合通过计算连接域的质量中心坐标,实现了杂草 AMT 的精确定位。实验结果表明,优化后的模型明显优于原始模型;优化后的模型减少了 26.7% 的浮点运算和 38.2% 的模型大小。其中,浮点计算量减少到 8.9 GFLOPs,模型大小减少到 4.2 MB。与 YOLOv5s-seg 和 YOLOv10n-seg 相比,改进后的模型更轻。此外,它的分割准确率也非常高,达到了 97.2%。在五种不同杂草物种上进行的实验测试表明,F-YOLOv8n-seg-CDA 具有很强的泛化能力。该算法检测这些杂草的综合准确率为 81%。值得注意的是,双子叶杂草的检测率高达 94%。此外,该算法的平均推理速度达到每秒 82.9 帧。这些结果表明,该算法适用于多种杂草顶端分生组织的实时检测。此外,实验结果还证明了杂草形态的明显变化对识别杂草顶端分生组织位置的影响。实验发现,双子叶杂草和单子叶杂草在检测效果上存在显著差异,双子叶杂草的检测准确率明显高于单子叶杂草。这一发现可为今后研究杂草AMT的识别和定位提供新的见解和途径。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
The Influence of Sowing Date on the Primary Yield Components of Maize Algorithm for Locating Apical Meristematic Tissue of Weeds Based on YOLO Instance Segmentation Unlocking Cassava Brown Streak Disease Resistance in Cassava: Insights from Genetic Variability and Combining Ability Effects of Spray Adjuvants on Droplet Deposition Characteristics in Litchi Trees under UAV Spraying Operations Synthesis, Herbicidal Activity, and Molecular Mode of Action Evaluation of Novel Quinazolinone—Phenoxypropionate Hybrids Containing a Diester Moiety
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1