{"title":"Safflower picking points localization method during the full harvest period based on SBP-YOLOv8s-seg network","authors":"He Zhang, Yun Ge, Hao Xia, Chao Sun","doi":"10.1016/j.compag.2024.109646","DOIUrl":null,"url":null,"abstract":"<div><div>Visual recognition is crucial for robotic harvesting of safflower filaments in field. However, accurate detection and localization is challenging due to complex backgrounds, leaves and branches shielding, and variable safflower morphology. This study proposes a safflower picking points localization method during the full harvest period based on SBP-YOLOv8s-seg network. The method enhanced the accuracy by improving the performance of the detection and segmentation network and implementing phased localization. Specifically, SBP-YOLOv8s-seg network based on self-calibration was constructed for precise segmentation of safflower filaments and fruit balls. Additionally, different morphological features of safflower during the full harvest period were analyzed. The segmented masks underwent Principal Component Analysis (PCA) computation, region of interest (ROI) extraction, and contour fitting to extract the principal eigenvectors that express information about the filaments. To address the issue of picking position being invisible due to the occlusion of safflower necking, the picking points were determined in conjunction with the positional relationship between filaments and fruit balls. Experimental results demonstrated that the segmentation performance of SBP-YOLOv8s-seg network was superior to other networks, achieving a significant improvement in mean average precision (mAP) compared to YOLOv5s-seg, YOLOv6s-seg, YOLOv7s-seg, and YOLOv8s-seg, with improvements of 5.1 %, 2.3 %, 4.1 %, and 1.3 % respectively. The precision, recall and mAP of SBP-YOLOv8s-seg network in the segmentation task increased from 87.9 %, 79 %, and 84.4 % of YOLOv8s-seg to 89.1 %, 79.7 %, and 85.7 %. The accuracy of blooming safflower and decaying safflower calculated by the proposed method were 93.0 % and 91.9 %, respectively. The overall localization accuracy of safflower picking points was 92.9 %. Field experiments showed that the picking success rate was 90.7 %. This study provides a theoretical basis and data support for visual localization of safflower picking robot in the future.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"227 ","pages":"Article 109646"},"PeriodicalIF":7.7000,"publicationDate":"2024-11-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers and Electronics in Agriculture","FirstCategoryId":"97","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0168169924010378","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURE, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
Visual recognition is crucial for robotic harvesting of safflower filaments in field. However, accurate detection and localization is challenging due to complex backgrounds, leaves and branches shielding, and variable safflower morphology. This study proposes a safflower picking points localization method during the full harvest period based on SBP-YOLOv8s-seg network. The method enhanced the accuracy by improving the performance of the detection and segmentation network and implementing phased localization. Specifically, SBP-YOLOv8s-seg network based on self-calibration was constructed for precise segmentation of safflower filaments and fruit balls. Additionally, different morphological features of safflower during the full harvest period were analyzed. The segmented masks underwent Principal Component Analysis (PCA) computation, region of interest (ROI) extraction, and contour fitting to extract the principal eigenvectors that express information about the filaments. To address the issue of picking position being invisible due to the occlusion of safflower necking, the picking points were determined in conjunction with the positional relationship between filaments and fruit balls. Experimental results demonstrated that the segmentation performance of SBP-YOLOv8s-seg network was superior to other networks, achieving a significant improvement in mean average precision (mAP) compared to YOLOv5s-seg, YOLOv6s-seg, YOLOv7s-seg, and YOLOv8s-seg, with improvements of 5.1 %, 2.3 %, 4.1 %, and 1.3 % respectively. The precision, recall and mAP of SBP-YOLOv8s-seg network in the segmentation task increased from 87.9 %, 79 %, and 84.4 % of YOLOv8s-seg to 89.1 %, 79.7 %, and 85.7 %. The accuracy of blooming safflower and decaying safflower calculated by the proposed method were 93.0 % and 91.9 %, respectively. The overall localization accuracy of safflower picking points was 92.9 %. Field experiments showed that the picking success rate was 90.7 %. This study provides a theoretical basis and data support for visual localization of safflower picking robot in the future.
期刊介绍:
Computers and Electronics in Agriculture provides international coverage of advancements in computer hardware, software, electronic instrumentation, and control systems applied to agricultural challenges. Encompassing agronomy, horticulture, forestry, aquaculture, and animal farming, the journal publishes original papers, reviews, and applications notes. It explores the use of computers and electronics in plant or animal agricultural production, covering topics like agricultural soils, water, pests, controlled environments, and waste. The scope extends to on-farm post-harvest operations and relevant technologies, including artificial intelligence, sensors, machine vision, robotics, networking, and simulation modeling. Its companion journal, Smart Agricultural Technology, continues the focus on smart applications in production agriculture.