{"title":"Three-view cotton flower counting through multi-object tracking and RGB-D imagery","authors":"","doi":"10.1016/j.biosystemseng.2024.08.010","DOIUrl":null,"url":null,"abstract":"<div><p>Monitoring the number of cotton flowers can provide important information for breeders to assess the flowering time and the productivity of genotypes because flowering marks the transition from vegetative growth to reproductive growth and impacts the final yield. Traditional manual counting methods are time-consuming and impractical for large-scale fields. To count cotton flowers efficiently and accurately, a multi-view multi-object tracking approach was proposed by using both RGB and depth images collected by three RGB-D cameras fixed on a ground robotic platform. The tracking-by-detection algorithm was employed to track flowers from three views simultaneously and remove duplicated counting from single views. Specifically, an object detection model (YOLOv8) was trained to detect flowers in RGB images and a deep learning-based optical flow model Recurrent All-pairs Field Transforms (RAFT) was used to estimate motion between two adjacent frames. The intersection over union and distance costs were employed to associate flowers in the tracking algorithm. Additionally, tracked flowers were segmented in RGB images and the depth of each flower was obtained from the corresponding depth image. Those flowers tracked with known depth from two side views were then projected onto the middle image coordinate using camera calibration parameters. Finally, a constrained hierarchy clustering algorithm clustered all flowers in the middle image coordinate to remove duplicated counting from three views. The results showed that the mean average precision of trained YOLOv8x was 96.4%. The counting results of the developed method were highly correlated with those counted manually with a coefficient of determination of 0.92. Besides, the mean absolute percentage error of all 25 testing videos was 6.22%. The predicted cumulative flower number of Pima cotton flowers is higher than that of Acala Maxxa, which is consistent with what breeders have observed. Furthermore, the developed method can also obtain the flower number distributions of different genotypes without laborious manual counting in the field. Overall, the three-view approach provides an efficient and effective approach to count cotton flowers from multiple views. By collecting the video data continuously, this method is beneficial for breeders to dissect genetic mechanisms of flowering time with unprecedented spatial and temporal resolution, also providing a means to discern genetic differences in fecundity, the number of flowers that result in harvestable bolls. The code and datasets used in this paper can be accessed on GitHub: <span><span>https://github.com/UGA-BSAIL/Multi-view_flower_counting</span><svg><path></path></svg></span>.</p></div>","PeriodicalId":9173,"journal":{"name":"Biosystems Engineering","volume":null,"pages":null},"PeriodicalIF":4.4000,"publicationDate":"2024-08-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Biosystems Engineering","FirstCategoryId":"97","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1537511024001880","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURAL ENGINEERING","Score":null,"Total":0}
引用次数: 0
Abstract
Monitoring the number of cotton flowers can provide important information for breeders to assess the flowering time and the productivity of genotypes because flowering marks the transition from vegetative growth to reproductive growth and impacts the final yield. Traditional manual counting methods are time-consuming and impractical for large-scale fields. To count cotton flowers efficiently and accurately, a multi-view multi-object tracking approach was proposed by using both RGB and depth images collected by three RGB-D cameras fixed on a ground robotic platform. The tracking-by-detection algorithm was employed to track flowers from three views simultaneously and remove duplicated counting from single views. Specifically, an object detection model (YOLOv8) was trained to detect flowers in RGB images and a deep learning-based optical flow model Recurrent All-pairs Field Transforms (RAFT) was used to estimate motion between two adjacent frames. The intersection over union and distance costs were employed to associate flowers in the tracking algorithm. Additionally, tracked flowers were segmented in RGB images and the depth of each flower was obtained from the corresponding depth image. Those flowers tracked with known depth from two side views were then projected onto the middle image coordinate using camera calibration parameters. Finally, a constrained hierarchy clustering algorithm clustered all flowers in the middle image coordinate to remove duplicated counting from three views. The results showed that the mean average precision of trained YOLOv8x was 96.4%. The counting results of the developed method were highly correlated with those counted manually with a coefficient of determination of 0.92. Besides, the mean absolute percentage error of all 25 testing videos was 6.22%. The predicted cumulative flower number of Pima cotton flowers is higher than that of Acala Maxxa, which is consistent with what breeders have observed. Furthermore, the developed method can also obtain the flower number distributions of different genotypes without laborious manual counting in the field. Overall, the three-view approach provides an efficient and effective approach to count cotton flowers from multiple views. By collecting the video data continuously, this method is beneficial for breeders to dissect genetic mechanisms of flowering time with unprecedented spatial and temporal resolution, also providing a means to discern genetic differences in fecundity, the number of flowers that result in harvestable bolls. The code and datasets used in this paper can be accessed on GitHub: https://github.com/UGA-BSAIL/Multi-view_flower_counting.
期刊介绍:
Biosystems Engineering publishes research in engineering and the physical sciences that represent advances in understanding or modelling of the performance of biological systems for sustainable developments in land use and the environment, agriculture and amenity, bioproduction processes and the food chain. The subject matter of the journal reflects the wide range and interdisciplinary nature of research in engineering for biological systems.