Meng Yang, Chenglong Huang, Zhengda Li, Yang Shao, Jinzhan Yuan, Wanneng Yang, Peng Song
{"title":"Autonomous navigation method based on RGB-D camera for a crop phenotyping robot","authors":"Meng Yang, Chenglong Huang, Zhengda Li, Yang Shao, Jinzhan Yuan, Wanneng Yang, Peng Song","doi":"10.1002/rob.22379","DOIUrl":null,"url":null,"abstract":"<p>Phenotyping robots have the potential to obtain crop phenotypic traits on a large scale with high throughput. Autonomous navigation technology for phenotyping robots can significantly improve the efficiency of phenotypic traits collection. This study developed an autonomous navigation method utilizing an RGB-D camera, specifically designed for phenotyping robots in field environments. The PP-LiteSeg semantic segmentation model was employed due to its real-time and accurate segmentation capabilities, enabling the distinction of crop areas in images captured by the RGB-D camera. Navigation feature points were extracted from these segmented areas, with their three-dimensional coordinates determined from pixel and depth information, facilitating the computation of angle deviation (<i>α</i>) and lateral deviation (<i>d</i>). Fuzzy controllers were designed with <i>α</i> and <i>d</i> as inputs for real-time deviation correction during the walking of phenotyping robot. Additionally, the method includes end-of-row recognition and row spacing calculation, based on both visible and depth data, enabling automatic turning and row transition. The experimental results showed that the adopted PP-LiteSeg semantic segmentation model had a testing accuracy of 95.379% and a mean intersection over union of 90.615%. The robot's navigation demonstrated an average walking deviation of 1.33 cm, with a maximum of 3.82 cm. Additionally, the average error in row spacing measurement was 2.71 cm, while the success rate of row transition at the end of the row was 100%. These findings indicate that the proposed method provides effective support for the autonomous operation of phenotyping robots.</p>","PeriodicalId":192,"journal":{"name":"Journal of Field Robotics","volume":"41 8","pages":"2663-2675"},"PeriodicalIF":4.2000,"publicationDate":"2024-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/rob.22379","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Field Robotics","FirstCategoryId":"94","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/rob.22379","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0
Abstract
Phenotyping robots have the potential to obtain crop phenotypic traits on a large scale with high throughput. Autonomous navigation technology for phenotyping robots can significantly improve the efficiency of phenotypic traits collection. This study developed an autonomous navigation method utilizing an RGB-D camera, specifically designed for phenotyping robots in field environments. The PP-LiteSeg semantic segmentation model was employed due to its real-time and accurate segmentation capabilities, enabling the distinction of crop areas in images captured by the RGB-D camera. Navigation feature points were extracted from these segmented areas, with their three-dimensional coordinates determined from pixel and depth information, facilitating the computation of angle deviation (α) and lateral deviation (d). Fuzzy controllers were designed with α and d as inputs for real-time deviation correction during the walking of phenotyping robot. Additionally, the method includes end-of-row recognition and row spacing calculation, based on both visible and depth data, enabling automatic turning and row transition. The experimental results showed that the adopted PP-LiteSeg semantic segmentation model had a testing accuracy of 95.379% and a mean intersection over union of 90.615%. The robot's navigation demonstrated an average walking deviation of 1.33 cm, with a maximum of 3.82 cm. Additionally, the average error in row spacing measurement was 2.71 cm, while the success rate of row transition at the end of the row was 100%. These findings indicate that the proposed method provides effective support for the autonomous operation of phenotyping robots.
期刊介绍:
The Journal of Field Robotics seeks to promote scholarly publications dealing with the fundamentals of robotics in unstructured and dynamic environments.
The Journal focuses on experimental robotics and encourages publication of work that has both theoretical and practical significance.