Adam M. Gronewold, Philip Mulford, Eliana Ray, Laura E. Ray
{"title":"Tactile Sensing & Visually-Impaired Navigation in Densely Planted Row Crops, for Precision Fertilization by Small UGVs","authors":"Adam M. Gronewold, Philip Mulford, Eliana Ray, Laura E. Ray","doi":"10.1016/j.compag.2025.110003","DOIUrl":null,"url":null,"abstract":"<div><div>Navigating outdoor agricultural environments with cameras or ranging sensors is challenging due to sensor occlusion, lighting variability, and dense vegetation, particularly in tightly-spaced row crops. These environments present visually similar surfaces, making it difficult for vision-based systems to distinguish between rigid obstacles and flexible, traversable objects like weeds. As plant density increases, the margin of error narrows, limiting the effectiveness of traditional visual sensing. To overcome these challenges, we present a novel tactile-based perception system for autonomous navigation without any form of remote sensing. The system uses a mechanical feeler with rotary encoders to detect and map rigid obstacles, such as corn stalks, while filtering out flexible features like leaves and weeds. Through real-time classification of sensor deflections, the system achieves approximately 97 % accuracy in detecting obstacles and global positioning accuracy within 4 cm of a plant’s true location. The tactile sensor system, alongside blind-adapted path-planning (A*) and path-following (pure pursuit) algorithms, further allow an unmanned ground vehicle to autonomously navigate cornfields. Prototype sensors and the navigation method were tested in simulation, a controlled real-world environment, and a mature, unmanicured cornfield, demonstrating autonomous capabilities of > 100 m in simulated and > 30 m in real-world cornfields, prior to needing intervention. The tactile system overcomes row curvature, planting gaps, dense weeds, and canopy variability—without relying on vision or ranging sensors. With additional refinement, visual and tactile sensing modalities may be combined for more reliable obstacle detection and navigation for small robots operating in visually-occluded agricultural environments.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"231 ","pages":"Article 110003"},"PeriodicalIF":7.7000,"publicationDate":"2025-02-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers and Electronics in Agriculture","FirstCategoryId":"97","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0168169925001097","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURE, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
Navigating outdoor agricultural environments with cameras or ranging sensors is challenging due to sensor occlusion, lighting variability, and dense vegetation, particularly in tightly-spaced row crops. These environments present visually similar surfaces, making it difficult for vision-based systems to distinguish between rigid obstacles and flexible, traversable objects like weeds. As plant density increases, the margin of error narrows, limiting the effectiveness of traditional visual sensing. To overcome these challenges, we present a novel tactile-based perception system for autonomous navigation without any form of remote sensing. The system uses a mechanical feeler with rotary encoders to detect and map rigid obstacles, such as corn stalks, while filtering out flexible features like leaves and weeds. Through real-time classification of sensor deflections, the system achieves approximately 97 % accuracy in detecting obstacles and global positioning accuracy within 4 cm of a plant’s true location. The tactile sensor system, alongside blind-adapted path-planning (A*) and path-following (pure pursuit) algorithms, further allow an unmanned ground vehicle to autonomously navigate cornfields. Prototype sensors and the navigation method were tested in simulation, a controlled real-world environment, and a mature, unmanicured cornfield, demonstrating autonomous capabilities of > 100 m in simulated and > 30 m in real-world cornfields, prior to needing intervention. The tactile system overcomes row curvature, planting gaps, dense weeds, and canopy variability—without relying on vision or ranging sensors. With additional refinement, visual and tactile sensing modalities may be combined for more reliable obstacle detection and navigation for small robots operating in visually-occluded agricultural environments.
期刊介绍:
Computers and Electronics in Agriculture provides international coverage of advancements in computer hardware, software, electronic instrumentation, and control systems applied to agricultural challenges. Encompassing agronomy, horticulture, forestry, aquaculture, and animal farming, the journal publishes original papers, reviews, and applications notes. It explores the use of computers and electronics in plant or animal agricultural production, covering topics like agricultural soils, water, pests, controlled environments, and waste. The scope extends to on-farm post-harvest operations and relevant technologies, including artificial intelligence, sensors, machine vision, robotics, networking, and simulation modeling. Its companion journal, Smart Agricultural Technology, continues the focus on smart applications in production agriculture.