{"title":"Detection and Navigation of Unmanned Vehicles in Wooded Environments Using Light Detection and Ranging Sensors","authors":"Zhiwei Zhang, Jyun-Yu Jhang, Cheng-Jian Lin","doi":"10.18494/sam4688","DOIUrl":null,"url":null,"abstract":"With the advancement of automatic navigation, navigation control has become an indispensable core technology in the movement of unmanned vehicles. In particular, research on navigation control in outdoor wooded environments, which are more complex, less controlled, and more unpredictable than indoor environments, has received widespread attention. To realize the movement control and obstacle avoidance of unmanned vehicles in unknown environments, in this study, we use light detection and ranging (LiDAR) sensors to sense the surrounding environment. By plane meshing the point cloud reflected from LiDAR, we can instantly establish feasible regions. At the same time, using the artificial potential field algorithm, a stable obstacle avoidance and navigation path is planned for use in an unknown environment. In an actual woods navigation experiment to evaluate our proposed LiDAR detection method, we used an independently developed unmanned vehicle with Ackermann steering geometry. Experimental results indicate that the proposed method can effectively detect obstacles. The accuracy requirement is within 30 cm from the navigation target, and the experimental results show that the average navigation success rate of the proposed method is as high as 85%. The experimental results demonstrate that the system can stably and safely navigate in scenarios with different unknown environments.","PeriodicalId":22154,"journal":{"name":"Sensors and Materials","volume":"44 3","pages":""},"PeriodicalIF":1.0000,"publicationDate":"2023-11-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Sensors and Materials","FirstCategoryId":"88","ListUrlMain":"https://doi.org/10.18494/sam4688","RegionNum":4,"RegionCategory":"材料科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"INSTRUMENTS & INSTRUMENTATION","Score":null,"Total":0}
引用次数: 0
Abstract
With the advancement of automatic navigation, navigation control has become an indispensable core technology in the movement of unmanned vehicles. In particular, research on navigation control in outdoor wooded environments, which are more complex, less controlled, and more unpredictable than indoor environments, has received widespread attention. To realize the movement control and obstacle avoidance of unmanned vehicles in unknown environments, in this study, we use light detection and ranging (LiDAR) sensors to sense the surrounding environment. By plane meshing the point cloud reflected from LiDAR, we can instantly establish feasible regions. At the same time, using the artificial potential field algorithm, a stable obstacle avoidance and navigation path is planned for use in an unknown environment. In an actual woods navigation experiment to evaluate our proposed LiDAR detection method, we used an independently developed unmanned vehicle with Ackermann steering geometry. Experimental results indicate that the proposed method can effectively detect obstacles. The accuracy requirement is within 30 cm from the navigation target, and the experimental results show that the average navigation success rate of the proposed method is as high as 85%. The experimental results demonstrate that the system can stably and safely navigate in scenarios with different unknown environments.
期刊介绍:
Sensors and Materials is designed to provide a forum for people working in the multidisciplinary fields of sensing technology, and publishes contributions describing original work in the experimental and theoretical fields, aimed at understanding sensing technology, related materials, associated phenomena, and applied systems. Expository review papers and short notes are also acceptable.