Yoshinari Morio, Yuya Hanada, Yuta Sawada, Katsusuke Murakami
{"title":"自动农用车自定位的现场场景识别","authors":"Yoshinari Morio, Yuya Hanada, Yuta Sawada, Katsusuke Murakami","doi":"10.1016/j.eaef.2019.03.001","DOIUrl":null,"url":null,"abstract":"<div><p>In this study, a field scene recognition system was developed to estimate a self-position of a traveling vehicle along a farm road by using an original capture system with three cameras, a vector quantization method to express the features of field scenes, a machine learning based scene recognition algorithm, and a vehicle position estimation algorithm with an original voting method. The potential of our system was demonstrated through five experiments performed over four months. In the experiments, the system could robustly estimate the vehicle position with the accuracy less than 1 m at the processing speed of approximately 2.0 Hz when the vehicle was driven straight along a traveling line on the targeted two types of roads: a surfaced road and an unsurfaced road, at the driving speed of 0.5 m/s. The results demonstrated an applicability of our system to navigate an autonomous agricultural robot vehicle without using GNSS.</p></div>","PeriodicalId":38965,"journal":{"name":"Engineering in Agriculture, Environment and Food","volume":"12 3","pages":"Pages 325-340"},"PeriodicalIF":0.0000,"publicationDate":"2019-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.eaef.2019.03.001","citationCount":"1","resultStr":"{\"title\":\"Field scene recognition for self-localization of autonomous agricultural vehicle\",\"authors\":\"Yoshinari Morio, Yuya Hanada, Yuta Sawada, Katsusuke Murakami\",\"doi\":\"10.1016/j.eaef.2019.03.001\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>In this study, a field scene recognition system was developed to estimate a self-position of a traveling vehicle along a farm road by using an original capture system with three cameras, a vector quantization method to express the features of field scenes, a machine learning based scene recognition algorithm, and a vehicle position estimation algorithm with an original voting method. The potential of our system was demonstrated through five experiments performed over four months. In the experiments, the system could robustly estimate the vehicle position with the accuracy less than 1 m at the processing speed of approximately 2.0 Hz when the vehicle was driven straight along a traveling line on the targeted two types of roads: a surfaced road and an unsurfaced road, at the driving speed of 0.5 m/s. The results demonstrated an applicability of our system to navigate an autonomous agricultural robot vehicle without using GNSS.</p></div>\",\"PeriodicalId\":38965,\"journal\":{\"name\":\"Engineering in Agriculture, Environment and Food\",\"volume\":\"12 3\",\"pages\":\"Pages 325-340\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1016/j.eaef.2019.03.001\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Engineering in Agriculture, Environment and Food\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1881836617301015\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"Engineering\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Engineering in Agriculture, Environment and Food","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1881836617301015","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Engineering","Score":null,"Total":0}
Field scene recognition for self-localization of autonomous agricultural vehicle
In this study, a field scene recognition system was developed to estimate a self-position of a traveling vehicle along a farm road by using an original capture system with three cameras, a vector quantization method to express the features of field scenes, a machine learning based scene recognition algorithm, and a vehicle position estimation algorithm with an original voting method. The potential of our system was demonstrated through five experiments performed over four months. In the experiments, the system could robustly estimate the vehicle position with the accuracy less than 1 m at the processing speed of approximately 2.0 Hz when the vehicle was driven straight along a traveling line on the targeted two types of roads: a surfaced road and an unsurfaced road, at the driving speed of 0.5 m/s. The results demonstrated an applicability of our system to navigate an autonomous agricultural robot vehicle without using GNSS.
期刊介绍:
Engineering in Agriculture, Environment and Food (EAEF) is devoted to the advancement and dissemination of scientific and technical knowledge concerning agricultural machinery, tillage, terramechanics, precision farming, agricultural instrumentation, sensors, bio-robotics, systems automation, processing of agricultural products and foods, quality evaluation and food safety, waste treatment and management, environmental control, energy utilization agricultural systems engineering, bio-informatics, computer simulation, computational mechanics, farm work systems and mechanized cropping. It is an international English E-journal published and distributed by the Asian Agricultural and Biological Engineering Association (AABEA). Authors should submit the manuscript file written by MS Word through a web site. The manuscript must be approved by the author''s organization prior to submission if required. Contact the societies which you belong to, if you have any question on manuscript submission or on the Journal EAEF.