{"title":"Recognition of Lane Markings in Factories and Self-position Estimation Method Using AR Markers","authors":"Kento Hisanaga, Shiyuan Yang, S. Serikawa","doi":"10.12792/icisip2021.036","DOIUrl":null,"url":null,"abstract":"In recent years, many unmanned transfer robots have been introduced in factories and warehouses. Functions such as self-position estimation are indispensable for freely operating automated guided vehicles. In this research, we estimated the self-position in the factory, proposed a robot control method using lane marking, and verified the measurement accuracy of the system in real time. In this study, the camera image is used to read the AR marker and lane markings to calculate the distance between the camera and lane markings and estimate the self-position. In this study, lane markings and AR markers are photographed horizontally with a camera. The distance is calculated from the position and tilt of the lane markings on the image. When the AR marker is detected, the camera is calibrated to calculate the distance and angle, and the self-position is estimated by comparing it with the actual coordinates. As an experiment to measure the distance to the lane marking, the distance was calculated by gradually bringing the camera closer to the stationary camera with a thick paper with a thickness of 30 mm, which is likened to the lane marking. In the distance calculation, two experiments were conducted with the camera oriented horizontally and diagonally with respect to the lane marking. As an experiment of self-position estimation using AR markers, we created a model like a passage in a factory, placed cameras at multiple points, and measured the error from theoretical values. As a method of expressing the self-position, I assigned the x-axis and y-axis to the model in the actual coordinate system and expressed it in two dimensions. In both experiments, in order to verify the accuracy, 100 continuous data were acquired at each point and the variability of the data was investigated.","PeriodicalId":431446,"journal":{"name":"The Proceedings of The 8th International Conference on Intelligent Systems and Image Processing 2021","volume":"67 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"The Proceedings of The 8th International Conference on Intelligent Systems and Image Processing 2021","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.12792/icisip2021.036","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In recent years, many unmanned transfer robots have been introduced in factories and warehouses. Functions such as self-position estimation are indispensable for freely operating automated guided vehicles. In this research, we estimated the self-position in the factory, proposed a robot control method using lane marking, and verified the measurement accuracy of the system in real time. In this study, the camera image is used to read the AR marker and lane markings to calculate the distance between the camera and lane markings and estimate the self-position. In this study, lane markings and AR markers are photographed horizontally with a camera. The distance is calculated from the position and tilt of the lane markings on the image. When the AR marker is detected, the camera is calibrated to calculate the distance and angle, and the self-position is estimated by comparing it with the actual coordinates. As an experiment to measure the distance to the lane marking, the distance was calculated by gradually bringing the camera closer to the stationary camera with a thick paper with a thickness of 30 mm, which is likened to the lane marking. In the distance calculation, two experiments were conducted with the camera oriented horizontally and diagonally with respect to the lane marking. As an experiment of self-position estimation using AR markers, we created a model like a passage in a factory, placed cameras at multiple points, and measured the error from theoretical values. As a method of expressing the self-position, I assigned the x-axis and y-axis to the model in the actual coordinate system and expressed it in two dimensions. In both experiments, in order to verify the accuracy, 100 continuous data were acquired at each point and the variability of the data was investigated.