Recognition of Lane Markings in Factories and Self-position Estimation Method Using AR Markers

Kento Hisanaga, Shiyuan Yang, S. Serikawa
{"title":"Recognition of Lane Markings in Factories and Self-position Estimation Method Using AR Markers","authors":"Kento Hisanaga, Shiyuan Yang, S. Serikawa","doi":"10.12792/icisip2021.036","DOIUrl":null,"url":null,"abstract":"In recent years, many unmanned transfer robots have been introduced in factories and warehouses. Functions such as self-position estimation are indispensable for freely operating automated guided vehicles. In this research, we estimated the self-position in the factory, proposed a robot control method using lane marking, and verified the measurement accuracy of the system in real time. In this study, the camera image is used to read the AR marker and lane markings to calculate the distance between the camera and lane markings and estimate the self-position. In this study, lane markings and AR markers are photographed horizontally with a camera. The distance is calculated from the position and tilt of the lane markings on the image. When the AR marker is detected, the camera is calibrated to calculate the distance and angle, and the self-position is estimated by comparing it with the actual coordinates. As an experiment to measure the distance to the lane marking, the distance was calculated by gradually bringing the camera closer to the stationary camera with a thick paper with a thickness of 30 mm, which is likened to the lane marking. In the distance calculation, two experiments were conducted with the camera oriented horizontally and diagonally with respect to the lane marking. As an experiment of self-position estimation using AR markers, we created a model like a passage in a factory, placed cameras at multiple points, and measured the error from theoretical values. As a method of expressing the self-position, I assigned the x-axis and y-axis to the model in the actual coordinate system and expressed it in two dimensions. In both experiments, in order to verify the accuracy, 100 continuous data were acquired at each point and the variability of the data was investigated.","PeriodicalId":431446,"journal":{"name":"The Proceedings of The 8th International Conference on Intelligent Systems and Image Processing 2021","volume":"67 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"The Proceedings of The 8th International Conference on Intelligent Systems and Image Processing 2021","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.12792/icisip2021.036","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

In recent years, many unmanned transfer robots have been introduced in factories and warehouses. Functions such as self-position estimation are indispensable for freely operating automated guided vehicles. In this research, we estimated the self-position in the factory, proposed a robot control method using lane marking, and verified the measurement accuracy of the system in real time. In this study, the camera image is used to read the AR marker and lane markings to calculate the distance between the camera and lane markings and estimate the self-position. In this study, lane markings and AR markers are photographed horizontally with a camera. The distance is calculated from the position and tilt of the lane markings on the image. When the AR marker is detected, the camera is calibrated to calculate the distance and angle, and the self-position is estimated by comparing it with the actual coordinates. As an experiment to measure the distance to the lane marking, the distance was calculated by gradually bringing the camera closer to the stationary camera with a thick paper with a thickness of 30 mm, which is likened to the lane marking. In the distance calculation, two experiments were conducted with the camera oriented horizontally and diagonally with respect to the lane marking. As an experiment of self-position estimation using AR markers, we created a model like a passage in a factory, placed cameras at multiple points, and measured the error from theoretical values. As a method of expressing the self-position, I assigned the x-axis and y-axis to the model in the actual coordinate system and expressed it in two dimensions. In both experiments, in order to verify the accuracy, 100 continuous data were acquired at each point and the variability of the data was investigated.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于AR标记的工厂车道标记识别及自定位方法
近年来,许多无人运输机器人被引入工厂和仓库。自动导引车要实现自由运行,必须具备自定位等功能。在本研究中,我们估计了工厂中的自我位置,提出了一种使用车道标记的机器人控制方法,并实时验证了系统的测量精度。在本研究中,使用摄像头图像读取AR标记和车道标记,计算摄像头与车道标记之间的距离,并估计自身位置。在本研究中,车道标记和AR标记用相机水平拍摄。距离是根据图像上车道标记的位置和倾斜来计算的。当检测到AR标记时,对相机进行校准,计算距离和角度,并与实际坐标进行比较,估计自身位置。作为测量车道标线距离的实验,距离的计算方法是用厚度为30 mm的厚纸将摄像机逐渐靠近静止摄像机,这相当于车道标线。在距离计算中,分别对车道标线进行了水平方向和对角线方向的实验。作为使用AR标记进行自我位置估计的实验,我们创建了一个类似工厂通道的模型,在多个点放置摄像机,并测量理论值的误差。作为一种表达自我位置的方法,我在实际坐标系中给模型分配了x轴和y轴,并用二维表示。在这两个实验中,为了验证准确性,在每个点上采集了100个连续数据,并研究了数据的变异性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Proposal for Selecting a Cooperation Partner in Distributed Control of Traffic Signals using Deep Reinforcement Learning User-friendly Switches and Secure Non-contact Switches for Universal Design Damage of Mega Solar Power Plants Due to Heavy Rain - Global warming and sustainable green energy - Optimal Micromanipulator Design for Gathering Magnetic Beads A Survey of Small Object Detection Based on Deep Learning
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1