Visual Localization Strategy for Indoor Mobile Robots in the Complex Environment

Xiaohan Lei, Fei Zhang, Junyi Zhou, Weiwei Shang
{"title":"Visual Localization Strategy for Indoor Mobile Robots in the Complex Environment","authors":"Xiaohan Lei, Fei Zhang, Junyi Zhou, Weiwei Shang","doi":"10.1109/ICMA54519.2022.9856360","DOIUrl":null,"url":null,"abstract":"Vision-based mobile positioning technology has a broad application prospect. Still, it is easy to be disturbed by external environmental factors, and the positioning accuracy and robustness in a complex environment are poor. Therefore, this paper designs a high-precision visual positioning strategy for a complex environment via fusing stereo visual odometry and Inertial Measurement Unit (IMU) data. A multi-sensor calibration method is utilized to compensate for the measurement error of IMU and the parameter error of the stereo camera. A multi-sensor data synchronization alignment method based on timestamp is also designed to realize the synchronous acquisition and processing of multi-sensor data. Based on the Unscented Kalman Filter (UKF) algorithm, we implement a nonlinear data coupling method to fuse the stereo visual odometry and IMU information to obtain high-precision positioning. In the complex and open laboratory environment, the experimental results of fused localization show that the accuracy and robustness of the mobile robot localization are significantly improved. The global maximum error is reduced by 15%, and the variance is reduced by 5%.","PeriodicalId":120073,"journal":{"name":"2022 IEEE International Conference on Mechatronics and Automation (ICMA)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-08-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Mechatronics and Automation (ICMA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMA54519.2022.9856360","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Vision-based mobile positioning technology has a broad application prospect. Still, it is easy to be disturbed by external environmental factors, and the positioning accuracy and robustness in a complex environment are poor. Therefore, this paper designs a high-precision visual positioning strategy for a complex environment via fusing stereo visual odometry and Inertial Measurement Unit (IMU) data. A multi-sensor calibration method is utilized to compensate for the measurement error of IMU and the parameter error of the stereo camera. A multi-sensor data synchronization alignment method based on timestamp is also designed to realize the synchronous acquisition and processing of multi-sensor data. Based on the Unscented Kalman Filter (UKF) algorithm, we implement a nonlinear data coupling method to fuse the stereo visual odometry and IMU information to obtain high-precision positioning. In the complex and open laboratory environment, the experimental results of fused localization show that the accuracy and robustness of the mobile robot localization are significantly improved. The global maximum error is reduced by 15%, and the variance is reduced by 5%.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
复杂环境下室内移动机器人的视觉定位策略
基于视觉的移动定位技术具有广阔的应用前景。但易受外界环境因素干扰,在复杂环境下定位精度和鲁棒性较差。为此,本文设计了一种融合立体视觉里程计和惯性测量单元(IMU)数据的复杂环境下高精度视觉定位策略。采用多传感器标定方法补偿IMU的测量误差和立体摄像机的参数误差。为了实现多传感器数据的同步采集与处理,设计了一种基于时间戳的多传感器数据同步对准方法。在Unscented卡尔曼滤波(UKF)算法的基础上,实现了一种非线性数据耦合方法,将立体视觉测程和IMU信息融合在一起,实现高精度定位。在复杂开放的实验室环境中,融合定位的实验结果表明,移动机器人定位的精度和鲁棒性得到了显著提高。全局最大误差减小15%,方差减小5%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A Fuzzy Indrect Adaptive Robust Control for Upper Extremity Exoskeleton Driven by Pneumatic Artificial Muscle Visual Localization Strategy for Indoor Mobile Robots in the Complex Environment Smart Prosthetic Knee for Above-Knee Amputees Research on the recovery system of the fixed wing swarm based on the robotic vision in the marine environment Lightning Arrester Target Segmentation Algorithm Based on Improved DeepLabv3+ and GrabCut
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1