SiLK-SLAM: accurate, robust and versatile visual SLAM with simple learned keypoints

Jianjun Yao, Yingzhao Li
{"title":"SiLK-SLAM: accurate, robust and versatile visual SLAM with simple learned keypoints","authors":"Jianjun Yao, Yingzhao Li","doi":"10.1108/ir-11-2023-0309","DOIUrl":null,"url":null,"abstract":"<h3>Purpose</h3>\n<p>Weak repeatability is observed in handcrafted keypoints, leading to tracking failures in visual simultaneous localization and mapping (SLAM) systems under challenging scenarios such as illumination change, rapid rotation and large angle of view variation. In contrast, learning-based keypoints exhibit higher repetition but entail considerable computational costs. This paper proposes an innovative algorithm for keypoint extraction, aiming to strike an equilibrium between precision and efficiency. This paper aims to attain accurate, robust and versatile visual localization in scenes of formidable complexity.</p><!--/ Abstract__block -->\n<h3>Design/methodology/approach</h3>\n<p>SiLK-SLAM initially refines the cutting-edge learning-based extractor, SiLK, and introduces an innovative postprocessing algorithm for keypoint homogenization and operational efficiency. Furthermore, SiLK-SLAM devises a reliable relocalization strategy called PCPnP, leveraging progressive and consistent sampling, thereby bolstering its robustness.</p><!--/ Abstract__block -->\n<h3>Findings</h3>\n<p>Empirical evaluations conducted on TUM, KITTI and EuRoC data sets substantiate SiLK-SLAM’s superior localization accuracy compared to ORB-SLAM3 and other methods. Compared to ORB-SLAM3, SiLK-SLAM demonstrates an enhancement in localization accuracy even by 70.99%, 87.20% and 85.27% across the three data sets. The relocalization experiments demonstrate SiLK-SLAM’s capability in producing precise and repeatable keypoints, showcasing its robustness in challenging environments.</p><!--/ Abstract__block -->\n<h3>Originality/value</h3>\n<p>The SiLK-SLAM achieves exceedingly elevated localization accuracy and resilience in formidable scenarios, holding paramount importance in enhancing the autonomy of robots navigating intricate environments. Code is available at https://github.com/Pepper-FlavoredChewingGum/SiLK-SLAM.</p><!--/ Abstract__block -->","PeriodicalId":501389,"journal":{"name":"Industrial Robot","volume":"65 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Industrial Robot","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1108/ir-11-2023-0309","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Purpose

Weak repeatability is observed in handcrafted keypoints, leading to tracking failures in visual simultaneous localization and mapping (SLAM) systems under challenging scenarios such as illumination change, rapid rotation and large angle of view variation. In contrast, learning-based keypoints exhibit higher repetition but entail considerable computational costs. This paper proposes an innovative algorithm for keypoint extraction, aiming to strike an equilibrium between precision and efficiency. This paper aims to attain accurate, robust and versatile visual localization in scenes of formidable complexity.

Design/methodology/approach

SiLK-SLAM initially refines the cutting-edge learning-based extractor, SiLK, and introduces an innovative postprocessing algorithm for keypoint homogenization and operational efficiency. Furthermore, SiLK-SLAM devises a reliable relocalization strategy called PCPnP, leveraging progressive and consistent sampling, thereby bolstering its robustness.

Findings

Empirical evaluations conducted on TUM, KITTI and EuRoC data sets substantiate SiLK-SLAM’s superior localization accuracy compared to ORB-SLAM3 and other methods. Compared to ORB-SLAM3, SiLK-SLAM demonstrates an enhancement in localization accuracy even by 70.99%, 87.20% and 85.27% across the three data sets. The relocalization experiments demonstrate SiLK-SLAM’s capability in producing precise and repeatable keypoints, showcasing its robustness in challenging environments.

Originality/value

The SiLK-SLAM achieves exceedingly elevated localization accuracy and resilience in formidable scenarios, holding paramount importance in enhancing the autonomy of robots navigating intricate environments. Code is available at https://github.com/Pepper-FlavoredChewingGum/SiLK-SLAM.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
SiLK-SLAM:利用简单学习的关键点实现精确、稳健和多功能的视觉 SLAM
目的手工制作的关键点重复性较差,导致视觉同步定位与映射(SLAM)系统在光照变化、快速旋转和视角变化较大等挑战性场景下跟踪失败。相比之下,基于学习的关键点具有更高的重复性,但却需要大量的计算成本。本文提出了一种创新的关键点提取算法,旨在实现精度和效率之间的平衡。设计/方法/途径SiLK-SLAM 对基于学习的尖端提取器 SiLK 进行了初步改进,并引入了一种创新的后处理算法,以实现关键点同质化并提高运行效率。研究结果在 TUM、KITTI 和 EuRoC 数据集上进行的实证评估证明,与 ORB-SLAM3 和其他方法相比,SiLK-SLAM 的定位精度更高。与 ORB-SLAM3 相比,SiLK-SLAM 在三个数据集上的定位精度分别提高了 70.99%、87.20% 和 85.27%。重新定位实验证明了 SiLK-SLAM 能够生成精确且可重复的关键点,展示了其在挑战性环境中的鲁棒性。原创性/价值SiLK-SLAM 在艰苦的场景中实现了超高的定位精度和弹性,这对于增强机器人在复杂环境中的自主导航能力至关重要。代码见 https://github.com/Pepper-FlavoredChewingGum/SiLK-SLAM。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Model optimization and acceleration method based on meta-learning and model pruning for laser vision weld tracking system High-performance foot trajectory tracking control of hydraulic legged robots based on fixed-time disturbance observers Design of a multi-manipulator robot for relieving welding residual stress An online error compensation strategy for hybrid robot based on grating feedback YLS-SLAM: a real-time dynamic visual SLAM based on semantic segmentation
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1