A photogrammetric approach for real‐time visual SLAM applied to an omnidirectional system

Thaisa Aline Correia Garcia, Antonio Maria Garcia Tommaselli, Letícia Ferrari Castanheiro, Mariana Batista Campos
{"title":"A photogrammetric approach for real‐time visual SLAM applied to an omnidirectional system","authors":"Thaisa Aline Correia Garcia, Antonio Maria Garcia Tommaselli, Letícia Ferrari Castanheiro, Mariana Batista Campos","doi":"10.1111/phor.12494","DOIUrl":null,"url":null,"abstract":"The problem of sequential estimation of the exterior orientation of imaging sensors and the three‐dimensional environment reconstruction in real time is commonly known as visual simultaneous localisation and mapping (vSLAM). Omnidirectional optical sensors have been increasingly used in vSLAM solutions, mainly for providing a wider view of the scene, allowing the extraction of more features. However, dealing with unmodelled points in the hyperhemispherical field poses challenges, mainly due to the complex lens geometry entailed in the image formation process. To address these challenges, the use of rigorous photogrammetric models that appropriately handle the geometry of fisheye lens cameras can overcome these challenges. Thus, this study presents a real‐time vSLAM approach for omnidirectional systems adapting ORB‐SLAM with a rigorous projection model (equisolid‐angle). The implementation was conducted on the Nvidia Jetson TX2 board, and the approach was evaluated using hyperhemispherical images captured by a dual‐fisheye camera (Ricoh Theta S) embedded into a mobile backpack platform. The trajectory covered a distance of 140 m, with the approach demonstrating accuracy better than 0.12 m at the beginning and achieving metre‐level accuracy at the end of the trajectory. Additionally, we compared the performance of our proposed approach with a generic model for fisheye lens cameras.","PeriodicalId":22881,"journal":{"name":"The Photogrammetric Record","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"The Photogrammetric Record","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1111/phor.12494","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

The problem of sequential estimation of the exterior orientation of imaging sensors and the three‐dimensional environment reconstruction in real time is commonly known as visual simultaneous localisation and mapping (vSLAM). Omnidirectional optical sensors have been increasingly used in vSLAM solutions, mainly for providing a wider view of the scene, allowing the extraction of more features. However, dealing with unmodelled points in the hyperhemispherical field poses challenges, mainly due to the complex lens geometry entailed in the image formation process. To address these challenges, the use of rigorous photogrammetric models that appropriately handle the geometry of fisheye lens cameras can overcome these challenges. Thus, this study presents a real‐time vSLAM approach for omnidirectional systems adapting ORB‐SLAM with a rigorous projection model (equisolid‐angle). The implementation was conducted on the Nvidia Jetson TX2 board, and the approach was evaluated using hyperhemispherical images captured by a dual‐fisheye camera (Ricoh Theta S) embedded into a mobile backpack platform. The trajectory covered a distance of 140 m, with the approach demonstrating accuracy better than 0.12 m at the beginning and achieving metre‐level accuracy at the end of the trajectory. Additionally, we compared the performance of our proposed approach with a generic model for fisheye lens cameras.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
应用于全向系统的实时视觉 SLAM 摄影测量方法
成像传感器外部方位的顺序估计和三维环境的实时重建问题通常被称为视觉同步定位和绘图(vSLAM)。全向光学传感器在 vSLAM 解决方案中的使用越来越多,主要是为了提供更广阔的场景视野,从而提取更多特征。然而,处理超半球领域中的未建模点带来了挑战,这主要是由于图像形成过程中涉及复杂的透镜几何形状。为了应对这些挑战,使用能适当处理鱼眼镜头相机几何形状的严格摄影测量模型可以克服这些挑战。因此,本研究提出了一种适用于全向系统的实时 vSLAM 方法,该方法将 ORB-SLAM 与严格的投影模型(等实角)相适配。该方法在 Nvidia Jetson TX2 板上实现,并使用嵌入移动背包平台的双鱼眼相机(理光 Theta S)捕获的超半球图像进行评估。轨迹覆盖了 140 米的距离,该方法在轨迹开始时的精度优于 0.12 米,在轨迹结束时达到了米级精度。此外,我们还比较了我们提出的方法与鱼眼镜头相机通用模型的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
59th Photogrammetric Week: Advancement in photogrammetry, remote sensing and Geoinformatics Obituary for Prof. Dr.‐Ing. Dr. h.c. mult. Gottfried Konecny Topographic mapping from space dedicated to Dr. Karsten Jacobsen’s 80th birthday Frontispiece: Comparison of 3D models with texture before and after restoration ISPRS TC IV Mid‐Term Symposium: Spatial information to empower the Metaverse
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1