Mobile 3D scanning and mapping for freely rotating and vertically descended LiDAR

F. Arzberger, J. Zevering, A. Bredenbeck, D. Borrmann, A. Nüchter
{"title":"Mobile 3D scanning and mapping for freely rotating and vertically descended LiDAR","authors":"F. Arzberger, J. Zevering, A. Bredenbeck, D. Borrmann, A. Nüchter","doi":"10.1109/SSRR56537.2022.10018586","DOIUrl":null,"url":null,"abstract":"Situational awareness in search and rescue missions is key to successful operations, e.g., in collapsed buildings, underground mine shafts, construction sites, and underwater caves. LiDAR sensors in robotics play an increasingly important role in this context, as do robust and application-specific algorithms for simultaneous localization and mapping (SLAM). In many of these scenarios mapping requires the utilization of a vertically descended scanning system. This work presents a mobile system designed to solve this task, including a SLAM approach for descended LiDAR sensors with small field of view (FoV), which are in uncontrolled rotation. The SLAM approach is based on planar polygon matching and is not limited to the presented scenario. We test the system by lowering it from a crane inside a tall building at a fire-fighter school, applying our offline SLAM approach, and comparing the resulting point clouds of the environment with ground truth maps acquired by a terrestrial laser scanner (TLS). We also compare the SLAM approach to a state-of-the-art approach with respect to runtime and accuracy of the resulting maps. Our solution achieves comparable mapping accuracy at 0.2% of the runtime.","PeriodicalId":272862,"journal":{"name":"2022 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SSRR56537.2022.10018586","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Situational awareness in search and rescue missions is key to successful operations, e.g., in collapsed buildings, underground mine shafts, construction sites, and underwater caves. LiDAR sensors in robotics play an increasingly important role in this context, as do robust and application-specific algorithms for simultaneous localization and mapping (SLAM). In many of these scenarios mapping requires the utilization of a vertically descended scanning system. This work presents a mobile system designed to solve this task, including a SLAM approach for descended LiDAR sensors with small field of view (FoV), which are in uncontrolled rotation. The SLAM approach is based on planar polygon matching and is not limited to the presented scenario. We test the system by lowering it from a crane inside a tall building at a fire-fighter school, applying our offline SLAM approach, and comparing the resulting point clouds of the environment with ground truth maps acquired by a terrestrial laser scanner (TLS). We also compare the SLAM approach to a state-of-the-art approach with respect to runtime and accuracy of the resulting maps. Our solution achieves comparable mapping accuracy at 0.2% of the runtime.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
移动3D扫描和测绘自由旋转和垂直下降的激光雷达
搜救任务中的态势感知是成功行动的关键,例如在倒塌的建筑物、地下矿井、建筑工地和水下洞穴中。机器人中的激光雷达传感器在这种情况下发挥着越来越重要的作用,同时定位和映射(SLAM)的鲁棒和特定应用算法也是如此。在许多这样的场景中,测绘需要使用垂直下降扫描系统。这项工作提出了一个移动系统来解决这个问题,包括一个SLAM方法,用于小视场(FoV)的下降激光雷达传感器,这些传感器处于不受控制的旋转状态。SLAM方法基于平面多边形匹配,不局限于所呈现的场景。我们通过将系统从消防学校高层建筑内的起重机上放下来测试系统,应用我们的离线SLAM方法,并将环境的点云结果与地面激光扫描仪(TLS)获得的地面真值图进行比较。我们还将SLAM方法与最先进的方法在运行时间和结果地图的准确性方面进行了比较。我们的解决方案在运行时的0.2%达到了相当的映射精度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Autonomous Human Navigation Using Wearable Multiple Laser Projection Suit An innovative pick-up and transport robot system for casualty evacuation DynaBARN: Benchmarking Metric Ground Navigation in Dynamic Environments Multi-Robot System for Autonomous Cooperative Counter-UAS Missions: Design, Integration, and Field Testing Autonomous Robotic Map Refinement for Targeted Resolution and Local Accuracy
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1