F. Arzberger, J. Zevering, A. Bredenbeck, D. Borrmann, A. Nüchter
{"title":"Mobile 3D scanning and mapping for freely rotating and vertically descended LiDAR","authors":"F. Arzberger, J. Zevering, A. Bredenbeck, D. Borrmann, A. Nüchter","doi":"10.1109/SSRR56537.2022.10018586","DOIUrl":null,"url":null,"abstract":"Situational awareness in search and rescue missions is key to successful operations, e.g., in collapsed buildings, underground mine shafts, construction sites, and underwater caves. LiDAR sensors in robotics play an increasingly important role in this context, as do robust and application-specific algorithms for simultaneous localization and mapping (SLAM). In many of these scenarios mapping requires the utilization of a vertically descended scanning system. This work presents a mobile system designed to solve this task, including a SLAM approach for descended LiDAR sensors with small field of view (FoV), which are in uncontrolled rotation. The SLAM approach is based on planar polygon matching and is not limited to the presented scenario. We test the system by lowering it from a crane inside a tall building at a fire-fighter school, applying our offline SLAM approach, and comparing the resulting point clouds of the environment with ground truth maps acquired by a terrestrial laser scanner (TLS). We also compare the SLAM approach to a state-of-the-art approach with respect to runtime and accuracy of the resulting maps. Our solution achieves comparable mapping accuracy at 0.2% of the runtime.","PeriodicalId":272862,"journal":{"name":"2022 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SSRR56537.2022.10018586","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Situational awareness in search and rescue missions is key to successful operations, e.g., in collapsed buildings, underground mine shafts, construction sites, and underwater caves. LiDAR sensors in robotics play an increasingly important role in this context, as do robust and application-specific algorithms for simultaneous localization and mapping (SLAM). In many of these scenarios mapping requires the utilization of a vertically descended scanning system. This work presents a mobile system designed to solve this task, including a SLAM approach for descended LiDAR sensors with small field of view (FoV), which are in uncontrolled rotation. The SLAM approach is based on planar polygon matching and is not limited to the presented scenario. We test the system by lowering it from a crane inside a tall building at a fire-fighter school, applying our offline SLAM approach, and comparing the resulting point clouds of the environment with ground truth maps acquired by a terrestrial laser scanner (TLS). We also compare the SLAM approach to a state-of-the-art approach with respect to runtime and accuracy of the resulting maps. Our solution achieves comparable mapping accuracy at 0.2% of the runtime.