Moritz Torchalla, Marius Schnaubelt, Kevin Daun, O. Stryk
{"title":"鲁棒多传感器融合在视觉退化条件下的可靠映射和导航","authors":"Moritz Torchalla, Marius Schnaubelt, Kevin Daun, O. Stryk","doi":"10.1109/SSRR53300.2021.9597866","DOIUrl":null,"url":null,"abstract":"We address the problem of robust simultaneous mapping and localization in degraded visual conditions using low-cost off-the-shelf radars. Current methods often use high-end radar sensors or are tightly coupled to specific sensors, limiting the applicability to new robots. In contrast, we present a sensor-agnostic processing pipeline based on a novel forward sensor model to achieve accurate updates of signed distance function-based maps and robust optimization techniques to reach robust and accurate pose estimates. Our evaluation demonstrates accurate mapping and pose estimation in indoor environments under poor visual conditions and higher accuracy compared to existing methods on publicly available benchmark data.","PeriodicalId":423263,"journal":{"name":"2021 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR)","volume":"142 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Robust Multisensor Fusion for Reliable Mapping and Navigation in Degraded Visual Conditions\",\"authors\":\"Moritz Torchalla, Marius Schnaubelt, Kevin Daun, O. Stryk\",\"doi\":\"10.1109/SSRR53300.2021.9597866\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We address the problem of robust simultaneous mapping and localization in degraded visual conditions using low-cost off-the-shelf radars. Current methods often use high-end radar sensors or are tightly coupled to specific sensors, limiting the applicability to new robots. In contrast, we present a sensor-agnostic processing pipeline based on a novel forward sensor model to achieve accurate updates of signed distance function-based maps and robust optimization techniques to reach robust and accurate pose estimates. Our evaluation demonstrates accurate mapping and pose estimation in indoor environments under poor visual conditions and higher accuracy compared to existing methods on publicly available benchmark data.\",\"PeriodicalId\":423263,\"journal\":{\"name\":\"2021 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR)\",\"volume\":\"142 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-10-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SSRR53300.2021.9597866\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SSRR53300.2021.9597866","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Robust Multisensor Fusion for Reliable Mapping and Navigation in Degraded Visual Conditions
We address the problem of robust simultaneous mapping and localization in degraded visual conditions using low-cost off-the-shelf radars. Current methods often use high-end radar sensors or are tightly coupled to specific sensors, limiting the applicability to new robots. In contrast, we present a sensor-agnostic processing pipeline based on a novel forward sensor model to achieve accurate updates of signed distance function-based maps and robust optimization techniques to reach robust and accurate pose estimates. Our evaluation demonstrates accurate mapping and pose estimation in indoor environments under poor visual conditions and higher accuracy compared to existing methods on publicly available benchmark data.