Yufeng Yue, Chule Yang, Jun Zhang, Mingxing Wen, Zhenyu Wu, Haoyuan Zhang, Danwei W. Wang
{"title":"Day and Night Collaborative Dynamic Mapping in Unstructured Environment Based on Multimodal Sensors","authors":"Yufeng Yue, Chule Yang, Jun Zhang, Mingxing Wen, Zhenyu Wu, Haoyuan Zhang, Danwei W. Wang","doi":"10.1109/ICRA40945.2020.9197072","DOIUrl":null,"url":null,"abstract":"Enabling long-term operation during day and night for collaborative robots requires a comprehensive understanding of the unstructured environment. Besides, in the dynamic environment, robots must be able to recognize dynamic objects and collaboratively build a global map. This paper proposes a novel approach for dynamic collaborative mapping based on multimodal environmental perception. For each mission, robots first apply heterogeneous sensor fusion model to detect humans and separate them to acquire static observations. Then, the collaborative mapping is performed to estimate the relative position between robots and local 3D maps are integrated into a globally consistent 3D map. The experiment is conducted in the day and night rainforest with moving people. The results show the accuracy, robustness, and versatility in 3D map fusion missions.","PeriodicalId":6859,"journal":{"name":"2020 IEEE International Conference on Robotics and Automation (ICRA)","volume":"53 1","pages":"2981-2987"},"PeriodicalIF":0.0000,"publicationDate":"2020-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"17","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE International Conference on Robotics and Automation (ICRA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICRA40945.2020.9197072","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 17
Abstract
Enabling long-term operation during day and night for collaborative robots requires a comprehensive understanding of the unstructured environment. Besides, in the dynamic environment, robots must be able to recognize dynamic objects and collaboratively build a global map. This paper proposes a novel approach for dynamic collaborative mapping based on multimodal environmental perception. For each mission, robots first apply heterogeneous sensor fusion model to detect humans and separate them to acquire static observations. Then, the collaborative mapping is performed to estimate the relative position between robots and local 3D maps are integrated into a globally consistent 3D map. The experiment is conducted in the day and night rainforest with moving people. The results show the accuracy, robustness, and versatility in 3D map fusion missions.