Nick Michiels, Lode Jorissen, Jeroen Put, Jori Liesenborgs, Isjtar Vandebroeck, Eric Joris, Frank Van Reeth
{"title":"Tracking and co-location of global point clouds for large-area indoor environments","authors":"Nick Michiels, Lode Jorissen, Jeroen Put, Jori Liesenborgs, Isjtar Vandebroeck, Eric Joris, Frank Van Reeth","doi":"10.1007/s10055-024-01004-0","DOIUrl":null,"url":null,"abstract":"<p>Extended reality (XR) experiences are on the verge of becoming widely adopted in diverse application domains. An essential part of the technology is accurate tracking and localization of the headset to create an immersive experience. A subset of the applications require perfect co-location between the real and the virtual world, where virtual objects are aligned with real-world counterparts. Current headsets support co-location for small areas, but suffer from drift when scaling up to larger ones such as buildings or factories. This paper proposes tools and solutions for this challenge by splitting up the simultaneous localization and mapping (SLAM) into separate mapping and localization stages. In the pre-processing stage, a feature map is built for the entire tracking area. A global optimizer is applied to correct the deformations caused by drift, guided by a sparse set of ground truth markers in the point cloud of a laser scan. Optionally, further refinement is applied by matching features between the ground truth keyframe images and their rendered-out SLAM estimates of the point cloud. In the second, real-time stage, the rectified feature map is used to perform localization and sensor fusion between the global tracking and the headset. The results show that the approach achieves robust co-location between the virtual and the real 3D environment for large and complex tracking environments.</p>","PeriodicalId":23727,"journal":{"name":"Virtual Reality","volume":"18 1","pages":""},"PeriodicalIF":4.4000,"publicationDate":"2024-05-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Virtual Reality","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s10055-024-01004-0","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
Extended reality (XR) experiences are on the verge of becoming widely adopted in diverse application domains. An essential part of the technology is accurate tracking and localization of the headset to create an immersive experience. A subset of the applications require perfect co-location between the real and the virtual world, where virtual objects are aligned with real-world counterparts. Current headsets support co-location for small areas, but suffer from drift when scaling up to larger ones such as buildings or factories. This paper proposes tools and solutions for this challenge by splitting up the simultaneous localization and mapping (SLAM) into separate mapping and localization stages. In the pre-processing stage, a feature map is built for the entire tracking area. A global optimizer is applied to correct the deformations caused by drift, guided by a sparse set of ground truth markers in the point cloud of a laser scan. Optionally, further refinement is applied by matching features between the ground truth keyframe images and their rendered-out SLAM estimates of the point cloud. In the second, real-time stage, the rectified feature map is used to perform localization and sensor fusion between the global tracking and the headset. The results show that the approach achieves robust co-location between the virtual and the real 3D environment for large and complex tracking environments.
扩展现实(XR)体验即将被广泛应用于各种应用领域。该技术的一个重要部分是对耳机进行精确跟踪和定位,以创造身临其境的体验。其中一部分应用要求在现实世界和虚拟世界之间实现完美的协同定位,即虚拟物体与现实世界中的对应物体保持一致。目前的头显支持小范围内的协同定位,但当扩展到建筑物或工厂等更大范围时,就会出现偏移。本文通过将同步定位和映射(SLAM)分成独立的映射和定位阶段,提出了应对这一挑战的工具和解决方案。在预处理阶段,为整个跟踪区域绘制特征图。在激光扫描点云中稀疏的地面实况标记集的引导下,应用全局优化器修正漂移引起的变形。此外,还可通过匹配地面实况关键帧图像与其渲染出的点云 SLAM 估计值之间的特征来进一步完善。在第二阶段,即实时阶段,校正后的特征图用于在全局跟踪和耳机之间执行定位和传感器融合。结果表明,该方法可在大型复杂跟踪环境中实现虚拟和真实三维环境之间的稳健协同定位。
期刊介绍:
The journal, established in 1995, publishes original research in Virtual Reality, Augmented and Mixed Reality that shapes and informs the community. The multidisciplinary nature of the field means that submissions are welcomed on a wide range of topics including, but not limited to:
Original research studies of Virtual Reality, Augmented Reality, Mixed Reality and real-time visualization applications
Development and evaluation of systems, tools, techniques and software that advance the field, including:
Display technologies, including Head Mounted Displays, simulators and immersive displays
Haptic technologies, including novel devices, interaction and rendering
Interaction management, including gesture control, eye gaze, biosensors and wearables
Tracking technologies
VR/AR/MR in medicine, including training, surgical simulation, rehabilitation, and tissue/organ modelling.
Impactful and original applications and studies of VR/AR/MR’s utility in areas such as manufacturing, business, telecommunications, arts, education, design, entertainment and defence
Research demonstrating new techniques and approaches to designing, building and evaluating virtual and augmented reality systems
Original research studies assessing the social, ethical, data or legal aspects of VR/AR/MR.