Yang Xu, Yu Jiang, Shibo Wang, Kang Li, Guohua Geng
{"title":"Delta Path Tracing for Real-Time Global Illumination in Mixed Reality","authors":"Yang Xu, Yu Jiang, Shibo Wang, Kang Li, Guohua Geng","doi":"10.1109/VR55154.2023.00020","DOIUrl":null,"url":null,"abstract":"Visual coherence between real and virtual objects is important in mixed reality (MR), and illumination consistency is one of the key aspects to achieve coherence. Apart from matching the illumination of the virtual objects with the real environments, the change of illumination on the real scenes produced by the inserted virtual objects should also be considered but is difficult to compute in real-time due to the heavy computation demands of global illumination. In this work, we propose delta path tracing (DPT), which only computes the radiance blocked by the virtual objects from the light sources at the primary hit points of Monte Carlo path tracing, then combines the blocked radiance and multi-bounce indirect illumination with the image of the real scene. Multiple importance sampling (MIS) between BRDF and environment map is performed to handle all-frequency environment maps captured by a panorama camera. Compared to conventional differential rendering methods, our method can remarkably reduce the number of times required to access the environment map and avoid rendering scenes twice. Therefore, the performance can be significantly improved. We implement our method using hardware-accelerated ray tracing on modern GPUs, and the results demonstrate that our method can render global illumination at real-time frame rates and produce plausible visual coherence between real and virtual objects in MR environments.","PeriodicalId":346767,"journal":{"name":"2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/VR55154.2023.00020","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Visual coherence between real and virtual objects is important in mixed reality (MR), and illumination consistency is one of the key aspects to achieve coherence. Apart from matching the illumination of the virtual objects with the real environments, the change of illumination on the real scenes produced by the inserted virtual objects should also be considered but is difficult to compute in real-time due to the heavy computation demands of global illumination. In this work, we propose delta path tracing (DPT), which only computes the radiance blocked by the virtual objects from the light sources at the primary hit points of Monte Carlo path tracing, then combines the blocked radiance and multi-bounce indirect illumination with the image of the real scene. Multiple importance sampling (MIS) between BRDF and environment map is performed to handle all-frequency environment maps captured by a panorama camera. Compared to conventional differential rendering methods, our method can remarkably reduce the number of times required to access the environment map and avoid rendering scenes twice. Therefore, the performance can be significantly improved. We implement our method using hardware-accelerated ray tracing on modern GPUs, and the results demonstrate that our method can render global illumination at real-time frame rates and produce plausible visual coherence between real and virtual objects in MR environments.