Zimu Yi, Ke Xie, Jiahui Lyu, Minglun Gong, Hui Huang
{"title":"在哪里渲染:研究大规模场景IBR的可渲染性","authors":"Zimu Yi, Ke Xie, Jiahui Lyu, Minglun Gong, Hui Huang","doi":"10.1109/VR55154.2023.00051","DOIUrl":null,"url":null,"abstract":"Image-based rendering (IBR) technique enables presenting real scenes interactively to viewers and hence is a key component for implementing VR telepresence. The quality of IBR results depends on the set of pre-captured views, the rendering algorithm used, and the camera parameters of the novel view to be synthesized. Numerous methods were proposed for optimizing the set of captured images and enhancing the rendering algorithms. However, from which regions IBR methods can synthesize satisfactory results is not yet well studied. In this work, we introduce the concept of renderability, which predicts the quality of IBR results at any given viewpoint and view direction. Consequently, the renderability values evaluated for the 5D camera parameter space form a field, which effectively guides viewpoint/trajectory selection for IBR, especially for challenging large-scale 3D scenes. To demonstrate this capability, we designed 2 VR applications: a path planner that allows users to navigate through sparsely captured scenes with controllable rendering quality and a view selector that provides an overview for a scene from diverse and high quality perspectives. We believe the renderability concept, the proposed evaluation method, and the suggested applications will motivate and facilitate the use of IBR in various interactive settings.","PeriodicalId":346767,"journal":{"name":"2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Where to Render: Studying Renderability for IBR of Large-Scale Scenes\",\"authors\":\"Zimu Yi, Ke Xie, Jiahui Lyu, Minglun Gong, Hui Huang\",\"doi\":\"10.1109/VR55154.2023.00051\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Image-based rendering (IBR) technique enables presenting real scenes interactively to viewers and hence is a key component for implementing VR telepresence. The quality of IBR results depends on the set of pre-captured views, the rendering algorithm used, and the camera parameters of the novel view to be synthesized. Numerous methods were proposed for optimizing the set of captured images and enhancing the rendering algorithms. However, from which regions IBR methods can synthesize satisfactory results is not yet well studied. In this work, we introduce the concept of renderability, which predicts the quality of IBR results at any given viewpoint and view direction. Consequently, the renderability values evaluated for the 5D camera parameter space form a field, which effectively guides viewpoint/trajectory selection for IBR, especially for challenging large-scale 3D scenes. To demonstrate this capability, we designed 2 VR applications: a path planner that allows users to navigate through sparsely captured scenes with controllable rendering quality and a view selector that provides an overview for a scene from diverse and high quality perspectives. We believe the renderability concept, the proposed evaluation method, and the suggested applications will motivate and facilitate the use of IBR in various interactive settings.\",\"PeriodicalId\":346767,\"journal\":{\"name\":\"2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)\",\"volume\":\"7 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-03-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/VR55154.2023.00051\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/VR55154.2023.00051","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Where to Render: Studying Renderability for IBR of Large-Scale Scenes
Image-based rendering (IBR) technique enables presenting real scenes interactively to viewers and hence is a key component for implementing VR telepresence. The quality of IBR results depends on the set of pre-captured views, the rendering algorithm used, and the camera parameters of the novel view to be synthesized. Numerous methods were proposed for optimizing the set of captured images and enhancing the rendering algorithms. However, from which regions IBR methods can synthesize satisfactory results is not yet well studied. In this work, we introduce the concept of renderability, which predicts the quality of IBR results at any given viewpoint and view direction. Consequently, the renderability values evaluated for the 5D camera parameter space form a field, which effectively guides viewpoint/trajectory selection for IBR, especially for challenging large-scale 3D scenes. To demonstrate this capability, we designed 2 VR applications: a path planner that allows users to navigate through sparsely captured scenes with controllable rendering quality and a view selector that provides an overview for a scene from diverse and high quality perspectives. We believe the renderability concept, the proposed evaluation method, and the suggested applications will motivate and facilitate the use of IBR in various interactive settings.