Lucas Pometti, Matthieu Fradet, P. Hirtzlin, Pierrick Jouet
{"title":"房间尺度场景中光源的三维位置估计","authors":"Lucas Pometti, Matthieu Fradet, P. Hirtzlin, Pierrick Jouet","doi":"10.1109/IC3D53758.2021.9687218","DOIUrl":null,"url":null,"abstract":"In this paper we present our on-going work on light source estimation in room-scale scenes for more photorealistic experiences. Our unique input is an up-to-date textured 3D mesh of a real uncontrolled environment obtained using a consumer mobile device. We base our approach on the detection of real shadows in a single RGB-D image rendered for a top viewpoint. Contrary to prior art, our approach does not consider any object-based segmentation, neither simplifying assumptions on the scene geometry or on poorly textured surfaces. The 3D locations of light sources are automatically estimated, and for now, the lighting model is completed with intensity values obtained interactively through a GUI displaying augmentations on the scanned scene. This lighting model can then be reused to light the MR scene coherently during mobile experiences. Results on various indoor and outdoor scenes show the beginnings of a promising work. To illustrate the complexity of the problem and to make the community aware of the importance of a correct lighting on user perception, we also fairly show how slightly inaccurate light estimation results or incomplete geometry knowledge can go completely unnoticed in some simple cases but can also deeply impact the final rendering photorealism in some other cases.","PeriodicalId":382937,"journal":{"name":"2021 International Conference on 3D Immersion (IC3D)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"3D location estimation of light sources in room-scale scenes\",\"authors\":\"Lucas Pometti, Matthieu Fradet, P. Hirtzlin, Pierrick Jouet\",\"doi\":\"10.1109/IC3D53758.2021.9687218\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper we present our on-going work on light source estimation in room-scale scenes for more photorealistic experiences. Our unique input is an up-to-date textured 3D mesh of a real uncontrolled environment obtained using a consumer mobile device. We base our approach on the detection of real shadows in a single RGB-D image rendered for a top viewpoint. Contrary to prior art, our approach does not consider any object-based segmentation, neither simplifying assumptions on the scene geometry or on poorly textured surfaces. The 3D locations of light sources are automatically estimated, and for now, the lighting model is completed with intensity values obtained interactively through a GUI displaying augmentations on the scanned scene. This lighting model can then be reused to light the MR scene coherently during mobile experiences. Results on various indoor and outdoor scenes show the beginnings of a promising work. To illustrate the complexity of the problem and to make the community aware of the importance of a correct lighting on user perception, we also fairly show how slightly inaccurate light estimation results or incomplete geometry knowledge can go completely unnoticed in some simple cases but can also deeply impact the final rendering photorealism in some other cases.\",\"PeriodicalId\":382937,\"journal\":{\"name\":\"2021 International Conference on 3D Immersion (IC3D)\",\"volume\":\"23 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-12-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 International Conference on 3D Immersion (IC3D)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IC3D53758.2021.9687218\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 International Conference on 3D Immersion (IC3D)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IC3D53758.2021.9687218","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
3D location estimation of light sources in room-scale scenes
In this paper we present our on-going work on light source estimation in room-scale scenes for more photorealistic experiences. Our unique input is an up-to-date textured 3D mesh of a real uncontrolled environment obtained using a consumer mobile device. We base our approach on the detection of real shadows in a single RGB-D image rendered for a top viewpoint. Contrary to prior art, our approach does not consider any object-based segmentation, neither simplifying assumptions on the scene geometry or on poorly textured surfaces. The 3D locations of light sources are automatically estimated, and for now, the lighting model is completed with intensity values obtained interactively through a GUI displaying augmentations on the scanned scene. This lighting model can then be reused to light the MR scene coherently during mobile experiences. Results on various indoor and outdoor scenes show the beginnings of a promising work. To illustrate the complexity of the problem and to make the community aware of the importance of a correct lighting on user perception, we also fairly show how slightly inaccurate light estimation results or incomplete geometry knowledge can go completely unnoticed in some simple cases but can also deeply impact the final rendering photorealism in some other cases.