{"title":"A General Albedo Recovery Approach for Aerial Photogrammetric Images through Inverse Rendering","authors":"Shuang Song, Rongjun Qin","doi":"arxiv-2409.03032","DOIUrl":null,"url":null,"abstract":"Modeling outdoor scenes for the synthetic 3D environment requires the\nrecovery of reflectance/albedo information from raw images, which is an\nill-posed problem due to the complicated unmodeled physics in this process\n(e.g., indirect lighting, volume scattering, specular reflection). The problem\nremains unsolved in a practical context. The recovered albedo can facilitate\nmodel relighting and shading, which can further enhance the realism of rendered\nmodels and the applications of digital twins. Typically, photogrammetric 3D\nmodels simply take the source images as texture materials, which inherently\nembed unwanted lighting artifacts (at the time of capture) into the texture.\nTherefore, these polluted textures are suboptimal for a synthetic environment\nto enable realistic rendering. In addition, these embedded environmental\nlightings further bring challenges to photo-consistencies across different\nimages that cause image-matching uncertainties. This paper presents a general\nimage formation model for albedo recovery from typical aerial photogrammetric\nimages under natural illuminations and derives the inverse model to resolve the\nalbedo information through inverse rendering intrinsic image decomposition. Our\napproach builds on the fact that both the sun illumination and scene geometry\nare estimable in aerial photogrammetry, thus they can provide direct inputs for\nthis ill-posed problem. This physics-based approach does not require additional\ninput other than data acquired through the typical drone-based photogrammetric\ncollection and was shown to favorably outperform existing approaches. We also\ndemonstrate that the recovered albedo image can in turn improve typical image\nprocessing tasks in photogrammetry such as feature and dense matching, edge,\nand line extraction.","PeriodicalId":501174,"journal":{"name":"arXiv - CS - Graphics","volume":"40 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Graphics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.03032","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Modeling outdoor scenes for the synthetic 3D environment requires the
recovery of reflectance/albedo information from raw images, which is an
ill-posed problem due to the complicated unmodeled physics in this process
(e.g., indirect lighting, volume scattering, specular reflection). The problem
remains unsolved in a practical context. The recovered albedo can facilitate
model relighting and shading, which can further enhance the realism of rendered
models and the applications of digital twins. Typically, photogrammetric 3D
models simply take the source images as texture materials, which inherently
embed unwanted lighting artifacts (at the time of capture) into the texture.
Therefore, these polluted textures are suboptimal for a synthetic environment
to enable realistic rendering. In addition, these embedded environmental
lightings further bring challenges to photo-consistencies across different
images that cause image-matching uncertainties. This paper presents a general
image formation model for albedo recovery from typical aerial photogrammetric
images under natural illuminations and derives the inverse model to resolve the
albedo information through inverse rendering intrinsic image decomposition. Our
approach builds on the fact that both the sun illumination and scene geometry
are estimable in aerial photogrammetry, thus they can provide direct inputs for
this ill-posed problem. This physics-based approach does not require additional
input other than data acquired through the typical drone-based photogrammetric
collection and was shown to favorably outperform existing approaches. We also
demonstrate that the recovered albedo image can in turn improve typical image
processing tasks in photogrammetry such as feature and dense matching, edge,
and line extraction.