nerforth:基于神经辐射场的正射影图像生成

IF 7.5 1区 地球科学 Q1 Earth and Planetary Sciences International Journal of Applied Earth Observation and Geoinformation Pub Date : 2025-01-16 DOI:10.1016/j.jag.2025.104378
Dongdong Yue, Xinyi Liu, Yi Wan, Yongjun Zhang, Maoteng Zheng, Weiwei Fan, Jiachen Zhong
{"title":"nerforth:基于神经辐射场的正射影图像生成","authors":"Dongdong Yue, Xinyi Liu, Yi Wan, Yongjun Zhang, Maoteng Zheng, Weiwei Fan, Jiachen Zhong","doi":"10.1016/j.jag.2025.104378","DOIUrl":null,"url":null,"abstract":"The application value of orthographic projection images is substantial, especially in the field of remote sensing for True Digital Orthophoto Map (TDOM) generation. Existing methods for orthographic projection image generation primarily involve geometric correction or explicit projection of photogrammetric mesh models. However, the former suffers from projection differences and stitching lines, while the latter is plagued by poor model quality and high costs. This paper presents NeRFOrtho, a new method for generating orthographic projection images from neural radiance fields at arbitrary angles. By constructing Neural Radiance Fields from multi-view images with known viewpoints and positions, the projection method is altered to render orthographic projection images on a plane where projection rays are parallel to each other. In comparison to existing orthographic projection image generation methods, this approach produces orthographic projection images devoid of projection differences and distortions, while offering superior texture details and higher precision. We also show the applicative potential of the method when rendering TDOM and the texture of building façade.","PeriodicalId":50341,"journal":{"name":"International Journal of Applied Earth Observation and Geoinformation","volume":"7 1","pages":""},"PeriodicalIF":7.5000,"publicationDate":"2025-01-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"NeRFOrtho: Orthographic Projection Images Generation based on Neural Radiance Fields\",\"authors\":\"Dongdong Yue, Xinyi Liu, Yi Wan, Yongjun Zhang, Maoteng Zheng, Weiwei Fan, Jiachen Zhong\",\"doi\":\"10.1016/j.jag.2025.104378\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The application value of orthographic projection images is substantial, especially in the field of remote sensing for True Digital Orthophoto Map (TDOM) generation. Existing methods for orthographic projection image generation primarily involve geometric correction or explicit projection of photogrammetric mesh models. However, the former suffers from projection differences and stitching lines, while the latter is plagued by poor model quality and high costs. This paper presents NeRFOrtho, a new method for generating orthographic projection images from neural radiance fields at arbitrary angles. By constructing Neural Radiance Fields from multi-view images with known viewpoints and positions, the projection method is altered to render orthographic projection images on a plane where projection rays are parallel to each other. In comparison to existing orthographic projection image generation methods, this approach produces orthographic projection images devoid of projection differences and distortions, while offering superior texture details and higher precision. We also show the applicative potential of the method when rendering TDOM and the texture of building façade.\",\"PeriodicalId\":50341,\"journal\":{\"name\":\"International Journal of Applied Earth Observation and Geoinformation\",\"volume\":\"7 1\",\"pages\":\"\"},\"PeriodicalIF\":7.5000,\"publicationDate\":\"2025-01-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Applied Earth Observation and Geoinformation\",\"FirstCategoryId\":\"89\",\"ListUrlMain\":\"https://doi.org/10.1016/j.jag.2025.104378\",\"RegionNum\":1,\"RegionCategory\":\"地球科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"Earth and Planetary Sciences\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Applied Earth Observation and Geoinformation","FirstCategoryId":"89","ListUrlMain":"https://doi.org/10.1016/j.jag.2025.104378","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Earth and Planetary Sciences","Score":null,"Total":0}
引用次数: 0

摘要

正射影像具有重要的应用价值,特别是在遥感领域生成真数字正射影像图(TDOM)。现有的正射影图像生成方法主要涉及几何校正或摄影测量网格模型的显式投影。但前者存在投影差异和拼接线问题,后者则存在模型质量差、成本高的问题。本文提出了一种基于神经辐射场任意角度生成正射影图像的新方法NeRFOrtho。通过从已知视点和位置的多视图图像中构建神经辐射场,将投影方法改为在投影光线相互平行的平面上呈现正射影图像。与现有的正交投影图像生成方法相比,该方法产生的正交投影图像没有投影差异和失真,同时具有优越的纹理细节和更高的精度。我们还展示了该方法在渲染TDOM和建筑立面纹理时的应用潜力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
NeRFOrtho: Orthographic Projection Images Generation based on Neural Radiance Fields
The application value of orthographic projection images is substantial, especially in the field of remote sensing for True Digital Orthophoto Map (TDOM) generation. Existing methods for orthographic projection image generation primarily involve geometric correction or explicit projection of photogrammetric mesh models. However, the former suffers from projection differences and stitching lines, while the latter is plagued by poor model quality and high costs. This paper presents NeRFOrtho, a new method for generating orthographic projection images from neural radiance fields at arbitrary angles. By constructing Neural Radiance Fields from multi-view images with known viewpoints and positions, the projection method is altered to render orthographic projection images on a plane where projection rays are parallel to each other. In comparison to existing orthographic projection image generation methods, this approach produces orthographic projection images devoid of projection differences and distortions, while offering superior texture details and higher precision. We also show the applicative potential of the method when rendering TDOM and the texture of building façade.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
10.20
自引率
8.00%
发文量
49
审稿时长
7.2 months
期刊介绍: The International Journal of Applied Earth Observation and Geoinformation publishes original papers that utilize earth observation data for natural resource and environmental inventory and management. These data primarily originate from remote sensing platforms, including satellites and aircraft, supplemented by surface and subsurface measurements. Addressing natural resources such as forests, agricultural land, soils, and water, as well as environmental concerns like biodiversity, land degradation, and hazards, the journal explores conceptual and data-driven approaches. It covers geoinformation themes like capturing, databasing, visualization, interpretation, data quality, and spatial uncertainty.
期刊最新文献
Change detection of slow-moving landslide with multi-source SBAS-InSAR and Light-U2Net CUG-STCN: A seabed topography classification framework based on knowledge graph-guided vision mamba network A tree crown edge-aware clipping algorithm for airborne LiDAR point clouds Reduced sediment load and vegetation restoration leading to clearer water color in the Yellow River: Evidence from 38 years of Landsat observations Empirical methods to determine surface air temperature from satellite-retrieved data
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1