Generative networks for spatio-temporal gap filling of Sentinel-2 reflectances

IF 12.2 1区 地球科学 Q1 GEOGRAPHY, PHYSICAL ISPRS Journal of Photogrammetry and Remote Sensing Pub Date : 2025-02-01 DOI:10.1016/j.isprsjprs.2025.01.016
Maria Gonzalez-Calabuig, Miguel-Ángel Fernández-Torres, Gustau Camps-Valls
{"title":"Generative networks for spatio-temporal gap filling of Sentinel-2 reflectances","authors":"Maria Gonzalez-Calabuig,&nbsp;Miguel-Ángel Fernández-Torres,&nbsp;Gustau Camps-Valls","doi":"10.1016/j.isprsjprs.2025.01.016","DOIUrl":null,"url":null,"abstract":"<div><div>Earth observation from satellite sensors offers the possibility to monitor natural ecosystems by deriving spatially explicit and temporally resolved biogeophysical parameters. Optical remote sensing, however, suffers from missing data mainly due to the presence of clouds, sensor malfunctioning, and atmospheric conditions. This study proposes a novel deep learning architecture to address gap filling of satellite reflectances, more precisely the visible and near-infrared bands, and illustrates its performance at high-resolution Sentinel-2 data. We introduce GANFilling, a generative adversarial network capable of sequence-to-sequence translation, which comprises convolutional long short-term memory layers to effectively exploit complete dependencies in space–time series data. We focus on Europe and evaluate the method’s performance <em>quantitatively</em> (through distortion and perceptual metrics) and <em>qualitatively</em> (via visual inspection and visual quality metrics). Quantitatively, our model offers the best trade-off between denoising corrupted data and preserving noise-free information, underscoring the importance of considering multiple metrics jointly when assessing gap filling tasks. Qualitatively, it successfully deals with various noise sources, such as clouds and missing data, constituting a robust solution to multiple scenarios and settings. We also illustrate and quantify the quality of the generated product in the relevant downstream application of vegetation greenness forecasting, where using GANFilling enhances forecasting in approximately 70% of the considered regions in Europe. This research contributes to underlining the utility of deep learning for Earth observation data, which allows for improved spatially and temporally resolved monitoring of the Earth surface.</div></div>","PeriodicalId":50269,"journal":{"name":"ISPRS Journal of Photogrammetry and Remote Sensing","volume":"220 ","pages":"Pages 637-648"},"PeriodicalIF":12.2000,"publicationDate":"2025-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ISPRS Journal of Photogrammetry and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0924271625000152","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"GEOGRAPHY, PHYSICAL","Score":null,"Total":0}
引用次数: 0

Abstract

Earth observation from satellite sensors offers the possibility to monitor natural ecosystems by deriving spatially explicit and temporally resolved biogeophysical parameters. Optical remote sensing, however, suffers from missing data mainly due to the presence of clouds, sensor malfunctioning, and atmospheric conditions. This study proposes a novel deep learning architecture to address gap filling of satellite reflectances, more precisely the visible and near-infrared bands, and illustrates its performance at high-resolution Sentinel-2 data. We introduce GANFilling, a generative adversarial network capable of sequence-to-sequence translation, which comprises convolutional long short-term memory layers to effectively exploit complete dependencies in space–time series data. We focus on Europe and evaluate the method’s performance quantitatively (through distortion and perceptual metrics) and qualitatively (via visual inspection and visual quality metrics). Quantitatively, our model offers the best trade-off between denoising corrupted data and preserving noise-free information, underscoring the importance of considering multiple metrics jointly when assessing gap filling tasks. Qualitatively, it successfully deals with various noise sources, such as clouds and missing data, constituting a robust solution to multiple scenarios and settings. We also illustrate and quantify the quality of the generated product in the relevant downstream application of vegetation greenness forecasting, where using GANFilling enhances forecasting in approximately 70% of the considered regions in Europe. This research contributes to underlining the utility of deep learning for Earth observation data, which allows for improved spatially and temporally resolved monitoring of the Earth surface.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Sentinel-2反射率时空间隙填充的生成网络
卫星传感器的地球观测提供了通过获得空间明确和时间分辨的生物地球物理参数来监测自然生态系统的可能性。然而,由于云层、传感器故障和大气条件的存在,光学遥感受到数据丢失的影响。本研究提出了一种新的深度学习架构来解决卫星反射率的空白填充问题,更准确地说,是可见光和近红外波段,并说明了其在高分辨率Sentinel-2数据上的性能。我们介绍了GANFilling,一种能够序列到序列转换的生成式对抗网络,它包括卷积长短期记忆层,以有效地利用时空序列数据中的完全依赖关系。我们将重点放在欧洲,并定量地(通过失真和感知度量)和定性地(通过目视检查和视觉质量度量)评估该方法的性能。定量地,我们的模型提供了去噪损坏数据和保留无噪声信息之间的最佳权衡,强调了在评估缺口填充任务时联合考虑多个指标的重要性。定性地说,它成功地处理了各种噪声源,如云和丢失的数据,构成了一个强大的解决方案,适用于多种场景和设置。我们还说明并量化了在植被绿化率预测的相关下游应用中生成的产品的质量,其中使用ganfill增强了欧洲约70%的考虑区域的预测。该研究有助于强调深度学习对地球观测数据的效用,从而改进对地球表面的空间和时间分辨率监测。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
ISPRS Journal of Photogrammetry and Remote Sensing
ISPRS Journal of Photogrammetry and Remote Sensing 工程技术-成像科学与照相技术
CiteScore
21.00
自引率
6.30%
发文量
273
审稿时长
40 days
期刊介绍: The ISPRS Journal of Photogrammetry and Remote Sensing (P&RS) serves as the official journal of the International Society for Photogrammetry and Remote Sensing (ISPRS). It acts as a platform for scientists and professionals worldwide who are involved in various disciplines that utilize photogrammetry, remote sensing, spatial information systems, computer vision, and related fields. The journal aims to facilitate communication and dissemination of advancements in these disciplines, while also acting as a comprehensive source of reference and archive. P&RS endeavors to publish high-quality, peer-reviewed research papers that are preferably original and have not been published before. These papers can cover scientific/research, technological development, or application/practical aspects. Additionally, the journal welcomes papers that are based on presentations from ISPRS meetings, as long as they are considered significant contributions to the aforementioned fields. In particular, P&RS encourages the submission of papers that are of broad scientific interest, showcase innovative applications (especially in emerging fields), have an interdisciplinary focus, discuss topics that have received limited attention in P&RS or related journals, or explore new directions in scientific or professional realms. It is preferred that theoretical papers include practical applications, while papers focusing on systems and applications should include a theoretical background.
期刊最新文献
Predictability of Earth’s greenness KGBDCNet: keyword-guided building damage captioning network for bi-temporal remote sensing images Label-free mangrove mapping from temporally consistent PlanetScope imagery with interpretable deep unfolding network Towards operational tracking of weekly crop progress using VIIRS land surface phenology product across the continental United States Spatiotemporal CNN framework for quantifying crop-specific salinity damage in coastal agriculture
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1