Structural Representation-Guided GAN for Remote Sensing Image Cloud Removal

Jiajun Yang;Wenjing Wang;Keyan Chen;Liqin Liu;Zhengxia Zou;Zhenwei Shi
{"title":"Structural Representation-Guided GAN for Remote Sensing Image Cloud Removal","authors":"Jiajun Yang;Wenjing Wang;Keyan Chen;Liqin Liu;Zhengxia Zou;Zhenwei Shi","doi":"10.1109/LGRS.2024.3516078","DOIUrl":null,"url":null,"abstract":"Optical remote sensing imagery is often compromised by cloud cover, making effective cloud-removal techniques essential for enhancing the usability of such data. We designed a novel structural representation-guided generative adversarial network (GAN) framework for cloud removal, in which structure and gradient branches are integrated into the network, helping the model focus on the structural representations of ground objects during image reconstruction. Different from previous methods that concentrate on recovering pixel information, we emphasize learning the structural information of remote sensing images. We then utilize error feedback to fuse features from the structural auxiliary branch, guiding the image reconstruction process. During the training phase, synthetic cloud images are used to supervise the optimization of the cloud-removal network, while real cloud images are employed in an adversarial training manner for unsupervised learning to improve the generalization ability of the network. Additionally, multitemporal revisit images from remote sensing satellites are employed as auxiliary inputs, aiding the network to remove thick clouds reliably. We evaluated our framework on a dataset derived from SEN12MS-CR, and the proposed method outperformed classical cloud-removal methods in both objective performance and subjective visual quality. Furthermore, compared to other methods, our approach achieved superior cloud-removal results on real images.","PeriodicalId":91017,"journal":{"name":"IEEE geoscience and remote sensing letters : a publication of the IEEE Geoscience and Remote Sensing Society","volume":"22 ","pages":"1-5"},"PeriodicalIF":4.4000,"publicationDate":"2024-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE geoscience and remote sensing letters : a publication of the IEEE Geoscience and Remote Sensing Society","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10794750/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Optical remote sensing imagery is often compromised by cloud cover, making effective cloud-removal techniques essential for enhancing the usability of such data. We designed a novel structural representation-guided generative adversarial network (GAN) framework for cloud removal, in which structure and gradient branches are integrated into the network, helping the model focus on the structural representations of ground objects during image reconstruction. Different from previous methods that concentrate on recovering pixel information, we emphasize learning the structural information of remote sensing images. We then utilize error feedback to fuse features from the structural auxiliary branch, guiding the image reconstruction process. During the training phase, synthetic cloud images are used to supervise the optimization of the cloud-removal network, while real cloud images are employed in an adversarial training manner for unsupervised learning to improve the generalization ability of the network. Additionally, multitemporal revisit images from remote sensing satellites are employed as auxiliary inputs, aiding the network to remove thick clouds reliably. We evaluated our framework on a dataset derived from SEN12MS-CR, and the proposed method outperformed classical cloud-removal methods in both objective performance and subjective visual quality. Furthermore, compared to other methods, our approach achieved superior cloud-removal results on real images.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于结构表示的GAN遥感图像去云
光学遥感图像往往受到云层的影响,因此有效的除云技术对于提高这类数据的可用性至关重要。我们设计了一种新的结构表示引导的生成对抗网络(GAN)框架用于云去除,其中结构和梯度分支集成到网络中,帮助模型在图像重建过程中专注于地物的结构表示。与以往的方法侧重于恢复像元信息不同,我们强调学习遥感图像的结构信息。然后利用误差反馈融合结构辅助分支的特征,指导图像重建过程。在训练阶段,使用合成云图像来监督去云网络的优化,同时使用真实云图像进行对抗训练,进行无监督学习,提高网络的泛化能力。此外,利用遥感卫星的多时相重访图像作为辅助输入,帮助网络可靠地去除厚云。我们在来自SEN12MS-CR的数据集上对我们的框架进行了评估,结果表明,我们提出的方法在客观性能和主观视觉质量方面都优于经典的去云方法。此外,与其他方法相比,我们的方法在真实图像上取得了更好的去云效果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A Multiclass Training Dataset and Hybrid Neural Network for Simultaneous Karst and Channel Detection An Improved Ground-Based GNSS-R Soil Moisture Retrieval Algorithm Incorporating Precipitation Effects MCD-YOLO: An Improved YOLOv11 Framework for Manhole Cover Detection in UAV Imagery Robust Recognition of Anomalous Distribution From Electrical Resistivity Tomography Dip-Guided Poststack Inversion via Structure-Tensor Regularization
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1