Hang Sun;Zhaoru Yao;Bo Du;Jun Wan;Dong Ren;Lyuyang Tong
{"title":"Spatial–Frequency Residual-Guided Dynamic Perceptual Network for Remote Sensing Image Haze Removal","authors":"Hang Sun;Zhaoru Yao;Bo Du;Jun Wan;Dong Ren;Lyuyang Tong","doi":"10.1109/TGRS.2025.3543728","DOIUrl":null,"url":null,"abstract":"Recently, deep neural networks have been extensively explored in remote sensing image haze removal and achieved remarkable performance. However, most existing haze removal methods fail to effectively leverage the fusion of spatial and frequency information, which is crucial for learning more representative features. Moreover, the prevalent perceptual loss used in dehazing model training overlooks the diversity among perceptual channels, leading to performance degradation. To address these issues, we propose a spatial-frequency residual-guided dynamic perceptual network (SFRDP-Net) for remote sensing image haze removal. Specifically, we first propose a residual-guided spatial-frequency interaction (RSFI) module, which incorporates a bidirectional residual complementary mechanism (BRCM) and a frequency residual enhanced attention (FREA). Both BRCM and FREA exploit spatial-frequency complementarity to guide more effective fusion of spatial and frequency information, thus enhancing feature representation capability and improving haze removal performance. Furthermore, a dynamic channel weighting perceptual loss (DCWP-Loss) is developed to impose constraints with varying strengths on different perceptual channels, advancing the reconstruction of high-quality haze-free images. Experiments on challenging benchmark datasets demonstrate our SFRDP-Net outperforms several state-of-the-art haze removal methods. The code is released publicly at <uri>https://github.com/789as-syl/SFRDP-Net</uri>.","PeriodicalId":13213,"journal":{"name":"IEEE Transactions on Geoscience and Remote Sensing","volume":"63 ","pages":"1-16"},"PeriodicalIF":8.6000,"publicationDate":"2025-02-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Geoscience and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10892218/","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Recently, deep neural networks have been extensively explored in remote sensing image haze removal and achieved remarkable performance. However, most existing haze removal methods fail to effectively leverage the fusion of spatial and frequency information, which is crucial for learning more representative features. Moreover, the prevalent perceptual loss used in dehazing model training overlooks the diversity among perceptual channels, leading to performance degradation. To address these issues, we propose a spatial-frequency residual-guided dynamic perceptual network (SFRDP-Net) for remote sensing image haze removal. Specifically, we first propose a residual-guided spatial-frequency interaction (RSFI) module, which incorporates a bidirectional residual complementary mechanism (BRCM) and a frequency residual enhanced attention (FREA). Both BRCM and FREA exploit spatial-frequency complementarity to guide more effective fusion of spatial and frequency information, thus enhancing feature representation capability and improving haze removal performance. Furthermore, a dynamic channel weighting perceptual loss (DCWP-Loss) is developed to impose constraints with varying strengths on different perceptual channels, advancing the reconstruction of high-quality haze-free images. Experiments on challenging benchmark datasets demonstrate our SFRDP-Net outperforms several state-of-the-art haze removal methods. The code is released publicly at https://github.com/789as-syl/SFRDP-Net.
期刊介绍:
IEEE Transactions on Geoscience and Remote Sensing (TGRS) is a monthly publication that focuses on the theory, concepts, and techniques of science and engineering as applied to sensing the land, oceans, atmosphere, and space; and the processing, interpretation, and dissemination of this information.