{"title":"Edge and Flow Guided Iterative CNN for Remote Sensing Image Change Detection","authors":"Yuting Liu;Shihua Li;Ze He;Kaitong Liu","doi":"10.1109/TGRS.2025.3550973","DOIUrl":null,"url":null,"abstract":"Change detection (CD) is an essential aspect of urban planning and resource management. Deep learning (DL) has the potential to detect complex changes from massive data more automatically than traditional methods. However, current DL-based CD methods have limited abilities to efficiently extract bi-temporal features, emphasize real changes, detect weak edges, and ultimately integrate multiscale change information in detail. To address these challenges, we propose an edge and flow guided iterative convolutional neural network (EFICNN). Our network introduces an innovative and efficient iterative backbone (IB) as the feature extractor through pretrained fine-tuning. By embedding the IB into a Siamese architecture, it is possible to extract richer bi-temporal features from the input remote sensing (RS) images. On this basis, we design a 3-D difference enhancement module (3D-DEM) that utilizes a parameter-free 3-D attention mechanism to emphasize and connect bi-temporal differential and concatenated change features. Additionally, an edge-guided attention module (EGAM) is developed to enhance weak edges. This module combines reverse attention and edge attention based on the Laplacian pyramid to capture change backgrounds and high-frequency change edges, respectively. Inspired by the optical flow alignment between adjacent frames, we ultimately employ a flow-guided fusion module (FGFM) to promote the propagation of fine-grained features from deep to shallow and dynamically optimize the multiscale fusion process. Qualitative and quantitative experiments on three publicly available datasets demonstrate that our method outperforms 17 SOTA methods in terms of real change recognition and edge refinement. Our code is available at <uri>https://github.com/LYT-work/EFICNN</uri>.","PeriodicalId":13213,"journal":{"name":"IEEE Transactions on Geoscience and Remote Sensing","volume":"63 ","pages":"1-22"},"PeriodicalIF":8.6000,"publicationDate":"2025-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Geoscience and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10925393/","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Change detection (CD) is an essential aspect of urban planning and resource management. Deep learning (DL) has the potential to detect complex changes from massive data more automatically than traditional methods. However, current DL-based CD methods have limited abilities to efficiently extract bi-temporal features, emphasize real changes, detect weak edges, and ultimately integrate multiscale change information in detail. To address these challenges, we propose an edge and flow guided iterative convolutional neural network (EFICNN). Our network introduces an innovative and efficient iterative backbone (IB) as the feature extractor through pretrained fine-tuning. By embedding the IB into a Siamese architecture, it is possible to extract richer bi-temporal features from the input remote sensing (RS) images. On this basis, we design a 3-D difference enhancement module (3D-DEM) that utilizes a parameter-free 3-D attention mechanism to emphasize and connect bi-temporal differential and concatenated change features. Additionally, an edge-guided attention module (EGAM) is developed to enhance weak edges. This module combines reverse attention and edge attention based on the Laplacian pyramid to capture change backgrounds and high-frequency change edges, respectively. Inspired by the optical flow alignment between adjacent frames, we ultimately employ a flow-guided fusion module (FGFM) to promote the propagation of fine-grained features from deep to shallow and dynamically optimize the multiscale fusion process. Qualitative and quantitative experiments on three publicly available datasets demonstrate that our method outperforms 17 SOTA methods in terms of real change recognition and edge refinement. Our code is available at https://github.com/LYT-work/EFICNN.
期刊介绍:
IEEE Transactions on Geoscience and Remote Sensing (TGRS) is a monthly publication that focuses on the theory, concepts, and techniques of science and engineering as applied to sensing the land, oceans, atmosphere, and space; and the processing, interpretation, and dissemination of this information.