Edge and Flow Guided Iterative CNN for Remote Sensing Image Change Detection

IF 8.6 1区 地球科学 Q1 ENGINEERING, ELECTRICAL & ELECTRONIC IEEE Transactions on Geoscience and Remote Sensing Pub Date : 2025-03-14 DOI:10.1109/TGRS.2025.3550973
Yuting Liu;Shihua Li;Ze He;Kaitong Liu
{"title":"Edge and Flow Guided Iterative CNN for Remote Sensing Image Change Detection","authors":"Yuting Liu;Shihua Li;Ze He;Kaitong Liu","doi":"10.1109/TGRS.2025.3550973","DOIUrl":null,"url":null,"abstract":"Change detection (CD) is an essential aspect of urban planning and resource management. Deep learning (DL) has the potential to detect complex changes from massive data more automatically than traditional methods. However, current DL-based CD methods have limited abilities to efficiently extract bi-temporal features, emphasize real changes, detect weak edges, and ultimately integrate multiscale change information in detail. To address these challenges, we propose an edge and flow guided iterative convolutional neural network (EFICNN). Our network introduces an innovative and efficient iterative backbone (IB) as the feature extractor through pretrained fine-tuning. By embedding the IB into a Siamese architecture, it is possible to extract richer bi-temporal features from the input remote sensing (RS) images. On this basis, we design a 3-D difference enhancement module (3D-DEM) that utilizes a parameter-free 3-D attention mechanism to emphasize and connect bi-temporal differential and concatenated change features. Additionally, an edge-guided attention module (EGAM) is developed to enhance weak edges. This module combines reverse attention and edge attention based on the Laplacian pyramid to capture change backgrounds and high-frequency change edges, respectively. Inspired by the optical flow alignment between adjacent frames, we ultimately employ a flow-guided fusion module (FGFM) to promote the propagation of fine-grained features from deep to shallow and dynamically optimize the multiscale fusion process. Qualitative and quantitative experiments on three publicly available datasets demonstrate that our method outperforms 17 SOTA methods in terms of real change recognition and edge refinement. Our code is available at <uri>https://github.com/LYT-work/EFICNN</uri>.","PeriodicalId":13213,"journal":{"name":"IEEE Transactions on Geoscience and Remote Sensing","volume":"63 ","pages":"1-22"},"PeriodicalIF":8.6000,"publicationDate":"2025-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Geoscience and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10925393/","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Change detection (CD) is an essential aspect of urban planning and resource management. Deep learning (DL) has the potential to detect complex changes from massive data more automatically than traditional methods. However, current DL-based CD methods have limited abilities to efficiently extract bi-temporal features, emphasize real changes, detect weak edges, and ultimately integrate multiscale change information in detail. To address these challenges, we propose an edge and flow guided iterative convolutional neural network (EFICNN). Our network introduces an innovative and efficient iterative backbone (IB) as the feature extractor through pretrained fine-tuning. By embedding the IB into a Siamese architecture, it is possible to extract richer bi-temporal features from the input remote sensing (RS) images. On this basis, we design a 3-D difference enhancement module (3D-DEM) that utilizes a parameter-free 3-D attention mechanism to emphasize and connect bi-temporal differential and concatenated change features. Additionally, an edge-guided attention module (EGAM) is developed to enhance weak edges. This module combines reverse attention and edge attention based on the Laplacian pyramid to capture change backgrounds and high-frequency change edges, respectively. Inspired by the optical flow alignment between adjacent frames, we ultimately employ a flow-guided fusion module (FGFM) to promote the propagation of fine-grained features from deep to shallow and dynamically optimize the multiscale fusion process. Qualitative and quantitative experiments on three publicly available datasets demonstrate that our method outperforms 17 SOTA methods in terms of real change recognition and edge refinement. Our code is available at https://github.com/LYT-work/EFICNN.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
边缘和流导向迭代CNN用于遥感图像变化检测
变化检测(CD)是城市规划和资源管理的一个重要方面。深度学习(DL)具有比传统方法更自动地从海量数据中检测复杂变化的潜力。然而,目前基于dl的CD方法在有效提取双时相特征、强调真实变化、检测弱边缘以及最终详细整合多尺度变化信息等方面能力有限。为了解决这些挑战,我们提出了一种边缘和流引导迭代卷积神经网络(EFICNN)。我们的网络通过预训练微调引入了创新高效的迭代主干(IB)作为特征提取器。通过将IB嵌入到Siamese架构中,可以从输入遥感(RS)图像中提取更丰富的双时相特征。在此基础上,我们设计了一个三维差异增强模块(3D-DEM),该模块利用无参数的三维注意机制来强调和连接双时差和连接变化特征。此外,还开发了边缘引导注意模块(EGAM)来增强弱边缘。该模块结合了基于拉普拉斯金字塔的反向注意和边缘注意,分别捕捉变化背景和高频变化边缘。受相邻帧之间的光流对准的启发,我们最终采用了一个流动引导融合模块(FGFM)来促进细粒度特征从深到浅的传播,并动态优化多尺度融合过程。在三个公开数据集上进行的定性和定量实验表明,我们的方法在真实变化识别和边缘细化方面优于17种SOTA方法。我们的代码可在https://github.com/LYT-work/EFICNN上获得。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
IEEE Transactions on Geoscience and Remote Sensing
IEEE Transactions on Geoscience and Remote Sensing 工程技术-地球化学与地球物理
CiteScore
11.50
自引率
28.00%
发文量
1912
审稿时长
4.0 months
期刊介绍: IEEE Transactions on Geoscience and Remote Sensing (TGRS) is a monthly publication that focuses on the theory, concepts, and techniques of science and engineering as applied to sensing the land, oceans, atmosphere, and space; and the processing, interpretation, and dissemination of this information.
期刊最新文献
Fine-Scale Structure Reconstruction of Weather Radar Echoes via Blind Super-Resolution Generalized Iterative Sparse Maximum Likelihood Algorithm for the Detection of Buried Targets Unsupervised Snowy-Weather Point Cloud Denoising via Two-Stage Filter-Network Collaboration Numerical Study on Anisotropic Permeability Inversion from Dipole Seismoelectric Logging in Fluid-saturated Porous Formations Hybrid F-K Filtering and Deep Learning for P/S Separation in DAS VSP Data
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1