利用位时和差异特征交互感知进行遥感变化检测

Hao Chang;Peijin Wang;Wenhui Diao;Guangluan Xu;Xian Sun
{"title":"利用位时和差异特征交互感知进行遥感变化检测","authors":"Hao Chang;Peijin Wang;Wenhui Diao;Guangluan Xu;Xian Sun","doi":"10.1109/TIP.2024.3424335","DOIUrl":null,"url":null,"abstract":"Recently, the transformer has achieved notable success in remote sensing (RS) change detection (CD). Its outstanding long-distance modeling ability can effectively recognize the change of interest (CoI). However, in order to obtain the precise pixel-level change regions, many methods directly integrate the stacked transformer blocks into the UNet-style structure, which causes the high computation costs. Besides, the existing methods generally consider bitemporal or differential features separately, which makes the utilization of ground semantic information still insufficient. In this paper, we propose the multiscale dual-space interactive perception network (MDIPNet) to fill these two gaps. On the one hand, we simplify the stacked multi-head transformer blocks into the single-layer single-head attention module and further introduce the lightweight parallel fusion module (LPFM) to perform the efficient information integration. On the other hand, based on the simplified attention mechanism, we propose the cross-space perception module (CSPM) to connect the bitemporal and differential feature spaces, which can help our model suppress the pseudo changes and mine the more abundant semantic consistency of CoI. Extensive experiment results on three challenging datasets and one urban expansion scene indicate that compared with the mainstream CD methods, our MDIPNet obtains the state-of-the-art (SOTA) performance while further controlling the computation costs.","PeriodicalId":94032,"journal":{"name":"IEEE transactions on image processing : a publication of the IEEE Signal Processing Society","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Remote Sensing Change Detection With Bitemporal and Differential Feature Interactive Perception\",\"authors\":\"Hao Chang;Peijin Wang;Wenhui Diao;Guangluan Xu;Xian Sun\",\"doi\":\"10.1109/TIP.2024.3424335\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Recently, the transformer has achieved notable success in remote sensing (RS) change detection (CD). Its outstanding long-distance modeling ability can effectively recognize the change of interest (CoI). However, in order to obtain the precise pixel-level change regions, many methods directly integrate the stacked transformer blocks into the UNet-style structure, which causes the high computation costs. Besides, the existing methods generally consider bitemporal or differential features separately, which makes the utilization of ground semantic information still insufficient. In this paper, we propose the multiscale dual-space interactive perception network (MDIPNet) to fill these two gaps. On the one hand, we simplify the stacked multi-head transformer blocks into the single-layer single-head attention module and further introduce the lightweight parallel fusion module (LPFM) to perform the efficient information integration. On the other hand, based on the simplified attention mechanism, we propose the cross-space perception module (CSPM) to connect the bitemporal and differential feature spaces, which can help our model suppress the pseudo changes and mine the more abundant semantic consistency of CoI. Extensive experiment results on three challenging datasets and one urban expansion scene indicate that compared with the mainstream CD methods, our MDIPNet obtains the state-of-the-art (SOTA) performance while further controlling the computation costs.\",\"PeriodicalId\":94032,\"journal\":{\"name\":\"IEEE transactions on image processing : a publication of the IEEE Signal Processing Society\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-07-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE transactions on image processing : a publication of the IEEE Signal Processing Society\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10599227/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on image processing : a publication of the IEEE Signal Processing Society","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10599227/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

最近,变压器在遥感(RS)变化探测(CD)方面取得了显著的成功。其出色的远距离建模能力可以有效识别感兴趣的变化(CoI)。然而,为了获得精确的像素级变化区域,许多方法直接将堆叠的变换器块集成到 UNet 样式的结构中,导致计算成本较高。此外,现有方法一般都是单独考虑位时或差分特征,对地面语义信息的利用还不够充分。本文提出了多尺度双空间交互感知网络(MDIPNet)来填补这两个空白。一方面,我们将堆叠的多头变换模块简化为单层单头注意模块,并进一步引入轻量级并行融合模块(LPFM),以实现高效的信息整合。另一方面,在简化注意力机制的基础上,我们提出了跨空间感知模块(CSPM)来连接位时特征空间和差分特征空间,这可以帮助我们的模型抑制伪变化,挖掘出更丰富的CoI语义一致性。在三个具有挑战性的数据集和一个城市扩张场景上的大量实验结果表明,与主流的 CD 方法相比,我们的 MDIPNet 在进一步控制计算成本的同时,获得了最先进的性能(SOTA)。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Remote Sensing Change Detection With Bitemporal and Differential Feature Interactive Perception
Recently, the transformer has achieved notable success in remote sensing (RS) change detection (CD). Its outstanding long-distance modeling ability can effectively recognize the change of interest (CoI). However, in order to obtain the precise pixel-level change regions, many methods directly integrate the stacked transformer blocks into the UNet-style structure, which causes the high computation costs. Besides, the existing methods generally consider bitemporal or differential features separately, which makes the utilization of ground semantic information still insufficient. In this paper, we propose the multiscale dual-space interactive perception network (MDIPNet) to fill these two gaps. On the one hand, we simplify the stacked multi-head transformer blocks into the single-layer single-head attention module and further introduce the lightweight parallel fusion module (LPFM) to perform the efficient information integration. On the other hand, based on the simplified attention mechanism, we propose the cross-space perception module (CSPM) to connect the bitemporal and differential feature spaces, which can help our model suppress the pseudo changes and mine the more abundant semantic consistency of CoI. Extensive experiment results on three challenging datasets and one urban expansion scene indicate that compared with the mainstream CD methods, our MDIPNet obtains the state-of-the-art (SOTA) performance while further controlling the computation costs.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Balanced Destruction-Reconstruction Dynamics for Memory-Replay Class Incremental Learning Blind Video Quality Prediction by Uncovering Human Video Perceptual Representation. Contrastive Open-set Active Learning based Sample Selection for Image Classification. Generating Stylized Features for Single-Source Cross-Dataset Palmprint Recognition With Unseen Target Dataset Learning Prompt-Enhanced Context Features for Weakly-Supervised Video Anomaly Detection
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1