RAMSF: A Novel Generic Framework for Optical Remote Sensing Multimodal Spatial-Spectral Fusion

IF 8.6 1区 地球科学 Q1 ENGINEERING, ELECTRICAL & ELECTRONIC IEEE Transactions on Geoscience and Remote Sensing Pub Date : 2025-03-19 DOI:10.1109/TGRS.2025.3552937
Chuang Liu;Zhiqi Zhang;Mi Wang
{"title":"RAMSF: A Novel Generic Framework for Optical Remote Sensing Multimodal Spatial-Spectral Fusion","authors":"Chuang Liu;Zhiqi Zhang;Mi Wang","doi":"10.1109/TGRS.2025.3552937","DOIUrl":null,"url":null,"abstract":"Optical remote sensing (ORS) multimodal spatial-spectral fusion (MSF) aims to obtain high-resolution images containing fine-grained spatial details and high-fidelity spectral information, which are crucial for downstream tasks and real-world applications. Existing methods can yield promising outcomes in specific fusion scenarios. However, due to the coarse representation of spatial details and the imprecise alignment of spatial-spectral features, the majority of methods encounter difficulties in balancing spatial and spectral preservation. This imbalance tends to cause distortion in the fused image, rendering these task-specific methods less adaptable and more challenging to apply simultaneously to different ORS-MSF tasks. To address this gap, this article introduces a generic framework that focuses on generalization and practical applicability, rather than solely optimizing the performance of models in a specific fusion task. By conducting a comprehensive analysis of theoretical models and network architectures, we systematically decompose the fusion process into two distinct phases, namely, detail reconstruction and feature alignment. Consequently, the proposed framework consists of two fundamental components: low-frequency-driven high-frequency salient detail reconstruction (LHSDR) and coordinate-modal-guided spatial-spectral feature progressive alignment (CSFPA). In LHSDR, the joint spatial degradation process in various frequency directions from diverse modal data is estimated and salient details are derived in a hierarchical integration, with low frequencies driving high ones. These coupled high-frequency details could lay the foundation for subsequent implementation of high-fidelity fusion. Furthermore, CSFPA estimates the joint spectral degradation process by establishing coordinate-mode relations between coupled high-frequency details and corresponding spectral information in the continuous domain. As a result, high spatial-spectral fidelity fused images are obtained through fine detail reconstruction and accurate feature alignment. Ten datasets derived from three different ORS-MSF tasks are utilized for an experiment, comprising eight simulated and five real test sets. Our proposed methodology demonstrates robust fusion performance and generalization capability on data with different spectral bands at various resolutions. All implementations will be published on our website.","PeriodicalId":13213,"journal":{"name":"IEEE Transactions on Geoscience and Remote Sensing","volume":"63 ","pages":"1-22"},"PeriodicalIF":8.6000,"publicationDate":"2025-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Geoscience and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10934049/","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Optical remote sensing (ORS) multimodal spatial-spectral fusion (MSF) aims to obtain high-resolution images containing fine-grained spatial details and high-fidelity spectral information, which are crucial for downstream tasks and real-world applications. Existing methods can yield promising outcomes in specific fusion scenarios. However, due to the coarse representation of spatial details and the imprecise alignment of spatial-spectral features, the majority of methods encounter difficulties in balancing spatial and spectral preservation. This imbalance tends to cause distortion in the fused image, rendering these task-specific methods less adaptable and more challenging to apply simultaneously to different ORS-MSF tasks. To address this gap, this article introduces a generic framework that focuses on generalization and practical applicability, rather than solely optimizing the performance of models in a specific fusion task. By conducting a comprehensive analysis of theoretical models and network architectures, we systematically decompose the fusion process into two distinct phases, namely, detail reconstruction and feature alignment. Consequently, the proposed framework consists of two fundamental components: low-frequency-driven high-frequency salient detail reconstruction (LHSDR) and coordinate-modal-guided spatial-spectral feature progressive alignment (CSFPA). In LHSDR, the joint spatial degradation process in various frequency directions from diverse modal data is estimated and salient details are derived in a hierarchical integration, with low frequencies driving high ones. These coupled high-frequency details could lay the foundation for subsequent implementation of high-fidelity fusion. Furthermore, CSFPA estimates the joint spectral degradation process by establishing coordinate-mode relations between coupled high-frequency details and corresponding spectral information in the continuous domain. As a result, high spatial-spectral fidelity fused images are obtained through fine detail reconstruction and accurate feature alignment. Ten datasets derived from three different ORS-MSF tasks are utilized for an experiment, comprising eight simulated and five real test sets. Our proposed methodology demonstrates robust fusion performance and generalization capability on data with different spectral bands at various resolutions. All implementations will be published on our website.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
RAMSF:一种新的光学遥感多模态空间光谱融合通用框架
光学遥感(ORS)多模态空间光谱融合(MSF)旨在获得包含细粒度空间细节和高保真光谱信息的高分辨率图像,这对下游任务和实际应用至关重要。现有的方法可以在特定的融合场景中产生有希望的结果。然而,由于空间细节的粗糙表示和空间-光谱特征的不精确对齐,大多数方法在平衡空间和光谱保存方面存在困难。这种不平衡往往会导致融合图像失真,使得这些特定于任务的方法适应性较差,并且更难同时应用于不同的ORS-MSF任务。为了解决这一差距,本文引入了一个通用框架,该框架侧重于泛化和实际适用性,而不仅仅是在特定融合任务中优化模型的性能。通过对理论模型和网络架构的综合分析,我们系统地将融合过程分解为两个不同的阶段,即细节重构和特征对齐。因此,该框架由两个基本部分组成:低频驱动的高频显著细节重建(LHSDR)和坐标模态引导的空间光谱特征逐级对齐(CSFPA)。在LHSDR中,估计了不同模态数据在不同频率方向上的联合空间退化过程,并在层次集成中得到显著细节,低频驱动高频。这些耦合的高频细节可以为后续实现高保真融合奠定基础。此外,CSFPA通过在连续域中建立耦合高频细节与相应频谱信息之间的坐标模式关系来估计联合频谱退化过程。通过精细的细节重建和精确的特征对齐,获得高空间-光谱保真度的融合图像。实验使用了来自三个不同ORS-MSF任务的10个数据集,包括8个模拟测试集和5个真实测试集。我们提出的方法在不同分辨率下对不同光谱波段的数据具有强大的融合性能和泛化能力。所有实施将在我们的网站上公布。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
IEEE Transactions on Geoscience and Remote Sensing
IEEE Transactions on Geoscience and Remote Sensing 工程技术-地球化学与地球物理
CiteScore
11.50
自引率
28.00%
发文量
1912
审稿时长
4.0 months
期刊介绍: IEEE Transactions on Geoscience and Remote Sensing (TGRS) is a monthly publication that focuses on the theory, concepts, and techniques of science and engineering as applied to sensing the land, oceans, atmosphere, and space; and the processing, interpretation, and dissemination of this information.
期刊最新文献
A Hierarchical Vision-Language Model-Guided Feature Fusion Framework for Referring Remote Sensing Image Segmentation Efficient One-Step Orthogonal Consensus Framework for Multi-View Remote Sensing Clustering Temporally-Similar Structure-Aware Spatiotemporal Fusion of Satellite Images WCDMF-Net: Wavelet-based Cross-Domain Multistage Feature Fusion Network for Infrared Small Target Detection Satellite Video Continuous Space-Time Super-Resolution via Mask-Based Temporal-Aware Warping and Cross-Level Frequency Integration
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1