Zhongyuan Guo;Jiawei Li;Jia Lei;Jinyuan Liu;Shihua Zhou;Bin Wang;Nikola K. Kasabov
{"title":"Multiscale Bilateral Attention Fusion Network for Pansharpening","authors":"Zhongyuan Guo;Jiawei Li;Jia Lei;Jinyuan Liu;Shihua Zhou;Bin Wang;Nikola K. Kasabov","doi":"10.1109/TAI.2024.3418378","DOIUrl":null,"url":null,"abstract":"High-resolution multispectral (HRMS) images combine spatial and spectral information originating from panchromatic (PAN) and reduced-resolution multispectral (LRMS) images. Pansharpening performs well and is widely used to obtain HRMS images. However, most pansharpening approaches determine the ratio of PAN and LRMS images through direct interpolation, which may introduce artifacts and distort the color of the fused results. To address this issue, an unsupervised progressive pansharpening framework, MSBANet, is proposed, which adopts a multistage fusion strategy. Each stage contains an attention interactive extraction module (AIEM) and a multiscale bilateral fusion module (MBFM). The AIEM extracts spatial and spectral features from input images and captures the correlations between features. The MBFM can efficiently integrate information from the AIEM and improve MSBANet context awareness. We design a hybrid loss function that enhances the ability of the fusion network to store spectral and texture details. In qualitative and quantitative experimental studies on four datasets, MSBANet outperformed state-of-the-art pansharpening techniques. The code will be released.","PeriodicalId":73305,"journal":{"name":"IEEE transactions on artificial intelligence","volume":"5 11","pages":"5828-5843"},"PeriodicalIF":0.0000,"publicationDate":"2024-06-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on artificial intelligence","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10570347/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
High-resolution multispectral (HRMS) images combine spatial and spectral information originating from panchromatic (PAN) and reduced-resolution multispectral (LRMS) images. Pansharpening performs well and is widely used to obtain HRMS images. However, most pansharpening approaches determine the ratio of PAN and LRMS images through direct interpolation, which may introduce artifacts and distort the color of the fused results. To address this issue, an unsupervised progressive pansharpening framework, MSBANet, is proposed, which adopts a multistage fusion strategy. Each stage contains an attention interactive extraction module (AIEM) and a multiscale bilateral fusion module (MBFM). The AIEM extracts spatial and spectral features from input images and captures the correlations between features. The MBFM can efficiently integrate information from the AIEM and improve MSBANet context awareness. We design a hybrid loss function that enhances the ability of the fusion network to store spectral and texture details. In qualitative and quantitative experimental studies on four datasets, MSBANet outperformed state-of-the-art pansharpening techniques. The code will be released.