Shiva Aghapour Maleki, Hassan Ghassemian, Maryam Imani
{"title":"Nonreference object-based pansharpening quality assessment","authors":"Shiva Aghapour Maleki, Hassan Ghassemian, Maryam Imani","doi":"10.1016/j.ejrs.2024.03.002","DOIUrl":null,"url":null,"abstract":"<div><p>Pansharpening involves the fusion of panchromatic (PAN) and multispectral (MS) images to obtain a high-resolution image with enhanced spectral and spatial information. Assessing the quality of the resulting fused image poses a challenge due to the absence of a high-resolution reference image. Numerous methods have been proposed to address this, from assessing quality at reduced resolution to full-resolution evaluations. Many existing approaches are pixel-based, where quality metrics are applied and averaged on individual pixels. In this article, we introduce a novel object-based method for assessing the quality of pansharpened images at full resolution. In object-based quality assessment methods, the reaction of different areas of the fused image to the fusion process is reflected. Our approach revolves around extracting objects from the given image and evaluating extracted objects. By doing so, the distinct responses of different objects within the fused image to the fusion process are captured. The proposed method leverages a unique object extraction technique known as segmentation by nearest neighbor (SNN) to extract objects of the MS image. This method extracts the objects based on the image’s characteristics without any requirement for parameter tuning. These extracted objects are then mapped onto both PAN and fused images. The proposed spectral index measures the spectral homogeneity of the fused image’s objects and the proposed spatial index measures the injected spatial content from the PAN image to the fused image’s objects. Experimental results underscore the robustness and reliability of the proposed method. Additionally, by visualizing distortion values on object-maps, we gain insights into fusion quality across diverse areas within the scene.</p></div>","PeriodicalId":48539,"journal":{"name":"Egyptian Journal of Remote Sensing and Space Sciences","volume":"27 2","pages":"Pages 227-241"},"PeriodicalIF":3.7000,"publicationDate":"2024-03-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S111098232400022X/pdfft?md5=7dc512ed1d8a885a84a80f360ca1e4a9&pid=1-s2.0-S111098232400022X-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Egyptian Journal of Remote Sensing and Space Sciences","FirstCategoryId":"89","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S111098232400022X","RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENVIRONMENTAL SCIENCES","Score":null,"Total":0}
引用次数: 0
Abstract
Pansharpening involves the fusion of panchromatic (PAN) and multispectral (MS) images to obtain a high-resolution image with enhanced spectral and spatial information. Assessing the quality of the resulting fused image poses a challenge due to the absence of a high-resolution reference image. Numerous methods have been proposed to address this, from assessing quality at reduced resolution to full-resolution evaluations. Many existing approaches are pixel-based, where quality metrics are applied and averaged on individual pixels. In this article, we introduce a novel object-based method for assessing the quality of pansharpened images at full resolution. In object-based quality assessment methods, the reaction of different areas of the fused image to the fusion process is reflected. Our approach revolves around extracting objects from the given image and evaluating extracted objects. By doing so, the distinct responses of different objects within the fused image to the fusion process are captured. The proposed method leverages a unique object extraction technique known as segmentation by nearest neighbor (SNN) to extract objects of the MS image. This method extracts the objects based on the image’s characteristics without any requirement for parameter tuning. These extracted objects are then mapped onto both PAN and fused images. The proposed spectral index measures the spectral homogeneity of the fused image’s objects and the proposed spatial index measures the injected spatial content from the PAN image to the fused image’s objects. Experimental results underscore the robustness and reliability of the proposed method. Additionally, by visualizing distortion values on object-maps, we gain insights into fusion quality across diverse areas within the scene.
全色锐化是将全色(PAN)和多光谱(MS)图像进行融合,以获得具有增强光谱和空间信息的高分辨率图像。由于缺乏高分辨率参考图像,评估融合图像的质量成为一项挑战。为了解决这个问题,人们提出了许多方法,从评估降低分辨率的质量到评估全分辨率的质量。许多现有方法都是基于像素的,即在单个像素上应用质量指标并求取平均值。在本文中,我们将介绍一种基于对象的新方法,用于评估全分辨率平锐图像的质量。在基于对象的质量评估方法中,融合图像的不同区域对融合过程的反应得到了反映。我们的方法主要是从给定图像中提取对象,并对提取的对象进行评估。通过这种方法,可以捕捉到融合图像中不同对象对融合过程的不同反应。所提出的方法利用一种称为 "近邻分割"(SNN)的独特对象提取技术来提取 MS 图像中的对象。该方法根据图像的特征提取对象,无需调整参数。然后将这些提取的对象映射到 PAN 和融合图像上。所提出的光谱指数衡量融合图像对象的光谱同质性,所提出的空间指数衡量从 PAN 图像到融合图像对象的注入空间内容。实验结果凸显了所提方法的鲁棒性和可靠性。此外,通过可视化对象地图上的失真值,我们可以深入了解场景中不同区域的融合质量。
期刊介绍:
The Egyptian Journal of Remote Sensing and Space Sciences (EJRS) encompasses a comprehensive range of topics within Remote Sensing, Geographic Information Systems (GIS), planetary geology, and space technology development, including theories, applications, and modeling. EJRS aims to disseminate high-quality, peer-reviewed research focusing on the advancement of remote sensing and GIS technologies and their practical applications for effective planning, sustainable development, and environmental resource conservation. The journal particularly welcomes innovative papers with broad scientific appeal.