{"title":"Blind quality evaluator for multi-exposure fusion image via joint sparse features and complex-wavelet statistical characteristics","authors":"Benquan Yang, Yueli Cui, Lihong Liu, Guang Chen, Jiamin Xu, Junhao Lin","doi":"10.1007/s00530-024-01404-x","DOIUrl":null,"url":null,"abstract":"<p>Multi-Exposure Fusion (MEF) technique aims to fuse multiple images taken from the same scene at different exposure levels into an image with more details. Although more and more MEF algorithms have been developed, how to effectively evaluate the quality of MEF images has not been thoroughly investigated. To address this issue, a blind quality evaluator for MEF image via joint sparse features and complex-wavelet statistical characteristics is developed. Specifically, considering that color and structure distortions are inevitably introduced during the MEF operations, we first train a color dictionary in the Lab color space based on the color perception mechanism of human visual system, and extract sparse perceptual features to capture the color and structure distortions. Given an MEF image to be evaluated, its components in both luminance and color channels are derived first. Subsequently, these obtained components are sparsely encoded using the trained color dictionary, and the perceived sparse features are extracted from the derived sparse coefficients. In addition, considering the insensitivity of sparse features towards weak structural information in images, complex steerable pyramid decomposition is further performed over the generated chromaticity map. Consequently, perceptual features of magnitude, phase and cross-scale structural similarity index are extracted from complex wavelet coefficients within the chromaticity map as quality-aware features. Experimental results demonstrate that our proposed metric outperforms the existing classic image quality evaluation metrics while maintaining high accordance with human visual perception.</p>","PeriodicalId":3,"journal":{"name":"ACS Applied Electronic Materials","volume":null,"pages":null},"PeriodicalIF":4.3000,"publicationDate":"2024-07-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACS Applied Electronic Materials","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s00530-024-01404-x","RegionNum":3,"RegionCategory":"材料科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Multi-Exposure Fusion (MEF) technique aims to fuse multiple images taken from the same scene at different exposure levels into an image with more details. Although more and more MEF algorithms have been developed, how to effectively evaluate the quality of MEF images has not been thoroughly investigated. To address this issue, a blind quality evaluator for MEF image via joint sparse features and complex-wavelet statistical characteristics is developed. Specifically, considering that color and structure distortions are inevitably introduced during the MEF operations, we first train a color dictionary in the Lab color space based on the color perception mechanism of human visual system, and extract sparse perceptual features to capture the color and structure distortions. Given an MEF image to be evaluated, its components in both luminance and color channels are derived first. Subsequently, these obtained components are sparsely encoded using the trained color dictionary, and the perceived sparse features are extracted from the derived sparse coefficients. In addition, considering the insensitivity of sparse features towards weak structural information in images, complex steerable pyramid decomposition is further performed over the generated chromaticity map. Consequently, perceptual features of magnitude, phase and cross-scale structural similarity index are extracted from complex wavelet coefficients within the chromaticity map as quality-aware features. Experimental results demonstrate that our proposed metric outperforms the existing classic image quality evaluation metrics while maintaining high accordance with human visual perception.