{"title":"A Content-Aware Full-Reference Image Quality Assessment Method Using a Gram Matrix and Signal-to-Noise","authors":"Shuqi Han;Yueting Huang;Mingliang Zhou;Xuekai Wei;Fan Jia;Xu Zhuang;Fei Cheng;Tao Xiang;Yong Feng;Huayan Pu;Jun Luo","doi":"10.1109/TBC.2024.3410707","DOIUrl":null,"url":null,"abstract":"With the emergence of transformer-based feature extractors, the effect of image quality assessment (IQA) has improved, but its interpretability is limited. In addition, images repaired by generative adversarial networks (GANs) produce realistic textures and spatial misalignments with high-quality images. In this paper, we develop a content-aware full-reference IQA method without changing the original convolutional neural network feature extractor. First, image signal-to-noise (SNR) mapping is performed experimentally to verify its superior content-aware ability, and based on the SNR mapping of the reference image, we fuse multiscale distortion and normal image features according to a fusion strategy that enhances the informative area. Second, judging the quality of GAN-generated images from the perspective of focusing on content may ignore the alignment between pixels; therefore, we add a Gram-matrix-based texture enhancement module to boost the texture information between distorted and normal difference features. Finally, experiments on numerous public datasets prove the superior performance of the proposed method in predicting image quality.","PeriodicalId":13159,"journal":{"name":"IEEE Transactions on Broadcasting","volume":"70 4","pages":"1279-1291"},"PeriodicalIF":3.2000,"publicationDate":"2024-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Broadcasting","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10577449/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
With the emergence of transformer-based feature extractors, the effect of image quality assessment (IQA) has improved, but its interpretability is limited. In addition, images repaired by generative adversarial networks (GANs) produce realistic textures and spatial misalignments with high-quality images. In this paper, we develop a content-aware full-reference IQA method without changing the original convolutional neural network feature extractor. First, image signal-to-noise (SNR) mapping is performed experimentally to verify its superior content-aware ability, and based on the SNR mapping of the reference image, we fuse multiscale distortion and normal image features according to a fusion strategy that enhances the informative area. Second, judging the quality of GAN-generated images from the perspective of focusing on content may ignore the alignment between pixels; therefore, we add a Gram-matrix-based texture enhancement module to boost the texture information between distorted and normal difference features. Finally, experiments on numerous public datasets prove the superior performance of the proposed method in predicting image quality.
期刊介绍:
The Society’s Field of Interest is “Devices, equipment, techniques and systems related to broadcast technology, including the production, distribution, transmission, and propagation aspects.” In addition to this formal FOI statement, which is used to provide guidance to the Publications Committee in the selection of content, the AdCom has further resolved that “broadcast systems includes all aspects of transmission, propagation, and reception.”