图像序列两两排序的盲图像质量评价

IF 3.1 3区 计算机科学 Q2 TELECOMMUNICATIONS China Communications Pub Date : 2023-09-01 DOI:10.23919/JCC.2023.00.102
Li Xu, Xiuhua Jiang
{"title":"图像序列两两排序的盲图像质量评价","authors":"Li Xu, Xiuhua Jiang","doi":"10.23919/JCC.2023.00.102","DOIUrl":null,"url":null,"abstract":"Image quality assessment (IQA) is constantly innovating, but there are still three types of stickers that have not been resolved: the \"content sticker\" — limitation of training set, the \"annotation sticker\" — subjective instability in opinion scores and the \"distortion sticker\" — disordered distortion settings. In this paper, a No-Reference Image Quality Assessment (NR IQA) approach is proposed to deal with the problems. For \"content sticker\", we introduce the idea of pairwise comparison and generate a largescale ranking set to pre-train the network; For \"annotation sticker\", the absolute noise-containing subjective scores are transformed into ranking comparison results, and we design an indirect unsupervised regression based on Eigenvalue Decomposition (EVD); For \"distortion sticker\", we propose a perception-based distortion classification method, which makes the distortion types clear and refined. Experiments have proved that our NR IQA approach Experiments show that the algorithm performs well and has good generalization ability. Furthermore, the proposed perception based distortion classification method would be able to provide insights on how the visual related studies may be developed and to broaden our understanding of human visual system.","PeriodicalId":9814,"journal":{"name":"China Communications","volume":"20 1","pages":"127-143"},"PeriodicalIF":3.1000,"publicationDate":"2023-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Blind image quality assessment by pairwise ranking image series\",\"authors\":\"Li Xu, Xiuhua Jiang\",\"doi\":\"10.23919/JCC.2023.00.102\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Image quality assessment (IQA) is constantly innovating, but there are still three types of stickers that have not been resolved: the \\\"content sticker\\\" — limitation of training set, the \\\"annotation sticker\\\" — subjective instability in opinion scores and the \\\"distortion sticker\\\" — disordered distortion settings. In this paper, a No-Reference Image Quality Assessment (NR IQA) approach is proposed to deal with the problems. For \\\"content sticker\\\", we introduce the idea of pairwise comparison and generate a largescale ranking set to pre-train the network; For \\\"annotation sticker\\\", the absolute noise-containing subjective scores are transformed into ranking comparison results, and we design an indirect unsupervised regression based on Eigenvalue Decomposition (EVD); For \\\"distortion sticker\\\", we propose a perception-based distortion classification method, which makes the distortion types clear and refined. Experiments have proved that our NR IQA approach Experiments show that the algorithm performs well and has good generalization ability. Furthermore, the proposed perception based distortion classification method would be able to provide insights on how the visual related studies may be developed and to broaden our understanding of human visual system.\",\"PeriodicalId\":9814,\"journal\":{\"name\":\"China Communications\",\"volume\":\"20 1\",\"pages\":\"127-143\"},\"PeriodicalIF\":3.1000,\"publicationDate\":\"2023-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"China Communications\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.23919/JCC.2023.00.102\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"TELECOMMUNICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"China Communications","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.23919/JCC.2023.00.102","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"TELECOMMUNICATIONS","Score":null,"Total":0}
引用次数: 0

摘要

图像质量评估(IQA)不断创新,但仍有三种类型的贴纸尚未解决:“内容贴纸”(训练集的限制)、“注释贴纸”(意见得分的主观不稳定)和“失真贴纸”(失真设置混乱)。本文提出了一种无参考图像质量评估(NR IQA)方法来解决这些问题。对于“内容标签”,我们引入了成对比较的思想,并生成了一个大规模的排名集来预训练网络;对于“标注贴纸”,将包含主观得分的绝对噪声转化为排名比较结果,并设计了一种基于特征值分解的间接无监督回归;对于“失真贴纸”,我们提出了一种基于感知的失真分类方法,使失真类型清晰精细。实验证明了我们的NR IQA方法。实验表明,该算法性能良好,具有良好的泛化能力。此外,所提出的基于感知的失真分类方法将能够深入了解如何开展视觉相关研究,并拓宽我们对人类视觉系统的理解。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Blind image quality assessment by pairwise ranking image series
Image quality assessment (IQA) is constantly innovating, but there are still three types of stickers that have not been resolved: the "content sticker" — limitation of training set, the "annotation sticker" — subjective instability in opinion scores and the "distortion sticker" — disordered distortion settings. In this paper, a No-Reference Image Quality Assessment (NR IQA) approach is proposed to deal with the problems. For "content sticker", we introduce the idea of pairwise comparison and generate a largescale ranking set to pre-train the network; For "annotation sticker", the absolute noise-containing subjective scores are transformed into ranking comparison results, and we design an indirect unsupervised regression based on Eigenvalue Decomposition (EVD); For "distortion sticker", we propose a perception-based distortion classification method, which makes the distortion types clear and refined. Experiments have proved that our NR IQA approach Experiments show that the algorithm performs well and has good generalization ability. Furthermore, the proposed perception based distortion classification method would be able to provide insights on how the visual related studies may be developed and to broaden our understanding of human visual system.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
China Communications
China Communications 工程技术-电信学
CiteScore
8.00
自引率
12.20%
发文量
2868
审稿时长
8.6 months
期刊介绍: China Communications (ISSN 1673-5447) is an English-language monthly journal cosponsored by the China Institute of Communications (CIC) and IEEE Communications Society (IEEE ComSoc). It is aimed at readers in industry, universities, research and development organizations, and government agencies in the field of Information and Communications Technologies (ICTs) worldwide. The journal's main objective is to promote academic exchange in the ICTs sector and publish high-quality papers to contribute to the global ICTs industry. It provides instant access to the latest articles and papers, presenting leading-edge research achievements, tutorial overviews, and descriptions of significant practical applications of technology. China Communications has been indexed in SCIE (Science Citation Index-Expanded) since January 2007. Additionally, all articles have been available in the IEEE Xplore digital library since January 2013.
期刊最新文献
Secure short-packet transmission in uplink massive MU-MIMO assisted URLLC under imperfect CSI IoV and blockchain-enabled driving guidance strategy in complex traffic environment Multi-source underwater DOA estimation using PSO-BP neural network based on high-order cumulant optimization An overview of interactive immersive services Performance analysis in SWIPT-based bidirectional D2D communications in cellular networks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1