{"title":"No-Reference Image Quality Assessment: An Attention Driven Approach","authors":"Diqi Chen, Yizhou Wang, Hongyu Ren, Wen Gao","doi":"10.1109/WACV.2019.00046","DOIUrl":null,"url":null,"abstract":"In this paper, we tackle no-reference image quality assessment (NR-IQA), which aims to predict the perceptual quality of a test image without referencing its pristine-quality counterpart. The free-energy brain theory implies that the human visual system (HVS) tends to predict the pristine image while perceiving a distorted one. Besides, image quality assessment heavily depends on the way how human beings attend to distorted images. Motivated by that, the distorted image is restored first. Then given the distorted-restored pair, we make the first attempt to formulate the NR-IQA as a dynamic attentional process and implement it via reinforcement learning. The reward is derived from two tasks—classifying the distortion type and predicting the perceptual score of a test image. The model learns a policy to sample a sequence of fixation areas with a goal to maximize the expectation of the accumulated rewards. The observations of the fixation areas are aggregated through a recurrent neural network (RNN) and the robust averaging strategy which assigns different weights on different fixation areas. Extensive experiments on TID2008, TID2013 and CSIQ demonstrate the superiority of our method.","PeriodicalId":436637,"journal":{"name":"2019 IEEE Winter Conference on Applications of Computer Vision (WACV)","volume":"25 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE Winter Conference on Applications of Computer Vision (WACV)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/WACV.2019.00046","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
In this paper, we tackle no-reference image quality assessment (NR-IQA), which aims to predict the perceptual quality of a test image without referencing its pristine-quality counterpart. The free-energy brain theory implies that the human visual system (HVS) tends to predict the pristine image while perceiving a distorted one. Besides, image quality assessment heavily depends on the way how human beings attend to distorted images. Motivated by that, the distorted image is restored first. Then given the distorted-restored pair, we make the first attempt to formulate the NR-IQA as a dynamic attentional process and implement it via reinforcement learning. The reward is derived from two tasks—classifying the distortion type and predicting the perceptual score of a test image. The model learns a policy to sample a sequence of fixation areas with a goal to maximize the expectation of the accumulated rewards. The observations of the fixation areas are aggregated through a recurrent neural network (RNN) and the robust averaging strategy which assigns different weights on different fixation areas. Extensive experiments on TID2008, TID2013 and CSIQ demonstrate the superiority of our method.