Sadaf Salehkalaibar, Jun Chen, Ashish Khisti, Wei Yu
{"title":"Rate-Distortion-Perception Tradeoff Based on the Conditional-Distribution Perception Measure","authors":"Sadaf Salehkalaibar, Jun Chen, Ashish Khisti, Wei Yu","doi":"arxiv-2401.12207","DOIUrl":null,"url":null,"abstract":"We study the rate-distortion-perception (RDP) tradeoff for a memoryless\nsource model in the asymptotic limit of large block-lengths. Our perception\nmeasure is based on a divergence between the distributions of the source and\nreconstruction sequences conditioned on the encoder output, which was first\nproposed in [1], [2]. We consider the case when there is no shared randomness\nbetween the encoder and the decoder. For the case of discrete memoryless\nsources we derive a single-letter characterization of the RDP function, thus\nsettling a problem that remains open for the marginal metric introduced in Blau\nand Michaeli [3] (with no shared randomness). Our achievability scheme is based\non lossy source coding with a posterior reference map proposed in [4]. For the\ncase of continuous valued sources under squared error distortion measure and\nsquared quadratic Wasserstein perception measure we also derive a single-letter\ncharacterization and show that a noise-adding mechanism at the decoder suffices\nto achieve the optimal representation. For the case of zero perception loss, we\nshow that our characterization interestingly coincides with the results for the\nmarginal metric derived in [5], [6] and again demonstrate that zero perception\nloss can be achieved with a $3$-dB penalty in the minimum distortion. Finally\nwe specialize our results to the case of Gaussian sources. We derive the RDP\nfunction for vector Gaussian sources and propose a waterfilling type solution.\nWe also partially characterize the RDP function for a mixture of vector\nGaussians.","PeriodicalId":501433,"journal":{"name":"arXiv - CS - Information Theory","volume":"11 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Information Theory","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2401.12207","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
We study the rate-distortion-perception (RDP) tradeoff for a memoryless
source model in the asymptotic limit of large block-lengths. Our perception
measure is based on a divergence between the distributions of the source and
reconstruction sequences conditioned on the encoder output, which was first
proposed in [1], [2]. We consider the case when there is no shared randomness
between the encoder and the decoder. For the case of discrete memoryless
sources we derive a single-letter characterization of the RDP function, thus
settling a problem that remains open for the marginal metric introduced in Blau
and Michaeli [3] (with no shared randomness). Our achievability scheme is based
on lossy source coding with a posterior reference map proposed in [4]. For the
case of continuous valued sources under squared error distortion measure and
squared quadratic Wasserstein perception measure we also derive a single-letter
characterization and show that a noise-adding mechanism at the decoder suffices
to achieve the optimal representation. For the case of zero perception loss, we
show that our characterization interestingly coincides with the results for the
marginal metric derived in [5], [6] and again demonstrate that zero perception
loss can be achieved with a $3$-dB penalty in the minimum distortion. Finally
we specialize our results to the case of Gaussian sources. We derive the RDP
function for vector Gaussian sources and propose a waterfilling type solution.
We also partially characterize the RDP function for a mixture of vector
Gaussians.