基于条件分布感知测量的速率-失真-感知权衡

Sadaf Salehkalaibar, Jun Chen, Ashish Khisti, Wei Yu
{"title":"基于条件分布感知测量的速率-失真-感知权衡","authors":"Sadaf Salehkalaibar, Jun Chen, Ashish Khisti, Wei Yu","doi":"arxiv-2401.12207","DOIUrl":null,"url":null,"abstract":"We study the rate-distortion-perception (RDP) tradeoff for a memoryless\nsource model in the asymptotic limit of large block-lengths. Our perception\nmeasure is based on a divergence between the distributions of the source and\nreconstruction sequences conditioned on the encoder output, which was first\nproposed in [1], [2]. We consider the case when there is no shared randomness\nbetween the encoder and the decoder. For the case of discrete memoryless\nsources we derive a single-letter characterization of the RDP function, thus\nsettling a problem that remains open for the marginal metric introduced in Blau\nand Michaeli [3] (with no shared randomness). Our achievability scheme is based\non lossy source coding with a posterior reference map proposed in [4]. For the\ncase of continuous valued sources under squared error distortion measure and\nsquared quadratic Wasserstein perception measure we also derive a single-letter\ncharacterization and show that a noise-adding mechanism at the decoder suffices\nto achieve the optimal representation. For the case of zero perception loss, we\nshow that our characterization interestingly coincides with the results for the\nmarginal metric derived in [5], [6] and again demonstrate that zero perception\nloss can be achieved with a $3$-dB penalty in the minimum distortion. Finally\nwe specialize our results to the case of Gaussian sources. We derive the RDP\nfunction for vector Gaussian sources and propose a waterfilling type solution.\nWe also partially characterize the RDP function for a mixture of vector\nGaussians.","PeriodicalId":501433,"journal":{"name":"arXiv - CS - Information Theory","volume":"11 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Rate-Distortion-Perception Tradeoff Based on the Conditional-Distribution Perception Measure\",\"authors\":\"Sadaf Salehkalaibar, Jun Chen, Ashish Khisti, Wei Yu\",\"doi\":\"arxiv-2401.12207\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We study the rate-distortion-perception (RDP) tradeoff for a memoryless\\nsource model in the asymptotic limit of large block-lengths. Our perception\\nmeasure is based on a divergence between the distributions of the source and\\nreconstruction sequences conditioned on the encoder output, which was first\\nproposed in [1], [2]. We consider the case when there is no shared randomness\\nbetween the encoder and the decoder. For the case of discrete memoryless\\nsources we derive a single-letter characterization of the RDP function, thus\\nsettling a problem that remains open for the marginal metric introduced in Blau\\nand Michaeli [3] (with no shared randomness). Our achievability scheme is based\\non lossy source coding with a posterior reference map proposed in [4]. For the\\ncase of continuous valued sources under squared error distortion measure and\\nsquared quadratic Wasserstein perception measure we also derive a single-letter\\ncharacterization and show that a noise-adding mechanism at the decoder suffices\\nto achieve the optimal representation. For the case of zero perception loss, we\\nshow that our characterization interestingly coincides with the results for the\\nmarginal metric derived in [5], [6] and again demonstrate that zero perception\\nloss can be achieved with a $3$-dB penalty in the minimum distortion. Finally\\nwe specialize our results to the case of Gaussian sources. We derive the RDP\\nfunction for vector Gaussian sources and propose a waterfilling type solution.\\nWe also partially characterize the RDP function for a mixture of vector\\nGaussians.\",\"PeriodicalId\":501433,\"journal\":{\"name\":\"arXiv - CS - Information Theory\",\"volume\":\"11 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-01-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Information Theory\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2401.12207\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Information Theory","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2401.12207","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

我们研究了大块长度渐近极限下无记忆源模型的速率-失真-感知(RDP)权衡。我们的感知度量基于编码器输出条件下的源序列和重构序列分布之间的发散,这是在 [1] 和 [2] 中首次提出的。我们考虑的是编码器和解码器之间不存在共享随机性的情况。对于离散无内存源的情况,我们推导出了 RDP 函数的单字母特征,从而解决了一个对于 Blauand Michaeli [3] 中引入的边际度量(无共享随机性)来说仍然悬而未决的问题。我们的可实现性方案是基于 [4] 中提出的后参考图的有损信源编码。对于在平方误差失真度和平方二次瓦瑟斯坦感知度下的连续值源,我们还推导出了一个单字母表征,并证明在解码器上的噪声添加机制足以实现最优表征。对于零感知损失的情况,我们展示了我们的描述与 [5]、[6] 中得出的边际度量结果有趣地重合,并再次证明零感知损失可以通过对最小失真进行 3 美元-分贝的惩罚来实现。最后,我们将结果专门用于高斯源情况。我们推导出了矢量高斯源的 RDP 函数,并提出了填水型解决方案。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Rate-Distortion-Perception Tradeoff Based on the Conditional-Distribution Perception Measure
We study the rate-distortion-perception (RDP) tradeoff for a memoryless source model in the asymptotic limit of large block-lengths. Our perception measure is based on a divergence between the distributions of the source and reconstruction sequences conditioned on the encoder output, which was first proposed in [1], [2]. We consider the case when there is no shared randomness between the encoder and the decoder. For the case of discrete memoryless sources we derive a single-letter characterization of the RDP function, thus settling a problem that remains open for the marginal metric introduced in Blau and Michaeli [3] (with no shared randomness). Our achievability scheme is based on lossy source coding with a posterior reference map proposed in [4]. For the case of continuous valued sources under squared error distortion measure and squared quadratic Wasserstein perception measure we also derive a single-letter characterization and show that a noise-adding mechanism at the decoder suffices to achieve the optimal representation. For the case of zero perception loss, we show that our characterization interestingly coincides with the results for the marginal metric derived in [5], [6] and again demonstrate that zero perception loss can be achieved with a $3$-dB penalty in the minimum distortion. Finally we specialize our results to the case of Gaussian sources. We derive the RDP function for vector Gaussian sources and propose a waterfilling type solution. We also partially characterize the RDP function for a mixture of vector Gaussians.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Massive MIMO CSI Feedback using Channel Prediction: How to Avoid Machine Learning at UE? Reverse em-problem based on Bregman divergence and its application to classical and quantum information theory From "um" to "yeah": Producing, predicting, and regulating information flow in human conversation Electrochemical Communication in Bacterial Biofilms: A Study on Potassium Stimulation and Signal Transmission Semantics-Empowered Space-Air-Ground-Sea Integrated Network: New Paradigm, Frameworks, and Challenges
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1