{"title":"Capacity of Noisy Permutation Channels","authors":"Jennifer Tang, Yury Polyanskiy","doi":"10.1109/ISIT50566.2022.9834509","DOIUrl":null,"url":null,"abstract":"We establish the capacity of a class of communication channels introduced in [2]. The n-letter input from a finite alphabet is passed through a discrete memoryless channel PZ|X and then the output n-letter sequence is uniformly permuted. We show that the maximal communication rate (normalized by log n) equals $\\frac{1}{2}\\left( {\\operatorname{rank} \\left( {{P_{Z\\mid X}}} \\right) - 1} \\right)$ whenever PZ|X is strictly positive. This is done by establishing a converse bound matching the achievability of [2]. The two main ingredients of our proof are (1) a sharp bound on the entropy of a uniformly sampled vector from a type class and observed through a DMC; and (2) the covering ε-net of a probability simplex with Kullback-Leibler divergence as a metric. In addition to strictly positive DMC we also find the noisy permutation capacity for q-ary erasure channels, the Z-channel and others.","PeriodicalId":348168,"journal":{"name":"2022 IEEE International Symposium on Information Theory (ISIT)","volume":"189 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Symposium on Information Theory (ISIT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISIT50566.2022.9834509","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7
Abstract
We establish the capacity of a class of communication channels introduced in [2]. The n-letter input from a finite alphabet is passed through a discrete memoryless channel PZ|X and then the output n-letter sequence is uniformly permuted. We show that the maximal communication rate (normalized by log n) equals $\frac{1}{2}\left( {\operatorname{rank} \left( {{P_{Z\mid X}}} \right) - 1} \right)$ whenever PZ|X is strictly positive. This is done by establishing a converse bound matching the achievability of [2]. The two main ingredients of our proof are (1) a sharp bound on the entropy of a uniformly sampled vector from a type class and observed through a DMC; and (2) the covering ε-net of a probability simplex with Kullback-Leibler divergence as a metric. In addition to strictly positive DMC we also find the noisy permutation capacity for q-ary erasure channels, the Z-channel and others.