{"title":"关于D-ary范诺码","authors":"F. Cicalese, Eros Rossi","doi":"10.1109/ISIT44484.2020.9174023","DOIUrl":null,"url":null,"abstract":"We define a D-ary Fano code based on a natural generalization of the splitting criterion of the binary Fano code to the case of D-ary code. We show that this choice allows for an efficient computation of the code tree and also leads to a strong guarantee with respect to the redundancy of the resulting code: for any source distribution p = p1,… pn1) for D = 2, 3,4 the resulting code satisfies\\begin{equation*}\\bar L - {H_D}({\\mathbf{p}}) \\leq 1 - {p_{\\min }}, \\tag{1}\\end{equation*}where $\\bar L$ is the average codeword length, pmin = mini pi, and ${H_D}({\\mathbf{p}}) = \\sum\\nolimits_{i = 1}^n {{p_i}{{\\log }_D}\\frac{1}{{{p_i}}}} $ (the D-ary entropy);2) inequality (1) holds for every D ≥ 2 whenever every internal node has exactly D children in the code tree produced by our construction.We also formulate a conjecture on the basic step applied by our algorithm in each internal node of the code tree, that, if true, would imply that the bound in (1) is actually achieved for all D ≥ 2 without the restriction of item 2.","PeriodicalId":159311,"journal":{"name":"2020 IEEE International Symposium on Information Theory (ISIT)","volume":"43 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"On D-ary Fano Codes\",\"authors\":\"F. Cicalese, Eros Rossi\",\"doi\":\"10.1109/ISIT44484.2020.9174023\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We define a D-ary Fano code based on a natural generalization of the splitting criterion of the binary Fano code to the case of D-ary code. We show that this choice allows for an efficient computation of the code tree and also leads to a strong guarantee with respect to the redundancy of the resulting code: for any source distribution p = p1,… pn1) for D = 2, 3,4 the resulting code satisfies\\\\begin{equation*}\\\\bar L - {H_D}({\\\\mathbf{p}}) \\\\leq 1 - {p_{\\\\min }}, \\\\tag{1}\\\\end{equation*}where $\\\\bar L$ is the average codeword length, pmin = mini pi, and ${H_D}({\\\\mathbf{p}}) = \\\\sum\\\\nolimits_{i = 1}^n {{p_i}{{\\\\log }_D}\\\\frac{1}{{{p_i}}}} $ (the D-ary entropy);2) inequality (1) holds for every D ≥ 2 whenever every internal node has exactly D children in the code tree produced by our construction.We also formulate a conjecture on the basic step applied by our algorithm in each internal node of the code tree, that, if true, would imply that the bound in (1) is actually achieved for all D ≥ 2 without the restriction of item 2.\",\"PeriodicalId\":159311,\"journal\":{\"name\":\"2020 IEEE International Symposium on Information Theory (ISIT)\",\"volume\":\"43 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 IEEE International Symposium on Information Theory (ISIT)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISIT44484.2020.9174023\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE International Symposium on Information Theory (ISIT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISIT44484.2020.9174023","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
We define a D-ary Fano code based on a natural generalization of the splitting criterion of the binary Fano code to the case of D-ary code. We show that this choice allows for an efficient computation of the code tree and also leads to a strong guarantee with respect to the redundancy of the resulting code: for any source distribution p = p1,… pn1) for D = 2, 3,4 the resulting code satisfies\begin{equation*}\bar L - {H_D}({\mathbf{p}}) \leq 1 - {p_{\min }}, \tag{1}\end{equation*}where $\bar L$ is the average codeword length, pmin = mini pi, and ${H_D}({\mathbf{p}}) = \sum\nolimits_{i = 1}^n {{p_i}{{\log }_D}\frac{1}{{{p_i}}}} $ (the D-ary entropy);2) inequality (1) holds for every D ≥ 2 whenever every internal node has exactly D children in the code tree produced by our construction.We also formulate a conjecture on the basic step applied by our algorithm in each internal node of the code tree, that, if true, would imply that the bound in (1) is actually achieved for all D ≥ 2 without the restriction of item 2.