{"title":"Uniform Estimates on Length of Programs and Computing Algorithmic Complexities for Quantitative Information Measures","authors":"Rohit Kumar Verma, M. B. Laxmi","doi":"10.9734/jamcs/2024/v39i51889","DOIUrl":null,"url":null,"abstract":"Shannon entropy and Kolmogorov complexity are two conceptually distinct information metrics since the latter is based on probability distributions while the former is based on program size. All recursive probability distributions, however, are known to have an expected Up to a constant that solely depends on the distribution, the Kolmogorov complexity value is equal to its Shannon entropy. We investigate if a comparable correlation exists between Renyi and Havrda- Charvat Entropy entropies order α, indicating that it is consistent solely with Renyi and Havrda- Charvat entropies of order 1.\nKolmogorov noted that the characteristics of Shannon entropy and algorithmic complexity are comparable. We examine a single facet of this resemblance. Specifically, linear inequalities that hold true for Shannon entropy and for Kolmogorov complexity. As it happens, the following are true: (1) all linear inequalities that hold true for Shannon entropy and vice versa for Kolmogorov complexity; (2) all linear inequalities that hold true for ranks of finite subsets of linear spaces for Shannon entropy; and (3) the reverse is untrue.","PeriodicalId":503149,"journal":{"name":"Journal of Advances in Mathematics and Computer Science","volume":"68 S11","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-04-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Advances in Mathematics and Computer Science","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.9734/jamcs/2024/v39i51889","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Shannon entropy and Kolmogorov complexity are two conceptually distinct information metrics since the latter is based on probability distributions while the former is based on program size. All recursive probability distributions, however, are known to have an expected Up to a constant that solely depends on the distribution, the Kolmogorov complexity value is equal to its Shannon entropy. We investigate if a comparable correlation exists between Renyi and Havrda- Charvat Entropy entropies order α, indicating that it is consistent solely with Renyi and Havrda- Charvat entropies of order 1.
Kolmogorov noted that the characteristics of Shannon entropy and algorithmic complexity are comparable. We examine a single facet of this resemblance. Specifically, linear inequalities that hold true for Shannon entropy and for Kolmogorov complexity. As it happens, the following are true: (1) all linear inequalities that hold true for Shannon entropy and vice versa for Kolmogorov complexity; (2) all linear inequalities that hold true for ranks of finite subsets of linear spaces for Shannon entropy; and (3) the reverse is untrue.