程序长度的统一估算和定量信息度量的计算算法复杂性

Rohit Kumar Verma, M. B. Laxmi
{"title":"程序长度的统一估算和定量信息度量的计算算法复杂性","authors":"Rohit Kumar Verma, M. B. Laxmi","doi":"10.9734/jamcs/2024/v39i51889","DOIUrl":null,"url":null,"abstract":"Shannon entropy and Kolmogorov complexity are two conceptually distinct information metrics since the latter is based on probability distributions while the former is based on program size. All recursive probability distributions, however, are known to have an expected Up to a constant that solely depends on the distribution, the Kolmogorov complexity value is equal to its Shannon entropy. We investigate if a comparable correlation exists between Renyi and Havrda- Charvat Entropy entropies order α, indicating that it is consistent solely with Renyi and Havrda- Charvat entropies of order 1.\nKolmogorov noted that the characteristics of Shannon entropy and algorithmic complexity are comparable. We examine a single facet of this resemblance. Specifically, linear inequalities that hold true for Shannon entropy and for Kolmogorov complexity. As it happens, the following are true: (1) all linear inequalities that hold true for Shannon entropy and vice versa for Kolmogorov complexity; (2) all linear inequalities that hold true for ranks of finite subsets of linear spaces for Shannon entropy; and (3) the reverse is untrue.","PeriodicalId":503149,"journal":{"name":"Journal of Advances in Mathematics and Computer Science","volume":"68 S11","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-04-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Uniform Estimates on Length of Programs and Computing Algorithmic Complexities for Quantitative Information Measures\",\"authors\":\"Rohit Kumar Verma, M. B. Laxmi\",\"doi\":\"10.9734/jamcs/2024/v39i51889\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Shannon entropy and Kolmogorov complexity are two conceptually distinct information metrics since the latter is based on probability distributions while the former is based on program size. All recursive probability distributions, however, are known to have an expected Up to a constant that solely depends on the distribution, the Kolmogorov complexity value is equal to its Shannon entropy. We investigate if a comparable correlation exists between Renyi and Havrda- Charvat Entropy entropies order α, indicating that it is consistent solely with Renyi and Havrda- Charvat entropies of order 1.\\nKolmogorov noted that the characteristics of Shannon entropy and algorithmic complexity are comparable. We examine a single facet of this resemblance. Specifically, linear inequalities that hold true for Shannon entropy and for Kolmogorov complexity. As it happens, the following are true: (1) all linear inequalities that hold true for Shannon entropy and vice versa for Kolmogorov complexity; (2) all linear inequalities that hold true for ranks of finite subsets of linear spaces for Shannon entropy; and (3) the reverse is untrue.\",\"PeriodicalId\":503149,\"journal\":{\"name\":\"Journal of Advances in Mathematics and Computer Science\",\"volume\":\"68 S11\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-04-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Advances in Mathematics and Computer Science\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.9734/jamcs/2024/v39i51889\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Advances in Mathematics and Computer Science","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.9734/jamcs/2024/v39i51889","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

香农熵和柯尔莫哥洛夫复杂度是两个概念不同的信息指标,因为后者基于概率分布,而前者基于程序大小。然而,众所周知,所有递归概率分布都有一个完全取决于该分布的常数,即柯尔莫哥洛夫复杂度值等于其香农熵。我们研究了 Renyi 和 Havrda- Charvat 熵熵阶 α 之间是否存在类似的相关性,表明它仅与阶 1 的 Renyi 和 Havrda- Charvat 熵一致。我们将研究这种相似性的一个方面。具体来说,香农熵和柯尔莫哥洛夫复杂度的线性不等式是成立的。恰好,以下情况是正确的:(1) 所有线性不等式对香农熵成立,反之亦然;(2) 所有线性不等式对香农熵成立,反之亦然;(3) 所有线性不等式对线性空间有限子集的秩成立,反之亦然。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Uniform Estimates on Length of Programs and Computing Algorithmic Complexities for Quantitative Information Measures
Shannon entropy and Kolmogorov complexity are two conceptually distinct information metrics since the latter is based on probability distributions while the former is based on program size. All recursive probability distributions, however, are known to have an expected Up to a constant that solely depends on the distribution, the Kolmogorov complexity value is equal to its Shannon entropy. We investigate if a comparable correlation exists between Renyi and Havrda- Charvat Entropy entropies order α, indicating that it is consistent solely with Renyi and Havrda- Charvat entropies of order 1. Kolmogorov noted that the characteristics of Shannon entropy and algorithmic complexity are comparable. We examine a single facet of this resemblance. Specifically, linear inequalities that hold true for Shannon entropy and for Kolmogorov complexity. As it happens, the following are true: (1) all linear inequalities that hold true for Shannon entropy and vice versa for Kolmogorov complexity; (2) all linear inequalities that hold true for ranks of finite subsets of linear spaces for Shannon entropy; and (3) the reverse is untrue.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Comparative Study of Elementary School Mathematics Textbooks in China, Japan, South Korea, Singapore, America, Germany: A Case Study on "Fraction Division" Mathematical Modeling of Diarrhea with Vaccination and Treatment Factors Analysis of Turing Instability of the Fitzhugh-Nagumo Model in Diffusive Network Uniform Estimates on Length of Programs and Computing Algorithmic Complexities for Quantitative Information Measures Common Fixed Points of Dass-Gupta Rational Contraction and E-Contraction
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1