{"title":"度量空间的最大熵","authors":"Tom Leinster;Emily Roff","doi":"10.1093/qmath/haab003","DOIUrl":null,"url":null,"abstract":"We define a one-parameter family of entropies, each assigning a real number to any probability measure on a compact metric space (or, more generally, a compact Hausdorff space with a notion of similarity between points). These entropies generalise the Shannon and Renyi entropies of information theory. We prove that on any space X, there is a single probability measure maximising all these entropies simultaneously. Moreover, all the entropies have the same maximum value: the maximum entropy of X. As X is scaled up, the maximum entropy grows; its asymptotics determine geometric information about X, including the volume and dimension. We also study the large-scale limit of the maximising measure itself, arguing that it should be regarded as the canonical or uniform measure on X. Primarily we work not with entropy itself but its exponential, called diversity and (in its finite form) used as a measure of biodiversity. Our main theorem was first proved in the finite case by Leinster and Meckes.","PeriodicalId":54522,"journal":{"name":"Quarterly Journal of Mathematics","volume":"72 4","pages":"1271-1309"},"PeriodicalIF":0.6000,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/iel7/8016816/9690900/09690907.pdf","citationCount":"10","resultStr":"{\"title\":\"The Maximum Entropy of a Metric Space\",\"authors\":\"Tom Leinster;Emily Roff\",\"doi\":\"10.1093/qmath/haab003\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We define a one-parameter family of entropies, each assigning a real number to any probability measure on a compact metric space (or, more generally, a compact Hausdorff space with a notion of similarity between points). These entropies generalise the Shannon and Renyi entropies of information theory. We prove that on any space X, there is a single probability measure maximising all these entropies simultaneously. Moreover, all the entropies have the same maximum value: the maximum entropy of X. As X is scaled up, the maximum entropy grows; its asymptotics determine geometric information about X, including the volume and dimension. We also study the large-scale limit of the maximising measure itself, arguing that it should be regarded as the canonical or uniform measure on X. Primarily we work not with entropy itself but its exponential, called diversity and (in its finite form) used as a measure of biodiversity. Our main theorem was first proved in the finite case by Leinster and Meckes.\",\"PeriodicalId\":54522,\"journal\":{\"name\":\"Quarterly Journal of Mathematics\",\"volume\":\"72 4\",\"pages\":\"1271-1309\"},\"PeriodicalIF\":0.6000,\"publicationDate\":\"2021-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/iel7/8016816/9690900/09690907.pdf\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Quarterly Journal of Mathematics\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/9690907/\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"MATHEMATICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Quarterly Journal of Mathematics","FirstCategoryId":"100","ListUrlMain":"https://ieeexplore.ieee.org/document/9690907/","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"MATHEMATICS","Score":null,"Total":0}
We define a one-parameter family of entropies, each assigning a real number to any probability measure on a compact metric space (or, more generally, a compact Hausdorff space with a notion of similarity between points). These entropies generalise the Shannon and Renyi entropies of information theory. We prove that on any space X, there is a single probability measure maximising all these entropies simultaneously. Moreover, all the entropies have the same maximum value: the maximum entropy of X. As X is scaled up, the maximum entropy grows; its asymptotics determine geometric information about X, including the volume and dimension. We also study the large-scale limit of the maximising measure itself, arguing that it should be regarded as the canonical or uniform measure on X. Primarily we work not with entropy itself but its exponential, called diversity and (in its finite form) used as a measure of biodiversity. Our main theorem was first proved in the finite case by Leinster and Meckes.
期刊介绍:
The Quarterly Journal of Mathematics publishes original contributions to pure mathematics. All major areas of pure mathematics are represented on the editorial board.