{"title":"最大熵及其相关方法","authors":"I. Csiszár","doi":"10.1109/WITS.1994.513853","DOIUrl":null,"url":null,"abstract":"Originally coming from physics, maximum entropy (ME) has been promoted to a general principle of inference primarily by the works of Jaynes. ME applies to the problem of inferring a probability mass (or density) function, or any non-negative function p(x), when the available information specifies a set E of feasible functions, and there is a prior guess q /spl notin/ E. The author will review the arguments that have been put forward for justifying ME. In this author's opinion, the strongest theoretical support to ME is provided by the axiomatic approach. This shows that, in some sense, ME is the only logically consistent method of inferring a function subject to linear constraints.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"45 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"Maximum entropy and related methods\",\"authors\":\"I. Csiszár\",\"doi\":\"10.1109/WITS.1994.513853\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Originally coming from physics, maximum entropy (ME) has been promoted to a general principle of inference primarily by the works of Jaynes. ME applies to the problem of inferring a probability mass (or density) function, or any non-negative function p(x), when the available information specifies a set E of feasible functions, and there is a prior guess q /spl notin/ E. The author will review the arguments that have been put forward for justifying ME. In this author's opinion, the strongest theoretical support to ME is provided by the axiomatic approach. This shows that, in some sense, ME is the only logically consistent method of inferring a function subject to linear constraints.\",\"PeriodicalId\":423518,\"journal\":{\"name\":\"Proceedings of 1994 Workshop on Information Theory and Statistics\",\"volume\":\"45 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1994-10-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of 1994 Workshop on Information Theory and Statistics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/WITS.1994.513853\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of 1994 Workshop on Information Theory and Statistics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/WITS.1994.513853","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Originally coming from physics, maximum entropy (ME) has been promoted to a general principle of inference primarily by the works of Jaynes. ME applies to the problem of inferring a probability mass (or density) function, or any non-negative function p(x), when the available information specifies a set E of feasible functions, and there is a prior guess q /spl notin/ E. The author will review the arguments that have been put forward for justifying ME. In this author's opinion, the strongest theoretical support to ME is provided by the axiomatic approach. This shows that, in some sense, ME is the only logically consistent method of inferring a function subject to linear constraints.