{"title":"量子关联的边界:香农信息在信息因果关系原理中的作用","authors":"Natasha Oughton, Christopher G. Timpson","doi":"10.3390/e26070562","DOIUrl":null,"url":null,"abstract":"The Information Causality principle was proposed to re-derive the Tsirelson bound, an upper limit on the strength of quantum correlations, and has been suggested as a candidate law of nature. The principle states that the Shannon information about Alice’s distant database gained by Bob after receiving an m bit message cannot exceed m bits, even when Alice and Bob share non-local resources. As originally formulated, it can be shown that the principle is violated exactly when the strength of the shared correlations exceeds the Tsirelson bound. However, we demonstrate here that when an alternative measure of information, one of the Renyi measures, is chosen, the Information Causality principle no longer arrives at the correct value for the Tsirelson bound. We argue that neither the assumption of particular `intuitive’ properties of uncertainties measures, nor pragmatic choices about how to optimise costs associated with communication, are sufficient to motivate uniquely the choice of the Shannon measure from amongst the more general Renyi measures. We conclude that the dependence of the success of Information Causality on mere convention undermines its claimed significance as a foundational principle.","PeriodicalId":11694,"journal":{"name":"Entropy","volume":null,"pages":null},"PeriodicalIF":2.1000,"publicationDate":"2024-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Bounding Quantum Correlations: The Role of the Shannon Information in the Information Causality Principle\",\"authors\":\"Natasha Oughton, Christopher G. Timpson\",\"doi\":\"10.3390/e26070562\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The Information Causality principle was proposed to re-derive the Tsirelson bound, an upper limit on the strength of quantum correlations, and has been suggested as a candidate law of nature. The principle states that the Shannon information about Alice’s distant database gained by Bob after receiving an m bit message cannot exceed m bits, even when Alice and Bob share non-local resources. As originally formulated, it can be shown that the principle is violated exactly when the strength of the shared correlations exceeds the Tsirelson bound. However, we demonstrate here that when an alternative measure of information, one of the Renyi measures, is chosen, the Information Causality principle no longer arrives at the correct value for the Tsirelson bound. We argue that neither the assumption of particular `intuitive’ properties of uncertainties measures, nor pragmatic choices about how to optimise costs associated with communication, are sufficient to motivate uniquely the choice of the Shannon measure from amongst the more general Renyi measures. We conclude that the dependence of the success of Information Causality on mere convention undermines its claimed significance as a foundational principle.\",\"PeriodicalId\":11694,\"journal\":{\"name\":\"Entropy\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":2.1000,\"publicationDate\":\"2024-06-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Entropy\",\"FirstCategoryId\":\"101\",\"ListUrlMain\":\"https://doi.org/10.3390/e26070562\",\"RegionNum\":3,\"RegionCategory\":\"物理与天体物理\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"PHYSICS, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Entropy","FirstCategoryId":"101","ListUrlMain":"https://doi.org/10.3390/e26070562","RegionNum":3,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"PHYSICS, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
摘要
信息因果关系原理的提出是为了重新推导出齐雷尔森约束(量子相关性强度的上限),并被认为是一种候选的自然法则。该原理指出,即使爱丽丝和鲍勃共享非本地资源,鲍勃在接收到 m 位信息后获得的关于爱丽丝远方数据库的香农信息也不能超过 m 位。按照最初的表述,可以证明当共享相关性的强度超过齐雷尔森约束时,就违反了这一原则。然而,我们在此证明,当选择另一种信息度量方法,即 Renyi 度量方法之一时,信息因果关系原理不再能得出正确的 Tsirelson 约束值。我们认为,无论是对不确定性度量的特定 "直觉 "属性的假设,还是对如何优化通信相关成本的实用选择,都不足以唯一地促使我们从更一般的任义度量中选择香农度量。我们的结论是,信息因果关系的成功仅仅依赖于约定俗成,这有损于它作为基本原则所宣称的意义。
Bounding Quantum Correlations: The Role of the Shannon Information in the Information Causality Principle
The Information Causality principle was proposed to re-derive the Tsirelson bound, an upper limit on the strength of quantum correlations, and has been suggested as a candidate law of nature. The principle states that the Shannon information about Alice’s distant database gained by Bob after receiving an m bit message cannot exceed m bits, even when Alice and Bob share non-local resources. As originally formulated, it can be shown that the principle is violated exactly when the strength of the shared correlations exceeds the Tsirelson bound. However, we demonstrate here that when an alternative measure of information, one of the Renyi measures, is chosen, the Information Causality principle no longer arrives at the correct value for the Tsirelson bound. We argue that neither the assumption of particular `intuitive’ properties of uncertainties measures, nor pragmatic choices about how to optimise costs associated with communication, are sufficient to motivate uniquely the choice of the Shannon measure from amongst the more general Renyi measures. We conclude that the dependence of the success of Information Causality on mere convention undermines its claimed significance as a foundational principle.
期刊介绍:
Entropy (ISSN 1099-4300), an international and interdisciplinary journal of entropy and information studies, publishes reviews, regular research papers and short notes. Our aim is to encourage scientists to publish as much as possible their theoretical and experimental details. There is no restriction on the length of the papers. If there are computation and the experiment, the details must be provided so that the results can be reproduced.