{"title":"多方协议,信息复杂性和隐私","authors":"Iordanis Kerenidis, A. Rosén, Florent Urrutia","doi":"10.1145/3313230","DOIUrl":null,"url":null,"abstract":"We introduce a new information-theoretic measure, which we call Public Information Complexity (PIC), as a tool for the study of multi-party computation protocols, and of quantities such as their communication complexity, or the amount of randomness they require in the context of information-theoretic private computations. We are able to use this measure directly in the natural asynchronous message-passing peer-to-peer model and show a number of interesting properties and applications of our new notion: The Public Information Complexity is a lower bound on the Communication Complexity and an upper bound on the Information Complexity; the difference between the Public Information Complexity and the Information Complexity provides a lower bound on the amount of randomness used in a protocol; any communication protocol can be compressed to its Public Information Cost; and an explicit calculation of the zero-error Public Information Complexity of the k-party, n-bit Parity function, where a player outputs the bitwise parity of the inputs. The latter result also establishes that the amount of randomness needed by a private protocol that computes this function is Ω (n).","PeriodicalId":198744,"journal":{"name":"ACM Transactions on Computation Theory (TOCT)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":"{\"title\":\"Multi-Party Protocols, Information Complexity and Privacy\",\"authors\":\"Iordanis Kerenidis, A. Rosén, Florent Urrutia\",\"doi\":\"10.1145/3313230\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We introduce a new information-theoretic measure, which we call Public Information Complexity (PIC), as a tool for the study of multi-party computation protocols, and of quantities such as their communication complexity, or the amount of randomness they require in the context of information-theoretic private computations. We are able to use this measure directly in the natural asynchronous message-passing peer-to-peer model and show a number of interesting properties and applications of our new notion: The Public Information Complexity is a lower bound on the Communication Complexity and an upper bound on the Information Complexity; the difference between the Public Information Complexity and the Information Complexity provides a lower bound on the amount of randomness used in a protocol; any communication protocol can be compressed to its Public Information Cost; and an explicit calculation of the zero-error Public Information Complexity of the k-party, n-bit Parity function, where a player outputs the bitwise parity of the inputs. The latter result also establishes that the amount of randomness needed by a private protocol that computes this function is Ω (n).\",\"PeriodicalId\":198744,\"journal\":{\"name\":\"ACM Transactions on Computation Theory (TOCT)\",\"volume\":\"12 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-06-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"5\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACM Transactions on Computation Theory (TOCT)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3313230\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Transactions on Computation Theory (TOCT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3313230","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Multi-Party Protocols, Information Complexity and Privacy
We introduce a new information-theoretic measure, which we call Public Information Complexity (PIC), as a tool for the study of multi-party computation protocols, and of quantities such as their communication complexity, or the amount of randomness they require in the context of information-theoretic private computations. We are able to use this measure directly in the natural asynchronous message-passing peer-to-peer model and show a number of interesting properties and applications of our new notion: The Public Information Complexity is a lower bound on the Communication Complexity and an upper bound on the Information Complexity; the difference between the Public Information Complexity and the Information Complexity provides a lower bound on the amount of randomness used in a protocol; any communication protocol can be compressed to its Public Information Cost; and an explicit calculation of the zero-error Public Information Complexity of the k-party, n-bit Parity function, where a player outputs the bitwise parity of the inputs. The latter result also establishes that the amount of randomness needed by a private protocol that computes this function is Ω (n).