{"title":"快速学习需要良好的记忆:奇偶性学习的时空下界","authors":"R. Raz","doi":"10.1145/3186563","DOIUrl":null,"url":null,"abstract":"We prove that any algorithm for learning parities requires either a memory of quadratic size or an exponential number of samples. This proves a recent conjecture of Steinhardt, Valiant and Wager [15] and shows that for some learning problems a large storage space is crucial. More formally, in the problem of parity learning, an unknown string x ϵ {0,1}n was chosen uniformly at random. A learner tries to learn x from a stream of samples (a1, b1), (a2, b2)..., where each at is uniformly distributed over {0,1}n and bt is the inner product of at and x, modulo 2. We show that any algorithm for parity learning, that uses less than n2/25 bits of memory, requires an exponential number of samples. Previously, there was no non-trivial lower bound on the number of samples needed, for any learning problem, even if the allowed memory size is O(n) (where n is the space needed to store one sample). We also give an application of our result in the field of bounded-storage cryptography. We show an encryption scheme that requires a private key of length n, as well as time complexity of n per encryption/decryption of each bit, and is provenly and unconditionally secure as long as the attacker uses less than n2/25 memory bits and the scheme is used at most an exponential number of times. Previous works on bounded-storage cryptography assumed that the memory size used by the attacker is at most linear in the time needed for encryption/decryption.","PeriodicalId":414001,"journal":{"name":"2016 IEEE 57th Annual Symposium on Foundations of Computer Science (FOCS)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-02-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"77","resultStr":"{\"title\":\"Fast Learning Requires Good Memory: A Time-Space Lower Bound for Parity Learning\",\"authors\":\"R. Raz\",\"doi\":\"10.1145/3186563\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We prove that any algorithm for learning parities requires either a memory of quadratic size or an exponential number of samples. This proves a recent conjecture of Steinhardt, Valiant and Wager [15] and shows that for some learning problems a large storage space is crucial. More formally, in the problem of parity learning, an unknown string x ϵ {0,1}n was chosen uniformly at random. A learner tries to learn x from a stream of samples (a1, b1), (a2, b2)..., where each at is uniformly distributed over {0,1}n and bt is the inner product of at and x, modulo 2. We show that any algorithm for parity learning, that uses less than n2/25 bits of memory, requires an exponential number of samples. Previously, there was no non-trivial lower bound on the number of samples needed, for any learning problem, even if the allowed memory size is O(n) (where n is the space needed to store one sample). We also give an application of our result in the field of bounded-storage cryptography. We show an encryption scheme that requires a private key of length n, as well as time complexity of n per encryption/decryption of each bit, and is provenly and unconditionally secure as long as the attacker uses less than n2/25 memory bits and the scheme is used at most an exponential number of times. Previous works on bounded-storage cryptography assumed that the memory size used by the attacker is at most linear in the time needed for encryption/decryption.\",\"PeriodicalId\":414001,\"journal\":{\"name\":\"2016 IEEE 57th Annual Symposium on Foundations of Computer Science (FOCS)\",\"volume\":\"11 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-02-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"77\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 IEEE 57th Annual Symposium on Foundations of Computer Science (FOCS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3186563\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 IEEE 57th Annual Symposium on Foundations of Computer Science (FOCS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3186563","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Fast Learning Requires Good Memory: A Time-Space Lower Bound for Parity Learning
We prove that any algorithm for learning parities requires either a memory of quadratic size or an exponential number of samples. This proves a recent conjecture of Steinhardt, Valiant and Wager [15] and shows that for some learning problems a large storage space is crucial. More formally, in the problem of parity learning, an unknown string x ϵ {0,1}n was chosen uniformly at random. A learner tries to learn x from a stream of samples (a1, b1), (a2, b2)..., where each at is uniformly distributed over {0,1}n and bt is the inner product of at and x, modulo 2. We show that any algorithm for parity learning, that uses less than n2/25 bits of memory, requires an exponential number of samples. Previously, there was no non-trivial lower bound on the number of samples needed, for any learning problem, even if the allowed memory size is O(n) (where n is the space needed to store one sample). We also give an application of our result in the field of bounded-storage cryptography. We show an encryption scheme that requires a private key of length n, as well as time complexity of n per encryption/decryption of each bit, and is provenly and unconditionally secure as long as the attacker uses less than n2/25 memory bits and the scheme is used at most an exponential number of times. Previous works on bounded-storage cryptography assumed that the memory size used by the attacker is at most linear in the time needed for encryption/decryption.