{"title":"De-randomizing BPP: the state of the art","authors":"A. Wigderson","doi":"10.1109/CCC.1999.766263","DOIUrl":null,"url":null,"abstract":"The introduction of randomization into efficient computation has been one of the most fertile and useful ideas in computer science. In cryptography and asynchronous computing, randomization makes possible tasks that are impossible to perform deterministically. Even for function computation, many examples are known in which randomization allows considerable savings in resources like space and time over deterministic algorithms, or even \"only\" simplifies them. But to what extent is this seeming power of randomness over determinism real? The most famous concrete version of this question regards the power of BPP, the class of problems solvable by probabilistic polynomial time algorithms making small constant error. What is the relative power of such algorithms compared to deterministic ones? This is largely open. On the one hand, it is possible that P=BPP, i.e., randomness is useless for solving new problems in polynomial-time. On the other, we might have BPP=EXP, which would say that randomness would be a nearly omnipotent tool for algorithm design. The only viable path towards resolving this problem was the concept of \"pseudorandom generators\", and the \"hardness vs. randomness\" paradigm: BPP can be nontrivially simulated by deterministic algorithms, if some hard function is available. While the hard functions above needed in fact to be one-way functions, completely different pseudo-random generators allowed the use of any hard function in EXP for such nontrivial simulation. Further progress considerably weakened the hardness requirement, and considerably strengthened the deterministic simulation.","PeriodicalId":432015,"journal":{"name":"Proceedings. Fourteenth Annual IEEE Conference on Computational Complexity (Formerly: Structure in Complexity Theory Conference) (Cat.No.99CB36317)","volume":"164 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1999-05-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings. Fourteenth Annual IEEE Conference on Computational Complexity (Formerly: Structure in Complexity Theory Conference) (Cat.No.99CB36317)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CCC.1999.766263","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

The introduction of randomization into efficient computation has been one of the most fertile and useful ideas in computer science. In cryptography and asynchronous computing, randomization makes possible tasks that are impossible to perform deterministically. Even for function computation, many examples are known in which randomization allows considerable savings in resources like space and time over deterministic algorithms, or even "only" simplifies them. But to what extent is this seeming power of randomness over determinism real? The most famous concrete version of this question regards the power of BPP, the class of problems solvable by probabilistic polynomial time algorithms making small constant error. What is the relative power of such algorithms compared to deterministic ones? This is largely open. On the one hand, it is possible that P=BPP, i.e., randomness is useless for solving new problems in polynomial-time. On the other, we might have BPP=EXP, which would say that randomness would be a nearly omnipotent tool for algorithm design. The only viable path towards resolving this problem was the concept of "pseudorandom generators", and the "hardness vs. randomness" paradigm: BPP can be nontrivially simulated by deterministic algorithms, if some hard function is available. While the hard functions above needed in fact to be one-way functions, completely different pseudo-random generators allowed the use of any hard function in EXP for such nontrivial simulation. Further progress considerably weakened the hardness requirement, and considerably strengthened the deterministic simulation.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
去随机化BPP:最先进的技术
在高效计算中引入随机化是计算机科学中最丰富和最有用的思想之一。在密码学和异步计算中,随机化使不可能确定地执行的任务成为可能。即使对于函数计算,也有许多已知的例子,其中随机化可以比确定性算法节省大量的空间和时间等资源,或者甚至“只”简化它们。但这种看似随机的力量在多大程度上是真实的?这个问题最著名的具体版本与BPP的威力有关,BPP是一类可以通过概率多项式时间算法求解的问题,它产生很小的常数误差。与确定性算法相比,这种算法的相对能力是什么?这很大程度上是开放的。一方面,有可能P=BPP,即随机性对于解决多项式时间内的新问题是无用的。另一方面,我们可能有BPP=EXP,这意味着随机性几乎是算法设计的万能工具。解决这个问题的唯一可行途径是“伪随机生成器”的概念,以及“硬度vs随机性”范式:如果有一些硬函数可用,BPP可以通过确定性算法进行非寻常的模拟。虽然上面的硬函数实际上需要是单向函数,但完全不同的伪随机生成器允许在EXP中使用任何硬函数来进行这种非平凡的模拟。进一步的进展大大削弱了硬度要求,并大大加强了确定性模拟。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A lower bound for primality Proofs, codes, and polynomial-time reducibilities Comparing entropies in statistical zero knowledge with applications to the structure of SZK Depth-3 arithmetic formulae over fields of characteristic zero Applications of a new transference theorem to Ajtai's connection factor
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1