Patterns of Scalable Bayesian Inference

E. Angelino, Matthew J. Johnson, Ryan P. Adams
{"title":"Patterns of Scalable Bayesian Inference","authors":"E. Angelino, Matthew J. Johnson, Ryan P. Adams","doi":"10.1561/2200000052","DOIUrl":null,"url":null,"abstract":"Datasets are growing not just in size but in complexity, creating a demand for rich models and quantification of uncertainty. Bayesian methods are an excellent fit for this demand, but scaling Bayesian inference is a challenge. In response to this challenge, there has been considerable recent work based on varying assumptions about model structure, underlying computational resources, and the importance of asymptotic correctness. As a result, there is a zoo of ideas with few clear overarching principles. \nIn this paper, we seek to identify unifying principles, patterns, and intuitions for scaling Bayesian inference. We review existing work on utilizing modern computing resources with both MCMC and variational approximation techniques. From this taxonomy of ideas, we characterize the general principles that have proven successful for designing scalable inference procedures and comment on the path forward.","PeriodicalId":431372,"journal":{"name":"Found. Trends Mach. Learn.","volume":"41 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-02-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"81","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Found. Trends Mach. Learn.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1561/2200000052","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 81

Abstract

Datasets are growing not just in size but in complexity, creating a demand for rich models and quantification of uncertainty. Bayesian methods are an excellent fit for this demand, but scaling Bayesian inference is a challenge. In response to this challenge, there has been considerable recent work based on varying assumptions about model structure, underlying computational resources, and the importance of asymptotic correctness. As a result, there is a zoo of ideas with few clear overarching principles. In this paper, we seek to identify unifying principles, patterns, and intuitions for scaling Bayesian inference. We review existing work on utilizing modern computing resources with both MCMC and variational approximation techniques. From this taxonomy of ideas, we characterize the general principles that have proven successful for designing scalable inference procedures and comment on the path forward.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
可伸缩贝叶斯推理模式
数据集不仅在规模上增长,而且在复杂性上也在增长,这就产生了对丰富模型和不确定性量化的需求。贝叶斯方法非常适合这种需求,但是扩展贝叶斯推理是一个挑战。为了应对这一挑战,最近有相当多的工作基于对模型结构、底层计算资源和渐近正确性重要性的不同假设。因此,出现了一堆没有明确总体原则的想法。在本文中,我们试图确定统一的原则,模式,和直觉缩放贝叶斯推理。我们回顾了利用MCMC和变分逼近技术利用现代计算资源的现有工作。从这种思想分类中,我们描述了设计可扩展推理过程已被证明成功的一般原则,并评论了前进的道路。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Tensor Regression Tutorial on Amortized Optimization Machine Learning for Automated Theorem Proving: Learning to Solve SAT and QSAT A unifying tutorial on Approximate Message Passing Reinforcement Learning, Bit by Bit
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1