高维变量推理的投影积分更新

IF 4.6 Q2 MATERIALS SCIENCE, BIOMATERIALS ACS Applied Bio Materials Pub Date : 2024-02-08 DOI:10.1137/22m1529919
Jed A. Duersch
{"title":"高维变量推理的投影积分更新","authors":"Jed A. Duersch","doi":"10.1137/22m1529919","DOIUrl":null,"url":null,"abstract":"SIAM/ASA Journal on Uncertainty Quantification, Volume 12, Issue 1, Page 69-100, March 2024. <br/> Abstract. Variational inference is an approximation framework for Bayesian inference that seeks to improve quantified uncertainty in predictions by optimizing a simplified distribution over parameters to stand in for the full posterior. Capturing model variations that remain consistent with training data enables more robust predictions by reducing parameter sensitivity. This work introduces a fixed-point optimization for variational inference that is applicable when every feasible log density can be expressed as a linear combination of functions from a given basis. In such cases, the optimizer becomes a fixed-point of projective integral updates. When the basis spans univariate quadratics in each parameter, the feasible distributions are Gaussian mean-fields and the projective integral updates yield quasi-Newton variational Bayes (QNVB). Other bases and updates are also possible. Since these updates require high-dimensional integration, this work begins by proposing an efficient quasirandom sequence of quadratures for mean-field distributions. Each iterate of the sequence contains two evaluation points that combine to correctly integrate all univariate quadratic functions and, if the mean-field factors are symmetric, all univariate cubics. More importantly, averaging results over short subsequences achieves periodic exactness on a much larger space of multivariate polynomials of quadratic total degree. The corresponding variational updates require four loss evaluations with standard (not second-order) backpropagation to eliminate error terms from over half of all multivariate quadratic basis functions. This integration technique is motivated by first proposing stochastic blocked mean-field quadratures, which may be useful in other contexts. A PyTorch implementation of QNVB allows for better control over model uncertainty during training than competing methods. Experiments demonstrate superior generalizability for multiple learning problems and architectures.","PeriodicalId":2,"journal":{"name":"ACS Applied Bio Materials","volume":null,"pages":null},"PeriodicalIF":4.6000,"publicationDate":"2024-02-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Projective Integral Updates for High-Dimensional Variational Inference\",\"authors\":\"Jed A. Duersch\",\"doi\":\"10.1137/22m1529919\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"SIAM/ASA Journal on Uncertainty Quantification, Volume 12, Issue 1, Page 69-100, March 2024. <br/> Abstract. Variational inference is an approximation framework for Bayesian inference that seeks to improve quantified uncertainty in predictions by optimizing a simplified distribution over parameters to stand in for the full posterior. Capturing model variations that remain consistent with training data enables more robust predictions by reducing parameter sensitivity. This work introduces a fixed-point optimization for variational inference that is applicable when every feasible log density can be expressed as a linear combination of functions from a given basis. In such cases, the optimizer becomes a fixed-point of projective integral updates. When the basis spans univariate quadratics in each parameter, the feasible distributions are Gaussian mean-fields and the projective integral updates yield quasi-Newton variational Bayes (QNVB). Other bases and updates are also possible. Since these updates require high-dimensional integration, this work begins by proposing an efficient quasirandom sequence of quadratures for mean-field distributions. Each iterate of the sequence contains two evaluation points that combine to correctly integrate all univariate quadratic functions and, if the mean-field factors are symmetric, all univariate cubics. More importantly, averaging results over short subsequences achieves periodic exactness on a much larger space of multivariate polynomials of quadratic total degree. The corresponding variational updates require four loss evaluations with standard (not second-order) backpropagation to eliminate error terms from over half of all multivariate quadratic basis functions. This integration technique is motivated by first proposing stochastic blocked mean-field quadratures, which may be useful in other contexts. A PyTorch implementation of QNVB allows for better control over model uncertainty during training than competing methods. Experiments demonstrate superior generalizability for multiple learning problems and architectures.\",\"PeriodicalId\":2,\"journal\":{\"name\":\"ACS Applied Bio Materials\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":4.6000,\"publicationDate\":\"2024-02-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACS Applied Bio Materials\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1137/22m1529919\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"MATERIALS SCIENCE, BIOMATERIALS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACS Applied Bio Materials","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1137/22m1529919","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATERIALS SCIENCE, BIOMATERIALS","Score":null,"Total":0}
引用次数: 0

摘要

SIAM/ASA 不确定性量化期刊》,第 12 卷,第 1 期,第 69-100 页,2024 年 3 月。 摘要。变异推理是贝叶斯推理的一种近似框架,旨在通过优化参数的简化分布来代替全后验,从而提高预测的量化不确定性。捕捉与训练数据保持一致的模型变化,可以通过降低参数敏感性来实现更稳健的预测。这项工作为变分推理引入了一种定点优化方法,适用于每一个可行的对数密度都可以表达为给定基础函数的线性组合的情况。在这种情况下,优化器成为投影积分更新的定点。当基跨越每个参数的单变量二次方时,可行分布为高斯均值场,投影积分更新产生准牛顿变分贝叶斯(QNVB)。其他基数和更新也是可能的。由于这些更新需要高维积分,本研究首先提出了均值场分布的高效准随机序列。序列的每个迭代点都包含两个评估点,结合起来可以正确积分所有单变量二次函数,如果均值场因子是对称的,还可以正确积分所有单变量三次函数。更重要的是,对短子序列的结果求平均,可以在更大的二次总阶数多元多项式空间中实现周期精确性。相应的变分更新需要使用标准(非二阶)反向传播进行四次损失评估,才能消除一半以上多元二次基函数的误差项。这种积分技术的动机是首先提出随机阻塞均场四元数,这可能在其他情况下有用。与其他竞争方法相比,QNVB 的 PyTorch 实现可以在训练过程中更好地控制模型的不确定性。实验证明,QNVB 对多种学习问题和架构都有很好的通用性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Projective Integral Updates for High-Dimensional Variational Inference
SIAM/ASA Journal on Uncertainty Quantification, Volume 12, Issue 1, Page 69-100, March 2024.
Abstract. Variational inference is an approximation framework for Bayesian inference that seeks to improve quantified uncertainty in predictions by optimizing a simplified distribution over parameters to stand in for the full posterior. Capturing model variations that remain consistent with training data enables more robust predictions by reducing parameter sensitivity. This work introduces a fixed-point optimization for variational inference that is applicable when every feasible log density can be expressed as a linear combination of functions from a given basis. In such cases, the optimizer becomes a fixed-point of projective integral updates. When the basis spans univariate quadratics in each parameter, the feasible distributions are Gaussian mean-fields and the projective integral updates yield quasi-Newton variational Bayes (QNVB). Other bases and updates are also possible. Since these updates require high-dimensional integration, this work begins by proposing an efficient quasirandom sequence of quadratures for mean-field distributions. Each iterate of the sequence contains two evaluation points that combine to correctly integrate all univariate quadratic functions and, if the mean-field factors are symmetric, all univariate cubics. More importantly, averaging results over short subsequences achieves periodic exactness on a much larger space of multivariate polynomials of quadratic total degree. The corresponding variational updates require four loss evaluations with standard (not second-order) backpropagation to eliminate error terms from over half of all multivariate quadratic basis functions. This integration technique is motivated by first proposing stochastic blocked mean-field quadratures, which may be useful in other contexts. A PyTorch implementation of QNVB allows for better control over model uncertainty during training than competing methods. Experiments demonstrate superior generalizability for multiple learning problems and architectures.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
ACS Applied Bio Materials
ACS Applied Bio Materials Chemistry-Chemistry (all)
CiteScore
9.40
自引率
2.10%
发文量
464
期刊最新文献
A Systematic Review of Sleep Disturbance in Idiopathic Intracranial Hypertension. Advancing Patient Education in Idiopathic Intracranial Hypertension: The Promise of Large Language Models. Anti-Myelin-Associated Glycoprotein Neuropathy: Recent Developments. Approach to Managing the Initial Presentation of Multiple Sclerosis: A Worldwide Practice Survey. Association Between LACE+ Index Risk Category and 90-Day Mortality After Stroke.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1