Hyperparameter Estimation in Bayesian MAP Estimation: Parameterizations and Consistency

Matthew M. Dunlop, T. Helin, A. Stuart
{"title":"Hyperparameter Estimation in Bayesian MAP Estimation: Parameterizations and Consistency","authors":"Matthew M. Dunlop, T. Helin, A. Stuart","doi":"10.5802/smai-jcm.62","DOIUrl":null,"url":null,"abstract":"The Bayesian formulation of inverse problems is attractive for three primary reasons: it provides a clear modelling framework; means for uncertainty quantification; and it allows for principled learning of hyperparameters. The posterior distribution may be explored by sampling methods, but for many problems it is computationally infeasible to do so. In this situation maximum a posteriori (MAP) estimators are often sought. Whilst these are relatively cheap to compute, and have an attractive variational formulation, a key drawback is their lack of invariance under change of parameterization. This is a particularly significant issue when hierarchical priors are employed to learn hyperparameters. In this paper we study the effect of the choice of parameterization on MAP estimators when a conditionally Gaussian hierarchical prior distribution is employed. Specifically we consider the centred parameterization, the natural parameterization in which the unknown state is solved for directly, and the noncentred parameterization, which works with a whitened Gaussian as the unknown state variable, and arises when considering dimension-robust MCMC algorithms; MAP estimation is well-defined in the nonparametric setting only for the noncentred parameterization. However, we show that MAP estimates based on the noncentred parameterization are not consistent as estimators of hyperparameters; conversely, we show that limits of finite-dimensional centred MAP estimators are consistent as the dimension tends to infinity. We also consider empirical Bayesian hyperparameter estimation, show consistency of these estimates, and demonstrate that they are more robust with respect to noise than centred MAP estimates. An underpinning concept throughout is that hyperparameters may only be recovered up to measure equivalence, a well-known phenomenon in the context of the Ornstein-Uhlenbeck process.","PeriodicalId":376888,"journal":{"name":"The SMAI journal of computational mathematics","volume":"28 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-05-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"15","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"The SMAI journal of computational mathematics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5802/smai-jcm.62","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 15

Abstract

The Bayesian formulation of inverse problems is attractive for three primary reasons: it provides a clear modelling framework; means for uncertainty quantification; and it allows for principled learning of hyperparameters. The posterior distribution may be explored by sampling methods, but for many problems it is computationally infeasible to do so. In this situation maximum a posteriori (MAP) estimators are often sought. Whilst these are relatively cheap to compute, and have an attractive variational formulation, a key drawback is their lack of invariance under change of parameterization. This is a particularly significant issue when hierarchical priors are employed to learn hyperparameters. In this paper we study the effect of the choice of parameterization on MAP estimators when a conditionally Gaussian hierarchical prior distribution is employed. Specifically we consider the centred parameterization, the natural parameterization in which the unknown state is solved for directly, and the noncentred parameterization, which works with a whitened Gaussian as the unknown state variable, and arises when considering dimension-robust MCMC algorithms; MAP estimation is well-defined in the nonparametric setting only for the noncentred parameterization. However, we show that MAP estimates based on the noncentred parameterization are not consistent as estimators of hyperparameters; conversely, we show that limits of finite-dimensional centred MAP estimators are consistent as the dimension tends to infinity. We also consider empirical Bayesian hyperparameter estimation, show consistency of these estimates, and demonstrate that they are more robust with respect to noise than centred MAP estimates. An underpinning concept throughout is that hyperparameters may only be recovered up to measure equivalence, a well-known phenomenon in the context of the Ornstein-Uhlenbeck process.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
贝叶斯MAP估计中的超参数估计:参数化和一致性
反问题的贝叶斯公式之所以吸引人,主要有三个原因:它提供了一个清晰的建模框架;不确定度量化方法;它允许有原则地学习超参数。后验分布可以通过抽样方法来探索,但对于许多问题,这样做在计算上是不可行的。在这种情况下,通常寻求最大后验估计量。虽然它们的计算成本相对较低,并且具有吸引人的变分公式,但一个关键的缺点是它们在参数化变化时缺乏不变性。当使用分层先验来学习超参数时,这是一个特别重要的问题。本文研究了在条件高斯分层先验分布下参数化选择对MAP估计量的影响。具体来说,我们考虑了中心参数化,即直接求解未知状态的自然参数化,以及在考虑维鲁棒MCMC算法时出现的以白化高斯作为未知状态变量的非中心参数化;MAP估计在非参数设置下只有在非中心参数化情况下才有良好的定义。然而,我们发现基于非中心参数化的MAP估计与超参数估计不一致;相反,我们证明了有限维中心MAP估计量的极限在维数趋于无穷时是一致的。我们还考虑了经验贝叶斯超参数估计,显示了这些估计的一致性,并证明它们相对于噪声比中心MAP估计更稳健。贯穿始终的一个基本概念是,超参数可能只能恢复到测量等效,这是Ornstein-Uhlenbeck过程中众所周知的现象。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Hybrid high-order methods for flow simulations in extremely large discrete fracture networks A family of second-order dissipative finite volume schemes for hyperbolic systems of conservation laws Parallel kinetic scheme for transport equations in complex toroidal geometry Initialization of the Circulant Embedding method to speed up the generation of Gaussian random fields Error Guarantees for Least Squares Approximation with Noisy Samples in Domain Adaptation
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1