{"title":"Empirical Bayesian Inference Using a Support Informed Prior","authors":"Jiahui Zhang, A. Gelb, Theresa Scarnati","doi":"10.1137/21m140794x","DOIUrl":"https://doi.org/10.1137/21m140794x","url":null,"abstract":"","PeriodicalId":56064,"journal":{"name":"Siam-Asa Journal on Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2022-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74672154","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Extrapolated Polynomial Lattice Rule Integration in Computational Uncertainty Quantification","authors":"J. Dick, M. Longo, C. Schwab","doi":"10.1137/20m1338137","DOIUrl":"https://doi.org/10.1137/20m1338137","url":null,"abstract":"","PeriodicalId":56064,"journal":{"name":"Siam-Asa Journal on Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2022-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90846664","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Max Ehre, Rafael Flock, M. Fußeder, I. Papaioannou, D. Štraub
In inverse problems, the parameters of a model are estimated based on observations of the model response. The Bayesian approach is powerful for solving such problems; one formulates a prior distribution for the parameter state that is updated with the observations to compute the posterior parameter distribution. Solving for the posterior distribution can be challenging when, e.g., prior and posterior significantly differ from one another and/or the parameter space is high-dimensional. We use a sequence of importance sampling measures that arise by tempering the likelihood to approach inverse problems exhibiting a significant distance between prior and posterior. Each importance sampling measure is identified by cross-entropy minimization as proposed in the context of Bayesian inverse problems in Engel et al. (2021). To efficiently address problems with high-dimensional parameter spaces we set up the minimization procedure in a low-dimensional subspace of the original parameter space. The principal idea is to analyse the spectrum of the second-moment matrix of the gradient of the log-likelihood function to identify a suitable subspace. Following Zahm et al. (2021), an upper bound on the Kullback-Leibler-divergence between full-dimensional and subspace posterior is provided, which can be utilized to determine the effective dimension of the inverse problem corresponding to a prescribed approximation error bound. We suggest heuristic criteria for optimally selecting the number of model and model gradient evaluations in each iteration of the importance sampling sequence. We investigate the performance of this approach using examples from engineering mechanics set in various parameter space dimensions.
{"title":"Certified Dimension Reduction for Bayesian Updating with the Cross-Entropy Method","authors":"Max Ehre, Rafael Flock, M. Fußeder, I. Papaioannou, D. Štraub","doi":"10.1137/22m1484031","DOIUrl":"https://doi.org/10.1137/22m1484031","url":null,"abstract":"In inverse problems, the parameters of a model are estimated based on observations of the model response. The Bayesian approach is powerful for solving such problems; one formulates a prior distribution for the parameter state that is updated with the observations to compute the posterior parameter distribution. Solving for the posterior distribution can be challenging when, e.g., prior and posterior significantly differ from one another and/or the parameter space is high-dimensional. We use a sequence of importance sampling measures that arise by tempering the likelihood to approach inverse problems exhibiting a significant distance between prior and posterior. Each importance sampling measure is identified by cross-entropy minimization as proposed in the context of Bayesian inverse problems in Engel et al. (2021). To efficiently address problems with high-dimensional parameter spaces we set up the minimization procedure in a low-dimensional subspace of the original parameter space. The principal idea is to analyse the spectrum of the second-moment matrix of the gradient of the log-likelihood function to identify a suitable subspace. Following Zahm et al. (2021), an upper bound on the Kullback-Leibler-divergence between full-dimensional and subspace posterior is provided, which can be utilized to determine the effective dimension of the inverse problem corresponding to a prescribed approximation error bound. We suggest heuristic criteria for optimally selecting the number of model and model gradient evaluations in each iteration of the importance sampling sequence. We investigate the performance of this approach using examples from engineering mechanics set in various parameter space dimensions.","PeriodicalId":56064,"journal":{"name":"Siam-Asa Journal on Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2022-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42537963","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Sensitivity indices are commonly used to quantify the relative influence of any specific group of input variables on the output of a computer code. One crucial question is then to decide whether a given set of variables has a significant impact on the output. Sobol indices are often used to measure this impact but their estimation can be difficult as they usually require a particular design of experiment. In this work, we take advantage of the monotonicity of Sobol indices with respect to set inclusion to test the influence of some of the input variables. The method does not rely on a direct estimation of the Sobol indices and can be performed under classical iid sampling designs.
{"title":"Test Comparison for Sobol Indices over Nested Sets of Variables","authors":"T. Klein, Nicolas Peteilh, P. Rochet","doi":"10.1137/21m1457370","DOIUrl":"https://doi.org/10.1137/21m1457370","url":null,"abstract":"Sensitivity indices are commonly used to quantify the relative influence of any specific group of input variables on the output of a computer code. One crucial question is then to decide whether a given set of variables has a significant impact on the output. Sobol indices are often used to measure this impact but their estimation can be difficult as they usually require a particular design of experiment. In this work, we take advantage of the monotonicity of Sobol indices with respect to set inclusion to test the influence of some of the input variables. The method does not rely on a direct estimation of the Sobol indices and can be performed under classical iid sampling designs.","PeriodicalId":56064,"journal":{"name":"Siam-Asa Journal on Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2022-03-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76520278","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-03-19DOI: 10.48550/arXiv.2203.10270
E. Spence, J. Wunsch
A crucial role in the theory of uncertainty quantification (UQ) of PDEs is played by the regularity of the solution with respect to the stochastic parameters; indeed, a key property one seeks to establish is that the solution is holomorphic with respect to (the complex extensions of) the parameters. In the context of UQ for the high-frequency Helmholtz equation, a natural question is therefore: how does this parametric holomorphy depend on the wavenumber $k$? The recent paper [Ganesh, Kuo, Sloan 2021] showed for a particular nontrapping variable-coefficient Helmholtz problem with affine dependence of the coefficients on the stochastic parameters that the solution operator can be analytically continued a distance $sim k^{-1}$ into the complex plane. In this paper, we generalise the result in [Ganesh, Kuo, Sloan 2021] about $k$-explicit parametric holomorphy to a much wider class of Helmholtz problems with arbitrary (holomorphic) dependence on the stochastic parameters; we show that in all cases the region of parametric holomorphy decreases with $k$, and show how the rate of decrease with $k$ is dictated by whether the unperturbed Helmholtz problem is trapping or nontrapping. We then give examples of both trapping and nontrapping problems where these bounds on the rate of decrease with $k$ of the region of parametric holomorphy are sharp, with the trapping examples coming from the recent results of [Galkowski, Marchand, Spence 2021]. An immediate implication of these results is that the $k$-dependent restrictions imposed on the randomness in the analysis of quasi-Monte Carlo (QMC) methods in [Ganesh, Kuo, Sloan 2021] arise from a genuine feature of the Helmholtz equation with $k$ large (and not, for example, a suboptimal bound).
{"title":"Wavenumber-explicit parametric holomorphy of Helmholtz solutions in the context of uncertainty quantification","authors":"E. Spence, J. Wunsch","doi":"10.48550/arXiv.2203.10270","DOIUrl":"https://doi.org/10.48550/arXiv.2203.10270","url":null,"abstract":"A crucial role in the theory of uncertainty quantification (UQ) of PDEs is played by the regularity of the solution with respect to the stochastic parameters; indeed, a key property one seeks to establish is that the solution is holomorphic with respect to (the complex extensions of) the parameters. In the context of UQ for the high-frequency Helmholtz equation, a natural question is therefore: how does this parametric holomorphy depend on the wavenumber $k$? The recent paper [Ganesh, Kuo, Sloan 2021] showed for a particular nontrapping variable-coefficient Helmholtz problem with affine dependence of the coefficients on the stochastic parameters that the solution operator can be analytically continued a distance $sim k^{-1}$ into the complex plane. In this paper, we generalise the result in [Ganesh, Kuo, Sloan 2021] about $k$-explicit parametric holomorphy to a much wider class of Helmholtz problems with arbitrary (holomorphic) dependence on the stochastic parameters; we show that in all cases the region of parametric holomorphy decreases with $k$, and show how the rate of decrease with $k$ is dictated by whether the unperturbed Helmholtz problem is trapping or nontrapping. We then give examples of both trapping and nontrapping problems where these bounds on the rate of decrease with $k$ of the region of parametric holomorphy are sharp, with the trapping examples coming from the recent results of [Galkowski, Marchand, Spence 2021]. An immediate implication of these results is that the $k$-dependent restrictions imposed on the randomness in the analysis of quasi-Monte Carlo (QMC) methods in [Ganesh, Kuo, Sloan 2021] arise from a genuine feature of the Helmholtz equation with $k$ large (and not, for example, a suboptimal bound).","PeriodicalId":56064,"journal":{"name":"Siam-Asa Journal on Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2022-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81689028","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Varying Coefficient Models and Design Choice for Bayes Linear Emulation of Complex Computer Models with Limited Model Evaluations","authors":"Amy L. Wilson, M. Goldstein, C. Dent","doi":"10.1137/20m1318560","DOIUrl":"https://doi.org/10.1137/20m1318560","url":null,"abstract":"","PeriodicalId":56064,"journal":{"name":"Siam-Asa Journal on Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74537949","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Effective Generation of Compressed Stationary Gaussian Fields","authors":"R. Sawko, M. Zimon","doi":"10.1137/20m1375541","DOIUrl":"https://doi.org/10.1137/20m1375541","url":null,"abstract":"","PeriodicalId":56064,"journal":{"name":"Siam-Asa Journal on Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81656270","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
It is informative to evaluate a forecaster's ability to predict outcomes that have a large impact on the forecast user. Although weighted scoring rules have become a well-established tool to achieve this, such scores have been studied almost exclusively in the univariate case, with interest typically placed on extreme events. However, a large impact may also result from events not considered to be extreme from a statistical perspective: the interaction of several moderate events could also generate a high impact. Compound weather events provide a good example of this. To assess forecasts made for high-impact events, this work extends existing results on weighted scoring rules by introducing weighted multivariate scores. To do so, we utilise kernel scores. We demonstrate that the threshold-weighted continuous ranked probability score (twCRPS), arguably the most well-known weighted scoring rule, is a kernel score. This result leads to a convenient representation of the twCRPS when the forecast is an ensemble, and also permits a generalisation that can be employed with alternative kernels, allowing us to introduce, for example, a threshold-weighted energy score and threshold-weighted variogram score. To illustrate the additional information that these weighted multivariate scoring rules provide, results are presented for a case study in which the weighted scores are used to evaluate daily precipitation accumulation forecasts, with particular interest on events that could lead to flooding.
{"title":"Evaluating Forecasts for High-Impact Events Using Transformed Kernel Scores","authors":"S. Allen, D. Ginsbourger, Johanna F. Ziegel","doi":"10.1137/22m1532184","DOIUrl":"https://doi.org/10.1137/22m1532184","url":null,"abstract":"It is informative to evaluate a forecaster's ability to predict outcomes that have a large impact on the forecast user. Although weighted scoring rules have become a well-established tool to achieve this, such scores have been studied almost exclusively in the univariate case, with interest typically placed on extreme events. However, a large impact may also result from events not considered to be extreme from a statistical perspective: the interaction of several moderate events could also generate a high impact. Compound weather events provide a good example of this. To assess forecasts made for high-impact events, this work extends existing results on weighted scoring rules by introducing weighted multivariate scores. To do so, we utilise kernel scores. We demonstrate that the threshold-weighted continuous ranked probability score (twCRPS), arguably the most well-known weighted scoring rule, is a kernel score. This result leads to a convenient representation of the twCRPS when the forecast is an ensemble, and also permits a generalisation that can be employed with alternative kernels, allowing us to introduce, for example, a threshold-weighted energy score and threshold-weighted variogram score. To illustrate the additional information that these weighted multivariate scoring rules provide, results are presented for a case study in which the weighted scores are used to evaluate daily precipitation accumulation forecasts, with particular interest on events that could lead to flooding.","PeriodicalId":56064,"journal":{"name":"Siam-Asa Journal on Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2022-02-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42030795","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Mikkel B. Lykkegaard, T. Dodwell, C. Fox, Grigorios Mingas, Robert Scheichl
We develop a novel Markov chain Monte Carlo (MCMC) method that exploits a hierarchy of models of increasing complexity to efficiently generate samples from an unnormalized target distribution. Broadly, the method rewrites the Multilevel MCMC approach of Dodwell et al. (2015) in terms of the Delayed Acceptance (DA) MCMC of Christen&Fox (2005). In particular, DA is extended to use a hierarchy of models of arbitrary depth, and allow subchains of arbitrary length. We show that the algorithm satisfies detailed balance, hence is ergodic for the target distribution. Furthermore, multilevel variance reduction is derived that exploits the multiple levels and subchains, and an adaptive multilevel correction to coarse-level biases is developed. Three numerical examples of Bayesian inverse problems are presented that demonstrate the advantages of these novel methods. The software and examples are available in PyMC3.
{"title":"Multilevel Delayed Acceptance MCMC","authors":"Mikkel B. Lykkegaard, T. Dodwell, C. Fox, Grigorios Mingas, Robert Scheichl","doi":"10.1137/22m1476770","DOIUrl":"https://doi.org/10.1137/22m1476770","url":null,"abstract":"We develop a novel Markov chain Monte Carlo (MCMC) method that exploits a hierarchy of models of increasing complexity to efficiently generate samples from an unnormalized target distribution. Broadly, the method rewrites the Multilevel MCMC approach of Dodwell et al. (2015) in terms of the Delayed Acceptance (DA) MCMC of Christen&Fox (2005). In particular, DA is extended to use a hierarchy of models of arbitrary depth, and allow subchains of arbitrary length. We show that the algorithm satisfies detailed balance, hence is ergodic for the target distribution. Furthermore, multilevel variance reduction is derived that exploits the multiple levels and subchains, and an adaptive multilevel correction to coarse-level biases is developed. Three numerical examples of Bayesian inverse problems are presented that demonstrate the advantages of these novel methods. The software and examples are available in PyMC3.","PeriodicalId":56064,"journal":{"name":"Siam-Asa Journal on Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2022-02-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84765354","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Intermediate Variable Emulation: Using Internal Processes in Simulators to Build More Informative Emulators","authors":"R. H. Oughton, M. Goldstein, J. Hemmings","doi":"10.1137/20m1370902","DOIUrl":"https://doi.org/10.1137/20m1370902","url":null,"abstract":"","PeriodicalId":56064,"journal":{"name":"Siam-Asa Journal on Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73997235","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}