Nada Cvetković, Han Cheng Lie, Harshit Bansal, Karen Veroy
{"title":"Choosing Observation Operators to Mitigate Model Error in Bayesian Inverse Problems","authors":"Nada Cvetković, Han Cheng Lie, Harshit Bansal, Karen Veroy","doi":"10.1137/23m1602140","DOIUrl":null,"url":null,"abstract":"SIAM/ASA Journal on Uncertainty Quantification, Volume 12, Issue 3, Page 723-758, September 2024. <br/> Abstract.In statistical inference, a discrepancy between the parameter-to-observable map that generates the data and the parameter-to-observable map that is used for inference can lead to misspecified likelihoods and thus to incorrect estimates. In many inverse problems, the parameter-to-observable map is the composition of a linear state-to-observable map called an “observation operator” and a possibly nonlinear parameter-to-state map called the “model.” We consider such Bayesian inverse problems where the discrepancy in the parameter-to-observable map is due to the use of an approximate model that differs from the best model, i.e., to nonzero “model error.” Multiple approaches have been proposed to address such discrepancies, each leading to a specific posterior. We show how to use local Lipschitz stability estimates of posteriors with respect to likelihood perturbations to bound the Kullback–Leibler divergence of the posterior of each approach with respect to the posterior associated to the best model. Our bounds lead to criteria for choosing observation operators that mitigate the effect of model error for Bayesian inverse problems of this type. We illustrate the feasibility of one such criterion on an advection-diffusion-reaction PDE inverse problem and use this example to discuss the importance and challenges of model error-aware inference.","PeriodicalId":56064,"journal":{"name":"Siam-Asa Journal on Uncertainty Quantification","volume":"14 1","pages":""},"PeriodicalIF":2.1000,"publicationDate":"2024-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Siam-Asa Journal on Uncertainty Quantification","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1137/23m1602140","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
SIAM/ASA Journal on Uncertainty Quantification, Volume 12, Issue 3, Page 723-758, September 2024. Abstract.In statistical inference, a discrepancy between the parameter-to-observable map that generates the data and the parameter-to-observable map that is used for inference can lead to misspecified likelihoods and thus to incorrect estimates. In many inverse problems, the parameter-to-observable map is the composition of a linear state-to-observable map called an “observation operator” and a possibly nonlinear parameter-to-state map called the “model.” We consider such Bayesian inverse problems where the discrepancy in the parameter-to-observable map is due to the use of an approximate model that differs from the best model, i.e., to nonzero “model error.” Multiple approaches have been proposed to address such discrepancies, each leading to a specific posterior. We show how to use local Lipschitz stability estimates of posteriors with respect to likelihood perturbations to bound the Kullback–Leibler divergence of the posterior of each approach with respect to the posterior associated to the best model. Our bounds lead to criteria for choosing observation operators that mitigate the effect of model error for Bayesian inverse problems of this type. We illustrate the feasibility of one such criterion on an advection-diffusion-reaction PDE inverse problem and use this example to discuss the importance and challenges of model error-aware inference.
期刊介绍:
SIAM/ASA Journal on Uncertainty Quantification (JUQ) publishes research articles presenting significant mathematical, statistical, algorithmic, and application advances in uncertainty quantification, defined as the interface of complex modeling of processes and data, especially characterizations of the uncertainties inherent in the use of such models. The journal also focuses on related fields such as sensitivity analysis, model validation, model calibration, data assimilation, and code verification. The journal also solicits papers describing new ideas that could lead to significant progress in methodology for uncertainty quantification as well as review articles on particular aspects. The journal is dedicated to nurturing synergistic interactions between the mathematical, statistical, computational, and applications communities involved in uncertainty quantification and related areas. JUQ is jointly offered by SIAM and the American Statistical Association.