SIAM/ASA Journal on Uncertainty Quantification, Volume 12, Issue 2, Page 442-472, June 2024. Abstract.We present a hierarchical Bayesian learning approach to infer jointly sparse parameter vectors from multiple measurement vectors. Our model uses separate conditionally Gaussian priors for each parameter vector and common gamma-distributed hyperparameters to enforce joint sparsity. The resulting joint-sparsity-promoting priors are combined with existing Bayesian inference methods to generate a new family of algorithms. Our numerical experiments, which include a multicoil magnetic resonance imaging application, demonstrate that our new approach consistently outperforms commonly used hierarchical Bayesian methods.
{"title":"Leveraging Joint Sparsity in Hierarchical Bayesian Learning","authors":"Jan Glaubitz, Anne Gelb","doi":"10.1137/23m156255x","DOIUrl":"https://doi.org/10.1137/23m156255x","url":null,"abstract":"SIAM/ASA Journal on Uncertainty Quantification, Volume 12, Issue 2, Page 442-472, June 2024. <br/> Abstract.We present a hierarchical Bayesian learning approach to infer jointly sparse parameter vectors from multiple measurement vectors. Our model uses separate conditionally Gaussian priors for each parameter vector and common gamma-distributed hyperparameters to enforce joint sparsity. The resulting joint-sparsity-promoting priors are combined with existing Bayesian inference methods to generate a new family of algorithms. Our numerical experiments, which include a multicoil magnetic resonance imaging application, demonstrate that our new approach consistently outperforms commonly used hierarchical Bayesian methods.","PeriodicalId":56064,"journal":{"name":"Siam-Asa Journal on Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2024-05-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141146113","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
SIAM/ASA Journal on Uncertainty Quantification, Volume 12, Issue 2, Page 411-441, June 2024. Abstract.Filtering is concerned with online estimation of the state of a dynamical system from partial and noisy observations. In applications where the state of the system is high dimensional, ensemble Kalman filters are often the method of choice. These algorithms rely on an ensemble of interacting particles to sequentially estimate the state as new observations become available. Despite the practical success of ensemble Kalman filters, theoretical understanding is hindered by the intricate dependence structure of the interacting particles. This paper investigates ensemble Kalman filters that incorporate an additional resampling step to break the dependency between particles. The new algorithm is amenable to a theoretical analysis that extends and improves upon those available for filters without resampling, while also performing well in numerical examples.
{"title":"Ensemble Kalman Filters with Resampling","authors":"Omar Al-Ghattas, Jiajun Bao, Daniel Sanz-Alonso","doi":"10.1137/23m1594935","DOIUrl":"https://doi.org/10.1137/23m1594935","url":null,"abstract":"SIAM/ASA Journal on Uncertainty Quantification, Volume 12, Issue 2, Page 411-441, June 2024. <br/> Abstract.Filtering is concerned with online estimation of the state of a dynamical system from partial and noisy observations. In applications where the state of the system is high dimensional, ensemble Kalman filters are often the method of choice. These algorithms rely on an ensemble of interacting particles to sequentially estimate the state as new observations become available. Despite the practical success of ensemble Kalman filters, theoretical understanding is hindered by the intricate dependence structure of the interacting particles. This paper investigates ensemble Kalman filters that incorporate an additional resampling step to break the dependency between particles. The new algorithm is amenable to a theoretical analysis that extends and improves upon those available for filters without resampling, while also performing well in numerical examples.","PeriodicalId":56064,"journal":{"name":"Siam-Asa Journal on Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2024-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141146191","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
SIAM/ASA Journal on Uncertainty Quantification, Volume 12, Issue 2, Page 377-410, June 2024. Abstract. We consider [math] independent and identically distributed one-dimensional inhomogeneous diffusion processes [math] with drift [math] and diffusion coefficient [math], where [math] and the functions [math] and [math] are known. Our concern is the nonparametric estimation of the [math]-dimensional unknown function [math] from the continuous observation of the sample paths [math] throughout a fixed time interval [math]. A collection of projection estimators belonging to a product of finite-dimensional subspaces of [math] is built. The [math]-risk is defined by the expectation of either an empirical norm or a deterministic norm fitted to the problem. Rates of convergence for large [math] are discussed. A data-driven choice of the dimensions of the projection spaces is proposed. The theoretical results are illustrated by numerical experiments on simulated data.
{"title":"Nonparametric Estimation for Independent and Identically Distributed Stochastic Differential Equations with Space-Time Dependent Coefficients","authors":"Fabienne Comte, Valentine Genon-Catalot","doi":"10.1137/23m1581662","DOIUrl":"https://doi.org/10.1137/23m1581662","url":null,"abstract":"SIAM/ASA Journal on Uncertainty Quantification, Volume 12, Issue 2, Page 377-410, June 2024. <br/> Abstract. We consider [math] independent and identically distributed one-dimensional inhomogeneous diffusion processes [math] with drift [math] and diffusion coefficient [math], where [math] and the functions [math] and [math] are known. Our concern is the nonparametric estimation of the [math]-dimensional unknown function [math] from the continuous observation of the sample paths [math] throughout a fixed time interval [math]. A collection of projection estimators belonging to a product of finite-dimensional subspaces of [math] is built. The [math]-risk is defined by the expectation of either an empirical norm or a deterministic norm fitted to the problem. Rates of convergence for large [math] are discussed. A data-driven choice of the dimensions of the projection spaces is proposed. The theoretical results are illustrated by numerical experiments on simulated data.","PeriodicalId":56064,"journal":{"name":"Siam-Asa Journal on Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2024-05-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141146309","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
SIAM/ASA Journal on Uncertainty Quantification, Volume 12, Issue 2, Page 347-376, June 2024. Abstract. Persistent homology is a central methodology in topological data analysis that has been successfully implemented in many fields and is becoming increasingly popular and relevant. The output of persistent homology is a persistence diagram—a multiset of points supported on the upper half-plane—that is often used as a statistical summary of the topological features of data. In this paper, we study the random nature of persistent homology and estimate the density of expected persistence diagrams from observations using wavelets; we show that our wavelet-based estimator is optimal. Furthermore, we propose an estimator that offers a sparse representation of the expected persistence diagram that achieves near-optimality. We demonstrate the utility of our contributions in a machine learning task in the context of dynamical systems.
{"title":"Wavelet-Based Density Estimation for Persistent Homology","authors":"Konstantin Häberle, Barbara Bravi, Anthea Monod","doi":"10.1137/23m1573811","DOIUrl":"https://doi.org/10.1137/23m1573811","url":null,"abstract":"SIAM/ASA Journal on Uncertainty Quantification, Volume 12, Issue 2, Page 347-376, June 2024. <br/> Abstract. Persistent homology is a central methodology in topological data analysis that has been successfully implemented in many fields and is becoming increasingly popular and relevant. The output of persistent homology is a persistence diagram—a multiset of points supported on the upper half-plane—that is often used as a statistical summary of the topological features of data. In this paper, we study the random nature of persistent homology and estimate the density of expected persistence diagrams from observations using wavelets; we show that our wavelet-based estimator is optimal. Furthermore, we propose an estimator that offers a sparse representation of the expected persistence diagram that achieves near-optimality. We demonstrate the utility of our contributions in a machine learning task in the context of dynamical systems.","PeriodicalId":56064,"journal":{"name":"Siam-Asa Journal on Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2024-04-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140614201","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
SIAM/ASA Journal on Uncertainty Quantification, Volume 12, Issue 2, Page 309-346, June 2024. Abstract. Importance sampling is a popular variance reduction method for Monte Carlo estimation, where an evident question is how to design good proposal distributions. While in most cases optimal (zero-variance) estimators are theoretically possible, in practice only suboptimal proposal distributions are available and it can often be observed numerically that those can reduce statistical performance significantly, leading to large relative errors and therefore counteracting the original intention. Previous analysis on importance sampling has often focused on asymptotic arguments that work well in a large deviations regime. In this article, we provide lower and upper bounds on the relative error in a nonasymptotic setting. They depend on the deviation of the actual proposal from optimality, and we thus identify potential robustness issues that importance sampling may have, especially in high dimensions. We particularly focus on path sampling problems for diffusion processes with nonvanishing noise, for which generating good proposals comes with additional technical challenges. We provide numerous numerical examples that support our findings and demonstrate the applicability of the derived bounds.
{"title":"Nonasymptotic Bounds for Suboptimal Importance Sampling","authors":"Carsten Hartmann, Lorenz Richter","doi":"10.1137/21m1427760","DOIUrl":"https://doi.org/10.1137/21m1427760","url":null,"abstract":"SIAM/ASA Journal on Uncertainty Quantification, Volume 12, Issue 2, Page 309-346, June 2024. <br/> Abstract. Importance sampling is a popular variance reduction method for Monte Carlo estimation, where an evident question is how to design good proposal distributions. While in most cases optimal (zero-variance) estimators are theoretically possible, in practice only suboptimal proposal distributions are available and it can often be observed numerically that those can reduce statistical performance significantly, leading to large relative errors and therefore counteracting the original intention. Previous analysis on importance sampling has often focused on asymptotic arguments that work well in a large deviations regime. In this article, we provide lower and upper bounds on the relative error in a nonasymptotic setting. They depend on the deviation of the actual proposal from optimality, and we thus identify potential robustness issues that importance sampling may have, especially in high dimensions. We particularly focus on path sampling problems for diffusion processes with nonvanishing noise, for which generating good proposals comes with additional technical challenges. We provide numerous numerical examples that support our findings and demonstrate the applicability of the derived bounds.","PeriodicalId":56064,"journal":{"name":"Siam-Asa Journal on Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2024-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140562196","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
SIAM/ASA Journal on Uncertainty Quantification, Volume 12, Issue 2, Page 289-308, June 2024. Abstract. We present an algorithm for estimating higher-order statistical moments of multidimensional functions expressed as polynomial chaos expansions (PCE). The algorithm starts by decomposing the PCE into a low-rank tensor network using a combination of tensor-train and Tucker decompositions. It then efficiently calculates the desired moments in the compressed tensor domain, leveraging the highly linear structure of the network. Using three benchmark engineering functions, we demonstrate that our approach offers substantial speed improvements over alternative algorithms while maintaining a minimal and adjustable approximation error. Additionally, our method can calculate moments even when the input variable distribution is altered, incurring only a small additional computational cost and without requiring retraining of the regressor.
{"title":"Computing Statistical Moments Via Tensorization of Polynomial Chaos Expansions","authors":"Rafael Ballester-Ripoll","doi":"10.1137/23m155428x","DOIUrl":"https://doi.org/10.1137/23m155428x","url":null,"abstract":"SIAM/ASA Journal on Uncertainty Quantification, Volume 12, Issue 2, Page 289-308, June 2024. <br/> Abstract. We present an algorithm for estimating higher-order statistical moments of multidimensional functions expressed as polynomial chaos expansions (PCE). The algorithm starts by decomposing the PCE into a low-rank tensor network using a combination of tensor-train and Tucker decompositions. It then efficiently calculates the desired moments in the compressed tensor domain, leveraging the highly linear structure of the network. Using three benchmark engineering functions, we demonstrate that our approach offers substantial speed improvements over alternative algorithms while maintaining a minimal and adjustable approximation error. Additionally, our method can calculate moments even when the input variable distribution is altered, incurring only a small additional computational cost and without requiring retraining of the regressor.","PeriodicalId":56064,"journal":{"name":"Siam-Asa Journal on Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2024-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140614207","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jacob Curran-Sebastian, Lorenzo Pellis, Ian Hall, Thomas House
SIAM/ASA Journal on Uncertainty Quantification, Volume 12, Issue 2, Page 242-261, June 2024. Abstract. Understanding the timing of the peak of a disease outbreak forms an important part of epidemic forecasting. In many cases, such information is essential for planning increased hospital bed demand and for designing of public health interventions. The time taken for an outbreak to become large is inherently stochastic and, therefore, uncertain, but after a sufficient number of infections has been reached the subsequent dynamics can be modeled accurately using ordinary differential equations. Here, we present analytical and numerical methods for approximating the time at which a stochastic model of a disease outbreak reaches a large number of cases and for quantifying the uncertainty arising from demographic stochasticity around that time. We then project this uncertainty forwards in time using an ordinary differential equation model in order to obtain a distribution for the peak timing of the epidemic that agrees closely with large simulations but that, for error tolerances relevant to most realistic applications, requires a fraction of the computational cost of full Monte Carlo approaches.
{"title":"Calculation of Epidemic First Passage and Peak Time Probability Distributions","authors":"Jacob Curran-Sebastian, Lorenzo Pellis, Ian Hall, Thomas House","doi":"10.1137/23m1548049","DOIUrl":"https://doi.org/10.1137/23m1548049","url":null,"abstract":"SIAM/ASA Journal on Uncertainty Quantification, Volume 12, Issue 2, Page 242-261, June 2024. <br/> Abstract. Understanding the timing of the peak of a disease outbreak forms an important part of epidemic forecasting. In many cases, such information is essential for planning increased hospital bed demand and for designing of public health interventions. The time taken for an outbreak to become large is inherently stochastic and, therefore, uncertain, but after a sufficient number of infections has been reached the subsequent dynamics can be modeled accurately using ordinary differential equations. Here, we present analytical and numerical methods for approximating the time at which a stochastic model of a disease outbreak reaches a large number of cases and for quantifying the uncertainty arising from demographic stochasticity around that time. We then project this uncertainty forwards in time using an ordinary differential equation model in order to obtain a distribution for the peak timing of the epidemic that agrees closely with large simulations but that, for error tolerances relevant to most realistic applications, requires a fraction of the computational cost of full Monte Carlo approaches.","PeriodicalId":56064,"journal":{"name":"Siam-Asa Journal on Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2024-04-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140562019","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
SIAM/ASA Journal on Uncertainty Quantification, Volume 12, Issue 2, Page 262-288, June 2024. Abstract.We study the problem of learning unknown parameters in stochastic interacting particle systems with polynomial drift, interaction, and diffusion functions from the path of one single particle in the system. Our estimator is obtained by solving a linear system which is constructed by imposing appropriate conditions on the moments of the invariant distribution of the mean field limit and on the quadratic variation of the process. Our approach is easy to implement as it only requires the approximation of the moments via the ergodic theorem and the solution of a low-dimensional linear system. Moreover, we prove that our estimator is asymptotically unbiased in the limits of infinite data and infinite number of particles (mean field limit). In addition, we present several numerical experiments that validate the theoretical analysis and show the effectiveness of our methodology to accurately infer parameters in systems of interacting particles.
{"title":"A Method of Moments Estimator for Interacting Particle Systems and their Mean Field Limit","authors":"Grigorios A. Pavliotis, Andrea Zanoni","doi":"10.1137/22m153848x","DOIUrl":"https://doi.org/10.1137/22m153848x","url":null,"abstract":"SIAM/ASA Journal on Uncertainty Quantification, Volume 12, Issue 2, Page 262-288, June 2024. <br/> Abstract.We study the problem of learning unknown parameters in stochastic interacting particle systems with polynomial drift, interaction, and diffusion functions from the path of one single particle in the system. Our estimator is obtained by solving a linear system which is constructed by imposing appropriate conditions on the moments of the invariant distribution of the mean field limit and on the quadratic variation of the process. Our approach is easy to implement as it only requires the approximation of the moments via the ergodic theorem and the solution of a low-dimensional linear system. Moreover, we prove that our estimator is asymptotically unbiased in the limits of infinite data and infinite number of particles (mean field limit). In addition, we present several numerical experiments that validate the theoretical analysis and show the effectiveness of our methodology to accurately infer parameters in systems of interacting particles.","PeriodicalId":56064,"journal":{"name":"Siam-Asa Journal on Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2024-04-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140562202","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Nuojin Cheng, Osman Asif Malik, Yiming Xu, Stephen Becker, Alireza Doostan, Akil Narayan
SIAM/ASA Journal on Uncertainty Quantification, Volume 12, Issue 2, Page 213-241, June 2024. Abstract.Least squares regression is a ubiquitous tool for building emulators (a.k.a. surrogate models) of problems across science and engineering for purposes such as design space exploration and uncertainty quantification. When the regression data are generated using an experimental design process (e.g., a quadrature grid) involving computationally expensive models, or when the data size is large, sketching techniques have shown promise at reducing the cost of the construction of the regression model while ensuring accuracy comparable to that of the full data. However, random sketching strategies, such as those based on leverage scores, lead to regression errors that are random and may exhibit large variability. To mitigate this issue, we present a novel boosting approach that leverages cheaper, lower-fidelity data of the problem at hand to identify the best sketch among a set of candidate sketches. This in turn specifies the sketch of the intended high-fidelity model and the associated data. We provide theoretical analyses of this bifidelity boosting (BFB) approach and discuss the conditions the low- and high-fidelity data must satisfy for a successful boosting. In doing so, we derive a bound on the residual norm of the BFB sketched solution relating it to its ideal, but computationally expensive, high-fidelity boosted counterpart. Empirical results on both manufactured and PDE data corroborate the theoretical analyses and illustrate the efficacy of the BFB solution in reducing the regression error, as compared to the nonboosted solution.
{"title":"Subsampling of Parametric Models with Bifidelity Boosting","authors":"Nuojin Cheng, Osman Asif Malik, Yiming Xu, Stephen Becker, Alireza Doostan, Akil Narayan","doi":"10.1137/22m1524989","DOIUrl":"https://doi.org/10.1137/22m1524989","url":null,"abstract":"SIAM/ASA Journal on Uncertainty Quantification, Volume 12, Issue 2, Page 213-241, June 2024. <br/>Abstract.Least squares regression is a ubiquitous tool for building emulators (a.k.a. surrogate models) of problems across science and engineering for purposes such as design space exploration and uncertainty quantification. When the regression data are generated using an experimental design process (e.g., a quadrature grid) involving computationally expensive models, or when the data size is large, sketching techniques have shown promise at reducing the cost of the construction of the regression model while ensuring accuracy comparable to that of the full data. However, random sketching strategies, such as those based on leverage scores, lead to regression errors that are random and may exhibit large variability. To mitigate this issue, we present a novel boosting approach that leverages cheaper, lower-fidelity data of the problem at hand to identify the best sketch among a set of candidate sketches. This in turn specifies the sketch of the intended high-fidelity model and the associated data. We provide theoretical analyses of this bifidelity boosting (BFB) approach and discuss the conditions the low- and high-fidelity data must satisfy for a successful boosting. In doing so, we derive a bound on the residual norm of the BFB sketched solution relating it to its ideal, but computationally expensive, high-fidelity boosted counterpart. Empirical results on both manufactured and PDE data corroborate the theoretical analyses and illustrate the efficacy of the BFB solution in reducing the regression error, as compared to the nonboosted solution.","PeriodicalId":56064,"journal":{"name":"Siam-Asa Journal on Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2024-04-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140562524","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Corrigendum: Quasi–Monte Carlo Finite Element Analysis for Wave Propagation in Heterogeneous Random Media","authors":"M. Ganesh, Frances Y. Kuo, Ian H. Sloan","doi":"10.1137/23m1624609","DOIUrl":"https://doi.org/10.1137/23m1624609","url":null,"abstract":"SIAM/ASA Journal on Uncertainty Quantification, Volume 12, Issue 1, Page 212-212, March 2024. <br/> Abstract.","PeriodicalId":56064,"journal":{"name":"Siam-Asa Journal on Uncertainty Quantification","volume":null,"pages":null},"PeriodicalIF":2.0,"publicationDate":"2024-03-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140562208","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}