{"title":"Convergence Rates for the Maximum A Posteriori Estimator in PDE-Regression Models with Random Design","authors":"Maximilian Siebel","doi":"arxiv-2409.03417","DOIUrl":null,"url":null,"abstract":"We consider the statistical inverse problem of recovering a parameter\n$\\theta\\in H^\\alpha$ from data arising from the Gaussian regression problem\n\\begin{equation*} Y = \\mathscr{G}(\\theta)(Z)+\\varepsilon \\end{equation*} with nonlinear forward\nmap $\\mathscr{G}:\\mathbb{L}^2\\to\\mathbb{L}^2$, random design points $Z$ and\nGaussian noise $\\varepsilon$. The estimation strategy is based on a least\nsquares approach under $\\Vert\\cdot\\Vert_{H^\\alpha}$-constraints. We establish\nthe existence of a least squares estimator $\\hat{\\theta}$ as a maximizer for a\ngiven functional under Lipschitz-type assumptions on the forward map\n$\\mathscr{G}$. A general concentration result is shown, which is used to prove\nconsistency and upper bounds for the prediction error. The corresponding rates\nof convergence reflect not only the smoothness of the parameter of interest but\nalso the ill-posedness of the underlying inverse problem. We apply the general\nmodel to the Darcy problem, where the recovery of an unknown coefficient\nfunction $f$ of a PDE is of interest. For this example, we also provide\ncorresponding rates of convergence for the prediction and estimation errors.\nAdditionally, we briefly discuss the applicability of the general model to\nother problems.","PeriodicalId":501379,"journal":{"name":"arXiv - STAT - Statistics Theory","volume":"68 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - STAT - Statistics Theory","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.03417","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
We consider the statistical inverse problem of recovering a parameter
$\theta\in H^\alpha$ from data arising from the Gaussian regression problem
\begin{equation*} Y = \mathscr{G}(\theta)(Z)+\varepsilon \end{equation*} with nonlinear forward
map $\mathscr{G}:\mathbb{L}^2\to\mathbb{L}^2$, random design points $Z$ and
Gaussian noise $\varepsilon$. The estimation strategy is based on a least
squares approach under $\Vert\cdot\Vert_{H^\alpha}$-constraints. We establish
the existence of a least squares estimator $\hat{\theta}$ as a maximizer for a
given functional under Lipschitz-type assumptions on the forward map
$\mathscr{G}$. A general concentration result is shown, which is used to prove
consistency and upper bounds for the prediction error. The corresponding rates
of convergence reflect not only the smoothness of the parameter of interest but
also the ill-posedness of the underlying inverse problem. We apply the general
model to the Darcy problem, where the recovery of an unknown coefficient
function $f$ of a PDE is of interest. For this example, we also provide
corresponding rates of convergence for the prediction and estimation errors.
Additionally, we briefly discuss the applicability of the general model to
other problems.