Nicolás García Trillos, Ryan Murray, Matthew Thorpe
{"title":"Rates of convergence for regression with the graph poly-Laplacian.","authors":"Nicolás García Trillos, Ryan Murray, Matthew Thorpe","doi":"10.1007/s43670-023-00075-5","DOIUrl":null,"url":null,"abstract":"<p><p>In the (special) smoothing spline problem one considers a variational problem with a quadratic data fidelity penalty and Laplacian regularization. Higher order regularity can be obtained via replacing the Laplacian regulariser with a poly-Laplacian regulariser. The methodology is readily adapted to graphs and here we consider graph poly-Laplacian regularization in a fully supervised, non-parametric, noise corrupted, regression problem. In particular, given a dataset <math><msubsup><mrow><mo>{</mo><msub><mi>x</mi><mi>i</mi></msub><mo>}</mo></mrow><mrow><mi>i</mi><mo>=</mo><mn>1</mn></mrow><mi>n</mi></msubsup></math> and a set of noisy labels <math><mrow><msubsup><mrow><mo>{</mo><msub><mi>y</mi><mi>i</mi></msub><mo>}</mo></mrow><mrow><mi>i</mi><mo>=</mo><mn>1</mn></mrow><mi>n</mi></msubsup><mo>⊂</mo><mi>R</mi></mrow></math> we let <math><mrow><msub><mi>u</mi><mi>n</mi></msub><mo>:</mo><msubsup><mrow><mo>{</mo><msub><mi>x</mi><mi>i</mi></msub><mo>}</mo></mrow><mrow><mi>i</mi><mo>=</mo><mn>1</mn></mrow><mi>n</mi></msubsup><mo>→</mo><mi>R</mi></mrow></math> be the minimizer of an energy which consists of a data fidelity term and an appropriately scaled graph poly-Laplacian term. When <math><mrow><msub><mi>y</mi><mi>i</mi></msub><mo>=</mo><mi>g</mi><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow><mo>+</mo><msub><mi>ξ</mi><mi>i</mi></msub></mrow></math>, for iid noise <math><msub><mi>ξ</mi><mi>i</mi></msub></math>, and using the geometric random graph, we identify (with high probability) the rate of convergence of <math><msub><mi>u</mi><mi>n</mi></msub></math> to <i>g</i> in the large data limit <math><mrow><mi>n</mi><mo>→</mo><mi>∞</mi></mrow></math>. Furthermore, our rate is close to the known rate of convergence in the usual smoothing spline model.</p>","PeriodicalId":74751,"journal":{"name":"Sampling theory, signal processing, and data analysis","volume":"21 2","pages":"35"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10682086/pdf/","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Sampling theory, signal processing, and data analysis","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s43670-023-00075-5","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2023/11/27 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
In the (special) smoothing spline problem one considers a variational problem with a quadratic data fidelity penalty and Laplacian regularization. Higher order regularity can be obtained via replacing the Laplacian regulariser with a poly-Laplacian regulariser. The methodology is readily adapted to graphs and here we consider graph poly-Laplacian regularization in a fully supervised, non-parametric, noise corrupted, regression problem. In particular, given a dataset and a set of noisy labels we let be the minimizer of an energy which consists of a data fidelity term and an appropriately scaled graph poly-Laplacian term. When , for iid noise , and using the geometric random graph, we identify (with high probability) the rate of convergence of to g in the large data limit . Furthermore, our rate is close to the known rate of convergence in the usual smoothing spline model.