{"title":"Consistency of Fractional Graph-Laplacian Regularization in Semisupervised Learning with Finite Labels","authors":"Adrien Weihs, Matthew Thorpe","doi":"10.1137/23m1559087","DOIUrl":null,"url":null,"abstract":"SIAM Journal on Mathematical Analysis, Volume 56, Issue 4, Page 4253-4295, August 2024. <br/> Abstract. Laplace learning is a popular machine learning algorithm for finding missing labels from a small number of labeled feature vectors using the geometry of a graph. More precisely, Laplace learning is based on minimizing a graph-Dirichlet energy, equivalently a discrete Sobolev [math] seminorm, constrained to taking the values of known labels on a given subset. The variational problem is asymptotically ill-posed as the number of unlabeled feature vectors goes to infinity for finite given labels due to a lack of regularity in minimizers of the continuum Dirichlet energy in any dimension higher than one. In particular, continuum minimizers are not continuous. One solution is to consider higher-order regularization, which is the analogue of minimizing Sobolev [math] seminorms. In this paper we consider the asymptotics of minimizing a graph variant of the Sobolev [math] seminorm with pointwise constraints. We show that, as expected, one needs [math], where [math] is the dimension of the data manifold. We also show that there must be an upper bound on the connectivity of the graph; that is, highly connected graphs lead to degenerate behavior of the minimizer even when [math].","PeriodicalId":51150,"journal":{"name":"SIAM Journal on Mathematical Analysis","volume":null,"pages":null},"PeriodicalIF":2.2000,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIAM Journal on Mathematical Analysis","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1137/23m1559087","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
SIAM Journal on Mathematical Analysis, Volume 56, Issue 4, Page 4253-4295, August 2024. Abstract. Laplace learning is a popular machine learning algorithm for finding missing labels from a small number of labeled feature vectors using the geometry of a graph. More precisely, Laplace learning is based on minimizing a graph-Dirichlet energy, equivalently a discrete Sobolev [math] seminorm, constrained to taking the values of known labels on a given subset. The variational problem is asymptotically ill-posed as the number of unlabeled feature vectors goes to infinity for finite given labels due to a lack of regularity in minimizers of the continuum Dirichlet energy in any dimension higher than one. In particular, continuum minimizers are not continuous. One solution is to consider higher-order regularization, which is the analogue of minimizing Sobolev [math] seminorms. In this paper we consider the asymptotics of minimizing a graph variant of the Sobolev [math] seminorm with pointwise constraints. We show that, as expected, one needs [math], where [math] is the dimension of the data manifold. We also show that there must be an upper bound on the connectivity of the graph; that is, highly connected graphs lead to degenerate behavior of the minimizer even when [math].
期刊介绍:
SIAM Journal on Mathematical Analysis (SIMA) features research articles of the highest quality employing innovative analytical techniques to treat problems in the natural sciences. Every paper has content that is primarily analytical and that employs mathematical methods in such areas as partial differential equations, the calculus of variations, functional analysis, approximation theory, harmonic or wavelet analysis, or dynamical systems. Additionally, every paper relates to a model for natural phenomena in such areas as fluid mechanics, materials science, quantum mechanics, biology, mathematical physics, or to the computational analysis of such phenomena.
Submission of a manuscript to a SIAM journal is representation by the author that the manuscript has not been published or submitted simultaneously for publication elsewhere.
Typical papers for SIMA do not exceed 35 journal pages. Substantial deviations from this page limit require that the referees, editor, and editor-in-chief be convinced that the increased length is both required by the subject matter and justified by the quality of the paper.