{"title":"基于系数的正则化分布回归","authors":"Yuan Mao , Lei Shi , Zheng-Chu Guo","doi":"10.1016/j.jat.2023.105995","DOIUrl":null,"url":null,"abstract":"<div><p>In this paper, we consider the coefficient-based regularized distribution regression which aims to regress from probability measures to real-valued responses over a reproducing kernel Hilbert space (RKHS), where the regularization is put on the coefficients and kernels are assumed to be indefinite. The algorithm involves two stages of sampling, the first stage sample consists of distributions and the second stage sample is obtained from these distributions. The asymptotic behavior of the algorithm is comprehensively studied across different regularity ranges of the regression function. Explicit learning rates are derived by using kernel mean embedding and integral operator techniques. We obtain the optimal rates under some mild conditions, which match the one-stage sampled minimax optimal rate. Compared with the kernel methods for distribution regression in existing literature, the algorithm under consideration does not require the kernel to be symmetric or positive semi-definite and hence provides a simple paradigm for designing indefinite kernel methods, which enriches the theme of the distribution regression. To the best of our knowledge, this is the first result for distribution regression with indefinite kernels, and our algorithm can improve the learning performance against saturation effect.</p></div>","PeriodicalId":54878,"journal":{"name":"Journal of Approximation Theory","volume":null,"pages":null},"PeriodicalIF":0.9000,"publicationDate":"2023-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Coefficient-based regularized distribution regression\",\"authors\":\"Yuan Mao , Lei Shi , Zheng-Chu Guo\",\"doi\":\"10.1016/j.jat.2023.105995\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>In this paper, we consider the coefficient-based regularized distribution regression which aims to regress from probability measures to real-valued responses over a reproducing kernel Hilbert space (RKHS), where the regularization is put on the coefficients and kernels are assumed to be indefinite. The algorithm involves two stages of sampling, the first stage sample consists of distributions and the second stage sample is obtained from these distributions. The asymptotic behavior of the algorithm is comprehensively studied across different regularity ranges of the regression function. Explicit learning rates are derived by using kernel mean embedding and integral operator techniques. We obtain the optimal rates under some mild conditions, which match the one-stage sampled minimax optimal rate. Compared with the kernel methods for distribution regression in existing literature, the algorithm under consideration does not require the kernel to be symmetric or positive semi-definite and hence provides a simple paradigm for designing indefinite kernel methods, which enriches the theme of the distribution regression. To the best of our knowledge, this is the first result for distribution regression with indefinite kernels, and our algorithm can improve the learning performance against saturation effect.</p></div>\",\"PeriodicalId\":54878,\"journal\":{\"name\":\"Journal of Approximation Theory\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.9000,\"publicationDate\":\"2023-11-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Approximation Theory\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0021904523001338\",\"RegionNum\":3,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"MATHEMATICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Approximation Theory","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0021904523001338","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS","Score":null,"Total":0}
Coefficient-based regularized distribution regression
In this paper, we consider the coefficient-based regularized distribution regression which aims to regress from probability measures to real-valued responses over a reproducing kernel Hilbert space (RKHS), where the regularization is put on the coefficients and kernels are assumed to be indefinite. The algorithm involves two stages of sampling, the first stage sample consists of distributions and the second stage sample is obtained from these distributions. The asymptotic behavior of the algorithm is comprehensively studied across different regularity ranges of the regression function. Explicit learning rates are derived by using kernel mean embedding and integral operator techniques. We obtain the optimal rates under some mild conditions, which match the one-stage sampled minimax optimal rate. Compared with the kernel methods for distribution regression in existing literature, the algorithm under consideration does not require the kernel to be symmetric or positive semi-definite and hence provides a simple paradigm for designing indefinite kernel methods, which enriches the theme of the distribution regression. To the best of our knowledge, this is the first result for distribution regression with indefinite kernels, and our algorithm can improve the learning performance against saturation effect.
期刊介绍:
The Journal of Approximation Theory is devoted to advances in pure and applied approximation theory and related areas. These areas include, among others:
• Classical approximation
• Abstract approximation
• Constructive approximation
• Degree of approximation
• Fourier expansions
• Interpolation of operators
• General orthogonal systems
• Interpolation and quadratures
• Multivariate approximation
• Orthogonal polynomials
• Padé approximation
• Rational approximation
• Spline functions of one and several variables
• Approximation by radial basis functions in Euclidean spaces, on spheres, and on more general manifolds
• Special functions with strong connections to classical harmonic analysis, orthogonal polynomial, and approximation theory (as opposed to combinatorics, number theory, representation theory, generating functions, formal theory, and so forth)
• Approximation theoretic aspects of real or complex function theory, function theory, difference or differential equations, function spaces, or harmonic analysis
• Wavelet Theory and its applications in signal and image processing, and in differential equations with special emphasis on connections between wavelet theory and elements of approximation theory (such as approximation orders, Besov and Sobolev spaces, and so forth)
• Gabor (Weyl-Heisenberg) expansions and sampling theory.