Pub Date : 2024-02-01DOI: 10.1142/s0219530523400018
Chendi Wang, Xin Guo, Qiang Wu
Kernel-based learning algorithms have been extensively studied over the past two decades for their successful applications in scientific research and industrial problem-solving. In classical kernel methods, such as kernel ridge regression and support vector machines, an unregularized offset term naturally appears. While its importance can be defended in some situations, it is arguable in others. However, it is commonly agreed that the offset term introduces essential challenges to the optimization and theoretical analysis of the algorithms. In this paper, we demonstrate that Kernel Ridge Regression (KRR) with an offset is closely connected to regularization schemes involving centered reproducing kernels. With the aid of this connection and the theory of centered reproducing kernels, we will establish generalization error bounds for KRR with an offset. These bounds indicate that the algorithm can achieve minimax optimal rates.
{"title":"Learning with centered reproducing kernels","authors":"Chendi Wang, Xin Guo, Qiang Wu","doi":"10.1142/s0219530523400018","DOIUrl":"https://doi.org/10.1142/s0219530523400018","url":null,"abstract":"Kernel-based learning algorithms have been extensively studied over the past two decades for their successful applications in scientific research and industrial problem-solving. In classical kernel methods, such as kernel ridge regression and support vector machines, an unregularized offset term naturally appears. While its importance can be defended in some situations, it is arguable in others. However, it is commonly agreed that the offset term introduces essential challenges to the optimization and theoretical analysis of the algorithms. In this paper, we demonstrate that Kernel Ridge Regression (KRR) with an offset is closely connected to regularization schemes involving centered reproducing kernels. With the aid of this connection and the theory of centered reproducing kernels, we will establish generalization error bounds for KRR with an offset. These bounds indicate that the algorithm can achieve minimax optimal rates.","PeriodicalId":503529,"journal":{"name":"Analysis and Applications","volume":"34 4","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139811713","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-02-01DOI: 10.1142/s0219530523400018
Chendi Wang, Xin Guo, Qiang Wu
Kernel-based learning algorithms have been extensively studied over the past two decades for their successful applications in scientific research and industrial problem-solving. In classical kernel methods, such as kernel ridge regression and support vector machines, an unregularized offset term naturally appears. While its importance can be defended in some situations, it is arguable in others. However, it is commonly agreed that the offset term introduces essential challenges to the optimization and theoretical analysis of the algorithms. In this paper, we demonstrate that Kernel Ridge Regression (KRR) with an offset is closely connected to regularization schemes involving centered reproducing kernels. With the aid of this connection and the theory of centered reproducing kernels, we will establish generalization error bounds for KRR with an offset. These bounds indicate that the algorithm can achieve minimax optimal rates.
{"title":"Learning with centered reproducing kernels","authors":"Chendi Wang, Xin Guo, Qiang Wu","doi":"10.1142/s0219530523400018","DOIUrl":"https://doi.org/10.1142/s0219530523400018","url":null,"abstract":"Kernel-based learning algorithms have been extensively studied over the past two decades for their successful applications in scientific research and industrial problem-solving. In classical kernel methods, such as kernel ridge regression and support vector machines, an unregularized offset term naturally appears. While its importance can be defended in some situations, it is arguable in others. However, it is commonly agreed that the offset term introduces essential challenges to the optimization and theoretical analysis of the algorithms. In this paper, we demonstrate that Kernel Ridge Regression (KRR) with an offset is closely connected to regularization schemes involving centered reproducing kernels. With the aid of this connection and the theory of centered reproducing kernels, we will establish generalization error bounds for KRR with an offset. These bounds indicate that the algorithm can achieve minimax optimal rates.","PeriodicalId":503529,"journal":{"name":"Analysis and Applications","volume":"32 3","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139871435","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-02-01DOI: 10.1142/s0219530523500355
Ting Hu, Renjie Guo
Distributed learning has attracted considerable attention in recent years due to its power to deal with big data in various science and engineering problems. Based on a divide-and-conquer strategy, this paper studies the distributed robust regression algorithm associated with correntropy losses and coefficient regularization in the scheme of kernel networks, where the kernel functions are not required to be symmetric or positive semi-definite. We establish explicit convergence results of such distributed algorithm depending on the number of data partitions, robustness and regularization parameters. We show that with suitable parameter choices the distributed robust algorithm can obtain the optimal convergence rate in the minimax sense, and simultaneously reduce the computational complexity and memory requirement in the standard (non-distributed) algorithms.
{"title":"Distributed robust regression with correntropy losses and regularization kernel networks","authors":"Ting Hu, Renjie Guo","doi":"10.1142/s0219530523500355","DOIUrl":"https://doi.org/10.1142/s0219530523500355","url":null,"abstract":"Distributed learning has attracted considerable attention in recent years due to its power to deal with big data in various science and engineering problems. Based on a divide-and-conquer strategy, this paper studies the distributed robust regression algorithm associated with correntropy losses and coefficient regularization in the scheme of kernel networks, where the kernel functions are not required to be symmetric or positive semi-definite. We establish explicit convergence results of such distributed algorithm depending on the number of data partitions, robustness and regularization parameters. We show that with suitable parameter choices the distributed robust algorithm can obtain the optimal convergence rate in the minimax sense, and simultaneously reduce the computational complexity and memory requirement in the standard (non-distributed) algorithms.","PeriodicalId":503529,"journal":{"name":"Analysis and Applications","volume":"50 3","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139875269","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-02-01DOI: 10.1142/s0219530523500343
Qian Cheng, Chun-Lei He, Shou-Jun Huang
In this paper, we investigate the various aspects of normal hyperbolic mean curvature flow by LeFloch and Smoczyk. It is remarkable that the equation admits the null condition in three-dimensional case and only satisfies the first null condition when [Formula: see text]. Based on the interesting findings, we can obtain the results of global existence of smooth solutions, as well as the stability of hyperplanes under this flow when [Formula: see text], which relates to the famous Bernstein theorem. Some explicit solutions for this flow have been also derived. It should be emphasized that the null structures of this hyperbolic mean curvature flow have not been found before.
{"title":"Remarks on the normal hyperbolic mean curvature flow","authors":"Qian Cheng, Chun-Lei He, Shou-Jun Huang","doi":"10.1142/s0219530523500343","DOIUrl":"https://doi.org/10.1142/s0219530523500343","url":null,"abstract":"In this paper, we investigate the various aspects of normal hyperbolic mean curvature flow by LeFloch and Smoczyk. It is remarkable that the equation admits the null condition in three-dimensional case and only satisfies the first null condition when [Formula: see text]. Based on the interesting findings, we can obtain the results of global existence of smooth solutions, as well as the stability of hyperplanes under this flow when [Formula: see text], which relates to the famous Bernstein theorem. Some explicit solutions for this flow have been also derived. It should be emphasized that the null structures of this hyperbolic mean curvature flow have not been found before.","PeriodicalId":503529,"journal":{"name":"Analysis and Applications","volume":"37 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139823518","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}