Without imposing light-tailed noise assumptions, we prove that Tikhonov regularization for Gaussian Empirical Gain Maximization (EGM) in a reproducing kernel Hilbert space is consistent and further establish its fast exponential type convergence rates. In the literature, Gaussian EGM was proposed in various contexts to tackle robust estimation problems and has been applied extensively in a great variety of real-world applications. A reproducing kernel Hilbert space is frequently chosen as the hypothesis space, and Tikhonov regularization plays a crucial role in model selection. Although Gaussian EGM has been studied theoretically in a series of papers recently and has been well-understood, theoretical understanding of its Tikhonov regularized variants in RKHS is still limited. Several fundamental challenges remain, especially when light-tailed noise assumptions are absent. To fill the gap and address these challenges, we conduct the present study and make the following contributions. First, under weak moment conditions, we establish a new comparison theorem that enables the investigation of the asymptotic mean calibration properties of regularized Gaussian EGM. Second, under the same weak moment conditions, we show that regularized Gaussian EGM estimators are consistent and further establish their fast exponential-type convergence rates. Our study justifies its feasibility in tackling robust regression problems and explains its robustness from a theoretical viewpoint. Moreover, new technical tools including probabilistic initial upper bounds, confined effective hypothesis spaces, and novel comparison theorems are introduced and developed, which can faciliate the analysis of general regularized empirical gain maximization schemes that fall into the same vein as regularized Gaussian EGM.
扫码关注我们
求助内容:
应助结果提醒方式:
