On the convergence of gradient descent for robust functional linear regression

IF 1.8 2区 数学 Q1 MATHEMATICS Journal of Complexity Pub Date : 2024-04-30 DOI:10.1016/j.jco.2024.101858
Cheng Wang , Jun Fan
{"title":"On the convergence of gradient descent for robust functional linear regression","authors":"Cheng Wang ,&nbsp;Jun Fan","doi":"10.1016/j.jco.2024.101858","DOIUrl":null,"url":null,"abstract":"<div><p>Functional data analysis offers a set of statistical methods concerned with extracting insights from intrinsically infinite-dimensional data and has attracted considerable amount of attentions in the past few decades. In this paper, we study robust functional linear regression model with a scalar response and a functional predictor in the framework of reproducing kernel Hilbert spaces. A gradient descent algorithm with early stopping is introduced to solve the corresponding empirical risk minimization problem associated with robust loss functions. By appropriately selecting the early stopping rule and the scaling parameter of the robust losses, the convergence of the proposed algorithm is established when the response variable is bounded or satisfies a moment condition. Explicit learning rates with respect to both estimation and prediction error are provided in terms of regularity of the regression function and eigenvalue decay rate of the integral operator induced by the reproducing kernel and covariance function.</p></div>","PeriodicalId":50227,"journal":{"name":"Journal of Complexity","volume":null,"pages":null},"PeriodicalIF":1.8000,"publicationDate":"2024-04-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Complexity","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0885064X24000359","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS","Score":null,"Total":0}
引用次数: 0

Abstract

Functional data analysis offers a set of statistical methods concerned with extracting insights from intrinsically infinite-dimensional data and has attracted considerable amount of attentions in the past few decades. In this paper, we study robust functional linear regression model with a scalar response and a functional predictor in the framework of reproducing kernel Hilbert spaces. A gradient descent algorithm with early stopping is introduced to solve the corresponding empirical risk minimization problem associated with robust loss functions. By appropriately selecting the early stopping rule and the scaling parameter of the robust losses, the convergence of the proposed algorithm is established when the response variable is bounded or satisfies a moment condition. Explicit learning rates with respect to both estimation and prediction error are provided in terms of regularity of the regression function and eigenvalue decay rate of the integral operator induced by the reproducing kernel and covariance function.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
论鲁棒性函数线性回归的梯度下降收敛性
函数数据分析提供了一套统计方法,旨在从本质上无穷维的数据中提取真知灼见,在过去几十年中吸引了大量关注。本文在重现核希尔伯特空间框架内研究了具有标量响应和函数预测因子的鲁棒函数线性回归模型。本文引入了一种早期停止的梯度下降算法,以解决与鲁棒损失函数相关的相应经验风险最小化问题。通过适当选择早期停止规则和鲁棒损失的缩放参数,当响应变量有界或满足矩条件时,就能确定所提算法的收敛性。根据回归函数的正则性以及由再现核和协方差函数引起的积分算子的特征值衰减率,提供了与估计和预测误差有关的显式学习率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Journal of Complexity
Journal of Complexity 工程技术-计算机:理论方法
CiteScore
3.10
自引率
17.60%
发文量
57
审稿时长
>12 weeks
期刊介绍: The multidisciplinary Journal of Complexity publishes original research papers that contain substantial mathematical results on complexity as broadly conceived. Outstanding review papers will also be published. In the area of computational complexity, the focus is on complexity over the reals, with the emphasis on lower bounds and optimal algorithms. The Journal of Complexity also publishes articles that provide major new algorithms or make important progress on upper bounds. Other models of computation, such as the Turing machine model, are also of interest. Computational complexity results in a wide variety of areas are solicited. Areas Include: • Approximation theory • Biomedical computing • Compressed computing and sensing • Computational finance • Computational number theory • Computational stochastics • Control theory • Cryptography • Design of experiments • Differential equations • Discrete problems • Distributed and parallel computation • High and infinite-dimensional problems • Information-based complexity • Inverse and ill-posed problems • Machine learning • Markov chain Monte Carlo • Monte Carlo and quasi-Monte Carlo • Multivariate integration and approximation • Noisy data • Nonlinear and algebraic equations • Numerical analysis • Operator equations • Optimization • Quantum computing • Scientific computation • Tractability of multivariate problems • Vision and image understanding.
期刊最新文献
Stefan Heinrich is the Winner of the 2024 Best Paper Award of the Journal of Complexity Best Paper Award of the Journal of Complexity Matthieu Dolbeault is the winner of the 2024 Joseph F. Traub Information-Based Complexity Young Researcher Award Optimal recovery of linear operators from information of random functions Intractability results for integration in tensor product spaces
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1