采用预处理波利克步长的随机梯度下降法

IF 0.7 4区 数学 Q3 MATHEMATICS, APPLIED Computational Mathematics and Mathematical Physics Pub Date : 2024-06-07 DOI:10.1134/s0965542524700052
F. Abdukhakimov, C. Xiang, D. Kamzolov, M. Takáč
{"title":"采用预处理波利克步长的随机梯度下降法","authors":"F. Abdukhakimov, C. Xiang, D. Kamzolov, M. Takáč","doi":"10.1134/s0965542524700052","DOIUrl":null,"url":null,"abstract":"<h3 data-test=\"abstract-sub-heading\">Abstract</h3><p>Stochastic Gradient Descent (SGD) is one of the many iterative optimization methods that are widely used in solving machine learning problems. These methods display valuable properties and attract researchers and industrial machine learning engineers with their simplicity. However, one of the weaknesses of this type of methods is the necessity to tune learning rate (step-size) for every loss function and dataset combination to solve an optimization problem and get an efficient performance in a given time budget. Stochastic Gradient Descent with Polyak Step-size (SPS) is a method that offers an update rule that alleviates the need of fine-tuning the learning rate of an optimizer. In this paper, we propose an extension of SPS that employs preconditioning techniques, such as Hutchinson’s method, Adam, and AdaGrad, to improve its performance on badly scaled and/or ill-conditioned datasets.</p>","PeriodicalId":55230,"journal":{"name":"Computational Mathematics and Mathematical Physics","volume":null,"pages":null},"PeriodicalIF":0.7000,"publicationDate":"2024-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Stochastic Gradient Descent with Preconditioned Polyak Step-Size\",\"authors\":\"F. Abdukhakimov, C. Xiang, D. Kamzolov, M. Takáč\",\"doi\":\"10.1134/s0965542524700052\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<h3 data-test=\\\"abstract-sub-heading\\\">Abstract</h3><p>Stochastic Gradient Descent (SGD) is one of the many iterative optimization methods that are widely used in solving machine learning problems. These methods display valuable properties and attract researchers and industrial machine learning engineers with their simplicity. However, one of the weaknesses of this type of methods is the necessity to tune learning rate (step-size) for every loss function and dataset combination to solve an optimization problem and get an efficient performance in a given time budget. Stochastic Gradient Descent with Polyak Step-size (SPS) is a method that offers an update rule that alleviates the need of fine-tuning the learning rate of an optimizer. In this paper, we propose an extension of SPS that employs preconditioning techniques, such as Hutchinson’s method, Adam, and AdaGrad, to improve its performance on badly scaled and/or ill-conditioned datasets.</p>\",\"PeriodicalId\":55230,\"journal\":{\"name\":\"Computational Mathematics and Mathematical Physics\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.7000,\"publicationDate\":\"2024-06-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computational Mathematics and Mathematical Physics\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1134/s0965542524700052\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"MATHEMATICS, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computational Mathematics and Mathematical Physics","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1134/s0965542524700052","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0

摘要

摘要随机梯度下降法(SGD)是广泛用于解决机器学习问题的众多迭代优化方法之一。这些方法显示出宝贵的特性,并以其简单性吸引着研究人员和工业机器学习工程师。然而,这类方法的弱点之一是必须调整每个损失函数和数据集组合的学习率(步长),才能解决优化问题,并在给定的时间预算内获得高效性能。采用 Polyak 步长的随机梯度下降法(SPS)是一种提供更新规则的方法,可减轻对优化器学习率进行微调的需要。在本文中,我们提出了随机梯度下降法的扩展方案,该方案采用了 Hutchinson 方法、Adam 和 AdaGrad 等预处理技术,以提高随机梯度下降法在严重缩放和/或条件不良数据集上的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

摘要图片

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Stochastic Gradient Descent with Preconditioned Polyak Step-Size

Abstract

Stochastic Gradient Descent (SGD) is one of the many iterative optimization methods that are widely used in solving machine learning problems. These methods display valuable properties and attract researchers and industrial machine learning engineers with their simplicity. However, one of the weaknesses of this type of methods is the necessity to tune learning rate (step-size) for every loss function and dataset combination to solve an optimization problem and get an efficient performance in a given time budget. Stochastic Gradient Descent with Polyak Step-size (SPS) is a method that offers an update rule that alleviates the need of fine-tuning the learning rate of an optimizer. In this paper, we propose an extension of SPS that employs preconditioning techniques, such as Hutchinson’s method, Adam, and AdaGrad, to improve its performance on badly scaled and/or ill-conditioned datasets.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Computational Mathematics and Mathematical Physics
Computational Mathematics and Mathematical Physics MATHEMATICS, APPLIED-PHYSICS, MATHEMATICAL
CiteScore
1.50
自引率
14.30%
发文量
125
审稿时长
4-8 weeks
期刊介绍: Computational Mathematics and Mathematical Physics is a monthly journal published in collaboration with the Russian Academy of Sciences. The journal includes reviews and original papers on computational mathematics, computational methods of mathematical physics, informatics, and other mathematical sciences. The journal welcomes reviews and original articles from all countries in the English or Russian language.
期刊最新文献
Difference Operator Approximations on Nonstandard Rectangular Grid The MDM Algorithm and the Sylvester Problem Regularization of the Solution to Degenerate Systems of Algebraic Equations Exemplified by Identification of the Virial Equation of State of a Real Gas New Classes of Solutions of the σ-Commutation Problem ( $$\sigma \ne 0,\; \pm 1$$ ) for Toeplitz and Hankel Matrices within a Unified Approach Complex Narayana Quaternions
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1