Accelerate Distributed Stochastic Descent for Nonconvex Optimization with Momentum

IF 65.3 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Foundations and Trends in Machine Learning Pub Date : 2020-11-01 DOI:10.1109/MLHPCAI4S51975.2020.00011
Guojing Cong, Tianyi Liu
{"title":"Accelerate Distributed Stochastic Descent for Nonconvex Optimization with Momentum","authors":"Guojing Cong, Tianyi Liu","doi":"10.1109/MLHPCAI4S51975.2020.00011","DOIUrl":null,"url":null,"abstract":"Momentum method has been used extensively in optimizers for deep learning. Recent studies show that distributed training through K-step averaging has many nice properties. We propose a momentum method for such model averaging approaches. At each individual learner level traditional stochastic gradient is applied. At the meta-level (global learner level), one momentum term is applied and we call it block momentum. We analyze the convergence and scaling properties of such momentum methods. Our experimental results show that block momentum not only accelerates training, but also achieves better results.","PeriodicalId":47667,"journal":{"name":"Foundations and Trends in Machine Learning","volume":"517 1","pages":"29-39"},"PeriodicalIF":65.3000,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Foundations and Trends in Machine Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MLHPCAI4S51975.2020.00011","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Momentum method has been used extensively in optimizers for deep learning. Recent studies show that distributed training through K-step averaging has many nice properties. We propose a momentum method for such model averaging approaches. At each individual learner level traditional stochastic gradient is applied. At the meta-level (global learner level), one momentum term is applied and we call it block momentum. We analyze the convergence and scaling properties of such momentum methods. Our experimental results show that block momentum not only accelerates training, but also achieves better results.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
带动量的非凸优化的加速分布随机下降
动量方法在深度学习优化器中得到了广泛的应用。最近的研究表明,通过k步平均的分布式训练有许多很好的特性。我们提出了一种动量法用于这种模型平均方法。在每个单独的学习者水平上应用传统的随机梯度。在元层次(全局学习者层次),应用了一个动量项,我们称之为块动量。我们分析了这类动量方法的收敛性和标度性。我们的实验结果表明,块动量不仅加速了训练,而且取得了更好的效果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Foundations and Trends in Machine Learning
Foundations and Trends in Machine Learning COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE-
CiteScore
108.50
自引率
0.00%
发文量
5
期刊介绍: Each issue of Foundations and Trends® in Machine Learning comprises a monograph of at least 50 pages written by research leaders in the field. We aim to publish monographs that provide an in-depth, self-contained treatment of topics where there have been significant new developments. Typically, this means that the monographs we publish will contain a significant level of mathematical detail (to describe the central methods and/or theory for the topic at hand), and will not eschew these details by simply pointing to existing references. Literature surveys and original research papers do not fall within these aims.
期刊最新文献
Model-based Reinforcement Learning: A Survey Probabilistic Learning Reinforcement Learning Support Vector Machine Advanced Clustering
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1