Hyperfast second-order local solvers for efficient statistically preconditioned distributed optimization

IF 2.6 Q2 OPERATIONS RESEARCH & MANAGEMENT SCIENCE EURO Journal on Computational Optimization Pub Date : 2022-01-01 DOI:10.1016/j.ejco.2022.100045
Pavel Dvurechensky , Dmitry Kamzolov , Aleksandr Lukashevich , Soomin Lee , Erik Ordentlich , César A. Uribe , Alexander Gasnikov
{"title":"Hyperfast second-order local solvers for efficient statistically preconditioned distributed optimization","authors":"Pavel Dvurechensky ,&nbsp;Dmitry Kamzolov ,&nbsp;Aleksandr Lukashevich ,&nbsp;Soomin Lee ,&nbsp;Erik Ordentlich ,&nbsp;César A. Uribe ,&nbsp;Alexander Gasnikov","doi":"10.1016/j.ejco.2022.100045","DOIUrl":null,"url":null,"abstract":"<div><p>Statistical preconditioning enables fast methods for distributed large-scale empirical risk minimization problems. In this approach, multiple worker nodes compute gradients in parallel, which are then used by the central node to update the parameter by solving an auxiliary (preconditioned) smaller-scale optimization problem. The recently proposed Statistically Preconditioned Accelerated Gradient (SPAG) method <span>[1]</span> has complexity bounds superior to other such algorithms but requires an exact solution for computationally intensive auxiliary optimization problems at every iteration. In this paper, we propose an Inexact SPAG (InSPAG) and explicitly characterize the accuracy by which the corresponding auxiliary subproblem needs to be solved to guarantee the same convergence rate as the exact method. We build our results by first developing an inexact adaptive accelerated Bregman proximal gradient method for general optimization problems under relative smoothness and strong convexity assumptions, which may be of independent interest. Moreover, we explore the properties of the auxiliary problem in the InSPAG algorithm assuming Lipschitz third-order derivatives and strong convexity. For such problem class, we develop a linearly convergent Hyperfast second-order method and estimate the total complexity of the InSPAG method with hyperfast auxiliary problem solver. Finally, we illustrate the proposed method's practical efficiency by performing large-scale numerical experiments on logistic regression models. To the best of our knowledge, these are the first empirical results on implementing high-order methods on large-scale problems, as we work with data where the dimension is of the order of 3 million, and the number of samples is 700 million.</p></div>","PeriodicalId":51880,"journal":{"name":"EURO Journal on Computational Optimization","volume":"10 ","pages":"Article 100045"},"PeriodicalIF":2.6000,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2192440622000211/pdfft?md5=295cb611041330f3ffad8993cf73fef2&pid=1-s2.0-S2192440622000211-main.pdf","citationCount":"14","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"EURO Journal on Computational Optimization","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2192440622000211","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"OPERATIONS RESEARCH & MANAGEMENT SCIENCE","Score":null,"Total":0}
引用次数: 14

Abstract

Statistical preconditioning enables fast methods for distributed large-scale empirical risk minimization problems. In this approach, multiple worker nodes compute gradients in parallel, which are then used by the central node to update the parameter by solving an auxiliary (preconditioned) smaller-scale optimization problem. The recently proposed Statistically Preconditioned Accelerated Gradient (SPAG) method [1] has complexity bounds superior to other such algorithms but requires an exact solution for computationally intensive auxiliary optimization problems at every iteration. In this paper, we propose an Inexact SPAG (InSPAG) and explicitly characterize the accuracy by which the corresponding auxiliary subproblem needs to be solved to guarantee the same convergence rate as the exact method. We build our results by first developing an inexact adaptive accelerated Bregman proximal gradient method for general optimization problems under relative smoothness and strong convexity assumptions, which may be of independent interest. Moreover, we explore the properties of the auxiliary problem in the InSPAG algorithm assuming Lipschitz third-order derivatives and strong convexity. For such problem class, we develop a linearly convergent Hyperfast second-order method and estimate the total complexity of the InSPAG method with hyperfast auxiliary problem solver. Finally, we illustrate the proposed method's practical efficiency by performing large-scale numerical experiments on logistic regression models. To the best of our knowledge, these are the first empirical results on implementing high-order methods on large-scale problems, as we work with data where the dimension is of the order of 3 million, and the number of samples is 700 million.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
高效统计预条件分布优化的超快速二阶局部求解
统计预处理使分布式大规模经验风险最小化问题的快速方法成为可能。在这种方法中,多个工作节点并行计算梯度,然后由中心节点通过解决一个辅助的(预置的)小规模优化问题来更新参数。最近提出的统计预条件加速梯度(statistical Preconditioned Accelerated Gradient, SPAG)方法[1]具有优于其他此类算法的复杂度界限,但在每次迭代时都需要对计算密集型辅助优化问题的精确解。在本文中,我们提出了一个不精确的SPAG (InSPAG),并明确地描述了相应的辅助子问题需要解决的精度,以保证与精确方法相同的收敛速度。我们首先开发了一种非精确自适应加速Bregman近端梯度方法,用于相对光滑和强凸性假设下的一般优化问题,这可能是一个独立的兴趣。此外,我们还探讨了InSPAG算法中假设Lipschitz三阶导数和强凸性的辅助问题的性质。针对这类问题,我们开发了一种线性收敛的超快二阶方法,并利用超快辅助问题求解器估计了InSPAG方法的总复杂度。最后,我们通过在逻辑回归模型上进行大规模数值实验来说明所提出方法的实际有效性。据我们所知,这些是在大规模问题上实施高阶方法的第一个实证结果,因为我们处理的数据维度为300万,样本数量为7亿。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
EURO Journal on Computational Optimization
EURO Journal on Computational Optimization OPERATIONS RESEARCH & MANAGEMENT SCIENCE-
CiteScore
3.50
自引率
0.00%
发文量
28
审稿时长
60 days
期刊介绍: The aim of this journal is to contribute to the many areas in which Operations Research and Computer Science are tightly connected with each other. More precisely, the common element in all contributions to this journal is the use of computers for the solution of optimization problems. Both methodological contributions and innovative applications are considered, but validation through convincing computational experiments is desirable. The journal publishes three types of articles (i) research articles, (ii) tutorials, and (iii) surveys. A research article presents original methodological contributions. A tutorial provides an introduction to an advanced topic designed to ease the use of the relevant methodology. A survey provides a wide overview of a given subject by summarizing and organizing research results.
期刊最新文献
Unboxing Tree ensembles for interpretability: A hierarchical visualization tool and a multivariate optimal re-built tree An effective hybrid decomposition approach to solve the network-constrained stochastic unit commitment problem in large-scale power systems Advances in nonlinear optimization and equilibrium problems – Special issue editorial The Marguerite Frank Award for the best EJCO paper 2023 A variable metric proximal stochastic gradient method: An application to classification problems
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1