{"title":"Stochastic linearized generalized alternating direction method of multipliers: Expected convergence rates and large deviation properties","authors":"Jia Hu, T. Guo, Congying Han","doi":"10.1017/s096012952300004x","DOIUrl":null,"url":null,"abstract":"\n Alternating direction method of multipliers (ADMM) receives much attention in the field of optimization and computer science, etc. The generalized ADMM (G-ADMM) proposed by Eckstein and Bertsekas incorporates an acceleration factor and is more efficient than the original ADMM. However, G-ADMM is not applicable in some models where the objective function value (or its gradient) is computationally costly or even impossible to compute. In this paper, we consider the two-block separable convex optimization problem with linear constraints, where only noisy estimations of the gradient of the objective function are accessible. Under this setting, we propose a stochastic linearized generalized ADMM (called SLG-ADMM) where two subproblems are approximated by some linearization strategies. And in theory, we analyze the expected convergence rates and large deviation properties of SLG-ADMM. In particular, we show that the worst-case expected convergence rates of SLG-ADMM are \n \n \n \n$\\mathcal{O}\\left( {{N}^{-1/2}}\\right)$\n\n \n and \n \n \n \n$\\mathcal{O}\\left({\\ln N} \\cdot {N}^{-1}\\right)$\n\n \n for solving general convex and strongly convex problems, respectively, where N is the iteration number, similarly hereinafter, and with high probability, SLG-ADMM has \n \n \n \n$\\mathcal{O}\\left ( \\ln N \\cdot N^{-1/2} \\right ) $\n\n \n and \n \n \n \n$\\mathcal{O}\\left ( \\left ( \\ln N \\right )^{2} \\cdot N^{-1} \\right ) $\n\n \n constraint violation bounds and objective error bounds for general convex and strongly convex problems, respectively.","PeriodicalId":49855,"journal":{"name":"Mathematical Structures in Computer Science","volume":null,"pages":null},"PeriodicalIF":0.4000,"publicationDate":"2023-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Mathematical Structures in Computer Science","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1017/s096012952300004x","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
引用次数: 2
Abstract
Alternating direction method of multipliers (ADMM) receives much attention in the field of optimization and computer science, etc. The generalized ADMM (G-ADMM) proposed by Eckstein and Bertsekas incorporates an acceleration factor and is more efficient than the original ADMM. However, G-ADMM is not applicable in some models where the objective function value (or its gradient) is computationally costly or even impossible to compute. In this paper, we consider the two-block separable convex optimization problem with linear constraints, where only noisy estimations of the gradient of the objective function are accessible. Under this setting, we propose a stochastic linearized generalized ADMM (called SLG-ADMM) where two subproblems are approximated by some linearization strategies. And in theory, we analyze the expected convergence rates and large deviation properties of SLG-ADMM. In particular, we show that the worst-case expected convergence rates of SLG-ADMM are
$\mathcal{O}\left( {{N}^{-1/2}}\right)$
and
$\mathcal{O}\left({\ln N} \cdot {N}^{-1}\right)$
for solving general convex and strongly convex problems, respectively, where N is the iteration number, similarly hereinafter, and with high probability, SLG-ADMM has
$\mathcal{O}\left ( \ln N \cdot N^{-1/2} \right ) $
and
$\mathcal{O}\left ( \left ( \ln N \right )^{2} \cdot N^{-1} \right ) $
constraint violation bounds and objective error bounds for general convex and strongly convex problems, respectively.
期刊介绍:
Mathematical Structures in Computer Science is a journal of theoretical computer science which focuses on the application of ideas from the structural side of mathematics and mathematical logic to computer science. The journal aims to bridge the gap between theoretical contributions and software design, publishing original papers of a high standard and broad surveys with original perspectives in all areas of computing, provided that ideas or results from logic, algebra, geometry, category theory or other areas of logic and mathematics form a basis for the work. The journal welcomes applications to computing based on the use of specific mathematical structures (e.g. topological and order-theoretic structures) as well as on proof-theoretic notions or results.