Dual Descent Augmented Lagrangian Method and Alternating Direction Method of Multipliers

IF 2.6 1区 数学 Q1 MATHEMATICS, APPLIED SIAM Journal on Optimization Pub Date : 2024-05-07 DOI:10.1137/21m1449099
Kaizhao Sun, Xu Andy Sun
{"title":"Dual Descent Augmented Lagrangian Method and Alternating Direction Method of Multipliers","authors":"Kaizhao Sun, Xu Andy Sun","doi":"10.1137/21m1449099","DOIUrl":null,"url":null,"abstract":"SIAM Journal on Optimization, Volume 34, Issue 2, Page 1679-1707, June 2024. <br/> Abstract. Classical primal-dual algorithms attempt to solve [math] by alternately minimizing over the primal variable [math] through primal descent and maximizing the dual variable [math] through dual ascent. However, when [math] is highly nonconvex with complex constraints in [math], the minimization over [math] may not achieve global optimality and, hence, the dual ascent step loses its valid intuition. This observation motivates us to propose a new class of primal-dual algorithms for nonconvex constrained optimization with the key feature to reverse dual ascent to a conceptually new dual descent, in a sense, elevating the dual variable to the same status as the primal variable. Surprisingly, this new dual scheme achieves some best iteration complexities for solving nonconvex optimization problems. In particular, when the dual descent step is scaled by a fractional constant, we name it scaled dual descent (SDD), otherwise, unscaled dual descent (UDD). For nonconvex multiblock optimization with nonlinear equality constraints, we propose SDD-alternating direction method of multipliers (SDD-ADMM) and show that it finds an [math]-stationary solution in [math] iterations. The complexity is further improved to [math] and [math] under proper conditions. We also propose UDD-augmented Lagrangian method (UDD-ALM), combining UDD with ALM, for weakly convex minimization over affine constraints. We show that UDD-ALM finds an [math]-stationary solution in [math] iterations. These complexity bounds for both algorithms either achieve or improve the best-known results in the ADMM and ALM literature. Moreover, SDD-ADMM addresses a long-standing limitation of existing ADMM frameworks.","PeriodicalId":49529,"journal":{"name":"SIAM Journal on Optimization","volume":null,"pages":null},"PeriodicalIF":2.6000,"publicationDate":"2024-05-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIAM Journal on Optimization","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1137/21m1449099","RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0

Abstract

SIAM Journal on Optimization, Volume 34, Issue 2, Page 1679-1707, June 2024.
Abstract. Classical primal-dual algorithms attempt to solve [math] by alternately minimizing over the primal variable [math] through primal descent and maximizing the dual variable [math] through dual ascent. However, when [math] is highly nonconvex with complex constraints in [math], the minimization over [math] may not achieve global optimality and, hence, the dual ascent step loses its valid intuition. This observation motivates us to propose a new class of primal-dual algorithms for nonconvex constrained optimization with the key feature to reverse dual ascent to a conceptually new dual descent, in a sense, elevating the dual variable to the same status as the primal variable. Surprisingly, this new dual scheme achieves some best iteration complexities for solving nonconvex optimization problems. In particular, when the dual descent step is scaled by a fractional constant, we name it scaled dual descent (SDD), otherwise, unscaled dual descent (UDD). For nonconvex multiblock optimization with nonlinear equality constraints, we propose SDD-alternating direction method of multipliers (SDD-ADMM) and show that it finds an [math]-stationary solution in [math] iterations. The complexity is further improved to [math] and [math] under proper conditions. We also propose UDD-augmented Lagrangian method (UDD-ALM), combining UDD with ALM, for weakly convex minimization over affine constraints. We show that UDD-ALM finds an [math]-stationary solution in [math] iterations. These complexity bounds for both algorithms either achieve or improve the best-known results in the ADMM and ALM literature. Moreover, SDD-ADMM addresses a long-standing limitation of existing ADMM frameworks.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
双重后裔增量拉格朗日法和交替方向乘数法
SIAM 优化期刊》,第 34 卷第 2 期,第 1679-1707 页,2024 年 6 月。 摘要。经典的基元-对偶算法试图通过基元下降交替最小化基元变量[math]和通过对偶上升最大化对偶变量[math]来求解[math]。然而,当[math]高度非凸且[math]中存在复杂约束时,对[math]的最小化可能无法实现全局最优,因此,对偶上升步骤也就失去了有效的直观性。这一观察结果促使我们提出了一类新的非凸约束优化的基元-对偶算法,其主要特点是将对偶上升反转为概念上全新的对偶下降,在某种意义上,将对偶变量提升到与基元变量相同的地位。令人惊讶的是,这种新的对偶方案在解决非凸优化问题时实现了一些最佳迭代复杂度。特别是当对偶下降步骤按分数常数缩放时,我们将其命名为缩放对偶下降(SDD),反之则命名为非缩放对偶下降(UDD)。对于具有非线性相等约束的非凸多块优化,我们提出了 SDD- 交替方向乘法(SDD-ADMM),并证明它能在[math]迭代中找到[math]稳态解。在适当条件下,复杂度进一步提高到 [math] 和 [math]。我们还提出了 UDD-Agmented Lagrangian 方法 (UDD-ALM),将 UDD 与 ALM 结合起来,用于仿射约束条件下的弱凸最小化。我们证明,UDD-ALM 在[math]次迭代中找到了[math]稳态解。这两种算法的复杂度边界都达到或改进了 ADMM 和 ALM 文献中最著名的结果。此外,SDD-ADMM 解决了现有 ADMM 框架的一个长期局限。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
SIAM Journal on Optimization
SIAM Journal on Optimization 数学-应用数学
CiteScore
5.30
自引率
9.70%
发文量
101
审稿时长
6-12 weeks
期刊介绍: The SIAM Journal on Optimization contains research articles on the theory and practice of optimization. The areas addressed include linear and quadratic programming, convex programming, nonlinear programming, complementarity problems, stochastic optimization, combinatorial optimization, integer programming, and convex, nonsmooth and variational analysis. Contributions may emphasize optimization theory, algorithms, software, computational practice, applications, or the links between these subjects.
期刊最新文献
Corrigendum and Addendum: Newton Differentiability of Convex Functions in Normed Spaces and of a Class of Operators Newton-Based Alternating Methods for the Ground State of a Class of Multicomponent Bose–Einstein Condensates Minimum Spanning Trees in Infinite Graphs: Theory and Algorithms On Minimal Extended Representations of Generalized Power Cones A Functional Model Method for Nonconvex Nonsmooth Conditional Stochastic Optimization
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1