{"title":"基于牛顿-CG 的一般非凸圆锥优化的障碍增强拉格朗日方法","authors":"Chuan He, Heng Huang, Zhaosong Lu","doi":"10.1007/s10589-024-00603-6","DOIUrl":null,"url":null,"abstract":"<p>In this paper we consider finding an approximate second-order stationary point (SOSP) of general nonconvex conic optimization that minimizes a twice differentiable function subject to nonlinear equality constraints and also a convex conic constraint. In particular, we propose a Newton-conjugate gradient (Newton-CG) based barrier-augmented Lagrangian method for finding an approximate SOSP of this problem. Under some mild assumptions, we show that our method enjoys a total inner iteration complexity of <span>\\({\\widetilde{{{\\,\\mathrm{\\mathcal {O}}\\,}}}}(\\epsilon ^{-11/2})\\)</span> and an operation complexity of <span>\\({\\widetilde{{{\\,\\mathrm{\\mathcal {O}}\\,}}}}(\\epsilon ^{-11/2}\\min \\{n,\\epsilon ^{-5/4}\\})\\)</span> for finding an <span>\\((\\epsilon ,\\sqrt{\\epsilon })\\)</span>-SOSP of general nonconvex conic optimization with high probability. Moreover, under a constraint qualification, these complexity bounds are improved to <span>\\({\\widetilde{{{\\,\\mathrm{\\mathcal {O}}\\,}}}}(\\epsilon ^{-7/2})\\)</span> and <span>\\({\\widetilde{{{\\,\\mathrm{\\mathcal {O}}\\,}}}}(\\epsilon ^{-7/2}\\min \\{n,\\epsilon ^{-3/4}\\})\\)</span>, respectively. To the best of our knowledge, this is the first study on the complexity of finding an approximate SOSP of general nonconvex conic optimization. Preliminary numerical results are presented to demonstrate superiority of the proposed method over first-order methods in terms of solution quality.</p>","PeriodicalId":55227,"journal":{"name":"Computational Optimization and Applications","volume":"165 1","pages":""},"PeriodicalIF":1.6000,"publicationDate":"2024-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Newton-CG based barrier-augmented Lagrangian method for general nonconvex conic optimization\",\"authors\":\"Chuan He, Heng Huang, Zhaosong Lu\",\"doi\":\"10.1007/s10589-024-00603-6\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>In this paper we consider finding an approximate second-order stationary point (SOSP) of general nonconvex conic optimization that minimizes a twice differentiable function subject to nonlinear equality constraints and also a convex conic constraint. In particular, we propose a Newton-conjugate gradient (Newton-CG) based barrier-augmented Lagrangian method for finding an approximate SOSP of this problem. Under some mild assumptions, we show that our method enjoys a total inner iteration complexity of <span>\\\\({\\\\widetilde{{{\\\\,\\\\mathrm{\\\\mathcal {O}}\\\\,}}}}(\\\\epsilon ^{-11/2})\\\\)</span> and an operation complexity of <span>\\\\({\\\\widetilde{{{\\\\,\\\\mathrm{\\\\mathcal {O}}\\\\,}}}}(\\\\epsilon ^{-11/2}\\\\min \\\\{n,\\\\epsilon ^{-5/4}\\\\})\\\\)</span> for finding an <span>\\\\((\\\\epsilon ,\\\\sqrt{\\\\epsilon })\\\\)</span>-SOSP of general nonconvex conic optimization with high probability. Moreover, under a constraint qualification, these complexity bounds are improved to <span>\\\\({\\\\widetilde{{{\\\\,\\\\mathrm{\\\\mathcal {O}}\\\\,}}}}(\\\\epsilon ^{-7/2})\\\\)</span> and <span>\\\\({\\\\widetilde{{{\\\\,\\\\mathrm{\\\\mathcal {O}}\\\\,}}}}(\\\\epsilon ^{-7/2}\\\\min \\\\{n,\\\\epsilon ^{-3/4}\\\\})\\\\)</span>, respectively. To the best of our knowledge, this is the first study on the complexity of finding an approximate SOSP of general nonconvex conic optimization. Preliminary numerical results are presented to demonstrate superiority of the proposed method over first-order methods in terms of solution quality.</p>\",\"PeriodicalId\":55227,\"journal\":{\"name\":\"Computational Optimization and Applications\",\"volume\":\"165 1\",\"pages\":\"\"},\"PeriodicalIF\":1.6000,\"publicationDate\":\"2024-08-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computational Optimization and Applications\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1007/s10589-024-00603-6\",\"RegionNum\":2,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"MATHEMATICS, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computational Optimization and Applications","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s10589-024-00603-6","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
A Newton-CG based barrier-augmented Lagrangian method for general nonconvex conic optimization
In this paper we consider finding an approximate second-order stationary point (SOSP) of general nonconvex conic optimization that minimizes a twice differentiable function subject to nonlinear equality constraints and also a convex conic constraint. In particular, we propose a Newton-conjugate gradient (Newton-CG) based barrier-augmented Lagrangian method for finding an approximate SOSP of this problem. Under some mild assumptions, we show that our method enjoys a total inner iteration complexity of \({\widetilde{{{\,\mathrm{\mathcal {O}}\,}}}}(\epsilon ^{-11/2})\) and an operation complexity of \({\widetilde{{{\,\mathrm{\mathcal {O}}\,}}}}(\epsilon ^{-11/2}\min \{n,\epsilon ^{-5/4}\})\) for finding an \((\epsilon ,\sqrt{\epsilon })\)-SOSP of general nonconvex conic optimization with high probability. Moreover, under a constraint qualification, these complexity bounds are improved to \({\widetilde{{{\,\mathrm{\mathcal {O}}\,}}}}(\epsilon ^{-7/2})\) and \({\widetilde{{{\,\mathrm{\mathcal {O}}\,}}}}(\epsilon ^{-7/2}\min \{n,\epsilon ^{-3/4}\})\), respectively. To the best of our knowledge, this is the first study on the complexity of finding an approximate SOSP of general nonconvex conic optimization. Preliminary numerical results are presented to demonstrate superiority of the proposed method over first-order methods in terms of solution quality.
期刊介绍:
Computational Optimization and Applications is a peer reviewed journal that is committed to timely publication of research and tutorial papers on the analysis and development of computational algorithms and modeling technology for optimization. Algorithms either for general classes of optimization problems or for more specific applied problems are of interest. Stochastic algorithms as well as deterministic algorithms will be considered. Papers that can provide both theoretical analysis, along with carefully designed computational experiments, are particularly welcome.
Topics of interest include, but are not limited to the following:
Large Scale Optimization,
Unconstrained Optimization,
Linear Programming,
Quadratic Programming Complementarity Problems, and Variational Inequalities,
Constrained Optimization,
Nondifferentiable Optimization,
Integer Programming,
Combinatorial Optimization,
Stochastic Optimization,
Multiobjective Optimization,
Network Optimization,
Complexity Theory,
Approximations and Error Analysis,
Parametric Programming and Sensitivity Analysis,
Parallel Computing, Distributed Computing, and Vector Processing,
Software, Benchmarks, Numerical Experimentation and Comparisons,
Modelling Languages and Systems for Optimization,
Automatic Differentiation,
Applications in Engineering, Finance, Optimal Control, Optimal Design, Operations Research,
Transportation, Economics, Communications, Manufacturing, and Management Science.