Survey of sequential convex programming and generalized Gauss-Newton methods

F. Messerer, Katrin Baumgärtner, M. Diehl
{"title":"Survey of sequential convex programming and generalized Gauss-Newton methods","authors":"F. Messerer, Katrin Baumgärtner, M. Diehl","doi":"10.1051/proc/202171107","DOIUrl":null,"url":null,"abstract":"We provide an overview of a class of iterative convex approximation methods for nonlinear optimization problems with convex-over-nonlinear substructure. These problems are characterized by outer convexities on the one hand, and nonlinear, generally nonconvex, but differentiable functions on the other hand. All methods from this class use only first order derivatives of the nonlinear functions and sequentially solve convex optimization problems. All of them are different generalizations of the classical Gauss-Newton (GN) method. We focus on the smooth constrained case and on three methods to address it: Sequential Convex Programming (SCP), Sequential Convex Quadratic Programming (SCQP), and Sequential Quadratically Constrained Quadratic Programming (SQCQP). While the first two methods were previously known, the last is newly proposed and investigated in this paper. We show under mild assumptions that SCP, SCQP and SQCQP have exactly the same local linear convergence – or divergence – rate. We then discuss the special case in which the solution is fully determined by the active constraints, and show that for this case the KKT conditions are sufficient for local optimality and that SCP, SCQP and SQCQP even converge quadratically. In the context of parameter estimation with symmetric convex loss functions, the possible divergence of the methods can in fact be an advantage that helps them to avoid some undesirable local minima: generalizing existing results, we show that the presented methods converge to a local minimum if and only if this local minimum is stable against a mirroring operation applied to the measurement data of the estimation problem. All results are illustrated by numerical experiments on a tutorial example.","PeriodicalId":53260,"journal":{"name":"ESAIM Proceedings and Surveys","volume":"34 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2021-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"28","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ESAIM Proceedings and Surveys","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1051/proc/202171107","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 28

Abstract

We provide an overview of a class of iterative convex approximation methods for nonlinear optimization problems with convex-over-nonlinear substructure. These problems are characterized by outer convexities on the one hand, and nonlinear, generally nonconvex, but differentiable functions on the other hand. All methods from this class use only first order derivatives of the nonlinear functions and sequentially solve convex optimization problems. All of them are different generalizations of the classical Gauss-Newton (GN) method. We focus on the smooth constrained case and on three methods to address it: Sequential Convex Programming (SCP), Sequential Convex Quadratic Programming (SCQP), and Sequential Quadratically Constrained Quadratic Programming (SQCQP). While the first two methods were previously known, the last is newly proposed and investigated in this paper. We show under mild assumptions that SCP, SCQP and SQCQP have exactly the same local linear convergence – or divergence – rate. We then discuss the special case in which the solution is fully determined by the active constraints, and show that for this case the KKT conditions are sufficient for local optimality and that SCP, SCQP and SQCQP even converge quadratically. In the context of parameter estimation with symmetric convex loss functions, the possible divergence of the methods can in fact be an advantage that helps them to avoid some undesirable local minima: generalizing existing results, we show that the presented methods converge to a local minimum if and only if this local minimum is stable against a mirroring operation applied to the measurement data of the estimation problem. All results are illustrated by numerical experiments on a tutorial example.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
序贯凸规划与广义高斯-牛顿方法综述
本文概述了一类具有凸-过非线性子结构的非线性优化问题的迭代凸逼近方法。这些问题的特征一方面是外凸,另一方面是非线性的,一般是非凸的,但可微的函数。所有的方法,从这类只使用非线性函数的一阶导数和顺序解决凸优化问题。它们都是经典高斯-牛顿(GN)方法的不同推广。我们重点讨论了光滑约束情况和三种解决方法:顺序凸规划(SCP)、顺序凸二次规划(SCQP)和顺序二次约束规划(SQCQP)。虽然前两种方法是已知的,但最后一种方法是本文新提出和研究的。在温和的假设下,我们证明了SCP、SCQP和SQCQP具有完全相同的局部线性收敛或发散率。然后讨论了解完全由主动约束决定的特殊情况,并证明了在这种情况下,KKT条件是局部最优性的充分条件,并且SCP、SCQP和SQCQP甚至是二次收敛的。在具有对称凸损失函数的参数估计的情况下,方法的可能散度实际上可以成为一个优势,帮助它们避免一些不希望的局部最小值:推广现有的结果,我们表明,所提出的方法收敛于局部最小值当且仅当该局部最小值对于应用于估计问题的测量数据的镜像操作是稳定的。所有结果都通过一个教学实例的数值实验加以说明。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Derivation via Hamilton's principle of a new shallow-water model using a color function for the macroscopic description of partial wetting phenomena Study of relaxation processes in a two-phase flow model Accelerating metabolic models evaluation with statistical metamodels: application to Salmonella infection models Mortensen observer for a class of variational inequalities – lost equivalence with stochastic filtering approaches Comparison of statistical, machine learning, and mathematical modelling methods to investigate the effect of ageing on dog’s cardiovascular system
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1