{"title":"Fast convergence to non-isolated minima: four equivalent conditions for $${\\textrm{C}^{2}}$$ functions","authors":"Quentin Rebjock, Nicolas Boumal","doi":"10.1007/s10107-024-02136-6","DOIUrl":null,"url":null,"abstract":"<p>Optimization algorithms can see their local convergence rates deteriorate when the Hessian at the optimum is singular. These singularities are inescapable when the optima are non-isolated. Yet, under the right circumstances, several algorithms preserve their favorable rates even when optima form a continuum (e.g., due to over-parameterization). This has been explained under various structural assumptions, including the Polyak–Łojasiewicz condition, Quadratic Growth and the Error Bound. We show that, for cost functions which are twice continuously differentiable (<span>\\(\\textrm{C}^2\\)</span>), those three (local) properties are equivalent. Moreover, we show they are equivalent to the Morse–Bott property, that is, local minima form differentiable submanifolds, and the Hessian of the cost function is positive definite along its normal directions. We leverage this insight to improve local convergence guarantees for safe-guarded Newton-type methods under any (hence all) of the above assumptions. First, for adaptive cubic regularization, we secure quadratic convergence even with inexact subproblem solvers. Second, for trust-region methods, we argue capture can fail with an exact subproblem solver, then proceed to show linear convergence with an inexact one (Cauchy steps).</p>","PeriodicalId":2,"journal":{"name":"ACS Applied Bio Materials","volume":null,"pages":null},"PeriodicalIF":4.6000,"publicationDate":"2024-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACS Applied Bio Materials","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s10107-024-02136-6","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATERIALS SCIENCE, BIOMATERIALS","Score":null,"Total":0}
引用次数: 0
Abstract
Optimization algorithms can see their local convergence rates deteriorate when the Hessian at the optimum is singular. These singularities are inescapable when the optima are non-isolated. Yet, under the right circumstances, several algorithms preserve their favorable rates even when optima form a continuum (e.g., due to over-parameterization). This has been explained under various structural assumptions, including the Polyak–Łojasiewicz condition, Quadratic Growth and the Error Bound. We show that, for cost functions which are twice continuously differentiable (\(\textrm{C}^2\)), those three (local) properties are equivalent. Moreover, we show they are equivalent to the Morse–Bott property, that is, local minima form differentiable submanifolds, and the Hessian of the cost function is positive definite along its normal directions. We leverage this insight to improve local convergence guarantees for safe-guarded Newton-type methods under any (hence all) of the above assumptions. First, for adaptive cubic regularization, we secure quadratic convergence even with inexact subproblem solvers. Second, for trust-region methods, we argue capture can fail with an exact subproblem solver, then proceed to show linear convergence with an inexact one (Cauchy steps).