{"title":"An inexact regularized proximal Newton method for nonconvex and nonsmooth optimization","authors":"Ruyu Liu, Shaohua Pan, Yuqia Wu, Xiaoqi Yang","doi":"10.1007/s10589-024-00560-0","DOIUrl":null,"url":null,"abstract":"<p>This paper focuses on the minimization of a sum of a twice continuously differentiable function <i>f</i> and a nonsmooth convex function. An inexact regularized proximal Newton method is proposed by an approximation to the Hessian of <i>f</i> involving the <span>\\(\\varrho \\)</span>th power of the KKT residual. For <span>\\(\\varrho =0\\)</span>, we justify the global convergence of the iterate sequence for the KL objective function and its R-linear convergence rate for the KL objective function of exponent 1/2. For <span>\\(\\varrho \\in (0,1)\\)</span>, by assuming that cluster points satisfy a locally Hölderian error bound of order <i>q</i> on a second-order stationary point set and a local error bound of order <span>\\(q>1\\!+\\!\\varrho \\)</span> on the common stationary point set, respectively, we establish the global convergence of the iterate sequence and its superlinear convergence rate with order depending on <i>q</i> and <span>\\(\\varrho \\)</span>. A dual semismooth Newton augmented Lagrangian method is also developed for seeking an inexact minimizer of subproblems. Numerical comparisons with two state-of-the-art methods on <span>\\(\\ell _1\\)</span>-regularized Student’s <i>t</i>-regressions, group penalized Student’s <i>t</i>-regressions, and nonconvex image restoration confirm the efficiency of the proposed method.</p>","PeriodicalId":55227,"journal":{"name":"Computational Optimization and Applications","volume":"40 1","pages":""},"PeriodicalIF":1.6000,"publicationDate":"2024-02-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computational Optimization and Applications","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s10589-024-00560-0","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
This paper focuses on the minimization of a sum of a twice continuously differentiable function f and a nonsmooth convex function. An inexact regularized proximal Newton method is proposed by an approximation to the Hessian of f involving the \(\varrho \)th power of the KKT residual. For \(\varrho =0\), we justify the global convergence of the iterate sequence for the KL objective function and its R-linear convergence rate for the KL objective function of exponent 1/2. For \(\varrho \in (0,1)\), by assuming that cluster points satisfy a locally Hölderian error bound of order q on a second-order stationary point set and a local error bound of order \(q>1\!+\!\varrho \) on the common stationary point set, respectively, we establish the global convergence of the iterate sequence and its superlinear convergence rate with order depending on q and \(\varrho \). A dual semismooth Newton augmented Lagrangian method is also developed for seeking an inexact minimizer of subproblems. Numerical comparisons with two state-of-the-art methods on \(\ell _1\)-regularized Student’s t-regressions, group penalized Student’s t-regressions, and nonconvex image restoration confirm the efficiency of the proposed method.
期刊介绍:
Computational Optimization and Applications is a peer reviewed journal that is committed to timely publication of research and tutorial papers on the analysis and development of computational algorithms and modeling technology for optimization. Algorithms either for general classes of optimization problems or for more specific applied problems are of interest. Stochastic algorithms as well as deterministic algorithms will be considered. Papers that can provide both theoretical analysis, along with carefully designed computational experiments, are particularly welcome.
Topics of interest include, but are not limited to the following:
Large Scale Optimization,
Unconstrained Optimization,
Linear Programming,
Quadratic Programming Complementarity Problems, and Variational Inequalities,
Constrained Optimization,
Nondifferentiable Optimization,
Integer Programming,
Combinatorial Optimization,
Stochastic Optimization,
Multiobjective Optimization,
Network Optimization,
Complexity Theory,
Approximations and Error Analysis,
Parametric Programming and Sensitivity Analysis,
Parallel Computing, Distributed Computing, and Vector Processing,
Software, Benchmarks, Numerical Experimentation and Comparisons,
Modelling Languages and Systems for Optimization,
Automatic Differentiation,
Applications in Engineering, Finance, Optimal Control, Optimal Design, Operations Research,
Transportation, Economics, Communications, Manufacturing, and Management Science.