{"title":"超越平滑性和可分性的坐标下降方法","authors":"Flavia Chorobura, Ion Necoara","doi":"10.1007/s10589-024-00556-w","DOIUrl":null,"url":null,"abstract":"<p>This paper deals with convex nonsmooth optimization problems. We introduce a general smooth approximation framework for the original function and apply random (accelerated) coordinate descent methods for minimizing the corresponding smooth approximations. Our framework covers the most important classes of smoothing techniques from the literature. Based on this general framework for the smooth approximation and using coordinate descent type methods we derive convergence rates in function values for the original objective. Moreover, if the original function satisfies a growth condition, then we prove that the smooth approximations also inherits this condition and consequently the convergence rates are improved in this case. We also present a relative randomized coordinate descent algorithm for solving nonseparable minimization problems with the objective function relative smooth along coordinates w.r.t. a (possibly nonseparable) differentiable function. For this algorithm we also derive convergence rates in the convex case and under the growth condition for the objective.</p>","PeriodicalId":55227,"journal":{"name":"Computational Optimization and Applications","volume":"53 78 1","pages":""},"PeriodicalIF":1.6000,"publicationDate":"2024-02-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Coordinate descent methods beyond smoothness and separability\",\"authors\":\"Flavia Chorobura, Ion Necoara\",\"doi\":\"10.1007/s10589-024-00556-w\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>This paper deals with convex nonsmooth optimization problems. We introduce a general smooth approximation framework for the original function and apply random (accelerated) coordinate descent methods for minimizing the corresponding smooth approximations. Our framework covers the most important classes of smoothing techniques from the literature. Based on this general framework for the smooth approximation and using coordinate descent type methods we derive convergence rates in function values for the original objective. Moreover, if the original function satisfies a growth condition, then we prove that the smooth approximations also inherits this condition and consequently the convergence rates are improved in this case. We also present a relative randomized coordinate descent algorithm for solving nonseparable minimization problems with the objective function relative smooth along coordinates w.r.t. a (possibly nonseparable) differentiable function. For this algorithm we also derive convergence rates in the convex case and under the growth condition for the objective.</p>\",\"PeriodicalId\":55227,\"journal\":{\"name\":\"Computational Optimization and Applications\",\"volume\":\"53 78 1\",\"pages\":\"\"},\"PeriodicalIF\":1.6000,\"publicationDate\":\"2024-02-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computational Optimization and Applications\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1007/s10589-024-00556-w\",\"RegionNum\":2,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"MATHEMATICS, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computational Optimization and Applications","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s10589-024-00556-w","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
Coordinate descent methods beyond smoothness and separability
This paper deals with convex nonsmooth optimization problems. We introduce a general smooth approximation framework for the original function and apply random (accelerated) coordinate descent methods for minimizing the corresponding smooth approximations. Our framework covers the most important classes of smoothing techniques from the literature. Based on this general framework for the smooth approximation and using coordinate descent type methods we derive convergence rates in function values for the original objective. Moreover, if the original function satisfies a growth condition, then we prove that the smooth approximations also inherits this condition and consequently the convergence rates are improved in this case. We also present a relative randomized coordinate descent algorithm for solving nonseparable minimization problems with the objective function relative smooth along coordinates w.r.t. a (possibly nonseparable) differentiable function. For this algorithm we also derive convergence rates in the convex case and under the growth condition for the objective.
期刊介绍:
Computational Optimization and Applications is a peer reviewed journal that is committed to timely publication of research and tutorial papers on the analysis and development of computational algorithms and modeling technology for optimization. Algorithms either for general classes of optimization problems or for more specific applied problems are of interest. Stochastic algorithms as well as deterministic algorithms will be considered. Papers that can provide both theoretical analysis, along with carefully designed computational experiments, are particularly welcome.
Topics of interest include, but are not limited to the following:
Large Scale Optimization,
Unconstrained Optimization,
Linear Programming,
Quadratic Programming Complementarity Problems, and Variational Inequalities,
Constrained Optimization,
Nondifferentiable Optimization,
Integer Programming,
Combinatorial Optimization,
Stochastic Optimization,
Multiobjective Optimization,
Network Optimization,
Complexity Theory,
Approximations and Error Analysis,
Parametric Programming and Sensitivity Analysis,
Parallel Computing, Distributed Computing, and Vector Processing,
Software, Benchmarks, Numerical Experimentation and Comparisons,
Modelling Languages and Systems for Optimization,
Automatic Differentiation,
Applications in Engineering, Finance, Optimal Control, Optimal Design, Operations Research,
Transportation, Economics, Communications, Manufacturing, and Management Science.