{"title":"非凸多目标优化问题的 BFGS 型算法的全局收敛性","authors":"L. F. Prudente, D. R. Souza","doi":"10.1007/s10589-024-00571-x","DOIUrl":null,"url":null,"abstract":"<p>We propose a modified BFGS algorithm for multiobjective optimization problems with global convergence, even in the absence of convexity assumptions on the objective functions. Furthermore, we establish a local superlinear rate of convergence of the method under usual conditions. Our approach employs Wolfe step sizes and ensures that the Hessian approximations are updated and corrected at each iteration to address the lack of convexity assumption. Numerical results shows that the introduced modifications preserve the practical efficiency of the BFGS method.</p>","PeriodicalId":55227,"journal":{"name":"Computational Optimization and Applications","volume":"11 1","pages":""},"PeriodicalIF":1.6000,"publicationDate":"2024-04-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Global convergence of a BFGS-type algorithm for nonconvex multiobjective optimization problems\",\"authors\":\"L. F. Prudente, D. R. Souza\",\"doi\":\"10.1007/s10589-024-00571-x\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>We propose a modified BFGS algorithm for multiobjective optimization problems with global convergence, even in the absence of convexity assumptions on the objective functions. Furthermore, we establish a local superlinear rate of convergence of the method under usual conditions. Our approach employs Wolfe step sizes and ensures that the Hessian approximations are updated and corrected at each iteration to address the lack of convexity assumption. Numerical results shows that the introduced modifications preserve the practical efficiency of the BFGS method.</p>\",\"PeriodicalId\":55227,\"journal\":{\"name\":\"Computational Optimization and Applications\",\"volume\":\"11 1\",\"pages\":\"\"},\"PeriodicalIF\":1.6000,\"publicationDate\":\"2024-04-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computational Optimization and Applications\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1007/s10589-024-00571-x\",\"RegionNum\":2,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"MATHEMATICS, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computational Optimization and Applications","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s10589-024-00571-x","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
Global convergence of a BFGS-type algorithm for nonconvex multiobjective optimization problems
We propose a modified BFGS algorithm for multiobjective optimization problems with global convergence, even in the absence of convexity assumptions on the objective functions. Furthermore, we establish a local superlinear rate of convergence of the method under usual conditions. Our approach employs Wolfe step sizes and ensures that the Hessian approximations are updated and corrected at each iteration to address the lack of convexity assumption. Numerical results shows that the introduced modifications preserve the practical efficiency of the BFGS method.
期刊介绍:
Computational Optimization and Applications is a peer reviewed journal that is committed to timely publication of research and tutorial papers on the analysis and development of computational algorithms and modeling technology for optimization. Algorithms either for general classes of optimization problems or for more specific applied problems are of interest. Stochastic algorithms as well as deterministic algorithms will be considered. Papers that can provide both theoretical analysis, along with carefully designed computational experiments, are particularly welcome.
Topics of interest include, but are not limited to the following:
Large Scale Optimization,
Unconstrained Optimization,
Linear Programming,
Quadratic Programming Complementarity Problems, and Variational Inequalities,
Constrained Optimization,
Nondifferentiable Optimization,
Integer Programming,
Combinatorial Optimization,
Stochastic Optimization,
Multiobjective Optimization,
Network Optimization,
Complexity Theory,
Approximations and Error Analysis,
Parametric Programming and Sensitivity Analysis,
Parallel Computing, Distributed Computing, and Vector Processing,
Software, Benchmarks, Numerical Experimentation and Comparisons,
Modelling Languages and Systems for Optimization,
Automatic Differentiation,
Applications in Engineering, Finance, Optimal Control, Optimal Design, Operations Research,
Transportation, Economics, Communications, Manufacturing, and Management Science.