{"title":"A family of conjugate gradient methods with guaranteed positiveness and descent for vector optimization","authors":"Qing-Rui He, Sheng-Jie Li, Bo-Ya Zhang, Chun-Rong Chen","doi":"10.1007/s10589-024-00609-0","DOIUrl":null,"url":null,"abstract":"<p>In this paper, we seek a new modification way to ensure the positiveness of the conjugate parameter and, based on the Dai-Yuan (DY) method in the vector setting, propose an associated family of conjugate gradient (CG) methods with guaranteed descent for solving unconstrained vector optimization problems. Several special members of the family are analyzed and the (sufficient) descent condition is established for them (in the vector sense). Under mild conditions, a general convergence result for the CG methods with specific parameters is presented, which, in particular, covers the global convergence of the aforementioned members. Furthermore, for the purpose of comparison, we then consider the direct extension versions of some Dai-Yuan type methods which are obtained by modifying the DY method of the scalar case. These vector extensions can retrieve the classical parameters in the scalar minimization case and their descent property and global convergence are also studied under mild assumptions. Finally, numerical experiments are given to illustrate the practical behavior of all proposed methods.</p>","PeriodicalId":55227,"journal":{"name":"Computational Optimization and Applications","volume":"47 1","pages":""},"PeriodicalIF":1.6000,"publicationDate":"2024-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computational Optimization and Applications","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s10589-024-00609-0","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
In this paper, we seek a new modification way to ensure the positiveness of the conjugate parameter and, based on the Dai-Yuan (DY) method in the vector setting, propose an associated family of conjugate gradient (CG) methods with guaranteed descent for solving unconstrained vector optimization problems. Several special members of the family are analyzed and the (sufficient) descent condition is established for them (in the vector sense). Under mild conditions, a general convergence result for the CG methods with specific parameters is presented, which, in particular, covers the global convergence of the aforementioned members. Furthermore, for the purpose of comparison, we then consider the direct extension versions of some Dai-Yuan type methods which are obtained by modifying the DY method of the scalar case. These vector extensions can retrieve the classical parameters in the scalar minimization case and their descent property and global convergence are also studied under mild assumptions. Finally, numerical experiments are given to illustrate the practical behavior of all proposed methods.
在本文中,我们寻求了一种新的修正方法来确保共轭参数的正向性,并基于矢量环境下的戴元(DY)方法,提出了一个相关的共轭梯度(CG)方法族,该方法具有保证下降的特性,可用于求解无约束矢量优化问题。分析了该族的几个特殊成员,并为它们建立了(向量意义上的)(充分)下降条件。在温和条件下,提出了具有特定参数的 CG 方法的一般收敛结果,特别是涵盖了上述成员的全局收敛性。此外,为了进行比较,我们还考虑了一些戴元类方法的直接扩展版本,它们是通过修改标量情况下的 DY 方法而获得的。这些矢量扩展方法可以检索标量最小化情况下的经典参数,并在温和的假设条件下研究了它们的下降特性和全局收敛性。最后,还给出了数值实验来说明所有建议方法的实用性。
期刊介绍:
Computational Optimization and Applications is a peer reviewed journal that is committed to timely publication of research and tutorial papers on the analysis and development of computational algorithms and modeling technology for optimization. Algorithms either for general classes of optimization problems or for more specific applied problems are of interest. Stochastic algorithms as well as deterministic algorithms will be considered. Papers that can provide both theoretical analysis, along with carefully designed computational experiments, are particularly welcome.
Topics of interest include, but are not limited to the following:
Large Scale Optimization,
Unconstrained Optimization,
Linear Programming,
Quadratic Programming Complementarity Problems, and Variational Inequalities,
Constrained Optimization,
Nondifferentiable Optimization,
Integer Programming,
Combinatorial Optimization,
Stochastic Optimization,
Multiobjective Optimization,
Network Optimization,
Complexity Theory,
Approximations and Error Analysis,
Parametric Programming and Sensitivity Analysis,
Parallel Computing, Distributed Computing, and Vector Processing,
Software, Benchmarks, Numerical Experimentation and Comparisons,
Modelling Languages and Systems for Optimization,
Automatic Differentiation,
Applications in Engineering, Finance, Optimal Control, Optimal Design, Operations Research,
Transportation, Economics, Communications, Manufacturing, and Management Science.