{"title":"用于无约束优化的有限内存子空间最小化共轭梯度算法","authors":"Zexian Liu, Yu-Hong Dai, Hongwei Liu","doi":"10.1007/s11590-024-02131-y","DOIUrl":null,"url":null,"abstract":"<p>Subspace minimization conjugate gradient (SMCG) methods are a class of quite efficient iterative methods for unconstrained optimization. The orthogonality is an important property of linear conjugate gradient method. It is however observed that the orthogonality of the gradients in linear conjugate gradient method is often lost, which usually causes slow convergence. Based on SMCG<span>\\(\\_\\)</span>BB (Liu and Liu in J Optim Theory Appl 180(3):879–906, 2019), we combine subspace minimization conjugate gradient method with the limited memory technique and present a limited memory subspace minimization conjugate gradient algorithm for unconstrained optimization. The proposed method includes two types of iterations: SMCG iteration and quasi-Newton (QN) iteration. In the SMCG iteration, the search direction is determined by solving a quadratic approximation problem, in which the important parameter is estimated based on some properties of the objective function at the current iterative point. In the QN iteration, a modified quasi-Newton method in the subspace is proposed to improve the orthogonality. Additionally, a modified strategy for choosing the initial stepsize is exploited. The global convergence of the proposed method is established under weak conditions. Some numerical results indicate that, for the tested functions in the CUTEr library, the proposed method has a great improvement over SMCG<span>\\(\\_\\)</span>BB, and it is comparable to the latest limited memory conjugate gradient software package CG<span>\\(\\_\\)</span>DESCENT (6.8) (Hager and Zhang in SIAM J Optim 23(4):2150–2168, 2013) and is also superior to the famous limited memory BFGS (L-BFGS) method.</p>","PeriodicalId":49720,"journal":{"name":"Optimization Letters","volume":"21 1","pages":""},"PeriodicalIF":1.3000,"publicationDate":"2024-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A limited memory subspace minimization conjugate gradient algorithm for unconstrained optimization\",\"authors\":\"Zexian Liu, Yu-Hong Dai, Hongwei Liu\",\"doi\":\"10.1007/s11590-024-02131-y\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Subspace minimization conjugate gradient (SMCG) methods are a class of quite efficient iterative methods for unconstrained optimization. The orthogonality is an important property of linear conjugate gradient method. It is however observed that the orthogonality of the gradients in linear conjugate gradient method is often lost, which usually causes slow convergence. Based on SMCG<span>\\\\(\\\\_\\\\)</span>BB (Liu and Liu in J Optim Theory Appl 180(3):879–906, 2019), we combine subspace minimization conjugate gradient method with the limited memory technique and present a limited memory subspace minimization conjugate gradient algorithm for unconstrained optimization. The proposed method includes two types of iterations: SMCG iteration and quasi-Newton (QN) iteration. In the SMCG iteration, the search direction is determined by solving a quadratic approximation problem, in which the important parameter is estimated based on some properties of the objective function at the current iterative point. In the QN iteration, a modified quasi-Newton method in the subspace is proposed to improve the orthogonality. Additionally, a modified strategy for choosing the initial stepsize is exploited. The global convergence of the proposed method is established under weak conditions. Some numerical results indicate that, for the tested functions in the CUTEr library, the proposed method has a great improvement over SMCG<span>\\\\(\\\\_\\\\)</span>BB, and it is comparable to the latest limited memory conjugate gradient software package CG<span>\\\\(\\\\_\\\\)</span>DESCENT (6.8) (Hager and Zhang in SIAM J Optim 23(4):2150–2168, 2013) and is also superior to the famous limited memory BFGS (L-BFGS) method.</p>\",\"PeriodicalId\":49720,\"journal\":{\"name\":\"Optimization Letters\",\"volume\":\"21 1\",\"pages\":\"\"},\"PeriodicalIF\":1.3000,\"publicationDate\":\"2024-07-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Optimization Letters\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1007/s11590-024-02131-y\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"MATHEMATICS, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Optimization Letters","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s11590-024-02131-y","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
摘要
子空间最小化共轭梯度(SMCG)方法是一类相当有效的无约束优化迭代方法。正交性是线性共轭梯度法的一个重要特性。然而,在线性共轭梯度法中梯度的正交性经常会丢失,这通常会导致收敛速度变慢。基于 SMCG\(\_\)BB (Liu and Liu in J Optim Theory Appl 180(3):879-906, 2019),我们将子空间最小化共轭梯度法与有限记忆技术相结合,提出了一种用于无约束优化的有限记忆子空间最小化共轭梯度算法。所提出的方法包括两种迭代:SMCG 迭代和准牛顿(QN)迭代。在 SMCG 迭代中,搜索方向是通过求解二次逼近问题确定的,其中重要参数是根据当前迭代点目标函数的某些属性估算的。在 QN 迭代中,提出了一种改进的子空间准牛顿方法,以提高正交性。此外,还采用了一种改进的初始步长选择策略。提出的方法在弱条件下具有全局收敛性。一些数值结果表明,对于 CUTEr 库中的测试函数,所提方法比 SMCG\(\_\)BB 有很大改进,与最新的有限记忆共轭梯度软件包 CG\(\_\)DESCENT (6.8) (Hager 和 Zhang 在 SIAM J Optim 23(4):2150-2168, 2013)不相上下,也优于著名的有限记忆 BFGS(L-BFGS)方法。
A limited memory subspace minimization conjugate gradient algorithm for unconstrained optimization
Subspace minimization conjugate gradient (SMCG) methods are a class of quite efficient iterative methods for unconstrained optimization. The orthogonality is an important property of linear conjugate gradient method. It is however observed that the orthogonality of the gradients in linear conjugate gradient method is often lost, which usually causes slow convergence. Based on SMCG\(\_\)BB (Liu and Liu in J Optim Theory Appl 180(3):879–906, 2019), we combine subspace minimization conjugate gradient method with the limited memory technique and present a limited memory subspace minimization conjugate gradient algorithm for unconstrained optimization. The proposed method includes two types of iterations: SMCG iteration and quasi-Newton (QN) iteration. In the SMCG iteration, the search direction is determined by solving a quadratic approximation problem, in which the important parameter is estimated based on some properties of the objective function at the current iterative point. In the QN iteration, a modified quasi-Newton method in the subspace is proposed to improve the orthogonality. Additionally, a modified strategy for choosing the initial stepsize is exploited. The global convergence of the proposed method is established under weak conditions. Some numerical results indicate that, for the tested functions in the CUTEr library, the proposed method has a great improvement over SMCG\(\_\)BB, and it is comparable to the latest limited memory conjugate gradient software package CG\(\_\)DESCENT (6.8) (Hager and Zhang in SIAM J Optim 23(4):2150–2168, 2013) and is also superior to the famous limited memory BFGS (L-BFGS) method.
期刊介绍:
Optimization Letters is an international journal covering all aspects of optimization, including theory, algorithms, computational studies, and applications, and providing an outlet for rapid publication of short communications in the field. Originality, significance, quality and clarity are the essential criteria for choosing the material to be published.
Optimization Letters has been expanding in all directions at an astonishing rate during the last few decades. New algorithmic and theoretical techniques have been developed, the diffusion into other disciplines has proceeded at a rapid pace, and our knowledge of all aspects of the field has grown even more profound. At the same time one of the most striking trends in optimization is the constantly increasing interdisciplinary nature of the field.
Optimization Letters aims to communicate in a timely fashion all recent developments in optimization with concise short articles (limited to a total of ten journal pages). Such concise articles will be easily accessible by readers working in any aspects of optimization and wish to be informed of recent developments.