求解无约束优化问题的两种修正共轭梯度法及其应用

Abd Elhamid Mehamdia, Y. Chaib, T. Bechouat
{"title":"求解无约束优化问题的两种修正共轭梯度法及其应用","authors":"Abd Elhamid Mehamdia, Y. Chaib, T. Bechouat","doi":"10.1051/ro/2023010","DOIUrl":null,"url":null,"abstract":"Conjugate gradient methods are a popular class of iterative methods for solving linear systems of equations and nonlinear optimization problems as they do not require the storage of any matrices. In order to obtain a theoretically effective and numerically efficient method, two modified conjugate gradient methods ( called the MCB1 and MCB2 methods ) are proposed. In which the coefficient βk in the two proposed methods is inspired by the structure of the conjugate gradient parameters in some existing conjugate gradient methods. Under the strong Wolfe line search, the sufficient descent property and global convergence of the MCB1 method are proved. Moreover, the MCB2 method generates a descent direction independently of any line search and produces good convergence properties when the strong Wolfe line search is employed. Preliminary numerical results show that the MCB1 and MCB2 methods are effective and robust in minimizing some unconstrained optimization problems and each of these modifications outperforms the four famous conjugate gradient methods. Furthermore, the proposed algorithms were extended to solve the problem of mode function.","PeriodicalId":20872,"journal":{"name":"RAIRO Oper. Res.","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-01-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Two modified conjugate gradient methods for solving unconstrained optimization and application\",\"authors\":\"Abd Elhamid Mehamdia, Y. Chaib, T. Bechouat\",\"doi\":\"10.1051/ro/2023010\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Conjugate gradient methods are a popular class of iterative methods for solving linear systems of equations and nonlinear optimization problems as they do not require the storage of any matrices. In order to obtain a theoretically effective and numerically efficient method, two modified conjugate gradient methods ( called the MCB1 and MCB2 methods ) are proposed. In which the coefficient βk in the two proposed methods is inspired by the structure of the conjugate gradient parameters in some existing conjugate gradient methods. Under the strong Wolfe line search, the sufficient descent property and global convergence of the MCB1 method are proved. Moreover, the MCB2 method generates a descent direction independently of any line search and produces good convergence properties when the strong Wolfe line search is employed. Preliminary numerical results show that the MCB1 and MCB2 methods are effective and robust in minimizing some unconstrained optimization problems and each of these modifications outperforms the four famous conjugate gradient methods. Furthermore, the proposed algorithms were extended to solve the problem of mode function.\",\"PeriodicalId\":20872,\"journal\":{\"name\":\"RAIRO Oper. Res.\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-01-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"RAIRO Oper. Res.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1051/ro/2023010\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"RAIRO Oper. Res.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1051/ro/2023010","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

共轭梯度法是求解线性方程组和非线性优化问题的一类常用迭代方法,因为它不需要存储任何矩阵。为了获得一种理论有效和数值有效的方法,提出了两种修正共轭梯度法(MCB1和MCB2方法)。其中,两种方法的系数βk是受现有共轭梯度方法中共轭梯度参数结构的启发。在强Wolfe线搜索下,证明了MCB1方法的充分下降性和全局收敛性。此外,MCB2方法产生的下降方向独立于任何线搜索,并且在使用强Wolfe线搜索时具有良好的收敛性。初步的数值结果表明,MCB1和MCB2方法在最小化一些无约束优化问题上是有效的和鲁棒的,并且每种修正方法都优于四种著名的共轭梯度方法。在此基础上,将所提算法扩展到求解模态函数问题。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Two modified conjugate gradient methods for solving unconstrained optimization and application
Conjugate gradient methods are a popular class of iterative methods for solving linear systems of equations and nonlinear optimization problems as they do not require the storage of any matrices. In order to obtain a theoretically effective and numerically efficient method, two modified conjugate gradient methods ( called the MCB1 and MCB2 methods ) are proposed. In which the coefficient βk in the two proposed methods is inspired by the structure of the conjugate gradient parameters in some existing conjugate gradient methods. Under the strong Wolfe line search, the sufficient descent property and global convergence of the MCB1 method are proved. Moreover, the MCB2 method generates a descent direction independently of any line search and produces good convergence properties when the strong Wolfe line search is employed. Preliminary numerical results show that the MCB1 and MCB2 methods are effective and robust in minimizing some unconstrained optimization problems and each of these modifications outperforms the four famous conjugate gradient methods. Furthermore, the proposed algorithms were extended to solve the problem of mode function.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Erratum to: On interval-valued bilevel optimization problems using upper convexificators On the conformability of regular line graphs A new modified bat algorithm for global optimization A multi-stage stochastic programming approach for an inventory-routing problem considering life cycle On characterizations of solution sets of interval-valued quasiconvex programming problems
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1