{"title":"A New Modified Secant Condition for Non-linear Conjugate Gradient Methods with Global Convergence","authors":"Farhan Khalaf Muord, Muna M. M. Ali","doi":"10.52783/cana.v31.1056","DOIUrl":null,"url":null,"abstract":"The Conjugate Gradient Methods(CGM) are well-recognized techniques for handling nonlinear optimization problems. Dai and Liao (2001) employ the secant condition approach, this study utilizes the modified secant condition proposed by Yabe-Takano (2004) and Zhang and Xu (2001), which is satisfied at each iteration through the implementation of the strong Wolf-line search condition. Additionally, please provide three novel categories of conjugate gradient algorithms of this nature. We examined 15 well-known test functions. This novel approach utilises the existing gradient and function value to accurately approximate the goal function with high-order precision. The worldwide convergence of our novel algorithms is demonstrated under certain conditions. Numerical results are provided, and the efficiency is proven by comparing it to other approaches.","PeriodicalId":40036,"journal":{"name":"Communications on Applied Nonlinear Analysis","volume":"170 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-07-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Communications on Applied Nonlinear Analysis","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.52783/cana.v31.1056","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"Mathematics","Score":null,"Total":0}
引用次数: 0
Abstract
The Conjugate Gradient Methods(CGM) are well-recognized techniques for handling nonlinear optimization problems. Dai and Liao (2001) employ the secant condition approach, this study utilizes the modified secant condition proposed by Yabe-Takano (2004) and Zhang and Xu (2001), which is satisfied at each iteration through the implementation of the strong Wolf-line search condition. Additionally, please provide three novel categories of conjugate gradient algorithms of this nature. We examined 15 well-known test functions. This novel approach utilises the existing gradient and function value to accurately approximate the goal function with high-order precision. The worldwide convergence of our novel algorithms is demonstrated under certain conditions. Numerical results are provided, and the efficiency is proven by comparing it to other approaches.