Yoksal A. Laylani, Hisham M. Khudhur, Edrees M. Nori, K. Abbo
{"title":"Hestenes-Stiefel和戴元共轭梯度法的杂交","authors":"Yoksal A. Laylani, Hisham M. Khudhur, Edrees M. Nori, K. Abbo","doi":"10.29020/nybg.ejpam.v16i2.4746","DOIUrl":null,"url":null,"abstract":"The paper in discusses conjugate gradient methods, which are often used for unconstrained optimization and are the subject of this discussion. In the process of studying and implementing conjugate gradient algorithms, it is standard practice to assume that the descent condition is true. Despite the fact that this sort of approach very seldom results in search routes that slope in a downward direction, this assumption is made routinely. As a result of this research, we propose a revised method known as the improved hybrid conjugate gradient technique. This method is a convex combination of the Dai-Yuan and Hestenes-Stiefel methodologies. The descending property and global convergence are both exhibited by the Wolfe line search. The numerical data demonstrates that the strategy that was presented is an efficient one.","PeriodicalId":51807,"journal":{"name":"European Journal of Pure and Applied Mathematics","volume":" ","pages":""},"PeriodicalIF":1.0000,"publicationDate":"2023-04-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A hybridization of the Hestenes-Stiefel and Dai-Yuan Conjugate Gradient Methods\",\"authors\":\"Yoksal A. Laylani, Hisham M. Khudhur, Edrees M. Nori, K. Abbo\",\"doi\":\"10.29020/nybg.ejpam.v16i2.4746\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The paper in discusses conjugate gradient methods, which are often used for unconstrained optimization and are the subject of this discussion. In the process of studying and implementing conjugate gradient algorithms, it is standard practice to assume that the descent condition is true. Despite the fact that this sort of approach very seldom results in search routes that slope in a downward direction, this assumption is made routinely. As a result of this research, we propose a revised method known as the improved hybrid conjugate gradient technique. This method is a convex combination of the Dai-Yuan and Hestenes-Stiefel methodologies. The descending property and global convergence are both exhibited by the Wolfe line search. The numerical data demonstrates that the strategy that was presented is an efficient one.\",\"PeriodicalId\":51807,\"journal\":{\"name\":\"European Journal of Pure and Applied Mathematics\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":1.0000,\"publicationDate\":\"2023-04-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"European Journal of Pure and Applied Mathematics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.29020/nybg.ejpam.v16i2.4746\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"European Journal of Pure and Applied Mathematics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.29020/nybg.ejpam.v16i2.4746","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS","Score":null,"Total":0}
A hybridization of the Hestenes-Stiefel and Dai-Yuan Conjugate Gradient Methods
The paper in discusses conjugate gradient methods, which are often used for unconstrained optimization and are the subject of this discussion. In the process of studying and implementing conjugate gradient algorithms, it is standard practice to assume that the descent condition is true. Despite the fact that this sort of approach very seldom results in search routes that slope in a downward direction, this assumption is made routinely. As a result of this research, we propose a revised method known as the improved hybrid conjugate gradient technique. This method is a convex combination of the Dai-Yuan and Hestenes-Stiefel methodologies. The descending property and global convergence are both exhibited by the Wolfe line search. The numerical data demonstrates that the strategy that was presented is an efficient one.