{"title":"用于矢量优化的哈格-张共轭梯度法的替代扩展","authors":"Qingjie Hu, Liping Zhu, Yu Chen","doi":"10.1007/s10589-023-00548-2","DOIUrl":null,"url":null,"abstract":"<p>Recently, Gonçalves and Prudente proposed an extension of the Hager–Zhang nonlinear conjugate gradient method for vector optimization (Comput Optim Appl 76:889–916, 2020). They initially demonstrated that directly extending the Hager–Zhang method for vector optimization may not result in descent in the vector sense, even when employing an exact line search. By utilizing a sufficiently accurate line search, they subsequently introduced a self-adjusting Hager–Zhang conjugate gradient method in the vector sense. The global convergence of this new scheme was proven without requiring regular restarts or any convex assumptions. In this paper, we propose an alternative extension of the Hager–Zhang nonlinear conjugate gradient method for vector optimization that preserves its desirable scalar property, i.e., ensuring sufficiently descent without relying on any line search or convexity assumption. Furthermore, we investigate its global convergence with the Wolfe line search under mild assumptions. Finally, numerical experiments are presented to illustrate the practical behavior of our proposed method.</p>","PeriodicalId":1,"journal":{"name":"Accounts of Chemical Research","volume":null,"pages":null},"PeriodicalIF":16.4000,"publicationDate":"2024-01-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Alternative extension of the Hager–Zhang conjugate gradient method for vector optimization\",\"authors\":\"Qingjie Hu, Liping Zhu, Yu Chen\",\"doi\":\"10.1007/s10589-023-00548-2\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Recently, Gonçalves and Prudente proposed an extension of the Hager–Zhang nonlinear conjugate gradient method for vector optimization (Comput Optim Appl 76:889–916, 2020). They initially demonstrated that directly extending the Hager–Zhang method for vector optimization may not result in descent in the vector sense, even when employing an exact line search. By utilizing a sufficiently accurate line search, they subsequently introduced a self-adjusting Hager–Zhang conjugate gradient method in the vector sense. The global convergence of this new scheme was proven without requiring regular restarts or any convex assumptions. In this paper, we propose an alternative extension of the Hager–Zhang nonlinear conjugate gradient method for vector optimization that preserves its desirable scalar property, i.e., ensuring sufficiently descent without relying on any line search or convexity assumption. Furthermore, we investigate its global convergence with the Wolfe line search under mild assumptions. Finally, numerical experiments are presented to illustrate the practical behavior of our proposed method.</p>\",\"PeriodicalId\":1,\"journal\":{\"name\":\"Accounts of Chemical Research\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":16.4000,\"publicationDate\":\"2024-01-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Accounts of Chemical Research\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1007/s10589-023-00548-2\",\"RegionNum\":1,\"RegionCategory\":\"化学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"CHEMISTRY, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Accounts of Chemical Research","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s10589-023-00548-2","RegionNum":1,"RegionCategory":"化学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"CHEMISTRY, MULTIDISCIPLINARY","Score":null,"Total":0}
Alternative extension of the Hager–Zhang conjugate gradient method for vector optimization
Recently, Gonçalves and Prudente proposed an extension of the Hager–Zhang nonlinear conjugate gradient method for vector optimization (Comput Optim Appl 76:889–916, 2020). They initially demonstrated that directly extending the Hager–Zhang method for vector optimization may not result in descent in the vector sense, even when employing an exact line search. By utilizing a sufficiently accurate line search, they subsequently introduced a self-adjusting Hager–Zhang conjugate gradient method in the vector sense. The global convergence of this new scheme was proven without requiring regular restarts or any convex assumptions. In this paper, we propose an alternative extension of the Hager–Zhang nonlinear conjugate gradient method for vector optimization that preserves its desirable scalar property, i.e., ensuring sufficiently descent without relying on any line search or convexity assumption. Furthermore, we investigate its global convergence with the Wolfe line search under mild assumptions. Finally, numerical experiments are presented to illustrate the practical behavior of our proposed method.
期刊介绍:
Accounts of Chemical Research presents short, concise and critical articles offering easy-to-read overviews of basic research and applications in all areas of chemistry and biochemistry. These short reviews focus on research from the author’s own laboratory and are designed to teach the reader about a research project. In addition, Accounts of Chemical Research publishes commentaries that give an informed opinion on a current research problem. Special Issues online are devoted to a single topic of unusual activity and significance.
Accounts of Chemical Research replaces the traditional article abstract with an article "Conspectus." These entries synopsize the research affording the reader a closer look at the content and significance of an article. Through this provision of a more detailed description of the article contents, the Conspectus enhances the article's discoverability by search engines and the exposure for the research.