{"title":"非线性欠定最小二乘问题的分割预处理方案","authors":"Nadja Vater, Alfio Borzì","doi":"10.1002/nla.2558","DOIUrl":null,"url":null,"abstract":"The convergence of preconditioned gradient methods for nonlinear underdetermined least squares problems arising in, for example, supervised learning of overparameterized neural networks is investigated. In this general setting, conditions are given that guarantee the existence of global minimizers that correspond to zero residuals and a proof of the convergence of a gradient method to these global minima is presented. In order to accelerate convergence of the gradient method, different preconditioning strategies are developed and analyzed. In particular, a left randomized preconditioner and a right coarse‐level correction preconditioner are combined and investigated. It is demonstrated that the resulting split preconditioned two‐level gradient method incorporates the advantages of both approaches and performs very efficiently.","PeriodicalId":49731,"journal":{"name":"Numerical Linear Algebra with Applications","volume":null,"pages":null},"PeriodicalIF":1.8000,"publicationDate":"2024-04-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A split preconditioning scheme for nonlinear underdetermined least squares problems\",\"authors\":\"Nadja Vater, Alfio Borzì\",\"doi\":\"10.1002/nla.2558\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The convergence of preconditioned gradient methods for nonlinear underdetermined least squares problems arising in, for example, supervised learning of overparameterized neural networks is investigated. In this general setting, conditions are given that guarantee the existence of global minimizers that correspond to zero residuals and a proof of the convergence of a gradient method to these global minima is presented. In order to accelerate convergence of the gradient method, different preconditioning strategies are developed and analyzed. In particular, a left randomized preconditioner and a right coarse‐level correction preconditioner are combined and investigated. It is demonstrated that the resulting split preconditioned two‐level gradient method incorporates the advantages of both approaches and performs very efficiently.\",\"PeriodicalId\":49731,\"journal\":{\"name\":\"Numerical Linear Algebra with Applications\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.8000,\"publicationDate\":\"2024-04-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Numerical Linear Algebra with Applications\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1002/nla.2558\",\"RegionNum\":3,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Numerical Linear Algebra with Applications","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1002/nla.2558","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS","Score":null,"Total":0}
A split preconditioning scheme for nonlinear underdetermined least squares problems
The convergence of preconditioned gradient methods for nonlinear underdetermined least squares problems arising in, for example, supervised learning of overparameterized neural networks is investigated. In this general setting, conditions are given that guarantee the existence of global minimizers that correspond to zero residuals and a proof of the convergence of a gradient method to these global minima is presented. In order to accelerate convergence of the gradient method, different preconditioning strategies are developed and analyzed. In particular, a left randomized preconditioner and a right coarse‐level correction preconditioner are combined and investigated. It is demonstrated that the resulting split preconditioned two‐level gradient method incorporates the advantages of both approaches and performs very efficiently.
期刊介绍:
Manuscripts submitted to Numerical Linear Algebra with Applications should include large-scale broad-interest applications in which challenging computational results are integral to the approach investigated and analysed. Manuscripts that, in the Editor’s view, do not satisfy these conditions will not be accepted for review.
Numerical Linear Algebra with Applications receives submissions in areas that address developing, analysing and applying linear algebra algorithms for solving problems arising in multilinear (tensor) algebra, in statistics, such as Markov Chains, as well as in deterministic and stochastic modelling of large-scale networks, algorithm development, performance analysis or related computational aspects.
Topics covered include: Standard and Generalized Conjugate Gradients, Multigrid and Other Iterative Methods; Preconditioning Methods; Direct Solution Methods; Numerical Methods for Eigenproblems; Newton-like Methods for Nonlinear Equations; Parallel and Vectorizable Algorithms in Numerical Linear Algebra; Application of Methods of Numerical Linear Algebra in Science, Engineering and Economics.