{"title":"Entropy-based convergence rates of greedy algorithms","authors":"Yuwen Li, Jonathan W. Siegel","doi":"10.1142/s0218202524500143","DOIUrl":null,"url":null,"abstract":"<p>We present convergence estimates of two types of greedy algorithms in terms of the entropy numbers of underlying compact sets. In the first part, we measure the error of a standard greedy reduced basis method for parametric PDEs by the entropy numbers of the solution manifold in Banach spaces. This contrasts with the classical analysis based on the Kolmogorov <span><math altimg=\"eq-00001.gif\" display=\"inline\" overflow=\"scroll\"><mi>n</mi></math></span><span></span>-widths and enables us to obtain direct comparisons between the algorithm error and the entropy numbers, where the multiplicative constants are explicit and simple. The entropy-based convergence estimate is sharp and improves upon the classical width-based analysis of reduced basis methods for elliptic model problems. In the second part, we derive a novel and simple convergence analysis of the classical orthogonal greedy algorithm for nonlinear dictionary approximation using the entropy numbers of the symmetric convex hull of the dictionary. This also improves upon existing results by giving a direct comparison between the algorithm error and the entropy numbers.</p>","PeriodicalId":18311,"journal":{"name":"Mathematical Models and Methods in Applied Sciences","volume":"135 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-02-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Mathematical Models and Methods in Applied Sciences","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1142/s0218202524500143","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
We present convergence estimates of two types of greedy algorithms in terms of the entropy numbers of underlying compact sets. In the first part, we measure the error of a standard greedy reduced basis method for parametric PDEs by the entropy numbers of the solution manifold in Banach spaces. This contrasts with the classical analysis based on the Kolmogorov -widths and enables us to obtain direct comparisons between the algorithm error and the entropy numbers, where the multiplicative constants are explicit and simple. The entropy-based convergence estimate is sharp and improves upon the classical width-based analysis of reduced basis methods for elliptic model problems. In the second part, we derive a novel and simple convergence analysis of the classical orthogonal greedy algorithm for nonlinear dictionary approximation using the entropy numbers of the symmetric convex hull of the dictionary. This also improves upon existing results by giving a direct comparison between the algorithm error and the entropy numbers.