Pub Date : 2020-05-18DOI: 10.1515/9783110635461-004
{"title":"4. On the power of random information","authors":"","doi":"10.1515/9783110635461-004","DOIUrl":"https://doi.org/10.1515/9783110635461-004","url":null,"abstract":"","PeriodicalId":443134,"journal":{"name":"Multivariate Algorithms and Information-Based Complexity","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130719572","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-05-18DOI: 10.1515/9783110635461-001
{"title":"1. The control variate integration algorithm for multivariate functions defined at scattered data points","authors":"","doi":"10.1515/9783110635461-001","DOIUrl":"https://doi.org/10.1515/9783110635461-001","url":null,"abstract":"","PeriodicalId":443134,"journal":{"name":"Multivariate Algorithms and Information-Based Complexity","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122732458","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-05-18DOI: 10.1515/9783110635461-003
{"title":"3. RBF-based penalized least-squares approximation of noisy scattered data on the sphere","authors":"","doi":"10.1515/9783110635461-003","DOIUrl":"https://doi.org/10.1515/9783110635461-003","url":null,"abstract":"","PeriodicalId":443134,"journal":{"name":"Multivariate Algorithms and Information-Based Complexity","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116492485","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-05-18DOI: 10.1515/9783110635461-006
{"title":"6. ε-Superposition and truncation dimensions, and multivariate method for∞-variate linear problems","authors":"","doi":"10.1515/9783110635461-006","DOIUrl":"https://doi.org/10.1515/9783110635461-006","url":null,"abstract":"","PeriodicalId":443134,"journal":{"name":"Multivariate Algorithms and Information-Based Complexity","volume":"104 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116011729","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-05-18DOI: 10.1515/9783110635461-201
The authors of this book include several of the invited speakers in the workshopMultivariate Algorithms and Information-Based Complexity, which was part of the RICAM Special Semester onMultivariate Algorithms and their Foundations in Number Theory in the fall of 2018. The special semester consisted of four larger and two smaller workshops on various topics ranging fromPseudo-Randomness andDiscrepancy Theory to Information-Based Complexity and Uncertainty Quantification. This book arises from the second workshop, which took place at the Johann Radon Institute for Computational andAppliedMathematics (RICAM) of the Austrian Academy of Sciences in Linz, Austria, on November 5–9, 2018. Multivariate continuous problems occur in a multitude of practical applications, such as physics, finance, computer graphics, and chemistry. The number of variables involved, d, can be in the hundreds or thousands. The information complexity of a given problem is the minimal number of information operations required by the best algorithm to solve the problem for a prescribed set of inputs within a certain error threshold, ε. Typical examples of information operations are function values and linear functionals. The field of information-based complexity (IBC), founded by Traub andWozniakowski in the 1980s, analyzes the information complexity for multivariate problemsanddetermineshow it depends ond and ε. A crucial question is underwhich circumstances one can avoid a curse of dimensionality, namely, exponential growth of the information complexity with d. This book addresses the analysis of multivariate (continuous) problems, especially from the IBC viewpoint. The problems discussed by the authors reflect the breadth of current inquiry under the umbrella of multivariate algorithms and IBC. The chapter entitled“The control variate integration algorithm for multivariate functions defined at scattered data points” studies a method of approximating the integral of a multivariate function, in which one uses the exact integral of a control variate based on a least-squares multivariate quasiinterpolant. Numerical examples demonstrate that such an algorithm can overcome the curse of dimensionality formultivariate least-squares fits. The second chapter, titled “An adaptive random bit multilevel algorithm for SDEs”, considers the approximations of expectations for functionals applied to the solutions of stochastic differential equations by employing Monte Carlo methods based on random bits instead of random numbers. An adaptive random bit multilevel algorithm is provided and compared numerically to other methods. The chapter “RBF-based penalized least-squares approximation of noisy scattered data on the sphere” deals with the approximation of noisy scattered data on the 2-dimensional unit sphere. In particular, global and local penalized least-squares approximation based on radial basis functions (RBFs) are explored. The authors of the fourth chapter in this book, “On the power of r
{"title":"Preface: Multivariate algorithms and information-based complexity","authors":"","doi":"10.1515/9783110635461-201","DOIUrl":"https://doi.org/10.1515/9783110635461-201","url":null,"abstract":"The authors of this book include several of the invited speakers in the workshopMultivariate Algorithms and Information-Based Complexity, which was part of the RICAM Special Semester onMultivariate Algorithms and their Foundations in Number Theory in the fall of 2018. The special semester consisted of four larger and two smaller workshops on various topics ranging fromPseudo-Randomness andDiscrepancy Theory to Information-Based Complexity and Uncertainty Quantification. This book arises from the second workshop, which took place at the Johann Radon Institute for Computational andAppliedMathematics (RICAM) of the Austrian Academy of Sciences in Linz, Austria, on November 5–9, 2018. Multivariate continuous problems occur in a multitude of practical applications, such as physics, finance, computer graphics, and chemistry. The number of variables involved, d, can be in the hundreds or thousands. The information complexity of a given problem is the minimal number of information operations required by the best algorithm to solve the problem for a prescribed set of inputs within a certain error threshold, ε. Typical examples of information operations are function values and linear functionals. The field of information-based complexity (IBC), founded by Traub andWozniakowski in the 1980s, analyzes the information complexity for multivariate problemsanddetermineshow it depends ond and ε. A crucial question is underwhich circumstances one can avoid a curse of dimensionality, namely, exponential growth of the information complexity with d. This book addresses the analysis of multivariate (continuous) problems, especially from the IBC viewpoint. The problems discussed by the authors reflect the breadth of current inquiry under the umbrella of multivariate algorithms and IBC. The chapter entitled“The control variate integration algorithm for multivariate functions defined at scattered data points” studies a method of approximating the integral of a multivariate function, in which one uses the exact integral of a control variate based on a least-squares multivariate quasiinterpolant. Numerical examples demonstrate that such an algorithm can overcome the curse of dimensionality formultivariate least-squares fits. The second chapter, titled “An adaptive random bit multilevel algorithm for SDEs”, considers the approximations of expectations for functionals applied to the solutions of stochastic differential equations by employing Monte Carlo methods based on random bits instead of random numbers. An adaptive random bit multilevel algorithm is provided and compared numerically to other methods. The chapter “RBF-based penalized least-squares approximation of noisy scattered data on the sphere” deals with the approximation of noisy scattered data on the 2-dimensional unit sphere. In particular, global and local penalized least-squares approximation based on radial basis functions (RBFs) are explored. The authors of the fourth chapter in this book, “On the power of r","PeriodicalId":443134,"journal":{"name":"Multivariate Algorithms and Information-Based Complexity","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131688282","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-03-26DOI: 10.1515/9783110635461-007
Yuhan Ding, F. J. Hickernell, P. Kritzer, Simon Mak
We study adaptive approximation algorithms for general multivariate linear problems where the sets of input functions are non-convex cones. While it is known that adaptive algorithms perform essentially no better than non-adaptive algorithms for convex input sets, the situation may be different for non-convex sets. A typical example considered here is function approximation based on series expansions. Given an error tolerance, we use series coefficients of the input to construct an approximate solution such that the error does not exceed this tolerance. We study the situation where we can bound the norm of the input based on a pilot sample, and the situation where we keep track of the decay rate of the series coefficients of the input. Moreover, we consider situations where it makes sense to infer coordinate and smoothness importance. Besides performing an error analysis, we also study the information cost of our algorithms and the computational complexity of our problems, and we identify conditions under which we can avoid a curse of dimensionality.
{"title":"7. Adaptive approximation for multivariate linear problems with inputs lying in a cone","authors":"Yuhan Ding, F. J. Hickernell, P. Kritzer, Simon Mak","doi":"10.1515/9783110635461-007","DOIUrl":"https://doi.org/10.1515/9783110635461-007","url":null,"abstract":"We study adaptive approximation algorithms for general multivariate linear problems where the sets of input functions are non-convex cones. While it is known that adaptive algorithms perform essentially no better than non-adaptive algorithms for convex input sets, the situation may be different for non-convex sets. A typical example considered here is function approximation based on series expansions. Given an error tolerance, we use series coefficients of the input to construct an approximate solution such that the error does not exceed this tolerance. We study the situation where we can bound the norm of the input based on a pilot sample, and the situation where we keep track of the decay rate of the series coefficients of the input. Moreover, we consider situations where it makes sense to infer coordinate and smoothness importance. Besides performing an error analysis, we also study the information cost of our algorithms and the computational complexity of our problems, and we identify conditions under which we can avoid a curse of dimensionality.","PeriodicalId":443134,"journal":{"name":"Multivariate Algorithms and Information-Based Complexity","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115702199","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-02-26DOI: 10.1515/9783110635461-002
M. Giles, M. Hefter, Lukas Mayer, K. Ritter
We study the approximation of expectations $operatorname{E}(f(X))$ for solutions $X$ of stochastic differential equations and functionals $f$ on the path space by means of Monte Carlo algorithms that only use random bits instead of random numbers. We construct an adaptive random bit multilevel algorithm, which is based on the Euler scheme, the L'evy-Ciesielski representation of the Brownian motion, and asymptotically optimal random bit approximations of the standard normal distribution. We numerically compare this algorithm with the adaptive classical multilevel Euler algorithm for a geometric Brownian motion, an Ornstein-Uhlenbeck process, and a Cox-Ingersoll-Ross process.
{"title":"2. An adaptive random bit multilevel algorithm for SDEs","authors":"M. Giles, M. Hefter, Lukas Mayer, K. Ritter","doi":"10.1515/9783110635461-002","DOIUrl":"https://doi.org/10.1515/9783110635461-002","url":null,"abstract":"We study the approximation of expectations $operatorname{E}(f(X))$ for solutions $X$ of stochastic differential equations and functionals $f$ on the path space by means of Monte Carlo algorithms that only use random bits instead of random numbers. We construct an adaptive random bit multilevel algorithm, which is based on the Euler scheme, the L'evy-Ciesielski representation of the Brownian motion, and asymptotically optimal random bit approximations of the standard normal distribution. We numerically compare this algorithm with the adaptive classical multilevel Euler algorithm for a geometric Brownian motion, an Ornstein-Uhlenbeck process, and a Cox-Ingersoll-Ross process.","PeriodicalId":443134,"journal":{"name":"Multivariate Algorithms and Information-Based Complexity","volume":"179 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-02-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125821181","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}