{"title":"Gradient-enhanced sparse Hermite polynomial expansions for pricing and hedging high-dimensional American options","authors":"Jiefei Yang, Guanglian Li","doi":"arxiv-2405.02570","DOIUrl":null,"url":null,"abstract":"We propose an efficient and easy-to-implement gradient-enhanced least squares\nMonte Carlo method for computing price and Greeks (i.e., derivatives of the\nprice function) of high-dimensional American options. It employs the sparse\nHermite polynomial expansion as a surrogate model for the continuation value\nfunction, and essentially exploits the fast evaluation of gradients. The\nexpansion coefficients are computed by solving a linear least squares problem\nthat is enhanced by gradient information of simulated paths. We analyze the\nconvergence of the proposed method, and establish an error estimate in terms of\nthe best approximation error in the weighted $H^1$ space, the statistical error\nof solving discrete least squares problems, and the time step size. We present\ncomprehensive numerical experiments to illustrate the performance of the\nproposed method. The results show that it outperforms the state-of-the-art\nleast squares Monte Carlo method with more accurate price, Greeks, and optimal\nexercise strategies in high dimensions but with nearly identical computational\ncost, and it can deliver comparable results with recent neural network-based\nmethods up to dimension 100.","PeriodicalId":501294,"journal":{"name":"arXiv - QuantFin - Computational Finance","volume":"20 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-05-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - QuantFin - Computational Finance","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2405.02570","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
We propose an efficient and easy-to-implement gradient-enhanced least squares
Monte Carlo method for computing price and Greeks (i.e., derivatives of the
price function) of high-dimensional American options. It employs the sparse
Hermite polynomial expansion as a surrogate model for the continuation value
function, and essentially exploits the fast evaluation of gradients. The
expansion coefficients are computed by solving a linear least squares problem
that is enhanced by gradient information of simulated paths. We analyze the
convergence of the proposed method, and establish an error estimate in terms of
the best approximation error in the weighted $H^1$ space, the statistical error
of solving discrete least squares problems, and the time step size. We present
comprehensive numerical experiments to illustrate the performance of the
proposed method. The results show that it outperforms the state-of-the-art
least squares Monte Carlo method with more accurate price, Greeks, and optimal
exercise strategies in high dimensions but with nearly identical computational
cost, and it can deliver comparable results with recent neural network-based
methods up to dimension 100.