{"title":"深度relu神经网络克服偏积分微分方程的维数诅咒","authors":"Lukas Gonon, C. Schwab","doi":"10.1142/s0219530522500129","DOIUrl":null,"url":null,"abstract":"Deep neural networks (DNNs) with ReLU activation function are proved to be able to express viscosity solutions of linear partial integrodifferental equations (PIDEs) on state spaces of possibly high dimension $d$. Admissible PIDEs comprise Kolmogorov equations for high-dimensional diffusion, advection, and for pure jump L\\'{e}vy processes. We prove for such PIDEs arising from a class of jump-diffusions on $\\mathbb{R}^d$, that for any compact $K\\subset \\mathbb{R}^d$, there exist constants $C,{\\mathfrak{p}},{\\mathfrak{q}}>0$ such that for every $\\varepsilon \\in (0,1]$ and for every $d\\in \\mathbb{N}$ the normalized (over $K$) DNN $L^2$-expression error of viscosity solutions of the PIDE is of size $\\varepsilon$ with DNN size bounded by $Cd^{\\mathfrak{p}}\\varepsilon^{-\\mathfrak{q}}$. In particular, the constant $C>0$ is independent of $d\\in \\mathbb{N}$ and of $\\varepsilon \\in (0,1]$ and depends only on the coefficients in the PIDE and the measure used to quantify the error. This establishes that ReLU DNNs can break the curse of dimensionality (CoD for short) for viscosity solutions of linear, possibly degenerate PIDEs corresponding to Markovian jump-diffusion processes. As a consequence of the employed techniques we also obtain that expectations of a large class of path-dependent functionals of the underlying jump-diffusion processes can be expressed without the CoD.","PeriodicalId":55519,"journal":{"name":"Analysis and Applications","volume":" ","pages":""},"PeriodicalIF":2.0000,"publicationDate":"2021-02-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"15","resultStr":"{\"title\":\"Deep relu neural networks overcome the curse of dimensionality for partial integrodifferential equations\",\"authors\":\"Lukas Gonon, C. Schwab\",\"doi\":\"10.1142/s0219530522500129\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Deep neural networks (DNNs) with ReLU activation function are proved to be able to express viscosity solutions of linear partial integrodifferental equations (PIDEs) on state spaces of possibly high dimension $d$. Admissible PIDEs comprise Kolmogorov equations for high-dimensional diffusion, advection, and for pure jump L\\\\'{e}vy processes. We prove for such PIDEs arising from a class of jump-diffusions on $\\\\mathbb{R}^d$, that for any compact $K\\\\subset \\\\mathbb{R}^d$, there exist constants $C,{\\\\mathfrak{p}},{\\\\mathfrak{q}}>0$ such that for every $\\\\varepsilon \\\\in (0,1]$ and for every $d\\\\in \\\\mathbb{N}$ the normalized (over $K$) DNN $L^2$-expression error of viscosity solutions of the PIDE is of size $\\\\varepsilon$ with DNN size bounded by $Cd^{\\\\mathfrak{p}}\\\\varepsilon^{-\\\\mathfrak{q}}$. In particular, the constant $C>0$ is independent of $d\\\\in \\\\mathbb{N}$ and of $\\\\varepsilon \\\\in (0,1]$ and depends only on the coefficients in the PIDE and the measure used to quantify the error. This establishes that ReLU DNNs can break the curse of dimensionality (CoD for short) for viscosity solutions of linear, possibly degenerate PIDEs corresponding to Markovian jump-diffusion processes. As a consequence of the employed techniques we also obtain that expectations of a large class of path-dependent functionals of the underlying jump-diffusion processes can be expressed without the CoD.\",\"PeriodicalId\":55519,\"journal\":{\"name\":\"Analysis and Applications\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":2.0000,\"publicationDate\":\"2021-02-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"15\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Analysis and Applications\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1142/s0219530522500129\",\"RegionNum\":2,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Analysis and Applications","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1142/s0219530522500129","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS","Score":null,"Total":0}
Deep relu neural networks overcome the curse of dimensionality for partial integrodifferential equations
Deep neural networks (DNNs) with ReLU activation function are proved to be able to express viscosity solutions of linear partial integrodifferental equations (PIDEs) on state spaces of possibly high dimension $d$. Admissible PIDEs comprise Kolmogorov equations for high-dimensional diffusion, advection, and for pure jump L\'{e}vy processes. We prove for such PIDEs arising from a class of jump-diffusions on $\mathbb{R}^d$, that for any compact $K\subset \mathbb{R}^d$, there exist constants $C,{\mathfrak{p}},{\mathfrak{q}}>0$ such that for every $\varepsilon \in (0,1]$ and for every $d\in \mathbb{N}$ the normalized (over $K$) DNN $L^2$-expression error of viscosity solutions of the PIDE is of size $\varepsilon$ with DNN size bounded by $Cd^{\mathfrak{p}}\varepsilon^{-\mathfrak{q}}$. In particular, the constant $C>0$ is independent of $d\in \mathbb{N}$ and of $\varepsilon \in (0,1]$ and depends only on the coefficients in the PIDE and the measure used to quantify the error. This establishes that ReLU DNNs can break the curse of dimensionality (CoD for short) for viscosity solutions of linear, possibly degenerate PIDEs corresponding to Markovian jump-diffusion processes. As a consequence of the employed techniques we also obtain that expectations of a large class of path-dependent functionals of the underlying jump-diffusion processes can be expressed without the CoD.
期刊介绍:
Analysis and Applications publishes high quality mathematical papers that treat those parts of analysis which have direct or potential applications to the physical and biological sciences and engineering. Some of the topics from analysis include approximation theory, asymptotic analysis, calculus of variations, integral equations, integral transforms, ordinary and partial differential equations, delay differential equations, and perturbation methods. The primary aim of the journal is to encourage the development of new techniques and results in applied analysis.