{"title":"具有平滑激活函数的深度里兹方法的泛化误差","authors":"Janne Siipola","doi":"10.4208/cicp.oa-2023-0253","DOIUrl":null,"url":null,"abstract":"Deep Ritz method is a deep learning paradigm to solve partial differential\nequations. In this article we study the generalization error of the Deep Ritz method.\nWe focus on the quintessential problem which is the Poisson’s equation. We show that\ngeneralization error of the Deep Ritz method converges to zero with rate $\\frac{C}{\\sqrt{n}},$ and we\ndiscuss about the constant $C.$ Results are obtained for shallow and residual neural\nnetworks with smooth activation functions.","PeriodicalId":50661,"journal":{"name":"Communications in Computational Physics","volume":"117 1","pages":""},"PeriodicalIF":2.6000,"publicationDate":"2024-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Generalization Error in the Deep Ritz Method with Smooth Activation Functions\",\"authors\":\"Janne Siipola\",\"doi\":\"10.4208/cicp.oa-2023-0253\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Deep Ritz method is a deep learning paradigm to solve partial differential\\nequations. In this article we study the generalization error of the Deep Ritz method.\\nWe focus on the quintessential problem which is the Poisson’s equation. We show that\\ngeneralization error of the Deep Ritz method converges to zero with rate $\\\\frac{C}{\\\\sqrt{n}},$ and we\\ndiscuss about the constant $C.$ Results are obtained for shallow and residual neural\\nnetworks with smooth activation functions.\",\"PeriodicalId\":50661,\"journal\":{\"name\":\"Communications in Computational Physics\",\"volume\":\"117 1\",\"pages\":\"\"},\"PeriodicalIF\":2.6000,\"publicationDate\":\"2024-04-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Communications in Computational Physics\",\"FirstCategoryId\":\"101\",\"ListUrlMain\":\"https://doi.org/10.4208/cicp.oa-2023-0253\",\"RegionNum\":3,\"RegionCategory\":\"物理与天体物理\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"PHYSICS, MATHEMATICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Communications in Computational Physics","FirstCategoryId":"101","ListUrlMain":"https://doi.org/10.4208/cicp.oa-2023-0253","RegionNum":3,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PHYSICS, MATHEMATICAL","Score":null,"Total":0}
引用次数: 0
摘要
Deep Ritz 方法是一种解决偏微分方程的深度学习范式。本文研究了 Deep Ritz 方法的泛化误差。我们证明了深度里兹方法的泛化误差以$frac{C}{sqrt{n}}的速率收敛为零,并讨论了常数$C。
Generalization Error in the Deep Ritz Method with Smooth Activation Functions
Deep Ritz method is a deep learning paradigm to solve partial differential
equations. In this article we study the generalization error of the Deep Ritz method.
We focus on the quintessential problem which is the Poisson’s equation. We show that
generalization error of the Deep Ritz method converges to zero with rate $\frac{C}{\sqrt{n}},$ and we
discuss about the constant $C.$ Results are obtained for shallow and residual neural
networks with smooth activation functions.
期刊介绍:
Communications in Computational Physics (CiCP) publishes original research and survey papers of high scientific value in computational modeling of physical problems. Results in multi-physics and multi-scale innovative computational methods and modeling in all physical sciences will be featured.