Adaptive neural-domain refinement for solving time-dependent differential equations

Toni Schneidereit, Michael Breuß
{"title":"Adaptive neural-domain refinement for solving time-dependent differential equations","authors":"Toni Schneidereit, Michael Breuß","doi":"10.1186/s13662-023-03789-x","DOIUrl":null,"url":null,"abstract":"Abstract A classic approach for solving differential equations with neural networks builds upon neural forms, which employ the differential equation with a discretisation of the solution domain. Making use of neural forms for time-dependent differential equations, one can apply the recently developed method of domain segmentation. That is, the domain may be split into several subdomains, on which the optimisation problem is solved. In classic adaptive numerical methods, the mesh as well as the domain may be refined or decomposed, in order to improve the accuracy. Also, the degree of approximation accuracy may be adapted. Therefore, it is desirable to transfer such important and successful strategies to the field of neural-network-based solutions. In the presented work, we propose a novel adaptive neural approach to meet this aim for solving time-dependent problems. To this end, each subdomain is reduced in size until the optimisation is resolved up to a predefined training accuracy. In addition, while the neural networks employed are by default small, we propose a means to adjust also the number of neurons in an adaptive way. We introduce conditions to automatically confirm the solution reliability and optimise computational parameters whenever it is necessary. Results are provided for several initial-value problems that illustrate important computational properties of the method.","PeriodicalId":72091,"journal":{"name":"Advances in continuous and discrete models","volume":"40 3","pages":"0"},"PeriodicalIF":2.3000,"publicationDate":"2023-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advances in continuous and discrete models","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1186/s13662-023-03789-x","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS","Score":null,"Total":0}
引用次数: 0

Abstract

Abstract A classic approach for solving differential equations with neural networks builds upon neural forms, which employ the differential equation with a discretisation of the solution domain. Making use of neural forms for time-dependent differential equations, one can apply the recently developed method of domain segmentation. That is, the domain may be split into several subdomains, on which the optimisation problem is solved. In classic adaptive numerical methods, the mesh as well as the domain may be refined or decomposed, in order to improve the accuracy. Also, the degree of approximation accuracy may be adapted. Therefore, it is desirable to transfer such important and successful strategies to the field of neural-network-based solutions. In the presented work, we propose a novel adaptive neural approach to meet this aim for solving time-dependent problems. To this end, each subdomain is reduced in size until the optimisation is resolved up to a predefined training accuracy. In addition, while the neural networks employed are by default small, we propose a means to adjust also the number of neurons in an adaptive way. We introduce conditions to automatically confirm the solution reliability and optimise computational parameters whenever it is necessary. Results are provided for several initial-value problems that illustrate important computational properties of the method.

Abstract Image

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
求解时变微分方程的自适应神经域细化
用神经网络求解微分方程的经典方法是建立在神经形式的基础上的,它采用微分方程的解域离散化。利用时变微分方程的神经形式,可以应用最近发展起来的区域分割方法。也就是说,可以将域划分为若干子域,在这些子域上解决优化问题。在经典的自适应数值方法中,为了提高精度,可以对网格和区域进行细化或分解。此外,可以调整近似精度的程度。因此,将这些重要而成功的策略转移到基于神经网络的解决方案领域是很有必要的。在本文中,我们提出了一种新的自适应神经方法来解决时间相关问题。为此,每个子域的大小被减小,直到优化被解决到预定义的训练精度。此外,虽然所使用的神经网络默认是小的,但我们提出了一种自适应方式调整神经元数量的方法。我们引入条件来自动确认解的可靠性,并在必要时优化计算参数。给出了几个初值问题的结果,说明了该方法的重要计算性质。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
2.30
自引率
0.00%
发文量
0
期刊最新文献
Conservative Fourier spectral method for a class of modified Zakharov system with high-order space fractional quantum correction Universal approximation property of a continuous neural network based on a nonlinear diffusion equation Adaptive neural-domain refinement for solving time-dependent differential equations Bifurcation and chaos in a discrete Holling–Tanner model with Beddington–DeAngelis functional response Stability and dynamics of a stochastic discrete fractional-order chaotic system with short memory
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1