A generalized framework of neural networks for Hamiltonian systems

IF 3.8 2区 物理与天体物理 Q2 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS Journal of Computational Physics Pub Date : 2024-10-28 DOI:10.1016/j.jcp.2024.113536
Philipp Horn , Veronica Saz Ulibarrena , Barry Koren , Simon Portegies Zwart
{"title":"A generalized framework of neural networks for Hamiltonian systems","authors":"Philipp Horn ,&nbsp;Veronica Saz Ulibarrena ,&nbsp;Barry Koren ,&nbsp;Simon Portegies Zwart","doi":"10.1016/j.jcp.2024.113536","DOIUrl":null,"url":null,"abstract":"<div><div>When solving Hamiltonian systems using numerical integrators, preserving the symplectic structure may be crucial for many problems. At the same time, solving chaotic or stiff problems requires integrators to approximate the trajectories with extreme precision. So, integrating Hamilton's equations to a level of scientific reliability such that the answer can be used for scientific interpretation, may be computationally expensive. However, a neural network can be a viable alternative to numerical integrators, offering high-fidelity solutions orders of magnitudes faster.</div><div>To understand whether it is also important to preserve the symplecticity when neural networks are used, we analyze three well-known neural network architectures that are including the symplectic structure inside the neural network's topology. Between these neural network architectures many similarities can be found. This allows us to formulate a new, generalized framework for these architectures. In the generalized framework Symplectic Recurrent Neural Networks, SympNets and HénonNets are included as special cases. Additionally, this new framework enables us to find novel neural network topologies by transitioning between the established ones.</div><div>We compare new Generalized Hamiltonian Neural Networks (GHNNs) against the already established SympNets, HénonNets and physics-unaware multilayer perceptrons. This comparison is performed with data for a pendulum, a double pendulum and a gravitational 3-body problem. In order to achieve a fair comparison, the hyperparameters of the different neural networks are chosen such that the prediction speeds of all four architectures are the same during inference. A special focus lies on the capability of the neural networks to generalize outside the training data. The GHNNs outperform all other neural network architectures for the problems considered.</div></div>","PeriodicalId":352,"journal":{"name":"Journal of Computational Physics","volume":"521 ","pages":"Article 113536"},"PeriodicalIF":3.8000,"publicationDate":"2024-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Computational Physics","FirstCategoryId":"101","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0021999124007848","RegionNum":2,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

Abstract

When solving Hamiltonian systems using numerical integrators, preserving the symplectic structure may be crucial for many problems. At the same time, solving chaotic or stiff problems requires integrators to approximate the trajectories with extreme precision. So, integrating Hamilton's equations to a level of scientific reliability such that the answer can be used for scientific interpretation, may be computationally expensive. However, a neural network can be a viable alternative to numerical integrators, offering high-fidelity solutions orders of magnitudes faster.
To understand whether it is also important to preserve the symplecticity when neural networks are used, we analyze three well-known neural network architectures that are including the symplectic structure inside the neural network's topology. Between these neural network architectures many similarities can be found. This allows us to formulate a new, generalized framework for these architectures. In the generalized framework Symplectic Recurrent Neural Networks, SympNets and HénonNets are included as special cases. Additionally, this new framework enables us to find novel neural network topologies by transitioning between the established ones.
We compare new Generalized Hamiltonian Neural Networks (GHNNs) against the already established SympNets, HénonNets and physics-unaware multilayer perceptrons. This comparison is performed with data for a pendulum, a double pendulum and a gravitational 3-body problem. In order to achieve a fair comparison, the hyperparameters of the different neural networks are chosen such that the prediction speeds of all four architectures are the same during inference. A special focus lies on the capability of the neural networks to generalize outside the training data. The GHNNs outperform all other neural network architectures for the problems considered.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
哈密顿系统神经网络的广义框架
在使用数值积分器求解哈密顿系统时,保留交映结构可能对许多问题至关重要。同时,解决混沌或僵硬问题需要积分器以极高的精度逼近轨迹。因此,将汉密尔顿方程积分到科学可靠的程度,使答案可用于科学解释,可能会耗费大量计算资源。为了了解在使用神经网络时保持交映性是否同样重要,我们分析了三种著名的神经网络架构,它们都将交映结构包含在神经网络的拓扑结构中。这些神经网络架构之间有许多相似之处。因此,我们可以为这些架构制定一个新的通用框架。在这个广义框架中,交映递归神经网络、SympNets 和 HénonNets 都是特例。我们将新的广义哈密顿神经网络(GHNN)与已建立的 SympNets、HénonNets 和物理无感知多层感知器进行了比较。比较使用了摆锤、双摆锤和重力三体问题的数据。为了进行公平比较,我们选择了不同神经网络的超参数,使所有四种架构在推理过程中的预测速度相同。我们特别关注神经网络在训练数据之外的泛化能力。在所考虑的问题上,GHNNs 的表现优于所有其他神经网络架构。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Journal of Computational Physics
Journal of Computational Physics 物理-计算机:跨学科应用
CiteScore
7.60
自引率
14.60%
发文量
763
审稿时长
5.8 months
期刊介绍: Journal of Computational Physics thoroughly treats the computational aspects of physical problems, presenting techniques for the numerical solution of mathematical equations arising in all areas of physics. The journal seeks to emphasize methods that cross disciplinary boundaries. The Journal of Computational Physics also publishes short notes of 4 pages or less (including figures, tables, and references but excluding title pages). Letters to the Editor commenting on articles already published in this Journal will also be considered. Neither notes nor letters should have an abstract.
期刊最新文献
Editorial Board Registration-based nonlinear model reduction of parametrized aerodynamics problems with applications to transonic Euler and RANS flows Taylor series error correction network for super-resolution of discretized partial differential equation solutions ARMS: Adding and removing markers on splines for high-order general interface tracking under the MARS framework Local energy-preserving scalar auxiliary variable approaches for general multi-symplectic Hamiltonian PDEs
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1