Structure-preserving dimensionality reduction for learning Hamiltonian dynamics

IF 3.8 2区 物理与天体物理 Q2 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS Journal of Computational Physics Pub Date : 2025-05-01 Epub Date: 2025-02-10 DOI:10.1016/j.jcp.2025.113832
Jānis Bajārs, Dāvis Kalvāns
{"title":"Structure-preserving dimensionality reduction for learning Hamiltonian dynamics","authors":"Jānis Bajārs,&nbsp;Dāvis Kalvāns","doi":"10.1016/j.jcp.2025.113832","DOIUrl":null,"url":null,"abstract":"<div><div>Structure-preserving data-driven learning algorithms have recently received high attention, e.g., the development of the symplecticity-preserving neural networks SympNets for learning the flow of a Hamiltonian system. The preservation of structural properties by neural networks has been shown to produce qualitatively better long-time predictions. Learning the flow of high-dimensional Hamiltonian dynamics still poses a great challenge due to the increase in neural network model complexity and, thus, the significant increase in training time. In this work, we investigate dimensionality reduction techniques of training datasets of solutions to Hamiltonian dynamics, which can be well modeled in a lower-dimensional subspace. For learning the flow of such Hamiltonian dynamics with symplecticity-preserving neural networks SympNets, we propose dimensionality reduction with the proper symplectic decomposition (PSD). PSD was originally proposed to obtain symplectic reduced-order models of Hamiltonian systems. We demonstrate the proposed purely data-driven approach by learning the nonlinear localized discrete breather solutions in a one-dimensional crystal lattice model. Considering three near-optimal PSD solutions, i.e., cotangent lift, complex SVD, and dimension-reduced nonlinear programming solutions, we find that learning the SPD-reduced Hamiltonian dynamics is not only more computationally efficient compared to learning the whole high-dimensional model, but we can also obtain comparably qualitatively good long-time predictions. Specifically, the cotangent lift and nonlinear programming PSD solutions demonstrate significantly enhanced long-term prediction capabilities, outperforming the approach of learning Hamiltonian dynamics with non-symplectic proper orthogonal decomposition (POD) dimensionality reduction.</div></div>","PeriodicalId":352,"journal":{"name":"Journal of Computational Physics","volume":"528 ","pages":"Article 113832"},"PeriodicalIF":3.8000,"publicationDate":"2025-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Computational Physics","FirstCategoryId":"101","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0021999125001159","RegionNum":2,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/2/10 0:00:00","PubModel":"Epub","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

Abstract

Structure-preserving data-driven learning algorithms have recently received high attention, e.g., the development of the symplecticity-preserving neural networks SympNets for learning the flow of a Hamiltonian system. The preservation of structural properties by neural networks has been shown to produce qualitatively better long-time predictions. Learning the flow of high-dimensional Hamiltonian dynamics still poses a great challenge due to the increase in neural network model complexity and, thus, the significant increase in training time. In this work, we investigate dimensionality reduction techniques of training datasets of solutions to Hamiltonian dynamics, which can be well modeled in a lower-dimensional subspace. For learning the flow of such Hamiltonian dynamics with symplecticity-preserving neural networks SympNets, we propose dimensionality reduction with the proper symplectic decomposition (PSD). PSD was originally proposed to obtain symplectic reduced-order models of Hamiltonian systems. We demonstrate the proposed purely data-driven approach by learning the nonlinear localized discrete breather solutions in a one-dimensional crystal lattice model. Considering three near-optimal PSD solutions, i.e., cotangent lift, complex SVD, and dimension-reduced nonlinear programming solutions, we find that learning the SPD-reduced Hamiltonian dynamics is not only more computationally efficient compared to learning the whole high-dimensional model, but we can also obtain comparably qualitatively good long-time predictions. Specifically, the cotangent lift and nonlinear programming PSD solutions demonstrate significantly enhanced long-term prediction capabilities, outperforming the approach of learning Hamiltonian dynamics with non-symplectic proper orthogonal decomposition (POD) dimensionality reduction.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
学习哈密顿动力学的保结构降维方法
保持结构的数据驱动学习算法最近受到了高度关注,例如,用于学习哈密顿系统流的保持辛数的神经网络症状的发展。神经网络对结构特性的保存已被证明可以产生质量更好的长期预测。由于神经网络模型复杂性的增加和训练时间的显著增加,学习高维哈密顿动力学流仍然是一个很大的挑战。在这项工作中,我们研究了可以在低维子空间中很好地建模的哈密顿动力学解的训练数据集的降维技术。为了学习具有保辛神经网络症状的哈密顿动力学流,我们提出了适当辛分解(PSD)降维方法。PSD最初是为了得到hamilton系统的辛降阶模型而提出的。我们通过学习一维晶格模型中的非线性局部离散呼吸解来证明所提出的纯数据驱动方法。考虑到三种近似最优的PSD解,即共切提升、复SVD和降维非线性规划解,我们发现,与学习整个高维模型相比,学习降维哈密顿动力学不仅计算效率更高,而且我们还可以获得质量相当好的长期预测。具体而言,协切提升和非线性规划PSD解决方案显示出显著增强的长期预测能力,优于使用非辛固有正交分解(POD)降维学习哈密顿动力学的方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Journal of Computational Physics
Journal of Computational Physics 物理-计算机:跨学科应用
CiteScore
7.60
自引率
14.60%
发文量
763
审稿时长
5.8 months
期刊介绍: Journal of Computational Physics thoroughly treats the computational aspects of physical problems, presenting techniques for the numerical solution of mathematical equations arising in all areas of physics. The journal seeks to emphasize methods that cross disciplinary boundaries. The Journal of Computational Physics also publishes short notes of 4 pages or less (including figures, tables, and references but excluding title pages). Letters to the Editor commenting on articles already published in this Journal will also be considered. Neither notes nor letters should have an abstract.
期刊最新文献
Low Mach number compressible multiphase particle-in-cell method for viscous flow problem A cut-cell based high-fidelity wall-modeled LES framework for compressible wall-bounded turbulent flows A unified 2D conservative framework for multi-medium continuum mechanics by cut-cell interface method A differentiable wall-modeled large-eddy simulation method for high-Reynolds-number wall-bounded turbulent flows A GENERIC-guided active learning SPH method for viscoelastic fluids using Gaussian process regression
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1