基于离散经验插值的非线性动力系统的保梯度超约化

C. Pagliantini, Federico Vismara
{"title":"基于离散经验插值的非线性动力系统的保梯度超约化","authors":"C. Pagliantini, Federico Vismara","doi":"10.1137/22M1503890","DOIUrl":null,"url":null,"abstract":"This work proposes a hyper-reduction method for nonlinear parametric dynamical systems characterized by gradient fields such as Hamiltonian systems and gradient flows. The gradient structure is associated with conservation of invariants or with dissipation and hence plays a crucial role in the description of the physical properties of the system. Traditional hyper-reduction of nonlinear gradient fields yields efficient approximations that, however, lack the gradient structure. We focus on Hamiltonian gradients and we propose to first decompose the nonlinear part of the Hamiltonian, mapped into a suitable reduced space, into the sum of d terms, each characterized by a sparse dependence on the system state. Then, the hyper-reduced approximation is obtained via discrete empirical interpolation (DEIM) of the Jacobian of the derived d-valued nonlinear function. The resulting hyper-reduced model retains the gradient structure and its computationally complexity is independent of the size of the full model. Moreover, a priori error estimates show that the hyper-reduced model converges to the reduced model and the Hamiltonian is asymptotically preserved. Whenever the nonlinear Hamiltonian gradient is not globally reducible, i.e. its evolution requires high-dimensional DEIM approximation spaces, an adaptive strategy is performed. This consists in updating the hyper-reduced Hamiltonian via a low-rank correction of the DEIM basis. Numerical tests demonstrate the applicability of the proposed approach to general nonlinear operators and runtime speedups compared to the full and the reduced models.","PeriodicalId":21812,"journal":{"name":"SIAM J. Sci. Comput.","volume":"69 1","pages":"2725-"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Gradient-preserving hyper-reduction of nonlinear dynamical systems via discrete empirical interpolation\",\"authors\":\"C. Pagliantini, Federico Vismara\",\"doi\":\"10.1137/22M1503890\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This work proposes a hyper-reduction method for nonlinear parametric dynamical systems characterized by gradient fields such as Hamiltonian systems and gradient flows. The gradient structure is associated with conservation of invariants or with dissipation and hence plays a crucial role in the description of the physical properties of the system. Traditional hyper-reduction of nonlinear gradient fields yields efficient approximations that, however, lack the gradient structure. We focus on Hamiltonian gradients and we propose to first decompose the nonlinear part of the Hamiltonian, mapped into a suitable reduced space, into the sum of d terms, each characterized by a sparse dependence on the system state. Then, the hyper-reduced approximation is obtained via discrete empirical interpolation (DEIM) of the Jacobian of the derived d-valued nonlinear function. The resulting hyper-reduced model retains the gradient structure and its computationally complexity is independent of the size of the full model. Moreover, a priori error estimates show that the hyper-reduced model converges to the reduced model and the Hamiltonian is asymptotically preserved. Whenever the nonlinear Hamiltonian gradient is not globally reducible, i.e. its evolution requires high-dimensional DEIM approximation spaces, an adaptive strategy is performed. This consists in updating the hyper-reduced Hamiltonian via a low-rank correction of the DEIM basis. Numerical tests demonstrate the applicability of the proposed approach to general nonlinear operators and runtime speedups compared to the full and the reduced models.\",\"PeriodicalId\":21812,\"journal\":{\"name\":\"SIAM J. Sci. Comput.\",\"volume\":\"69 1\",\"pages\":\"2725-\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-06-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"SIAM J. Sci. Comput.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1137/22M1503890\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIAM J. Sci. Comput.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1137/22M1503890","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

摘要

本文提出了一种以梯度场为特征的非线性参数动力系统的超约简方法,如哈密顿系统和梯度流。梯度结构与不变量守恒或耗散有关,因此在描述系统的物理性质方面起着至关重要的作用。传统的非线性梯度场的超约简得到了有效的近似,但缺乏梯度结构。我们将重点放在哈密顿梯度上,我们建议首先将哈密顿的非线性部分,映射到一个合适的简化空间中,分解成d项的和,每个项都以对系统状态的稀疏依赖为特征。然后,通过离散经验插值(DEIM)对所导出的d值非线性函数的雅可比矩阵进行超约简逼近。得到的超简化模型保留了梯度结构,其计算复杂度与完整模型的大小无关。先验误差估计表明,超约化模型收敛于约化模型,哈密顿量渐近保持。当非线性哈密顿梯度不是全局可约的,即其演化需要高维DEIM近似空间时,执行自适应策略。这包括通过对DEIM基的低阶修正来更新超约简哈密顿量。数值试验表明,与完整模型和简化模型相比,该方法适用于一般非线性算子和运行速度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Gradient-preserving hyper-reduction of nonlinear dynamical systems via discrete empirical interpolation
This work proposes a hyper-reduction method for nonlinear parametric dynamical systems characterized by gradient fields such as Hamiltonian systems and gradient flows. The gradient structure is associated with conservation of invariants or with dissipation and hence plays a crucial role in the description of the physical properties of the system. Traditional hyper-reduction of nonlinear gradient fields yields efficient approximations that, however, lack the gradient structure. We focus on Hamiltonian gradients and we propose to first decompose the nonlinear part of the Hamiltonian, mapped into a suitable reduced space, into the sum of d terms, each characterized by a sparse dependence on the system state. Then, the hyper-reduced approximation is obtained via discrete empirical interpolation (DEIM) of the Jacobian of the derived d-valued nonlinear function. The resulting hyper-reduced model retains the gradient structure and its computationally complexity is independent of the size of the full model. Moreover, a priori error estimates show that the hyper-reduced model converges to the reduced model and the Hamiltonian is asymptotically preserved. Whenever the nonlinear Hamiltonian gradient is not globally reducible, i.e. its evolution requires high-dimensional DEIM approximation spaces, an adaptive strategy is performed. This consists in updating the hyper-reduced Hamiltonian via a low-rank correction of the DEIM basis. Numerical tests demonstrate the applicability of the proposed approach to general nonlinear operators and runtime speedups compared to the full and the reduced models.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
An Operator-Splitting Optimization Approach for Phase-Field Simulation of Equilibrium Shapes of Crystals A Simple and Efficient Convex Optimization Based Bound-Preserving High Order Accurate Limiter for Cahn-Hilliard-Navier-Stokes System Almost Complete Analytical Integration in Galerkin Boundary Element Methods Sublinear Algorithms for Local Graph-Centrality Estimation Deterministic \(\boldsymbol{(\unicode{x00BD}+\varepsilon)}\) -Approximation for Submodular Maximization over a Matroid
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1