CGKN: A deep learning framework for modeling complex dynamical systems and efficient data assimilation

IF 3.8 2区 物理与天体物理 Q2 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS Journal of Computational Physics Pub Date : 2025-07-01 Epub Date: 2025-03-24 DOI:10.1016/j.jcp.2025.113950
Chuanqi Chen , Nan Chen , Yinling Zhang , Jin-Long Wu
{"title":"CGKN: A deep learning framework for modeling complex dynamical systems and efficient data assimilation","authors":"Chuanqi Chen ,&nbsp;Nan Chen ,&nbsp;Yinling Zhang ,&nbsp;Jin-Long Wu","doi":"10.1016/j.jcp.2025.113950","DOIUrl":null,"url":null,"abstract":"<div><div>Deep learning is widely used to predict complex dynamical systems in many scientific and engineering areas. However, the black-box nature of these deep learning models presents significant challenges for carrying out simultaneous data assimilation (DA), which is a crucial technique for state estimation, model identification, and reconstructing missing data. Integrating ensemble-based DA methods with nonlinear deep learning models is computationally expensive and may suffer from large sampling errors. To address these challenges, we introduce a deep learning framework designed to simultaneously provide accurate forecasts and efficient DA. It is named Conditional Gaussian Koopman Network (CGKN), which transforms general nonlinear systems into nonlinear neural differential equations with conditional Gaussian structures. CGKN aims to retain essential nonlinear components while applying systematic and minimal simplifications to facilitate the development of analytic formulae for nonlinear DA. This allows for seamless integration of DA performance into the deep learning training process, eliminating the need for empirical tuning as required in ensemble methods. CGKN compensates for structural simplifications by lifting the dimension of the system, which is motivated by Koopman theory. Nevertheless, CGKN exploits special nonlinear dynamics within the lifted space. This enables the model to capture extreme events and strong non-Gaussian features in joint and marginal distributions with appropriate uncertainty quantification. We demonstrate the effectiveness of CGKN for both prediction and DA on three strongly nonlinear and non-Gaussian turbulent systems: the projected stochastic Burgers–Sivashinsky equation, the Lorenz 96 system, and the El Niño-Southern Oscillation. The results justify the robustness and computational efficiency of CGKN.</div></div>","PeriodicalId":352,"journal":{"name":"Journal of Computational Physics","volume":"532 ","pages":"Article 113950"},"PeriodicalIF":3.8000,"publicationDate":"2025-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Computational Physics","FirstCategoryId":"101","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0021999125002335","RegionNum":2,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/3/24 0:00:00","PubModel":"Epub","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

Abstract

Deep learning is widely used to predict complex dynamical systems in many scientific and engineering areas. However, the black-box nature of these deep learning models presents significant challenges for carrying out simultaneous data assimilation (DA), which is a crucial technique for state estimation, model identification, and reconstructing missing data. Integrating ensemble-based DA methods with nonlinear deep learning models is computationally expensive and may suffer from large sampling errors. To address these challenges, we introduce a deep learning framework designed to simultaneously provide accurate forecasts and efficient DA. It is named Conditional Gaussian Koopman Network (CGKN), which transforms general nonlinear systems into nonlinear neural differential equations with conditional Gaussian structures. CGKN aims to retain essential nonlinear components while applying systematic and minimal simplifications to facilitate the development of analytic formulae for nonlinear DA. This allows for seamless integration of DA performance into the deep learning training process, eliminating the need for empirical tuning as required in ensemble methods. CGKN compensates for structural simplifications by lifting the dimension of the system, which is motivated by Koopman theory. Nevertheless, CGKN exploits special nonlinear dynamics within the lifted space. This enables the model to capture extreme events and strong non-Gaussian features in joint and marginal distributions with appropriate uncertainty quantification. We demonstrate the effectiveness of CGKN for both prediction and DA on three strongly nonlinear and non-Gaussian turbulent systems: the projected stochastic Burgers–Sivashinsky equation, the Lorenz 96 system, and the El Niño-Southern Oscillation. The results justify the robustness and computational efficiency of CGKN.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
CGKN:一个用于复杂动态系统建模和有效数据同化的深度学习框架
在许多科学和工程领域,深度学习被广泛用于预测复杂的动态系统。然而,这些深度学习模型的黑箱特性为执行同步数据同化(DA)提出了重大挑战,而同步数据同化是状态估计、模型识别和重建缺失数据的关键技术。将基于集成的数据分析方法与非线性深度学习模型相结合,计算成本很高,并且可能存在较大的抽样误差。为了应对这些挑战,我们引入了一个深度学习框架,旨在同时提供准确的预测和高效的数据分析。将一般非线性系统转化为具有条件高斯结构的非线性神经微分方程,称为条件高斯库普曼网络(CGKN)。CGKN的目的是保留基本的非线性成分,同时应用系统和最小的简化,以促进非线性数据分析公式的发展。这允许将数据挖掘性能无缝集成到深度学习训练过程中,从而消除了集成方法中需要的经验调整。CGKN通过提升系统的维度来补偿结构简化,这是由库普曼理论驱动的。然而,CGKN在提升空间中利用了特殊的非线性动力学。这使模型能够捕获极端事件和强非高斯特征的联合和边际分布与适当的不确定性量化。我们证明了CGKN对三种强非线性和非高斯湍流系统的预测和数据分析的有效性:投影随机Burgers-Sivashinsky方程、Lorenz 96系统和El Niño-Southern振荡。结果证明了CGKN的鲁棒性和计算效率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Journal of Computational Physics
Journal of Computational Physics 物理-计算机:跨学科应用
CiteScore
7.60
自引率
14.60%
发文量
763
审稿时长
5.8 months
期刊介绍: Journal of Computational Physics thoroughly treats the computational aspects of physical problems, presenting techniques for the numerical solution of mathematical equations arising in all areas of physics. The journal seeks to emphasize methods that cross disciplinary boundaries. The Journal of Computational Physics also publishes short notes of 4 pages or less (including figures, tables, and references but excluding title pages). Letters to the Editor commenting on articles already published in this Journal will also be considered. Neither notes nor letters should have an abstract.
期刊最新文献
Low Mach number compressible multiphase particle-in-cell method for viscous flow problem A cut-cell based high-fidelity wall-modeled LES framework for compressible wall-bounded turbulent flows A unified 2D conservative framework for multi-medium continuum mechanics by cut-cell interface method A differentiable wall-modeled large-eddy simulation method for high-Reynolds-number wall-bounded turbulent flows A GENERIC-guided active learning SPH method for viscoelastic fluids using Gaussian process regression
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1