基于物理的神经网络和图网络的可扩展算法

IF 2.4 Q3 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE DataCentric Engineering Pub Date : 2022-05-16 DOI:10.1017/dce.2022.24
K. Shukla, Mengjia Xu, N. Trask, G. Karniadakis
{"title":"基于物理的神经网络和图网络的可扩展算法","authors":"K. Shukla, Mengjia Xu, N. Trask, G. Karniadakis","doi":"10.1017/dce.2022.24","DOIUrl":null,"url":null,"abstract":"Abstract Physics-informed machine learning (PIML) has emerged as a promising new approach for simulating complex physical and biological systems that are governed by complex multiscale processes for which some data are also available. In some instances, the objective is to discover part of the hidden physics from the available data, and PIML has been shown to be particularly effective for such problems for which conventional methods may fail. Unlike commercial machine learning where training of deep neural networks requires big data, in PIML big data are not available. Instead, we can train such networks from additional information obtained by employing the physical laws and evaluating them at random points in the space–time domain. Such PIML integrates multimodality and multifidelity data with mathematical models, and implements them using neural networks or graph networks. Here, we review some of the prevailing trends in embedding physics into machine learning, using physics-informed neural networks (PINNs) based primarily on feed-forward neural networks and automatic differentiation. For more complex systems or systems of systems and unstructured data, graph neural networks (GNNs) present some distinct advantages, and here we review how physics-informed learning can be accomplished with GNNs based on graph exterior calculus to construct differential operators; we refer to these architectures as physics-informed graph networks (PIGNs). We present representative examples for both forward and inverse problems and discuss what advances are needed to scale up PINNs, PIGNs and more broadly GNNs for large-scale engineering problems.","PeriodicalId":34169,"journal":{"name":"DataCentric Engineering","volume":" ","pages":""},"PeriodicalIF":2.4000,"publicationDate":"2022-05-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"19","resultStr":"{\"title\":\"Scalable algorithms for physics-informed neural and graph networks\",\"authors\":\"K. Shukla, Mengjia Xu, N. Trask, G. Karniadakis\",\"doi\":\"10.1017/dce.2022.24\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract Physics-informed machine learning (PIML) has emerged as a promising new approach for simulating complex physical and biological systems that are governed by complex multiscale processes for which some data are also available. In some instances, the objective is to discover part of the hidden physics from the available data, and PIML has been shown to be particularly effective for such problems for which conventional methods may fail. Unlike commercial machine learning where training of deep neural networks requires big data, in PIML big data are not available. Instead, we can train such networks from additional information obtained by employing the physical laws and evaluating them at random points in the space–time domain. Such PIML integrates multimodality and multifidelity data with mathematical models, and implements them using neural networks or graph networks. Here, we review some of the prevailing trends in embedding physics into machine learning, using physics-informed neural networks (PINNs) based primarily on feed-forward neural networks and automatic differentiation. For more complex systems or systems of systems and unstructured data, graph neural networks (GNNs) present some distinct advantages, and here we review how physics-informed learning can be accomplished with GNNs based on graph exterior calculus to construct differential operators; we refer to these architectures as physics-informed graph networks (PIGNs). We present representative examples for both forward and inverse problems and discuss what advances are needed to scale up PINNs, PIGNs and more broadly GNNs for large-scale engineering problems.\",\"PeriodicalId\":34169,\"journal\":{\"name\":\"DataCentric Engineering\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":2.4000,\"publicationDate\":\"2022-05-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"19\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"DataCentric Engineering\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1017/dce.2022.24\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"DataCentric Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1017/dce.2022.24","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 19

摘要

摘要物理知情机器学习(PIML)已成为一种很有前途的新方法,用于模拟由复杂多尺度过程控制的复杂物理和生物系统,其中也有一些数据可用。在某些情况下,目标是从可用数据中发现部分隐藏的物理现象,PIML已被证明对传统方法可能失败的此类问题特别有效。与商业机器学习不同,在商业机器学习中,深度神经网络的训练需要大数据,而在PIML中,大数据是不可用的。相反,我们可以利用物理定律获得的额外信息来训练这种网络,并在时空域中的随机点对其进行评估。这样的PIML将多模态和多理想数据与数学模型集成,并使用神经网络或图网络来实现它们。在这里,我们回顾了将物理嵌入机器学习的一些流行趋势,使用主要基于前馈神经网络和自动微分的物理知情神经网络(PINN)。对于更复杂的系统或系统中的系统和非结构化数据,图神经网络(GNN)具有一些明显的优势,在这里我们回顾了如何使用基于图外部演算的GNN来构造微分算子来实现物理知情学习;我们将这些架构称为物理知情图网络(PIGNs)。我们给出了正问题和反问题的代表性例子,并讨论了在大规模工程问题中扩大PINN、PIGNN和更广泛的GNN需要哪些进展。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Scalable algorithms for physics-informed neural and graph networks
Abstract Physics-informed machine learning (PIML) has emerged as a promising new approach for simulating complex physical and biological systems that are governed by complex multiscale processes for which some data are also available. In some instances, the objective is to discover part of the hidden physics from the available data, and PIML has been shown to be particularly effective for such problems for which conventional methods may fail. Unlike commercial machine learning where training of deep neural networks requires big data, in PIML big data are not available. Instead, we can train such networks from additional information obtained by employing the physical laws and evaluating them at random points in the space–time domain. Such PIML integrates multimodality and multifidelity data with mathematical models, and implements them using neural networks or graph networks. Here, we review some of the prevailing trends in embedding physics into machine learning, using physics-informed neural networks (PINNs) based primarily on feed-forward neural networks and automatic differentiation. For more complex systems or systems of systems and unstructured data, graph neural networks (GNNs) present some distinct advantages, and here we review how physics-informed learning can be accomplished with GNNs based on graph exterior calculus to construct differential operators; we refer to these architectures as physics-informed graph networks (PIGNs). We present representative examples for both forward and inverse problems and discuss what advances are needed to scale up PINNs, PIGNs and more broadly GNNs for large-scale engineering problems.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
DataCentric Engineering
DataCentric Engineering Engineering-General Engineering
CiteScore
5.60
自引率
0.00%
发文量
26
审稿时长
12 weeks
期刊最新文献
Semantic 3D city interfaces—Intelligent interactions on dynamic geospatial knowledge graphs Optical network physical layer parameter optimization for digital backpropagation using Gaussian processes Finite element model updating with quantified uncertainties using point cloud data Evaluating probabilistic forecasts for maritime engineering operations Bottom-up forecasting: Applications and limitations in load forecasting using smart-meter data
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1