On a computational paradigm for a class of fractional order direct and inverse problems in terms of physics-informed neural networks with the attention mechanism

IF 3.1 3区 计算机科学 Q2 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS Journal of Computational Science Pub Date : 2025-02-01 DOI:10.1016/j.jocs.2024.102514
M. Srati , A. Oulmelk , L. Afraites , A. Hadri , M.A. Zaky , A.S. Hendy
{"title":"On a computational paradigm for a class of fractional order direct and inverse problems in terms of physics-informed neural networks with the attention mechanism","authors":"M. Srati ,&nbsp;A. Oulmelk ,&nbsp;L. Afraites ,&nbsp;A. Hadri ,&nbsp;M.A. Zaky ,&nbsp;A.S. Hendy","doi":"10.1016/j.jocs.2024.102514","DOIUrl":null,"url":null,"abstract":"<div><div>Physics-Informed Neural Networks (PINNs) have recently gained significant attention for their ability to solve both forward and inverse problems associated with linear and nonlinear fractional partial differential equations (PDEs). However, PINNs, relying on feedforward neural networks (FNNs), overlook the crucial temporal dependencies inherent in practical physics systems. As a result, they fail to globally propagate the initial condition constraints and accurately capture the true solutions under various scenarios. In contrast, the attention mechanism offers flexible means to implicitly exploit patterns within inputs and, moreover, establish relationships between arbitrary query locations and inputs. Thus, we present an attention-based framework for PINNs, which we term PINNs-Transformer (Zhao et al., 2023). The framework was constructed using self-attention and a set of point-wise multilayer perceptrons (MLPs). The novelty is in applying the framework to the various fractional differential equations with stiff dynamics as well as their inverse formulations. We have also validated the PINNs-Transformer on two examples: one involving a fractional diffusion differential equation over time, and the other focused on identifying a space-dependent parameter associated with the direct problem described in the first example. We reinforce this finding by conducting a numerical comparison with variant of PINN methods based on criteria such as relative error, complexity, memory needs and execution time.</div></div>","PeriodicalId":48907,"journal":{"name":"Journal of Computational Science","volume":"85 ","pages":"Article 102514"},"PeriodicalIF":3.1000,"publicationDate":"2025-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Computational Science","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1877750324003077","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

Abstract

Physics-Informed Neural Networks (PINNs) have recently gained significant attention for their ability to solve both forward and inverse problems associated with linear and nonlinear fractional partial differential equations (PDEs). However, PINNs, relying on feedforward neural networks (FNNs), overlook the crucial temporal dependencies inherent in practical physics systems. As a result, they fail to globally propagate the initial condition constraints and accurately capture the true solutions under various scenarios. In contrast, the attention mechanism offers flexible means to implicitly exploit patterns within inputs and, moreover, establish relationships between arbitrary query locations and inputs. Thus, we present an attention-based framework for PINNs, which we term PINNs-Transformer (Zhao et al., 2023). The framework was constructed using self-attention and a set of point-wise multilayer perceptrons (MLPs). The novelty is in applying the framework to the various fractional differential equations with stiff dynamics as well as their inverse formulations. We have also validated the PINNs-Transformer on two examples: one involving a fractional diffusion differential equation over time, and the other focused on identifying a space-dependent parameter associated with the direct problem described in the first example. We reinforce this finding by conducting a numerical comparison with variant of PINN methods based on criteria such as relative error, complexity, memory needs and execution time.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
求助全文
约1分钟内获得全文 去求助
来源期刊
Journal of Computational Science
Journal of Computational Science COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS-COMPUTER SCIENCE, THEORY & METHODS
CiteScore
5.50
自引率
3.00%
发文量
227
审稿时长
41 days
期刊介绍: Computational Science is a rapidly growing multi- and interdisciplinary field that uses advanced computing and data analysis to understand and solve complex problems. It has reached a level of predictive capability that now firmly complements the traditional pillars of experimentation and theory. The recent advances in experimental techniques such as detectors, on-line sensor networks and high-resolution imaging techniques, have opened up new windows into physical and biological processes at many levels of detail. The resulting data explosion allows for detailed data driven modeling and simulation. This new discipline in science combines computational thinking, modern computational methods, devices and collateral technologies to address problems far beyond the scope of traditional numerical methods. Computational science typically unifies three distinct elements: • Modeling, Algorithms and Simulations (e.g. numerical and non-numerical, discrete and continuous); • Software developed to solve science (e.g., biological, physical, and social), engineering, medicine, and humanities problems; • Computer and information science that develops and optimizes the advanced system hardware, software, networking, and data management components (e.g. problem solving environments).
期刊最新文献
Establishing a massively parallel computational model of the adaptive immune response Unsupervised continual learning by cross-level, instance-group and pseudo-group discrimination with hard attention A cluster-based opposition differential evolution algorithm boosted by a local search for ECG signal classification Community-based voting approach to enhance the spreading dynamics by identifying a group of influential spreaders in complex networks Deep dive into generative models through feature interpoint distances
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1