{"title":"Differentiable programming across the PDE and Machine Learning barrier","authors":"Nacime Bouziani, David A. Ham, Ado Farsi","doi":"arxiv-2409.06085","DOIUrl":null,"url":null,"abstract":"The combination of machine learning and physical laws has shown immense\npotential for solving scientific problems driven by partial differential\nequations (PDEs) with the promise of fast inference, zero-shot generalisation,\nand the ability to discover new physics. Examples include the use of\nfundamental physical laws as inductive bias to machine learning algorithms,\nalso referred to as physics-driven machine learning, and the application of\nmachine learning to represent features not represented in the differential\nequations such as closures for unresolved spatiotemporal scales. However, the\nsimulation of complex physical systems by coupling advanced numerics for PDEs\nwith state-of-the-art machine learning demands the composition of specialist\nPDE solving frameworks with industry-standard machine learning tools.\nHand-rolling either the PDE solver or the neural net will not cut it. In this\nwork, we introduce a generic differentiable programming abstraction that\nprovides scientists and engineers with a highly productive way of specifying\nend-to-end differentiable models coupling machine learning and PDE-based\ncomponents, while relying on code generation for high performance. Our\ninterface automates the coupling of arbitrary PDE-based systems and machine\nlearning models and unlocks new applications that could not hitherto be\ntackled, while only requiring trivial changes to existing code. Our framework\nhas been adopted in the Firedrake finite-element library and supports the\nPyTorch and JAX ecosystems, as well as downstream libraries.","PeriodicalId":501162,"journal":{"name":"arXiv - MATH - Numerical Analysis","volume":"10 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - MATH - Numerical Analysis","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.06085","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The combination of machine learning and physical laws has shown immense
potential for solving scientific problems driven by partial differential
equations (PDEs) with the promise of fast inference, zero-shot generalisation,
and the ability to discover new physics. Examples include the use of
fundamental physical laws as inductive bias to machine learning algorithms,
also referred to as physics-driven machine learning, and the application of
machine learning to represent features not represented in the differential
equations such as closures for unresolved spatiotemporal scales. However, the
simulation of complex physical systems by coupling advanced numerics for PDEs
with state-of-the-art machine learning demands the composition of specialist
PDE solving frameworks with industry-standard machine learning tools.
Hand-rolling either the PDE solver or the neural net will not cut it. In this
work, we introduce a generic differentiable programming abstraction that
provides scientists and engineers with a highly productive way of specifying
end-to-end differentiable models coupling machine learning and PDE-based
components, while relying on code generation for high performance. Our
interface automates the coupling of arbitrary PDE-based systems and machine
learning models and unlocks new applications that could not hitherto be
tackled, while only requiring trivial changes to existing code. Our framework
has been adopted in the Firedrake finite-element library and supports the
PyTorch and JAX ecosystems, as well as downstream libraries.