Daniel Musekamp, Marimuthu Kalimuthu, David Holzmüller, Makoto Takamoto, Mathias Niepert
{"title":"Active Learning for Neural PDE Solvers","authors":"Daniel Musekamp, Marimuthu Kalimuthu, David Holzmüller, Makoto Takamoto, Mathias Niepert","doi":"arxiv-2408.01536","DOIUrl":null,"url":null,"abstract":"Solving partial differential equations (PDEs) is a fundamental problem in\nengineering and science. While neural PDE solvers can be more efficient than\nestablished numerical solvers, they often require large amounts of training\ndata that is costly to obtain. Active Learning (AL) could help surrogate models\nreach the same accuracy with smaller training sets by querying classical\nsolvers with more informative initial conditions and PDE parameters. While AL\nis more common in other domains, it has yet to be studied extensively for\nneural PDE solvers. To bridge this gap, we introduce AL4PDE, a modular and\nextensible active learning benchmark. It provides multiple parametric PDEs and\nstate-of-the-art surrogate models for the solver-in-the-loop setting, enabling\nthe evaluation of existing and the development of new AL methods for PDE\nsolving. We use the benchmark to evaluate batch active learning algorithms such\nas uncertainty- and feature-based methods. We show that AL reduces the average\nerror by up to 71% compared to random sampling and significantly reduces\nworst-case errors. Moreover, AL generates similar datasets across repeated\nruns, with consistent distributions over the PDE parameters and initial\nconditions. The acquired datasets are reusable, providing benefits for\nsurrogate models not involved in the data generation.","PeriodicalId":501347,"journal":{"name":"arXiv - CS - Neural and Evolutionary Computing","volume":"8 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-08-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Neural and Evolutionary Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2408.01536","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Solving partial differential equations (PDEs) is a fundamental problem in
engineering and science. While neural PDE solvers can be more efficient than
established numerical solvers, they often require large amounts of training
data that is costly to obtain. Active Learning (AL) could help surrogate models
reach the same accuracy with smaller training sets by querying classical
solvers with more informative initial conditions and PDE parameters. While AL
is more common in other domains, it has yet to be studied extensively for
neural PDE solvers. To bridge this gap, we introduce AL4PDE, a modular and
extensible active learning benchmark. It provides multiple parametric PDEs and
state-of-the-art surrogate models for the solver-in-the-loop setting, enabling
the evaluation of existing and the development of new AL methods for PDE
solving. We use the benchmark to evaluate batch active learning algorithms such
as uncertainty- and feature-based methods. We show that AL reduces the average
error by up to 71% compared to random sampling and significantly reduces
worst-case errors. Moreover, AL generates similar datasets across repeated
runs, with consistent distributions over the PDE parameters and initial
conditions. The acquired datasets are reusable, providing benefits for
surrogate models not involved in the data generation.