Liam K. MagargalDepartment of Mechanical Engineering and Mechanics, Lehigh University, Bethlehem, PA, United States, Parisa KhodabakhshiDepartment of Mechanical Engineering and Mechanics, Lehigh University, Bethlehem, PA, United States, Steven N. RodriguezComputational Multiphysics Systems Laboratory, United States Naval Research Laboratory, Washington, DC, United States, Justin W. JaworskiKevin T. Crofton Department of Aerospace and Ocean Engineering, Virginia Tech, Blacksburg, VA, United States, John G. MichopoulosComputational Multiphysics Systems Laboratory, United States Naval Research Laboratory, Washington, DC, United States
{"title":"Projection-based model-order reduction for unstructured meshes with graph autoencoders","authors":"Liam K. MagargalDepartment of Mechanical Engineering and Mechanics, Lehigh University, Bethlehem, PA, United States, Parisa KhodabakhshiDepartment of Mechanical Engineering and Mechanics, Lehigh University, Bethlehem, PA, United States, Steven N. RodriguezComputational Multiphysics Systems Laboratory, United States Naval Research Laboratory, Washington, DC, United States, Justin W. JaworskiKevin T. Crofton Department of Aerospace and Ocean Engineering, Virginia Tech, Blacksburg, VA, United States, John G. MichopoulosComputational Multiphysics Systems Laboratory, United States Naval Research Laboratory, Washington, DC, United States","doi":"arxiv-2407.13669","DOIUrl":null,"url":null,"abstract":"This paper presents a graph autoencoder architecture capable of performing\nprojection-based model-order reduction (PMOR) on advection-dominated flows\nmodeled by unstructured meshes. The autoencoder is coupled with the time\nintegration scheme from a traditional deep least-squares Petrov-Galerkin\nprojection and provides the first deployment of a graph autoencoder into a PMOR\nframework. The presented graph autoencoder is constructed with a two-part\nprocess that consists of (1) generating a hierarchy of reduced graphs to\nemulate the compressive abilities of convolutional neural networks (CNNs) and\n(2) training a message passing operation at each step in the hierarchy of\nreduced graphs to emulate the filtering process of a CNN. The resulting\nframework provides improved flexibility over traditional CNN-based autoencoders\nbecause it is extendable to unstructured meshes. To highlight the capabilities\nof the proposed framework, which is named geometric deep least-squares\nPetrov-Galerkin (GD-LSPG), we benchmark the method on a one-dimensional\nBurgers' equation problem with a structured mesh and demonstrate the\nflexibility of GD-LSPG by deploying it to a two-dimensional Euler equations\nmodel that uses an unstructured mesh. The proposed framework provides\nconsiderable improvement in accuracy for very low-dimensional latent spaces in\ncomparison with traditional affine projections.","PeriodicalId":501309,"journal":{"name":"arXiv - CS - Computational Engineering, Finance, and Science","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Computational Engineering, Finance, and Science","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2407.13669","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
This paper presents a graph autoencoder architecture capable of performing
projection-based model-order reduction (PMOR) on advection-dominated flows
modeled by unstructured meshes. The autoencoder is coupled with the time
integration scheme from a traditional deep least-squares Petrov-Galerkin
projection and provides the first deployment of a graph autoencoder into a PMOR
framework. The presented graph autoencoder is constructed with a two-part
process that consists of (1) generating a hierarchy of reduced graphs to
emulate the compressive abilities of convolutional neural networks (CNNs) and
(2) training a message passing operation at each step in the hierarchy of
reduced graphs to emulate the filtering process of a CNN. The resulting
framework provides improved flexibility over traditional CNN-based autoencoders
because it is extendable to unstructured meshes. To highlight the capabilities
of the proposed framework, which is named geometric deep least-squares
Petrov-Galerkin (GD-LSPG), we benchmark the method on a one-dimensional
Burgers' equation problem with a structured mesh and demonstrate the
flexibility of GD-LSPG by deploying it to a two-dimensional Euler equations
model that uses an unstructured mesh. The proposed framework provides
considerable improvement in accuracy for very low-dimensional latent spaces in
comparison with traditional affine projections.