根据壁面测量结果估算湍流的三维生成对抗网络

Antonio Cuéllar, Alejandro Güemes, Andrea Ianiro, Óscar Flores, Ricardo Vinuesa, Stefano Discetti
{"title":"根据壁面测量结果估算湍流的三维生成对抗网络","authors":"Antonio Cuéllar, Alejandro Güemes, Andrea Ianiro, Óscar Flores, Ricardo Vinuesa, Stefano Discetti","doi":"arxiv-2409.06548","DOIUrl":null,"url":null,"abstract":"Different types of neural networks have been used to solve the flow sensing\nproblem in turbulent flows, namely to estimate velocity in wall-parallel planes\nfrom wall measurements. Generative adversarial networks (GANs) are among the\nmost promising methodologies, due to their more accurate estimations and better\nperceptual quality. This work tackles this flow sensing problem in the vicinity\nof the wall, addressing for the first time the reconstruction of the entire\nthree-dimensional (3-D) field with a single network, i.e. a 3-D GAN. With this\nmethodology, a single training and prediction process overcomes the limitation\npresented by the former approaches based on the independent estimation of\nwall-parallel planes. The network is capable of estimating the 3-D flow field\nwith a level of error at each wall-normal distance comparable to that reported\nfrom wall-parallel plane estimations and at a lower training cost in terms of\ncomputational resources. The direct full 3-D reconstruction also unveils a\ndirect interpretation in terms of coherent structures. It is shown that the\naccuracy of the network depends directly on the wall footprint of each\nindividual turbulent structure. It is observed that wall-attached structures\nare predicted more accurately than wall-detached ones, especially at larger\ndistances from the wall. Among wall-attached structures, smaller sweeps are\nreconstructed better than small ejections, while large ejections are\nreconstructed better than large sweeps as a consequence of their more intense\nfootprint.","PeriodicalId":501125,"journal":{"name":"arXiv - PHYS - Fluid Dynamics","volume":"1 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Three-dimensional generative adversarial networks for turbulent flow estimation from wall measurements\",\"authors\":\"Antonio Cuéllar, Alejandro Güemes, Andrea Ianiro, Óscar Flores, Ricardo Vinuesa, Stefano Discetti\",\"doi\":\"arxiv-2409.06548\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Different types of neural networks have been used to solve the flow sensing\\nproblem in turbulent flows, namely to estimate velocity in wall-parallel planes\\nfrom wall measurements. Generative adversarial networks (GANs) are among the\\nmost promising methodologies, due to their more accurate estimations and better\\nperceptual quality. This work tackles this flow sensing problem in the vicinity\\nof the wall, addressing for the first time the reconstruction of the entire\\nthree-dimensional (3-D) field with a single network, i.e. a 3-D GAN. With this\\nmethodology, a single training and prediction process overcomes the limitation\\npresented by the former approaches based on the independent estimation of\\nwall-parallel planes. The network is capable of estimating the 3-D flow field\\nwith a level of error at each wall-normal distance comparable to that reported\\nfrom wall-parallel plane estimations and at a lower training cost in terms of\\ncomputational resources. The direct full 3-D reconstruction also unveils a\\ndirect interpretation in terms of coherent structures. It is shown that the\\naccuracy of the network depends directly on the wall footprint of each\\nindividual turbulent structure. It is observed that wall-attached structures\\nare predicted more accurately than wall-detached ones, especially at larger\\ndistances from the wall. Among wall-attached structures, smaller sweeps are\\nreconstructed better than small ejections, while large ejections are\\nreconstructed better than large sweeps as a consequence of their more intense\\nfootprint.\",\"PeriodicalId\":501125,\"journal\":{\"name\":\"arXiv - PHYS - Fluid Dynamics\",\"volume\":\"1 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - PHYS - Fluid Dynamics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.06548\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - PHYS - Fluid Dynamics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.06548","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

不同类型的神经网络已被用于解决湍流中的流动传感问题,即根据壁面测量结果估算平行于壁面的速度。生成式对抗网络(GANs)是最有前途的方法之一,因为其估算更准确,感知质量更高。这项研究解决了墙壁附近的流量感应问题,首次使用单个网络(即 3-D GAN)重建整个三维(3-D)场。通过这种方法,单一的训练和预测过程克服了以往基于独立估算与墙平行平面的方法所带来的限制。该网络能够估算出三维流场,其每个墙面法线距离的误差水平与墙面平行面估算的误差水平相当,而且在计算资源方面的训练成本更低。直接全三维重建还揭示了相干结构的直接解释。研究表明,网络的准确性直接取决于每个单独湍流结构的壁面足迹。据观察,附壁结构比离壁结构预测得更准确,特别是在离壁距离较大的情况下。在贴壁结构中,较小的扫掠结构比较小的喷射结构得到的预测结果更好,而较大的喷射结构比较大的扫掠结构得到的预测结果更好,这是因为它们的足迹更密集。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Three-dimensional generative adversarial networks for turbulent flow estimation from wall measurements
Different types of neural networks have been used to solve the flow sensing problem in turbulent flows, namely to estimate velocity in wall-parallel planes from wall measurements. Generative adversarial networks (GANs) are among the most promising methodologies, due to their more accurate estimations and better perceptual quality. This work tackles this flow sensing problem in the vicinity of the wall, addressing for the first time the reconstruction of the entire three-dimensional (3-D) field with a single network, i.e. a 3-D GAN. With this methodology, a single training and prediction process overcomes the limitation presented by the former approaches based on the independent estimation of wall-parallel planes. The network is capable of estimating the 3-D flow field with a level of error at each wall-normal distance comparable to that reported from wall-parallel plane estimations and at a lower training cost in terms of computational resources. The direct full 3-D reconstruction also unveils a direct interpretation in terms of coherent structures. It is shown that the accuracy of the network depends directly on the wall footprint of each individual turbulent structure. It is observed that wall-attached structures are predicted more accurately than wall-detached ones, especially at larger distances from the wall. Among wall-attached structures, smaller sweeps are reconstructed better than small ejections, while large ejections are reconstructed better than large sweeps as a consequence of their more intense footprint.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Additive-feature-attribution methods: a review on explainable artificial intelligence for fluid dynamics and heat transfer Direct and inverse cascades scaling in real shell models of turbulence A new complex fluid flow phenomenon: Bubbles-on-a-String Long-distance Liquid Transport Along Fibers Arising From Plateau-Rayleigh Instability Symmetry groups and invariant solutions of plane Poiseuille flow
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1