On the Deep Active-Subspace Method

IF 2.1 3区 工程技术 Q2 MATHEMATICS, INTERDISCIPLINARY APPLICATIONS Siam-Asa Journal on Uncertainty Quantification Pub Date : 2023-02-02 DOI:10.1137/21m1463240
W. Edeling
{"title":"On the Deep Active-Subspace Method","authors":"W. Edeling","doi":"10.1137/21m1463240","DOIUrl":null,"url":null,"abstract":". The deep active-subspace method is a neural-network based tool for the propagation of uncertainty through computational models with high-dimensional input spaces. Unlike the original active-subspace method, it does not require access to the gradient of the model. It relies on an orthogonal projection matrix constructed with Gram--Schmidt orthogonalization to reduce the input dimensionality. This matrix is incorporated into a neural network as the weight matrix of the first hidden layer (acting as an orthogonal encoder), and optimized using back propagation to identify the active subspace of the input. We propose several theoretical extensions, starting with a new analytic relation for the derivatives of Gram--Schmidt vectors, which are required for back propagation. We also study the use of vector-valued model outputs, which is difficult in the case of the original active-subspace method. Additionally, we investigate an alternative neural network with an encoder without embedded orthonormality, which shows equally good performance compared to the deep active-subspace method. Two epidemiological models are considered as applications, where one requires supercomputer access to generate the training data.","PeriodicalId":56064,"journal":{"name":"Siam-Asa Journal on Uncertainty Quantification","volume":"61 1","pages":"62-90"},"PeriodicalIF":2.1000,"publicationDate":"2023-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Siam-Asa Journal on Uncertainty Quantification","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1137/21m1463240","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 1

Abstract

. The deep active-subspace method is a neural-network based tool for the propagation of uncertainty through computational models with high-dimensional input spaces. Unlike the original active-subspace method, it does not require access to the gradient of the model. It relies on an orthogonal projection matrix constructed with Gram--Schmidt orthogonalization to reduce the input dimensionality. This matrix is incorporated into a neural network as the weight matrix of the first hidden layer (acting as an orthogonal encoder), and optimized using back propagation to identify the active subspace of the input. We propose several theoretical extensions, starting with a new analytic relation for the derivatives of Gram--Schmidt vectors, which are required for back propagation. We also study the use of vector-valued model outputs, which is difficult in the case of the original active-subspace method. Additionally, we investigate an alternative neural network with an encoder without embedded orthonormality, which shows equally good performance compared to the deep active-subspace method. Two epidemiological models are considered as applications, where one requires supercomputer access to generate the training data.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
关于深度活动子空间方法
. 深度主动子空间方法是一种基于神经网络的工具,用于通过具有高维输入空间的计算模型传播不确定性。与原始的活动子空间方法不同,它不需要访问模型的梯度。它依赖于用Gram- Schmidt正交构造的正交投影矩阵来降低输入维数。该矩阵作为第一隐层(作为正交编码器)的权重矩阵并入神经网络,并使用反向传播优化以识别输入的活动子空间。我们提出了几个理论扩展,从Gram- Schmidt向量导数的一个新的解析关系开始,这是反向传播所必需的。我们还研究了向量值模型输出的使用,这在原始的活动子空间方法中是困难的。此外,我们还研究了一种具有编码器的替代神经网络,该编码器没有嵌入正交性,与深度有源子空间方法相比,它具有同样好的性能。两个流行病学模型被认为是应用,其中一个需要超级计算机访问来生成训练数据。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Siam-Asa Journal on Uncertainty Quantification
Siam-Asa Journal on Uncertainty Quantification Mathematics-Statistics and Probability
CiteScore
3.70
自引率
0.00%
发文量
51
期刊介绍: SIAM/ASA Journal on Uncertainty Quantification (JUQ) publishes research articles presenting significant mathematical, statistical, algorithmic, and application advances in uncertainty quantification, defined as the interface of complex modeling of processes and data, especially characterizations of the uncertainties inherent in the use of such models. The journal also focuses on related fields such as sensitivity analysis, model validation, model calibration, data assimilation, and code verification. The journal also solicits papers describing new ideas that could lead to significant progress in methodology for uncertainty quantification as well as review articles on particular aspects. The journal is dedicated to nurturing synergistic interactions between the mathematical, statistical, computational, and applications communities involved in uncertainty quantification and related areas. JUQ is jointly offered by SIAM and the American Statistical Association.
期刊最新文献
The Bayesian Approach to Inverse Robin Problems Covariance Expressions for Multifidelity Sampling with Multioutput, Multistatistic Estimators: Application to Approximate Control Variates Parameter Inference Based on Gaussian Processes Informed by Nonlinear Partial Differential Equations Adaptive Multilevel Subset Simulation with Selective Refinement A Fully Parallelized and Budgeted Multilevel Monte Carlo Method and the Application to Acoustic Waves
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1