Fire scene reconstruction and determining the fire evolution (i.e. item-to-item ignition events) using the post-fire compartment is an extremely difficult task because of the time-integrated nature of the observed damages. Bayesian methods are ideal for making inferences amongst hypotheses given observations and are able to naturally incorporate uncertainties. A Bayesian methodology for determining probabilities to items that may have initiated the fire in a compartment from damage signatures is developed. Exercise of this methodology requires uncertainty quantification of these damage signatures. A simple compartment configuration was used to quantify the uncertainty in damage predictions by Fire Dynamics Simulator (FDS), and a compartment evolution program, JT-risk as compared to experimentally derived damage signatures. Surrogate sensors spaced within the compartment use heat flux data collected over the course of the simulations to inform damage models. Experimental repeatability showed up to 4% uncertainty in damage signatures between replicates . Uncertainties for FDS and JT-risk ranged from 12% up to 32% when compared to experimental damages. Separately, the evolution physics of a simple three fuel package problem with surrogate damage sensors were characterized in a compartment using experimental data, FDS, and JT-risk predictions. An simple ignition model was used for each of the fuel packages. The Bayesian methodology was exercised using the damage signatures collected, cycling through each of the three fuel packages, and combined with the previously quantified uncertainties. Only reconstruction using experimental data was able to confidently predict the true hypothesis from the three scenarios.
{"title":"Experimental and Modeling Uncertainty Considerations for Determining the First Item Ignited in a Compartment Using a Bayesian Method","authors":"J. Cabrera, R. Moser, O. Ezekoye","doi":"10.1115/1.4052796","DOIUrl":"https://doi.org/10.1115/1.4052796","url":null,"abstract":"\u0000 Fire scene reconstruction and determining the fire evolution (i.e. item-to-item ignition events) using the post-fire compartment is an extremely difficult task because of the time-integrated nature of the observed damages. Bayesian methods are ideal for making inferences amongst hypotheses given observations and are able to naturally incorporate uncertainties.\u0000 A Bayesian methodology for determining probabilities to items that may have initiated the fire in a compartment from damage signatures is developed. Exercise of this methodology requires uncertainty quantification of these damage signatures. A simple compartment configuration was used to quantify the uncertainty in damage predictions by Fire Dynamics Simulator (FDS), and a compartment evolution program, JT-risk as compared to experimentally derived damage signatures. Surrogate sensors spaced within the compartment use heat flux data collected over the course of the simulations to inform damage models. Experimental repeatability showed up to 4% uncertainty in damage signatures between replicates . Uncertainties for FDS and JT-risk ranged from 12% up to 32% when compared to experimental damages.\u0000 Separately, the evolution physics of a simple three fuel package problem with surrogate damage sensors were characterized in a compartment using experimental data, FDS, and JT-risk predictions. An simple ignition model was used for each of the fuel packages. The Bayesian methodology was exercised using the damage signatures collected, cycling through each of the three fuel packages, and combined with the previously quantified uncertainties. Only reconstruction using experimental data was able to confidently predict the true hypothesis from the three scenarios.","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2021-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45535369","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper documents a computational fluid dynamics (CFD) validation benchmark experiment for flow through three parallel, heated channels from one plenum to another. The test section was installed into a facility designed for natural convection benchmark validation experiments. The focus of these experiments was the highly-coupled thermal-fluid dynamics that occur between mixing jets in the upper plenum of the wind tunnel. A thermal instability in mixing jets, called thermal striping, can cause damage to structures which is a concern for High Temperature Gas Reactors. Nine experimental cases were explored by varying the relative channel temperature or blower speed. The boundary conditions for CFD validation were measured and tabulated along with an uncertainty. Geometry measurements of the triple channel test section were used to make an as-built solid model for use in simulation. The outer tunnel and channel surface temperatures, the pressure drop across the test section, atmospheric conditions, and inflow into the upper plenum were measured or calculated for the boundary conditions. The air velocity and temperature were measured in the jet mixing region of the upper plenum as system response quantities.
{"title":"Benchmark Validation Experiment of Plenum-to-Plenum Flow Through Heated Parallel Channels","authors":"A. W. Parker, Barton L. Smith","doi":"10.1115/1.4052763","DOIUrl":"https://doi.org/10.1115/1.4052763","url":null,"abstract":"\u0000 This paper documents a computational fluid dynamics (CFD) validation benchmark experiment for flow through three parallel, heated channels from one plenum to another. The test section was installed into a facility designed for natural convection benchmark validation experiments. The focus of these experiments was the highly-coupled thermal-fluid dynamics that occur between mixing jets in the upper plenum of the wind tunnel. A thermal instability in mixing jets, called thermal striping, can cause damage to structures which is a concern for High Temperature Gas Reactors. Nine experimental cases were explored by varying the relative channel temperature or blower speed. The boundary conditions for CFD validation were measured and tabulated along with an uncertainty. Geometry measurements of the triple channel test section were used to make an as-built solid model for use in simulation. The outer tunnel and channel surface temperatures, the pressure drop across the test section, atmospheric conditions, and inflow into the upper plenum were measured or calculated for the boundary conditions. The air velocity and temperature were measured in the jet mixing region of the upper plenum as system response quantities.","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2021-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47262484","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Modeling and Simulation (M&S) is seen as a means to mitigate the difficulties associated with increased system complexity, integration, and cross-couplings effects encountered during development of aircraft subsystems. As a consequence, knowledge of model validity is necessary for taking robust and justified design decisions. This paper presents a method for using coverage metrics to formulate an optimal model validation strategy. Three fundamentally different and industrially relevant use-cases are presented. The first use-case entails the successive identification of validation settings, and the second considers the simultaneous identification of n validation settings. The latter of these two use-cases is finally expanded to incorporate a secondary model-based objective to the optimization problem in a third use-case. The approach presented is designed to be scalable and generic to models of industrially relevant complexity. As a result, selecting experiments for validation is done objectively with little required manual effort.
{"title":"Optimal Selection of Model Validation Experiments: Guided by Coverage","authors":"Robert Hällqvist, R. Braun, M. Eek, P. Krus","doi":"10.1115/1.4051497","DOIUrl":"https://doi.org/10.1115/1.4051497","url":null,"abstract":"\u0000 Modeling and Simulation (M&S) is seen as a means to mitigate the difficulties associated with increased system complexity, integration, and cross-couplings effects encountered during development of aircraft subsystems. As a consequence, knowledge of model validity is necessary for taking robust and justified design decisions. This paper presents a method for using coverage metrics to formulate an optimal model validation strategy. Three fundamentally different and industrially relevant use-cases are presented. The first use-case entails the successive identification of validation settings, and the second considers the simultaneous identification of n validation settings. The latter of these two use-cases is finally expanded to incorporate a secondary model-based objective to the optimization problem in a third use-case. The approach presented is designed to be scalable and generic to models of industrially relevant complexity. As a result, selecting experiments for validation is done objectively with little required manual effort.","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2021-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44088987","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Response to “Closure on the Discussion of “Models, Uncertainty, and the Sandia V&V Challenge Problem” ” (Oberkampf, W. L., and Balch, M. S., ASME J. Verif. Valid. Uncert., 2020, 5(3), p. 035501-1)","authors":"G. Hazelrigg, G. Klutke","doi":"10.1115/1.4051591","DOIUrl":"https://doi.org/10.1115/1.4051591","url":null,"abstract":"","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2021-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42827017","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
G. Banyay, Clarence Worrell, S. E. Sidener, Joshua S. Kaizer
We present a framework for establishing credibility of a machine learning (ML) model used to predict a key process control variable setting to maximize product quality in a component manufacturing application. Our model coupled a purely data-based ML model with a physics-based adjustment that encoded subject matter expertise of the physical process. Establishing credibility of the resulting model provided the basis for eliminating a costly intermediate testing process that was previously used to determine the control variable setting.
{"title":"Credibility Assessment of Machine Learning in a Manufacturing Process Application","authors":"G. Banyay, Clarence Worrell, S. E. Sidener, Joshua S. Kaizer","doi":"10.1115/1.4051717","DOIUrl":"https://doi.org/10.1115/1.4051717","url":null,"abstract":"\u0000 We present a framework for establishing credibility of a machine learning (ML) model used to predict a key process control variable setting to maximize product quality in a component manufacturing application. Our model coupled a purely data-based ML model with a physics-based adjustment that encoded subject matter expertise of the physical process. Establishing credibility of the resulting model provided the basis for eliminating a costly intermediate testing process that was previously used to determine the control variable setting.","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2021-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41925707","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Chanyoung Park, Samaun Nili, Justin T. Mathew, F. Ouellet, R. Koneru, N. Kim, S. Balachandar, R. Haftka
Uncertainty quantification (UQ) is an important step in the verification and validation of scientific computing. Validation can be inconclusive when uncertainties are larger than acceptable ranges for both simulation and experiment. Therefore, uncertainty reduction (UR) is important to achieve meaningful validation. A unique approach in this paper is to separate model error from uncertainty such that UR can reveal the model error. This paper aims to share lessons learned from UQ and UR of a horizontal shock tube simulation, whose goal is to validate the particle drag force model for the compressible multiphase flow. First, simulation UQ revealed the inconsistency in simulation predictions due to the numerical flux scheme, which was clearly shown using the parametric design of experiments. By improving the numerical flux scheme, the uncertainty due to inconsistency was removed, while increasing the overall prediction error. Second, the mismatch between the geometry of the experiments and the simplified 1D simulation model was identified as a lack of knowledge. After modifying simulation conditions and experiments, it turned out that the error due to the mismatch was small, which was unexpected based on expert opinions. Last, the uncertainty in the initial volume fraction of particles was reduced based on rigorous UQ. All these UR measures worked together to reveal the hidden modeling error in the simulation predictions, which can lead to a model improvement in the future. We summarized the lessons learned from this exercise in terms of empty success, useful failure, and deceptive success.
{"title":"Uncertainty Reduction for Model Error Detection in Multiphase Shock Tube Simulation","authors":"Chanyoung Park, Samaun Nili, Justin T. Mathew, F. Ouellet, R. Koneru, N. Kim, S. Balachandar, R. Haftka","doi":"10.1115/1.4051407","DOIUrl":"https://doi.org/10.1115/1.4051407","url":null,"abstract":"\u0000 Uncertainty quantification (UQ) is an important step in the verification and validation of scientific computing. Validation can be inconclusive when uncertainties are larger than acceptable ranges for both simulation and experiment. Therefore, uncertainty reduction (UR) is important to achieve meaningful validation. A unique approach in this paper is to separate model error from uncertainty such that UR can reveal the model error. This paper aims to share lessons learned from UQ and UR of a horizontal shock tube simulation, whose goal is to validate the particle drag force model for the compressible multiphase flow. First, simulation UQ revealed the inconsistency in simulation predictions due to the numerical flux scheme, which was clearly shown using the parametric design of experiments. By improving the numerical flux scheme, the uncertainty due to inconsistency was removed, while increasing the overall prediction error. Second, the mismatch between the geometry of the experiments and the simplified 1D simulation model was identified as a lack of knowledge. After modifying simulation conditions and experiments, it turned out that the error due to the mismatch was small, which was unexpected based on expert opinions. Last, the uncertainty in the initial volume fraction of particles was reduced based on rigorous UQ. All these UR measures worked together to reveal the hidden modeling error in the simulation predictions, which can lead to a model improvement in the future. We summarized the lessons learned from this exercise in terms of empty success, useful failure, and deceptive success.","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2021-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46906057","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper presents a practical methodology for propagating and processing uncertainties associated with random measurement and estimation errors (that vary from test-to-test) and systematic measurement and estimation errors (uncertain but similar from test-to-test) in inputs and outputs of replicate tests to characterize response variability of stochastically varying test units. Also treated are test condition control variability from test-to-test and sampling uncertainty due to limited numbers of replicate tests. These aleatory variabilities and epistemic uncertainties result in uncertainty on computed statistics of output response quantities. The methodology was developed in the context of processing experimental data for “real-space” (RS) model validation comparisons against model-predicted statistics and uncertainty thereof. The methodology is flexible and sufficient for many types of experimental and data uncertainty, offering the most extensive data uncertainty quantification (UQ) treatment of any model validation method the authors are aware of. It handles both interval and probabilistic uncertainty descriptions and can be performed with relatively little computational cost through use of simple and effective dimension- and order-adaptive polynomial response surfaces in a Monte Carlo (MC) uncertainty propagation approach. A key feature of the progressively upgraded response surfaces is that they enable estimation of propagation error contributed by the surrogate model. Sensitivity analysis of the relative contributions of the various uncertainty sources to the total uncertainty of statistical estimates is also presented. The methodologies are demonstrated on real experimental validation data involving all the mentioned sources and types of error and uncertainty in five replicate tests of pressure vessels heated and pressurized to failure. Simple spreadsheet procedures are used for all processing operations.
{"title":"Processing Aleatory and Epistemic Uncertainties in Experimental Data From Sparse Replicate Tests of Stochastic Systems for Real-Space Model Validation","authors":"V. Romero, A. Black","doi":"10.1115/1.4051069","DOIUrl":"https://doi.org/10.1115/1.4051069","url":null,"abstract":"\u0000 This paper presents a practical methodology for propagating and processing uncertainties associated with random measurement and estimation errors (that vary from test-to-test) and systematic measurement and estimation errors (uncertain but similar from test-to-test) in inputs and outputs of replicate tests to characterize response variability of stochastically varying test units. Also treated are test condition control variability from test-to-test and sampling uncertainty due to limited numbers of replicate tests. These aleatory variabilities and epistemic uncertainties result in uncertainty on computed statistics of output response quantities. The methodology was developed in the context of processing experimental data for “real-space” (RS) model validation comparisons against model-predicted statistics and uncertainty thereof. The methodology is flexible and sufficient for many types of experimental and data uncertainty, offering the most extensive data uncertainty quantification (UQ) treatment of any model validation method the authors are aware of. It handles both interval and probabilistic uncertainty descriptions and can be performed with relatively little computational cost through use of simple and effective dimension- and order-adaptive polynomial response surfaces in a Monte Carlo (MC) uncertainty propagation approach. A key feature of the progressively upgraded response surfaces is that they enable estimation of propagation error contributed by the surrogate model. Sensitivity analysis of the relative contributions of the various uncertainty sources to the total uncertainty of statistical estimates is also presented. The methodologies are demonstrated on real experimental validation data involving all the mentioned sources and types of error and uncertainty in five replicate tests of pressure vessels heated and pressurized to failure. Simple spreadsheet procedures are used for all processing operations.","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2021-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45313509","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
F. Pereira, Fernando Grinstein, Daniel Israel, L. Eça
This work investigates the importance of verification and validation (V&V) to achieve predictive scale-resolving simulations (SRS) of turbulence, i.e., computations capable of resolving a fraction of the turbulent flow scales. Toward this end, we propose a novel but simple V&V strategy based on grid and physical resolution refinement studies that can be used even when the exact initial flow conditions are unknown, or reference data are unavailable. This is particularly relevant for transient and transitional flow problems, as well as for the improvement of turbulence models. We start by presenting a literature survey of results obtained with distinct SRS models for flows past circular cylinders. It confirms the importance of V&V by illustrating a large variability of results, which is independent of the selected mathematical model and Reynolds number. The proposed V&V strategy is then used on three representative problems of practical interest. The results illustrate that it is possible to conduct reliable verification and validation exercises with SRS models, and evidence the importance of V&V to predictive SRS of turbulence. Most notably, the data also confirm the advantages and potential of the proposed V&V strategy: separate assessment of numerical and modeling errors, enhanced flow physics analysis, identification of key flow phenomena, and ability to operate when the exact flow conditions are unknown or reference data are unavailable.
{"title":"Verification and Validation: the Path to Predictive Scale-Resolving Simulations of Turbulence","authors":"F. Pereira, Fernando Grinstein, Daniel Israel, L. Eça","doi":"10.1115/1.4053884","DOIUrl":"https://doi.org/10.1115/1.4053884","url":null,"abstract":"\u0000 This work investigates the importance of verification and validation (V&V) to achieve predictive scale-resolving simulations (SRS) of turbulence, i.e., computations capable of resolving a fraction of the turbulent flow scales. Toward this end, we propose a novel but simple V&V strategy based on grid and physical resolution refinement studies that can be used even when the exact initial flow conditions are unknown, or reference data are unavailable. This is particularly relevant for transient and transitional flow problems, as well as for the improvement of turbulence models. We start by presenting a literature survey of results obtained with distinct SRS models for flows past circular cylinders. It confirms the importance of V&V by illustrating a large variability of results, which is independent of the selected mathematical model and Reynolds number. The proposed V&V strategy is then used on three representative problems of practical interest. The results illustrate that it is possible to conduct reliable verification and validation exercises with SRS models, and evidence the importance of V&V to predictive SRS of turbulence. Most notably, the data also confirm the advantages and potential of the proposed V&V strategy: separate assessment of numerical and modeling errors, enhanced flow physics analysis, identification of key flow phenomena, and ability to operate when the exact flow conditions are unknown or reference data are unavailable.","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2021-03-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45375874","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Computational modeling and simulation is a central tool in science and engineering, directed at solving partial differential equations for which analytical solutions are unavailable. The continuous equations are generally discretized in time, space, energy, etc., to obtain approximate solutions using a numerical method. The aspiration is for the numerical solutions to asymptotically converge to the exact-but-unknown solution as the discretization size approaches zero. A generally applicable procedure to assure convergence is unavailable. The Richardson extrapolation is the main method for dealing with this challenge, but its assumptions introduce uncertainty to the resulting approximation. We use info-gap decision theory to model and manage its main uncertainty, namely, in the rate of convergence of numerical solutions. The theory is illustrated with a numerical application to Hertz contact in solid mechanics.
{"title":"Richardson Extrapolation: An Info-Gap Analysis of Numerical Uncertainty","authors":"Y. Ben-Haim, F. Hemez","doi":"10.1115/1.4048004","DOIUrl":"https://doi.org/10.1115/1.4048004","url":null,"abstract":"\u0000 Computational modeling and simulation is a central tool in science and engineering, directed at solving partial differential equations for which analytical solutions are unavailable. The continuous equations are generally discretized in time, space, energy, etc., to obtain approximate solutions using a numerical method. The aspiration is for the numerical solutions to asymptotically converge to the exact-but-unknown solution as the discretization size approaches zero. A generally applicable procedure to assure convergence is unavailable. The Richardson extrapolation is the main method for dealing with this challenge, but its assumptions introduce uncertainty to the resulting approximation. We use info-gap decision theory to model and manage its main uncertainty, namely, in the rate of convergence of numerical solutions. The theory is illustrated with a numerical application to Hertz contact in solid mechanics.","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":"1 1","pages":""},"PeriodicalIF":0.6,"publicationDate":"2020-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42901275","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Engineers and computational scientists often study the behavior of their simulations by repeated solutions with variations in their parameters, which can be, for instance, boundary values or initial conditions. Through such simulation ensembles, uncertainty in a solution is studied as a function of various input parameters. Solutions of numerical simulations are often temporal functions, spatial maps, or spatio-temporal outputs. The usual way to deal with such complex outputs is to limit the analysis to several probes in the temporal/spatial domain. This leads to smaller and more tractable ensembles of functional outputs (curves) with their associated input parameters: augmented ensembles of curves. This article describes a system for the interactive exploration and analysis of such augmented ensembles. Descriptive statistics on the functional outputs are performed by principal component analysis (PCA) projection, kernel density estimation, and the computation of high density regions. This makes possible the calculation of functional quantiles and outliers. Brushing and linking the elements of the system allows in-depth analysis of the ensemble. The system allows for functional descriptive statistics, cluster detection, and finally, for the realization of a visual sensitivity analysis via cobweb plots. We present two synthetic examples and then validate our approach in an industrial use-case concerning a marine current study using a hydraulic solver.
{"title":"A Visual Sensitivity Analysis for Parameter-Augmented Ensembles of Curves","authors":"A. Ribés, Joachim Pouderoux, B. Iooss","doi":"10.1115/1.4046020","DOIUrl":"https://doi.org/10.1115/1.4046020","url":null,"abstract":"Engineers and computational scientists often study the behavior of their simulations by repeated solutions with variations in their parameters, which can be, for instance, boundary values or initial conditions. Through such simulation ensembles, uncertainty in a solution is studied as a function of various input parameters. Solutions of numerical simulations are often temporal functions, spatial maps, or spatio-temporal outputs. The usual way to deal with such complex outputs is to limit the analysis to several probes in the temporal/spatial domain. This leads to smaller and more tractable ensembles of functional outputs (curves) with their associated input parameters: augmented ensembles of curves. This article describes a system for the interactive exploration and analysis of such augmented ensembles. Descriptive statistics on the functional outputs are performed by principal component analysis (PCA) projection, kernel density estimation, and the computation of high density regions. This makes possible the calculation of functional quantiles and outliers. Brushing and linking the elements of the system allows in-depth analysis of the ensemble. The system allows for functional descriptive statistics, cluster detection, and finally, for the realization of a visual sensitivity analysis via cobweb plots. We present two synthetic examples and then validate our approach in an industrial use-case concerning a marine current study using a hydraulic solver.","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47240511","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}