{"title":"将模拟结果与实验数据进行比较的基于似然和深度的标准,以支持数值模拟器的验证","authors":"Amandine Marrel, Heloise Velardo, Antoine Bouloré","doi":"10.1615/int.j.uncertaintyquantification.2023046666","DOIUrl":null,"url":null,"abstract":"Within the framework of Best-Estimate-Plus-Uncertainty approaches, the assessment of model parameter uncertainties, associated with numerical simulators, is a key element in safety analysis. The results (or outputs) of the simulation must be compared and validated against experimental values, when such data is available. This validation step, as part of the broader Verification, Validation and Uncertainty Quantification process, is required to ensure a reliable use of the simulator for modeling and prediction. This work aims to define quantitative criteria to support this validation for multivariate outputs, while taking into account modeling uncertainties (uncertain input parameters) and experimental uncertainties (measurement uncertainties). For this purpose, different statistical indicators, based on likelihood or statistical depths, are investigated and extended to the multidimensional case. First, the properties of the criteria are studied, either analytically or by simulation, for some specific cases (Gaussian distribution for experimental uncertainties, identical distributions of experiments and simulations, particular discrepancies). Then, some natural extensions to multivariate outputs are proposed, with guidelines for practical use depending on the objectives of the validation (strict/hard or average validation). From this, transformed criteria are proposed to make them more comparable and less sensitive to the dimension of the output. It is shown that these transformations allow for a fairer and more relevant comparison and interpretation of the different criteria. Finally, these criteria are applied to a code dedicated to nuclear material behavior simulation.","PeriodicalId":48814,"journal":{"name":"International Journal for Uncertainty Quantification","volume":"33 1","pages":"0"},"PeriodicalIF":1.5000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Likelihood and depth-based criteria for comparing simulation results with experimental data, in support to validation of numerical simulators\",\"authors\":\"Amandine Marrel, Heloise Velardo, Antoine Bouloré\",\"doi\":\"10.1615/int.j.uncertaintyquantification.2023046666\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Within the framework of Best-Estimate-Plus-Uncertainty approaches, the assessment of model parameter uncertainties, associated with numerical simulators, is a key element in safety analysis. The results (or outputs) of the simulation must be compared and validated against experimental values, when such data is available. This validation step, as part of the broader Verification, Validation and Uncertainty Quantification process, is required to ensure a reliable use of the simulator for modeling and prediction. This work aims to define quantitative criteria to support this validation for multivariate outputs, while taking into account modeling uncertainties (uncertain input parameters) and experimental uncertainties (measurement uncertainties). For this purpose, different statistical indicators, based on likelihood or statistical depths, are investigated and extended to the multidimensional case. First, the properties of the criteria are studied, either analytically or by simulation, for some specific cases (Gaussian distribution for experimental uncertainties, identical distributions of experiments and simulations, particular discrepancies). Then, some natural extensions to multivariate outputs are proposed, with guidelines for practical use depending on the objectives of the validation (strict/hard or average validation). From this, transformed criteria are proposed to make them more comparable and less sensitive to the dimension of the output. It is shown that these transformations allow for a fairer and more relevant comparison and interpretation of the different criteria. Finally, these criteria are applied to a code dedicated to nuclear material behavior simulation.\",\"PeriodicalId\":48814,\"journal\":{\"name\":\"International Journal for Uncertainty Quantification\",\"volume\":\"33 1\",\"pages\":\"0\"},\"PeriodicalIF\":1.5000,\"publicationDate\":\"2023-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal for Uncertainty Quantification\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1615/int.j.uncertaintyquantification.2023046666\",\"RegionNum\":4,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ENGINEERING, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal for Uncertainty Quantification","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1615/int.j.uncertaintyquantification.2023046666","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
Likelihood and depth-based criteria for comparing simulation results with experimental data, in support to validation of numerical simulators
Within the framework of Best-Estimate-Plus-Uncertainty approaches, the assessment of model parameter uncertainties, associated with numerical simulators, is a key element in safety analysis. The results (or outputs) of the simulation must be compared and validated against experimental values, when such data is available. This validation step, as part of the broader Verification, Validation and Uncertainty Quantification process, is required to ensure a reliable use of the simulator for modeling and prediction. This work aims to define quantitative criteria to support this validation for multivariate outputs, while taking into account modeling uncertainties (uncertain input parameters) and experimental uncertainties (measurement uncertainties). For this purpose, different statistical indicators, based on likelihood or statistical depths, are investigated and extended to the multidimensional case. First, the properties of the criteria are studied, either analytically or by simulation, for some specific cases (Gaussian distribution for experimental uncertainties, identical distributions of experiments and simulations, particular discrepancies). Then, some natural extensions to multivariate outputs are proposed, with guidelines for practical use depending on the objectives of the validation (strict/hard or average validation). From this, transformed criteria are proposed to make them more comparable and less sensitive to the dimension of the output. It is shown that these transformations allow for a fairer and more relevant comparison and interpretation of the different criteria. Finally, these criteria are applied to a code dedicated to nuclear material behavior simulation.
期刊介绍:
The International Journal for Uncertainty Quantification disseminates information of permanent interest in the areas of analysis, modeling, design and control of complex systems in the presence of uncertainty. The journal seeks to emphasize methods that cross stochastic analysis, statistical modeling and scientific computing. Systems of interest are governed by differential equations possibly with multiscale features. Topics of particular interest include representation of uncertainty, propagation of uncertainty across scales, resolving the curse of dimensionality, long-time integration for stochastic PDEs, data-driven approaches for constructing stochastic models, validation, verification and uncertainty quantification for predictive computational science, and visualization of uncertainty in high-dimensional spaces. Bayesian computation and machine learning techniques are also of interest for example in the context of stochastic multiscale systems, for model selection/classification, and decision making. Reports addressing the dynamic coupling of modern experiments and modeling approaches towards predictive science are particularly encouraged. Applications of uncertainty quantification in all areas of physical and biological sciences are appropriate.