Establishing the credibility of computational fluid dynamics (CFD) models for multiphase flow applications is increasingly becoming a mainstream requirement. However, the established verification and validation (V&V) Standards have been primarily demonstrated for single phase flow applications. Studies to address their applicability for multiphase flows have been limited. Hence, their application may not be trivial and require a thorough investigation. We propose to adopt the ASME V&V 20 Standard and explore its applicability for multiphase flows through several extensions by introducing some of the best practices. In the current study, the proposed verification, validation, and uncertainty quantification (VVUQ) framework is presented and its preliminary application is demonstrated using the simulation of granular discharge through a conical hopper commonly employed in several industrial processes. As part of the proposed extensions to the V&V methodology, a detailed survey of subject matter experts including CFD modelers and experimentalists was conducted. The results from the survey highlighted the need for a more quantitative assessment of importance ranking in addition to a sensitivity study before embarking on simulation and experimental campaigns. Hence, a screening study followed by a global sensitivity was performed to identify the most influential parameters for the CFD simulation as the first phase of the process, which is presented in this paper. The results show that particle–particle coefficients of restitution and friction are the most important parameters for the granular discharge flow problem chosen for demonstration of the process. The identification of these parameters is important to determine their effect on the quantities of interest and improve the confidence level in numerical predictions.
{"title":"Toward the Development of a Verification, Validation, and Uncertainty Quantification Framework for Granular and Multiphase Flows—Part 1: Screening Study and Sensitivity Analysis","authors":"A. Gel, A. Vaidheeswaran, Jordan Musser, C. Tong","doi":"10.1115/1.4041745","DOIUrl":"https://doi.org/10.1115/1.4041745","url":null,"abstract":"Establishing the credibility of computational fluid dynamics (CFD) models for multiphase flow applications is increasingly becoming a mainstream requirement. However, the established verification and validation (V&V) Standards have been primarily demonstrated for single phase flow applications. Studies to address their applicability for multiphase flows have been limited. Hence, their application may not be trivial and require a thorough investigation. We propose to adopt the ASME V&V 20 Standard and explore its applicability for multiphase flows through several extensions by introducing some of the best practices. In the current study, the proposed verification, validation, and uncertainty quantification (VVUQ) framework is presented and its preliminary application is demonstrated using the simulation of granular discharge through a conical hopper commonly employed in several industrial processes. As part of the proposed extensions to the V&V methodology, a detailed survey of subject matter experts including CFD modelers and experimentalists was conducted. The results from the survey highlighted the need for a more quantitative assessment of importance ranking in addition to a sensitivity study before embarking on simulation and experimental campaigns. Hence, a screening study followed by a global sensitivity was performed to identify the most influential parameters for the CFD simulation as the first phase of the process, which is presented in this paper. The results show that particle–particle coefficients of restitution and friction are the most important parameters for the granular discharge flow problem chosen for demonstration of the process. The identification of these parameters is important to determine their effect on the quantities of interest and improve the confidence level in numerical predictions.","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2018-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1115/1.4041745","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44158556","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
When making computational simulation predictions of multiphysics engineering systems, sources of uncertainty in the prediction need to be acknowledged and included in the analysis within the current paradigm of striving for simulation credibility. A thermal analysis of an aerospace geometry was performed at Sandia National Laboratories. For this analysis, a verification, validation, and uncertainty quantification (VVUQ) workflow provided structure for the analysis, resulting in the quantification of significant uncertainty sources including spatial numerical error and material property parametric uncertainty. It was hypothesized that the parametric uncertainty and numerical errors were independent and separable for this application. This hypothesis was supported by performing uncertainty quantification (UQ) simulations at multiple mesh resolutions, while being limited by resources to minimize the number of medium and high resolution simulations. Based on this supported hypothesis, a prediction including parametric uncertainty and a systematic mesh bias is used to make a margin assessment that avoids unnecessary uncertainty obscuring the results and optimizes use of computing resources.
{"title":"Separability of Mesh Bias and Parametric Uncertainty for a Full System Thermal Analysis","authors":"Benjamin Schroeder, H. Silva, K. Smith","doi":"10.1115/VVS2018-9339","DOIUrl":"https://doi.org/10.1115/VVS2018-9339","url":null,"abstract":"When making computational simulation predictions of multiphysics engineering systems, sources of uncertainty in the prediction need to be acknowledged and included in the analysis within the current paradigm of striving for simulation credibility. A thermal analysis of an aerospace geometry was performed at Sandia National Laboratories. For this analysis, a verification, validation, and uncertainty quantification (VVUQ) workflow provided structure for the analysis, resulting in the quantification of significant uncertainty sources including spatial numerical error and material property parametric uncertainty. It was hypothesized that the parametric uncertainty and numerical errors were independent and separable for this application. This hypothesis was supported by performing uncertainty quantification (UQ) simulations at multiple mesh resolutions, while being limited by resources to minimize the number of medium and high resolution simulations. Based on this supported hypothesis, a prediction including parametric uncertainty and a systematic mesh bias is used to make a margin assessment that avoids unnecessary uncertainty obscuring the results and optimizes use of computing resources.","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2018-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44476214","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The purpose of this paper is not to present new results; rather, it is to show that the current approach to model validation is not consistent with the accepted mathematics of probability theory. Specifically, we argue that the Sandia V&V Challenge Problem is ill-posed in that the answers sought do not, mathematically, exist. We apply our arguments to show the types of mistakes present in the papers presented in the Journal of Verification, Validation and Uncertainty Quantification, Volume 1,1 along with the challenge problem. Further, we argue that, when the problem is properly posed, both the applicable methodology and the solution techniques are easily drawn from the well-developed mathematics of probability and decision theory. The unfortunate aspect of the challenge problem as currently stated is that it leads to incorrect and inappropriate mathematical approaches that should be avoided and corrected in the current literature.
{"title":"Models, Uncertainty, and the Sandia V&V Challenge Problem","authors":"G. Hazelrigg, G. Klutke","doi":"10.1115/VVS2018-9308","DOIUrl":"https://doi.org/10.1115/VVS2018-9308","url":null,"abstract":"\u0000 The purpose of this paper is not to present new results; rather, it is to show that the current approach to model validation is not consistent with the accepted mathematics of probability theory. Specifically, we argue that the Sandia V&V Challenge Problem is ill-posed in that the answers sought do not, mathematically, exist. We apply our arguments to show the types of mistakes present in the papers presented in the Journal of Verification, Validation and Uncertainty Quantification, Volume 1,1 along with the challenge problem. Further, we argue that, when the problem is properly posed, both the applicable methodology and the solution techniques are easily drawn from the well-developed mathematics of probability and decision theory. The unfortunate aspect of the challenge problem as currently stated is that it leads to incorrect and inappropriate mathematical approaches that should be avoided and corrected in the current literature.","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2018-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1115/VVS2018-9308","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43786357","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Diez, R. Broglia, D. Durante, A. Olivieri, E. Campana, F. Stern
The objective of this work is to provide and use both experimental fluid dynamics (EFD) data and computational fluid dynamics (CFD) results to validate a regular-wave uncertainty quantification (UQ) model of ship response in irregular waves, based on a set of stochastic regular waves with variable frequency. As a secondary objective, preliminary statistical studies are required to assess EFD and CFD irregular wave errors and uncertainties versus theoretical values and evaluate EFD and CFD resistance and motions uncertainties and, in the latter case, errors versus EFD values. UQ methods include analysis of the autocovariance matrix and block-bootstrap of time series values (primary variable). Additionally, the height (secondary variable) associated with the mean-crossing period is assessed by the bootstrap method. Errors and confidence intervals of statistical estimators are used to define validation criteria. The application is a two-degrees-of-freedom (heave and pitch) towed Delft catamaran with a length between perpendiculars equal to 3 m (scale factor equal to 33), sailing at Froude number equal to 0.425 in head waves at scaled sea state 5. Validation variables are x-force, heave and pitch motions, vertical acceleration of bridge, and vertical velocity of flight deck. Autocovariance and block-bootstrap methods for primary variables provide consistent and complementary results; the autocovariance is used to assess the uncertainty associated with expected values and standard deviations and is able to identify undesired self-repetition in the irregular wave signal; block-bootstrap methods are used to assess additional statistical estimators such as mode and quantiles. Secondary variables are used for an additional assessment of the quality of experimental and simulation data as they are generally more difficult to model and predict than primary variables. Finally, the regular wave UQ model provides a good approximation of the desired irregular wave statistics, with average errors smaller than 5% and validation uncertainties close to 10%.
{"title":"Statistical Assessment and Validation of Experimental and Computational Ship Response in Irregular Waves","authors":"M. Diez, R. Broglia, D. Durante, A. Olivieri, E. Campana, F. Stern","doi":"10.1115/1.4041372","DOIUrl":"https://doi.org/10.1115/1.4041372","url":null,"abstract":"The objective of this work is to provide and use both experimental fluid dynamics (EFD) data and computational fluid dynamics (CFD) results to validate a regular-wave uncertainty quantification (UQ) model of ship response in irregular waves, based on a set of stochastic regular waves with variable frequency. As a secondary objective, preliminary statistical studies are required to assess EFD and CFD irregular wave errors and uncertainties versus theoretical values and evaluate EFD and CFD resistance and motions uncertainties and, in the latter case, errors versus EFD values. UQ methods include analysis of the autocovariance matrix and block-bootstrap of time series values (primary variable). Additionally, the height (secondary variable) associated with the mean-crossing period is assessed by the bootstrap method. Errors and confidence intervals of statistical estimators are used to define validation criteria. The application is a two-degrees-of-freedom (heave and pitch) towed Delft catamaran with a length between perpendiculars equal to 3 m (scale factor equal to 33), sailing at Froude number equal to 0.425 in head waves at scaled sea state 5. Validation variables are x-force, heave and pitch motions, vertical acceleration of bridge, and vertical velocity of flight deck. Autocovariance and block-bootstrap methods for primary variables provide consistent and complementary results; the autocovariance is used to assess the uncertainty associated with expected values and standard deviations and is able to identify undesired self-repetition in the irregular wave signal; block-bootstrap methods are used to assess additional statistical estimators such as mode and quantiles. Secondary variables are used for an additional assessment of the quality of experimental and simulation data as they are generally more difficult to model and predict than primary variables. Finally, the regular wave UQ model provides a good approximation of the desired irregular wave statistics, with average errors smaller than 5% and validation uncertainties close to 10%.","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1115/1.4041372","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49017503","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Uncertainty quantification (UQ) is gaining in maturity and importance in engineering analysis. While historical engineering analysis and design methods have relied heavily on safety factors (SF) with built-in conservatism, modern approaches require detailed assessment of reliability to provide optimized and balanced designs. This paper presents methodologies that support the transition toward this type of approach. Fundamental concepts are described for UQ in general engineering analysis. These include consideration of the sources of uncertainty and their categorization. Of particular importance are the categorization of aleatory and epistemic uncertainties and their separate propagation through an UQ analysis. This familiar concept is referred to here as a “two-dimensional” approach, and it provides for the assessment of both the probability of a predicted occurrence and the credibility in that prediction. Unique to the approach presented here is the adaptation of the concept of a bounding probability box to that of a credible probability box. This requires estimates for probability distributions related to all uncertainties both aleatory and epistemic. The propagation of these distributions through the uncertainty analysis provides for the assessment of probability related to the system response, along with a quantification of credibility in that prediction. Details of a generalized methodology for UQ in this framework are presented, and approaches for interpreting results are described. Illustrative examples are presented.
{"title":"A General Methodology for Uncertainty Quantification in Engineering Analyses Using a Credible Probability Box","authors":"M. E. Ewing, B. Liechty, D. L. Black","doi":"10.1115/1.4041490","DOIUrl":"https://doi.org/10.1115/1.4041490","url":null,"abstract":"Uncertainty quantification (UQ) is gaining in maturity and importance in engineering analysis. While historical engineering analysis and design methods have relied heavily on safety factors (SF) with built-in conservatism, modern approaches require detailed assessment of reliability to provide optimized and balanced designs. This paper presents methodologies that support the transition toward this type of approach. Fundamental concepts are described for UQ in general engineering analysis. These include consideration of the sources of uncertainty and their categorization. Of particular importance are the categorization of aleatory and epistemic uncertainties and their separate propagation through an UQ analysis. This familiar concept is referred to here as a “two-dimensional” approach, and it provides for the assessment of both the probability of a predicted occurrence and the credibility in that prediction. Unique to the approach presented here is the adaptation of the concept of a bounding probability box to that of a credible probability box. This requires estimates for probability distributions related to all uncertainties both aleatory and epistemic. The propagation of these distributions through the uncertainty analysis provides for the assessment of probability related to the system response, along with a quantification of credibility in that prediction. Details of a generalized methodology for UQ in this framework are presented, and approaches for interpreting results are described. Illustrative examples are presented.","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1115/1.4041490","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48983512","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
U. Otgonbaatar, E. Baglietto, Y. Caffari, N. Todreas, G. Lenci
In this work, a general methodology and innovative framework to characterize and quantify representativeness uncertainty of performance indicator measurements of power generation systems is proposed. The representativeness uncertainty refers to the difference between a measurement value of a performance indicator quantity and its reference true value. It arises from the inherent variability of the quantity being measured. The main objectives of the methodology are to characterize and reduce the representativeness uncertainty by adopting numerical simulation in combination with experimental data and to improve the physical description of the measurement. The methodology is applied to an industrial case study for demonstration. The case study involves a computational fluid dynamics (CFD) simulation of an orifice plate-based mass flow rate measurement, using a commercially available package. Using the insight obtained from the CFD simulation, the representativeness uncertainty in mass flow rate measurement is quantified and the associated random uncertainties are comprehensively accounted for. Both parametric and nonparametric implementations of the methodology are illustrated. The case study also illustrates how the methodology is used to quantitatively test the level of statistical significance of the CFD simulation result after accounting for the relevant uncertainties.
{"title":"A Methodology for Characterizing Representativeness Uncertainty in Performance Indicator Measurements of Power Generating Systems","authors":"U. Otgonbaatar, E. Baglietto, Y. Caffari, N. Todreas, G. Lenci","doi":"10.1115/1.4041687","DOIUrl":"https://doi.org/10.1115/1.4041687","url":null,"abstract":"In this work, a general methodology and innovative framework to characterize and quantify representativeness uncertainty of performance indicator measurements of power generation systems is proposed. The representativeness uncertainty refers to the difference between a measurement value of a performance indicator quantity and its reference true value. It arises from the inherent variability of the quantity being measured. The main objectives of the methodology are to characterize and reduce the representativeness uncertainty by adopting numerical simulation in combination with experimental data and to improve the physical description of the measurement. The methodology is applied to an industrial case study for demonstration. The case study involves a computational fluid dynamics (CFD) simulation of an orifice plate-based mass flow rate measurement, using a commercially available package. Using the insight obtained from the CFD simulation, the representativeness uncertainty in mass flow rate measurement is quantified and the associated random uncertainties are comprehensively accounted for. Both parametric and nonparametric implementations of the methodology are illustrated. The case study also illustrates how the methodology is used to quantitatively test the level of statistical significance of the CFD simulation result after accounting for the relevant uncertainties.","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46649577","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Andrew Atkinson, R. Hill, J. Pignatiello, G. Vining, E. White, E. Chicken
Model verification and validation (V&V) remain a critical step in the simulation model development process. A model requires verification to ensure that it has been correctly transitioned from a conceptual form to a computerized form. A model also requires validation to substantiate the accurate representation of the system it is meant to simulate. Validation assessments are complex when the system and model both generate high-dimensional functional output. To handle this complexity, this paper reviews several wavelet-based approaches for assessing models of this type and introduces a new concept for highlighting the areas of contrast and congruity between system and model data. This concept identifies individual wavelet coefficients that correspond to the areas of discrepancy between the system and model.
{"title":"Exposing System and Model Disparity and Agreement Using Wavelets","authors":"Andrew Atkinson, R. Hill, J. Pignatiello, G. Vining, E. White, E. Chicken","doi":"10.1115/1.4041265","DOIUrl":"https://doi.org/10.1115/1.4041265","url":null,"abstract":"Model verification and validation (V&V) remain a critical step in the simulation model development process. A model requires verification to ensure that it has been correctly transitioned from a conceptual form to a computerized form. A model also requires validation to substantiate the accurate representation of the system it is meant to simulate. Validation assessments are complex when the system and model both generate high-dimensional functional output. To handle this complexity, this paper reviews several wavelet-based approaches for assessing models of this type and introduces a new concept for highlighting the areas of contrast and congruity between system and model data. This concept identifies individual wavelet coefficients that correspond to the areas of discrepancy between the system and model.","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1115/1.4041265","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46281242","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Fractional calculus provides a rigorous mathematical framework to describe anomalous stochastic processes by generalizing the notion of classical differential equations to their fractional-order counterparts. By introducing the fractional orders as uncertain variables, we develop an operator-based uncertainty quantification framework in the context of stochastic fractional partial differential equations (SFPDEs), subject to additive random noise. We characterize different sources of uncertainty and then, propagate their associated randomness to the system response by employing a probabilistic collocation method (PCM). We develop a fast, stable, and convergent Petrov–Galerkin spectral method in the physical domain in order to formulate the forward solver in simulating each realization of random variables in the sampling procedure.
{"title":"Operator-Based Uncertainty Quantification of Stochastic Fractional Partial Differential Equations","authors":"E. Kharazmi, Mohsen Zayernouri","doi":"10.1115/1.4046093","DOIUrl":"https://doi.org/10.1115/1.4046093","url":null,"abstract":"\u0000 Fractional calculus provides a rigorous mathematical framework to describe anomalous stochastic processes by generalizing the notion of classical differential equations to their fractional-order counterparts. By introducing the fractional orders as uncertain variables, we develop an operator-based uncertainty quantification framework in the context of stochastic fractional partial differential equations (SFPDEs), subject to additive random noise. We characterize different sources of uncertainty and then, propagate their associated randomness to the system response by employing a probabilistic collocation method (PCM). We develop a fast, stable, and convergent Petrov–Galerkin spectral method in the physical domain in order to formulate the forward solver in simulating each realization of random variables in the sampling procedure.","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2018-05-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46061663","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
F. Dias, J. Vargas, Sam Yang, V. Kava, W. Balmant, A. Mariano, J. Ordonez
A dynamic physics-based model developed for the prediction of biohydrogen production in a compact tubular photobioreactor (PBR) was calibrated experimentally. The spatial domain in the model was discretized with lumped control volumes and the principles of classical thermodynamics, mass, species, and heat transfer were combined to derive a system of ordinary differential equations, whose solution was the temperature and mass fraction distributions across the entire system. Two microalgae species, namely, Acutodesmus obliquus and Chlamydomonas reinhardtii strain cc125, were cultured in triplicate with different culture media via indirect biophotolysis. Measured biomass and hydrogen concentrations were then used to adjust the specific microalgae growth and hydrogen production rates in the model based on residual sum of squares (RSS) and the direct search method. Despite its simplicity, the presented volume element model was verified to well predict both hydrogen and biomass concentration over time. The microalgae growth rate for each species was determined as 2.16 μalga,0 s−1 and 0.91 μalga,0 s−1 for A. obliquus and C. reinhardtii strain cc125, respectively, where μalga,0 is the specific growth rate of Scenedesmus almeriensis for given temperature and irradiance. The adjusted maximum hydrogen production rates for the local nonmutant A. obliquus and for C. reinhardtii strain cc125 were 1.3 × 10−7 s−1 and 4.1 × 10−7 s−1. Consequently, these hydrogen production rates were 59 times and 19 times smaller, respectively, than that for the mutant C. reinhardtii strain cc849.
{"title":"Experimental Calibration of a Biohydrogen Production Estimation Model","authors":"F. Dias, J. Vargas, Sam Yang, V. Kava, W. Balmant, A. Mariano, J. Ordonez","doi":"10.1115/VVS2018-9341","DOIUrl":"https://doi.org/10.1115/VVS2018-9341","url":null,"abstract":"\u0000 A dynamic physics-based model developed for the prediction of biohydrogen production in a compact tubular photobioreactor (PBR) was calibrated experimentally. The spatial domain in the model was discretized with lumped control volumes and the principles of classical thermodynamics, mass, species, and heat transfer were combined to derive a system of ordinary differential equations, whose solution was the temperature and mass fraction distributions across the entire system. Two microalgae species, namely, Acutodesmus obliquus and Chlamydomonas reinhardtii strain cc125, were cultured in triplicate with different culture media via indirect biophotolysis. Measured biomass and hydrogen concentrations were then used to adjust the specific microalgae growth and hydrogen production rates in the model based on residual sum of squares (RSS) and the direct search method. Despite its simplicity, the presented volume element model was verified to well predict both hydrogen and biomass concentration over time. The microalgae growth rate for each species was determined as 2.16 μalga,0 s−1 and 0.91 μalga,0 s−1 for A. obliquus and C. reinhardtii strain cc125, respectively, where μalga,0 is the specific growth rate of Scenedesmus almeriensis for given temperature and irradiance. The adjusted maximum hydrogen production rates for the local nonmutant A. obliquus and for C. reinhardtii strain cc125 were 1.3 × 10−7 s−1 and 4.1 × 10−7 s−1. Consequently, these hydrogen production rates were 59 times and 19 times smaller, respectively, than that for the mutant C. reinhardtii strain cc849.","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":" ","pages":""},"PeriodicalIF":0.6,"publicationDate":"2018-05-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1115/VVS2018-9341","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43952321","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-09-01Epub Date: 2017-10-31DOI: 10.1115/1.4038175
Brecca M M Gaffney, Cory L Christiansen, Amanda M Murray, Casey A Myers, Peter J Laz, Bradley S Davidson
Joint kinetic measurement is a fundamental tool used to quantify compensatory movement patterns in participants with transtibial amputation (TTA). Joint kinetics are calculated through inverse dynamics (ID) and depend on segment kinematics, external forces, and both segment and prosthetic inertial parameters (PIPS); yet the individual influence of PIPs on ID is unknown. The objective of this investigation was to assess the importance of parameterizing PIPs when calculating ID using a probabilistic analysis. A series of Monte Carlo simulations were performed to assess the influence of uncertainty in PIPs on ID. Multivariate input distributions were generated from experimentally measured PIPs (foot/shank: mass, center of mass (COM), moment of inertia) of ten prostheses and output distributions were hip and knee joint kinetics. Confidence bounds (2.5-97.5%) and sensitivity of outputs to model input parameters were calculated throughout one gait cycle. Results demonstrated that PIPs had a larger influence on joint kinetics during the swing period than the stance period (e.g., maximum hip flexion/extension moment confidence bound size: stance = 5.6 N·m, swing: 11.4 N·m). Joint kinetics were most sensitive to shank mass during both the stance and swing periods. Accurate measurement of prosthesis shank mass is necessary to calculate joint kinetics with ID in participants with TTA with passive prostheses consisting of total contact carbon fiber sockets and dynamic elastic response feet during walking.
{"title":"The Effects of Prosthesis Inertial Parameters on Inverse Dynamics: A Probabilistic Analysis.","authors":"Brecca M M Gaffney, Cory L Christiansen, Amanda M Murray, Casey A Myers, Peter J Laz, Bradley S Davidson","doi":"10.1115/1.4038175","DOIUrl":"https://doi.org/10.1115/1.4038175","url":null,"abstract":"<p><p>Joint kinetic measurement is a fundamental tool used to quantify compensatory movement patterns in participants with transtibial amputation (TTA). Joint kinetics are calculated through inverse dynamics (ID) and depend on segment kinematics, external forces, and both segment and prosthetic inertial parameters (PIPS); yet the individual influence of PIPs on ID is unknown. The objective of this investigation was to assess the importance of parameterizing PIPs when calculating ID using a probabilistic analysis. A series of Monte Carlo simulations were performed to assess the influence of uncertainty in PIPs on ID. Multivariate input distributions were generated from experimentally measured PIPs (foot/shank: mass, center of mass (COM), moment of inertia) of ten prostheses and output distributions were hip and knee joint kinetics. Confidence bounds (2.5-97.5%) and sensitivity of outputs to model input parameters were calculated throughout one gait cycle. Results demonstrated that PIPs had a larger influence on joint kinetics during the swing period than the stance period (e.g., maximum hip flexion/extension moment confidence bound size: stance = 5.6 N·m, swing: 11.4 N·m). Joint kinetics were most sensitive to shank mass during both the stance and swing periods. Accurate measurement of prosthesis shank mass is necessary to calculate joint kinetics with ID in participants with TTA with passive prostheses consisting of total contact carbon fiber sockets and dynamic elastic response feet during walking.</p>","PeriodicalId":52254,"journal":{"name":"Journal of Verification, Validation and Uncertainty Quantification","volume":"2 3","pages":"0310031-310038"},"PeriodicalIF":0.6,"publicationDate":"2017-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1115/1.4038175","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"40601865","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}