Construction is a major resource-consuming and GHG-emitting sector requiring a change of practices to be aligned with climate goals. This transition is supported by the emergence of new materials and digital fabrication techniques as robotic filament winding of bio-based fibre-polymer composites (FPC). This recent technique allows for the efficient use of FPC by placing filaments where structurally needed, reducing fabrication waste and offering large design flexibility. Despite these benefits, applications of fibre-wound bio-composites (FW-BC) have been limited due to difficulties in proving safety of such structures. Past examples relied on iterative design processes, including full-scale testing of components, a major barrier to wider implementation. Probabilistic reliability assessments are a promising alternative but are challenging due to the complex interaction of parameters at the four composite levels and their associated uncertainties. Those are large for FW-BC due to the natural origin of the fibre and the large variability from the robotic fabrication. The present work proposes a strategy for characterising the uncertainties at each level and understanding their relations across levels as a starting point for a reliability assessment of fibre-wound structures.
{"title":"Multi-scale uncertainty mapping in fibre-wound bio-composite structures.","authors":"Nathan Dupas, Kalaivanan Amudhan, Marta Gil Pérez","doi":"10.1002/cepa.3336","DOIUrl":"https://doi.org/10.1002/cepa.3336","url":null,"abstract":"<p>Construction is a major resource-consuming and GHG-emitting sector requiring a change of practices to be aligned with climate goals. This transition is supported by the emergence of new materials and digital fabrication techniques as robotic filament winding of bio-based fibre-polymer composites (FPC). This recent technique allows for the efficient use of FPC by placing filaments where structurally needed, reducing fabrication waste and offering large design flexibility. Despite these benefits, applications of fibre-wound bio-composites (FW-BC) have been limited due to difficulties in proving safety of such structures. Past examples relied on iterative design processes, including full-scale testing of components, a major barrier to wider implementation. Probabilistic reliability assessments are a promising alternative but are challenging due to the complex interaction of parameters at the four composite levels and their associated uncertainties. Those are large for FW-BC due to the natural origin of the fibre and the large variability from the robotic fabrication. The present work proposes a strategy for characterising the uncertainties at each level and understanding their relations across levels as a starting point for a reliability assessment of fibre-wound structures.</p>","PeriodicalId":100223,"journal":{"name":"ce/papers","volume":"8 3-4","pages":"451-458"},"PeriodicalIF":0.0,"publicationDate":"2025-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144998644","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The reliability of engineering structures and infrastructure is crucial for their long-term functionality. Extending the lifespan of concrete structures in critical infrastructure is essential for sustainability and resilience. This research explores how reliability -oriented safety factors, can achieve this. By analysing normative design rules, particularly variability measures like the coefficient of variation (COV) and standard deviation of materials, the study identifies significant safety margins that can be utilized to extend service life. It demonstrates how reliability-based methods can integrate these margines into quality control systems. The research highlights the importance of statistical parameters, such as means, standard deviations, and fractile values, in improving assessment precision and reliability. Findings show that semi-probabilistic safety concepts and adherence to standard design and conformity regulations can significantly increase service life. This approach enhances infrastructure durability and safety while promoting sustainable development by optimizing existing engineering assets.
{"title":"A Reliability-Oriented Approach to the Ensuring Longevity of critical Infrastructure","authors":"Alfred Strauss, Saeideh Faghfouri","doi":"10.1002/cepa.3361","DOIUrl":"https://doi.org/10.1002/cepa.3361","url":null,"abstract":"<p>The reliability of engineering structures and infrastructure is crucial for their long-term functionality. Extending the lifespan of concrete structures in critical infrastructure is essential for sustainability and resilience. This research explores how reliability -oriented safety factors, can achieve this. By analysing normative design rules, particularly variability measures like the coefficient of variation (COV) and standard deviation of materials, the study identifies significant safety margins that can be utilized to extend service life. It demonstrates how reliability-based methods can integrate these margines into quality control systems. The research highlights the importance of statistical parameters, such as means, standard deviations, and fractile values, in improving assessment precision and reliability. Findings show that semi-probabilistic safety concepts and adherence to standard design and conformity regulations can significantly increase service life. This approach enhances infrastructure durability and safety while promoting sustainable development by optimizing existing engineering assets.</p>","PeriodicalId":100223,"journal":{"name":"ce/papers","volume":"8 3-4","pages":"30-33"},"PeriodicalIF":0.0,"publicationDate":"2025-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/cepa.3361","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144998821","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Hendrik Baarssen, Diego Allaix, Davide Leonetti, Johan Maljaars
Ring flange connections are commonly used to connect the tower of offshore wind turbines with monopile foundations. Studs in these ring flange connections are sensitive to fatigue damage accumulation, which is aggravated by gaps between the flanges. Such gaps are inevitable because of imperfections (out-of-flatness) of the flanges. Given the sensitivity of the forces in the studs with respect to the gaps, this study investigates the gaps of ring flange connections and proposes a method to assess the gap height and angle. To this end, the joint distributions of the gap heights and angles are determined for different tower diameters and production tolerances using manufacturing data. The type and parameters of the marginal distributions of the gap heights and angles are determined using the Maximum Likelihood Estimation method. Data censoring is applied to ensure that the tails of the gap height and gap angle distributions are accurately captured. Then, using the censored data and corresponding marginal distributions, the copula of the joint distribution is determined. The results show that the data is best represented by Weibull and Beta distributions for the gap height and angle, and the dependence structure is best described with Clayton copulas.
{"title":"Assessment of the Distribution of the Gaps Between Ring Flange Connections in Offshore Wind Turbines","authors":"Hendrik Baarssen, Diego Allaix, Davide Leonetti, Johan Maljaars","doi":"10.1002/cepa.3360","DOIUrl":"https://doi.org/10.1002/cepa.3360","url":null,"abstract":"<p>Ring flange connections are commonly used to connect the tower of offshore wind turbines with monopile foundations. Studs in these ring flange connections are sensitive to fatigue damage accumulation, which is aggravated by gaps between the flanges. Such gaps are inevitable because of imperfections (out-of-flatness) of the flanges. Given the sensitivity of the forces in the studs with respect to the gaps, this study investigates the gaps of ring flange connections and proposes a method to assess the gap height and angle. To this end, the joint distributions of the gap heights and angles are determined for different tower diameters and production tolerances using manufacturing data. The type and parameters of the marginal distributions of the gap heights and angles are determined using the Maximum Likelihood Estimation method. Data censoring is applied to ensure that the tails of the gap height and gap angle distributions are accurately captured. Then, using the censored data and corresponding marginal distributions, the copula of the joint distribution is determined. The results show that the data is best represented by Weibull and Beta distributions for the gap height and angle, and the dependence structure is best described with Clayton copulas.</p>","PeriodicalId":100223,"journal":{"name":"ce/papers","volume":"8 3-4","pages":"289-297"},"PeriodicalIF":0.0,"publicationDate":"2025-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/cepa.3360","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144998981","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
High-precision, image-based intelligent crack detection has gained significant attention in the structural health monitoring of concrete structures. Convolutional neural networks (CNNs) are widely used for automatic crack detection due to their high accuracy and efficiency, enabling engineers to accelerate the detection process and take timely corrective actions. However, selecting optimal hyperparameters for CNNs during network training is a challenging task that greatly influences classification accuracy. The traditional trial-and-error approach for hyperparameter selection is both time-consuming and inefficient, necessitating automated methods to achieve optimal performance. In recent years, various optimization techniques have been increasingly adopted for this purpose. However, given the vast number of available methods, identifying an algorithm that balances both accuracy and computational efficiency remains a significant challenge. This study presents a comprehensive comparison of conventional probabilistic and deterministic methods for CNN hyperparameter selection. The findings indicate that incorporating stochastic methods alongside CNNs during the training process for segmenting crack images in concrete structures yields superior performance compared to the investigated deterministic approaches.
{"title":"Automated Screening for Cracks in Concrete Structures Using an Optimized Convolutional Neural Network","authors":"Jafar Jafariasl, Panagiotis Spyridis, Joern Ploennigs","doi":"10.1002/cepa.3351","DOIUrl":"https://doi.org/10.1002/cepa.3351","url":null,"abstract":"<p>High-precision, image-based intelligent crack detection has gained significant attention in the structural health monitoring of concrete structures. Convolutional neural networks (CNNs) are widely used for automatic crack detection due to their high accuracy and efficiency, enabling engineers to accelerate the detection process and take timely corrective actions. However, selecting optimal hyperparameters for CNNs during network training is a challenging task that greatly influences classification accuracy. The traditional trial-and-error approach for hyperparameter selection is both time-consuming and inefficient, necessitating automated methods to achieve optimal performance. In recent years, various optimization techniques have been increasingly adopted for this purpose. However, given the vast number of available methods, identifying an algorithm that balances both accuracy and computational efficiency remains a significant challenge. This study presents a comprehensive comparison of conventional probabilistic and deterministic methods for CNN hyperparameter selection. The findings indicate that incorporating stochastic methods alongside CNNs during the training process for segmenting crack images in concrete structures yields superior performance compared to the investigated deterministic approaches.</p>","PeriodicalId":100223,"journal":{"name":"ce/papers","volume":"8 3-4","pages":"368-373"},"PeriodicalIF":0.0,"publicationDate":"2025-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/cepa.3351","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144998826","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Mengying Peng, Andrea Franchini, Balša Jovanović, Ruben Van Coile
Glazing is increasingly used as a structural element in modern buildings. Current standards require characterizing this material's fracture strength (fc) distribution with a minimum of 30 “valid” tests (i.e., samples with fracture initiation in the inner ring area of a co-axial double ring test), excluding “invalid” test results. However, the “invalid” tests do not mean the tests are incorrect or faulty. But rather they represent censored data and still contain information that can be exploited for enhanced accuracy. Thus, this study applies Bayesian updating to extract and incorporate such information in glazing fracture strength characterization. This approach is demonstrated using data from a recent experimental study which adopted coaxial double-ring standardized tests for fracture strength characterization at 25°C and 275 °C. For this case study, the paper also examines how the coefficient of variation of the 5% quantile of fc changes in function of the number of tests executed. The proposed methodology reduces the uncertainty in estimating the characteristic value of glazing fracture strength and improves testing efficiency. Indeed, it enables achieving the same confidence level as that implied by performing the test number required by the standard with fewer tests, particularly at elevated temperatures.
{"title":"Exploiting invalid test results for assessing the distribution of glazing fracture strength","authors":"Mengying Peng, Andrea Franchini, Balša Jovanović, Ruben Van Coile","doi":"10.1002/cepa.3342","DOIUrl":"https://doi.org/10.1002/cepa.3342","url":null,"abstract":"<p>Glazing is increasingly used as a structural element in modern buildings. Current standards require characterizing this material's fracture strength (f<sub>c</sub>) distribution with a minimum of 30 “valid” tests (i.e., samples with fracture initiation in the inner ring area of a co-axial double ring test), excluding “invalid” test results. However, the “invalid” tests do not mean the tests are incorrect or faulty. But rather they represent censored data and still contain information that can be exploited for enhanced accuracy. Thus, this study applies Bayesian updating to extract and incorporate such information in glazing fracture strength characterization. This approach is demonstrated using data from a recent experimental study which adopted coaxial double-ring standardized tests for fracture strength characterization at 25°C and 275 °C. For this case study, the paper also examines how the coefficient of variation of the 5% quantile of f<sub>c</sub> changes in function of the number of tests executed. The proposed methodology reduces the uncertainty in estimating the characteristic value of glazing fracture strength and improves testing efficiency. Indeed, it enables achieving the same confidence level as that implied by performing the test number required by the standard with fewer tests, particularly at elevated temperatures.</p>","PeriodicalId":100223,"journal":{"name":"ce/papers","volume":"8 3-4","pages":"144-151"},"PeriodicalIF":0.0,"publicationDate":"2025-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144998975","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Sensitivity analysis is a pivotal tool in civil engineering, facilitating the identification of influential parameters in complex systems and aiding in model optimization but also interpretability/explainability, especially in the fields of machine learning and nonlinear Finite Element modeling. This paper explores the comparative efficacy of Sobol and Shapley sensitivity analysis methods in addressing civil engineering challenges, with a focus on evaluating their applicability and reliability in real-world scenarios. Sobol's method, based on variance decomposition, provides a comprehensive measure of both individual and interaction effects of input variables. Meanwhile, the Shapley method, rooted in cooperative game theory, offers a fair allocation of variable contributions, particularly in nonlinear and interdependent systems. To demonstrate their effectiveness, this study proposes case studies involving the performance analysis of fiber-reinforced concrete (FRC). Using these case studies, the sensitivity results derived from Sobol and Shapley methods are compared in terms of computational efficiency, accuracy in capturing parameter interactions, and interpretation of results. This study underscores the strengths and limitations of these sensitivity analysis techniques and their potential to enhance the design and performance evaluation of civil engineering structures, particularly in realms of high-complexity, in terms of uncertainty and correlation. The proposed framework serves as a guide for selecting appropriate sensitivity methods based on problem-specific requirements, advancing robust and reliable uncertainty management.
{"title":"Comparative Analysis of Sobol and Shapley Methods for Sensitivity Analysis in Civil Engineering: Case Studies on Fibre-Reinforced Concrete Performance","authors":"Nikolaos Mellios, Panos Spyridis","doi":"10.1002/cepa.3363","DOIUrl":"https://doi.org/10.1002/cepa.3363","url":null,"abstract":"<p>Sensitivity analysis is a pivotal tool in civil engineering, facilitating the identification of influential parameters in complex systems and aiding in model optimization but also interpretability/explainability, especially in the fields of machine learning and nonlinear Finite Element modeling. This paper explores the comparative efficacy of Sobol and Shapley sensitivity analysis methods in addressing civil engineering challenges, with a focus on evaluating their applicability and reliability in real-world scenarios. Sobol's method, based on variance decomposition, provides a comprehensive measure of both individual and interaction effects of input variables. Meanwhile, the Shapley method, rooted in cooperative game theory, offers a fair allocation of variable contributions, particularly in nonlinear and interdependent systems. To demonstrate their effectiveness, this study proposes case studies involving the performance analysis of fiber-reinforced concrete (FRC). Using these case studies, the sensitivity results derived from Sobol and Shapley methods are compared in terms of computational efficiency, accuracy in capturing parameter interactions, and interpretation of results. This study underscores the strengths and limitations of these sensitivity analysis techniques and their potential to enhance the design and performance evaluation of civil engineering structures, particularly in realms of high-complexity, in terms of uncertainty and correlation. The proposed framework serves as a guide for selecting appropriate sensitivity methods based on problem-specific requirements, advancing robust and reliable uncertainty management.</p>","PeriodicalId":100223,"journal":{"name":"ce/papers","volume":"8 3-4","pages":"345-351"},"PeriodicalIF":0.0,"publicationDate":"2025-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/cepa.3363","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144997967","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The design of coastal protection structures requires design parameters that accurately represent the hydrodynamic conditions along the coast. Currently, these input variables are based on univariate probability models, which do not take into account the joint probability of water levels and waves. Bivariate modeling of the probability with Copula models offers an alternative.
Copulas can be used to describe the non-linear dependencies between water level and wave height and to calculate joint probabilities of occurrence. However, the application of this methodology places greater demands on the underlying data. As the data available in the study area does not meet the requirements, statistical methods are used to generate the data. First, various Copulas are adapted to physically consistent combinations of water level and wave height extracted from storm surge events and validated. Next, the Copulas are used to calculate design water levels and wave heights for selected return intervals. The bivariate design parameters are compared with the univariate ones in a simplified design example for wave run-up on a dike.
The validation of various models shows that the Frank Copula best describes the dependency structure. The bivariate parameter heights determined with the same return intervals are lower than the parameters determined with the univariate method. The available data only allow a limited application of the Copulas for design issues in the study area. Nevertheless, Copulas have the potential to replace the univariate methods for determining the design parameters.
{"title":"Determination of the hydrodynamic design parameters water level and wave height using Copula models for the design of coastal protection structures on the Baltic Sea of Germany","authors":"Christian Kaehler, Fokke Saathoff","doi":"10.1002/cepa.3359","DOIUrl":"https://doi.org/10.1002/cepa.3359","url":null,"abstract":"<p>The design of coastal protection structures requires design parameters that accurately represent the hydrodynamic conditions along the coast. Currently, these input variables are based on univariate probability models, which do not take into account the joint probability of water levels and waves. Bivariate modeling of the probability with Copula models offers an alternative.</p><p>Copulas can be used to describe the non-linear dependencies between water level and wave height and to calculate joint probabilities of occurrence. However, the application of this methodology places greater demands on the underlying data. As the data available in the study area does not meet the requirements, statistical methods are used to generate the data. First, various Copulas are adapted to physically consistent combinations of water level and wave height extracted from storm surge events and validated. Next, the Copulas are used to calculate design water levels and wave heights for selected return intervals. The bivariate design parameters are compared with the univariate ones in a simplified design example for wave run-up on a dike.</p><p>The validation of various models shows that the Frank Copula best describes the dependency structure. The bivariate parameter heights determined with the same return intervals are lower than the parameters determined with the univariate method. The available data only allow a limited application of the Copulas for design issues in the study area. Nevertheless, Copulas have the potential to replace the univariate methods for determining the design parameters.</p>","PeriodicalId":100223,"journal":{"name":"ce/papers","volume":"8 3-4","pages":"41-50"},"PeriodicalIF":0.0,"publicationDate":"2025-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/cepa.3359","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144998954","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Environmental factors play a critical role in the corrosion of reinforced concrete (RC) structures, directly impacting their durability, safety, and serviceability. Corrosion can lead to increased displacements, cracking, or even structural collapse, while also incurring significant economic costs. These issues are expected to intensify in certain regions due to the effects of climate change on corrosion mechanisms. In this study, the probability of steel depassivation was first estimated using climate variables—temperature, relative humidity, and CO2 concentration—predicted by a machine learning model (Random Forest) trained on historical data. For the propagation phase, the present study employs an alternative Finite Element Method based on Positions (FEMP), using laminated frame elements. The corrosion effect of reduction of steel area was incorporated into the model to simulate long-term degradation of RC elements. Monte Carlo simulation was used to compute the failure probabilities. The proposed method was tested for various environmental conditions for RC structures placed in Brazil. The results demonstrate significant regional variation in depassivation times and failure probabilities, with nearly a 10% increase in SLS failure probability 60 years after depassivation. The study highlights the critical influence of macroclimatic variables on corrosion progression and structural reliability, suggesting that current design codes may not fully capture localized environmental effects.
{"title":"Probabilistic Corrosion Impact Analysis Under a Changing Climate: A Numerical Model for Reinforced Concrete Structures","authors":"Chiara Pinheiro Teodoro, Emilio Bastidas-Arteaga, Rogério Carrazedo","doi":"10.1002/cepa.3340","DOIUrl":"https://doi.org/10.1002/cepa.3340","url":null,"abstract":"<p>Environmental factors play a critical role in the corrosion of reinforced concrete (RC) structures, directly impacting their durability, safety, and serviceability. Corrosion can lead to increased displacements, cracking, or even structural collapse, while also incurring significant economic costs. These issues are expected to intensify in certain regions due to the effects of climate change on corrosion mechanisms. In this study, the probability of steel depassivation was first estimated using climate variables—temperature, relative humidity, and CO<sub>2</sub> concentration—predicted by a machine learning model (Random Forest) trained on historical data. For the propagation phase, the present study employs an alternative Finite Element Method based on Positions (FEMP), using laminated frame elements. The corrosion effect of reduction of steel area was incorporated into the model to simulate long-term degradation of RC elements. Monte Carlo simulation was used to compute the failure probabilities. The proposed method was tested for various environmental conditions for RC structures placed in Brazil. The results demonstrate significant regional variation in depassivation times and failure probabilities, with nearly a 10% increase in SLS failure probability 60 years after depassivation. The study highlights the critical influence of macroclimatic variables on corrosion progression and structural reliability, suggesting that current design codes may not fully capture localized environmental effects.</p>","PeriodicalId":100223,"journal":{"name":"ce/papers","volume":"8 3-4","pages":"21-29"},"PeriodicalIF":0.0,"publicationDate":"2025-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/cepa.3340","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144997962","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Miguel Angel Mendoza-Lugo, Diego Lorenzo Allaix, Benjamin Cerar, Liesette la Gasse
One of the essential components for the reliability assessment of existing bridges is the collection of Weigh-In-Motion (WIM) observations. These observations provide valuable data on traffic composition, including vehicle loads and individual axle distances. However, at locations where WIM stations are not present, probabilistic predictive models are required to assess the uncertainty in the traffic flows and traffic loads.. In this study, we investigate the use of Gaussian copula-based Bayesian Networks (GCBN) to create a virtual dataset of WIM observations. This dataset is termed “virtual” because it has never been measured at any specific location. Given the uncertainty on inter-vehicle distances, we propose conceptualizing the flow of vehicles as a series of convoys. For traffic composition, vehicle types are sampled from WIM datasets based on the assumption that these datasets represent the variability of vehicle loads. This virtual dataset is then employed to assess the impact of traffic loads on bridges within the Dutch motorway network. Results from the approach utilized confirmed the suitability of the proposed GCBN for generating a virtual dataset that closely reflects the expected traffic composition.
{"title":"Virtual WIM datasets for the assessment of bridge-specific traffic load effects","authors":"Miguel Angel Mendoza-Lugo, Diego Lorenzo Allaix, Benjamin Cerar, Liesette la Gasse","doi":"10.1002/cepa.3318","DOIUrl":"https://doi.org/10.1002/cepa.3318","url":null,"abstract":"<p>One of the essential components for the reliability assessment of existing bridges is the collection of Weigh-In-Motion (WIM) observations. These observations provide valuable data on traffic composition, including vehicle loads and individual axle distances. However, at locations where WIM stations are not present, probabilistic predictive models are required to assess the uncertainty in the traffic flows and traffic loads.. In this study, we investigate the use of Gaussian copula-based Bayesian Networks (GCBN) to create a virtual dataset of WIM observations. This dataset is termed “virtual” because it has never been measured at any specific location. Given the uncertainty on inter-vehicle distances, we propose conceptualizing the flow of vehicles as a series of convoys. For traffic composition, vehicle types are sampled from WIM datasets based on the assumption that these datasets represent the variability of vehicle loads. This virtual dataset is then employed to assess the impact of traffic loads on bridges within the Dutch motorway network. Results from the approach utilized confirmed the suitability of the proposed GCBN for generating a virtual dataset that closely reflects the expected traffic composition.</p>","PeriodicalId":100223,"journal":{"name":"ce/papers","volume":"8 3-4","pages":"227-233"},"PeriodicalIF":0.0,"publicationDate":"2025-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144998677","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Elisabeth Stierschneider, Alexios Tamparopoulos, Oliver Zeman, Konrad Bergmeister
The product qualification of bonded fasteners is regulated in a European Assessment Document, where a comprehensive test program to evaluate the sensitivity of bonded fasteners to different installation and environmental conditions is included. Sustained load testing as part of this program is used for the long-term displacement forecast to assess the creep behaviour of the anchors over time. The prescribed standard procedure for the assessment of sustained load tests is the Findley power-law methodology. The measured displacements are extrapolated to a working life of 50 years and compared with the displacement at loss of adhesion as limiting value. As a result of this power-law methodology, also the inherent measurement uncertainty of the displacements is extrapolated over time. The first step to quantify this influence is the determination of the measurement uncertainty for the considered sustained load testing task and the used equipment for a specific sustained load data set. Based on Monte-Carlo simulations with the measured displacement and the combined standard uncertainty of the testing task as input parameters, a high number of displacement curves is generated. By analysing the scatter of the calculated displacement extrapolations for 50 years, the influence of the measurement uncertainty is quantified for the considered data set.
{"title":"Influence of measurement uncertainty inherent in sustained load testing data on the displacement forecast of bonded anchors","authors":"Elisabeth Stierschneider, Alexios Tamparopoulos, Oliver Zeman, Konrad Bergmeister","doi":"10.1002/cepa.3328","DOIUrl":"https://doi.org/10.1002/cepa.3328","url":null,"abstract":"<p>The product qualification of bonded fasteners is regulated in a European Assessment Document, where a comprehensive test program to evaluate the sensitivity of bonded fasteners to different installation and environmental conditions is included. Sustained load testing as part of this program is used for the long-term displacement forecast to assess the creep behaviour of the anchors over time. The prescribed standard procedure for the assessment of sustained load tests is the Findley power-law methodology. The measured displacements are extrapolated to a working life of 50 years and compared with the displacement at loss of adhesion as limiting value. As a result of this power-law methodology, also the inherent measurement uncertainty of the displacements is extrapolated over time. The first step to quantify this influence is the determination of the measurement uncertainty for the considered sustained load testing task and the used equipment for a specific sustained load data set. Based on Monte-Carlo simulations with the measured displacement and the combined standard uncertainty of the testing task as input parameters, a high number of displacement curves is generated. By analysing the scatter of the calculated displacement extrapolations for 50 years, the influence of the measurement uncertainty is quantified for the considered data set.</p>","PeriodicalId":100223,"journal":{"name":"ce/papers","volume":"8 3-4","pages":"266-273"},"PeriodicalIF":0.0,"publicationDate":"2025-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/cepa.3328","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144998979","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}