Existing infrared thermography detection of cavitation defects in external prestressed pipelines is characterised by a variety of test conditions, making it difficult to explore the applicable conditions thoroughly by experiment. To address this issue, key parameters for the numerical model of hydration heat transfer in grouting material for prestressed pipes were established through the fitting of simulation experiments and field experiments. Subsequently, simulation models were constructed under various conditions to investigate the factors affecting the detection of void defects using infrared thermal imaging, including the presence or absence of steel strands, the size of void defects, the material of the pipeline, and its wall thickness. Our results demonstrate that the presence of steel strands reduces the defect identification capability, with the maximum contrast (MaxΔT) decreasing by 1.117℃ in high polyethylene (HDPE) pipes with a 100% void area. Galvanized steel (GSP) pipes are more difficult to detect than HDPE pipes due to their lower emissivity, particularly in the case of GSP pipes with a 60% void area, where MaxΔT is reduced by 18.96% compared to HDPE pipes. As the size of the void increases, the defect identification capability gradually enhances, and void defects larger than 26% can be detected. For both types of pipes, as the wall thickness increases, the infrared detection time window gradually narrows, with the most significant reduction observed for 30% void defects. This study serves as a reference and provides a theoretical basis for the infrared thermal imaging detection of cavity defects in externally prestressed pipes.
扫码关注我们
求助内容:
应助结果提醒方式:
