Xiaohua Bao , Jiazhi Huang , Jun Shen , Xianlong Wu , Tao Wang , Xiangsheng Chen , Hongzhi Cui
{"title":"A novel method of void detection in rebar-affected areas based on transfer learning and improved YOLOv8","authors":"Xiaohua Bao , Jiazhi Huang , Jun Shen , Xianlong Wu , Tao Wang , Xiangsheng Chen , Hongzhi Cui","doi":"10.1016/j.tust.2025.106440","DOIUrl":null,"url":null,"abstract":"<div><div>The rebar mesh inside the tunnel lining introduces significant interference in detecting defects, reducing their visibility in GPR images. This study proposes a global-to-local secondary recognition method based on an improved YOLOv8 model to address this challenge. Two datasets—global and local GPR images—were used, with an attention mechanism integrated into the YOLOv8 architecture. The improved YOLOv8 structure has been shown to increase the mean Average Precision (mAP) by 9.36 % and 3.86 % for the two datasets, respectively. Optimal performance was achieved with a rebar spacing of 0.4 m and a secondary recognition confidence of 0.867, while a rebar-defect distance of 1.20 m reached a confidence of 0.858. The model accurately identified the void defect shapes. Compared to traditional rebar signal suppression methods, this approach simplifies data processing, enhances accuracy, and reduces training costs. A tunnel field case further validated the method, boosting GPR image recognition confidence from 0.37 to 0.73, significantly improving the automated detection of tunnel lining defects.</div></div>","PeriodicalId":49414,"journal":{"name":"Tunnelling and Underground Space Technology","volume":"158 ","pages":"Article 106440"},"PeriodicalIF":6.7000,"publicationDate":"2025-01-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Tunnelling and Underground Space Technology","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0886779825000781","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"CONSTRUCTION & BUILDING TECHNOLOGY","Score":null,"Total":0}
引用次数: 0
Abstract
The rebar mesh inside the tunnel lining introduces significant interference in detecting defects, reducing their visibility in GPR images. This study proposes a global-to-local secondary recognition method based on an improved YOLOv8 model to address this challenge. Two datasets—global and local GPR images—were used, with an attention mechanism integrated into the YOLOv8 architecture. The improved YOLOv8 structure has been shown to increase the mean Average Precision (mAP) by 9.36 % and 3.86 % for the two datasets, respectively. Optimal performance was achieved with a rebar spacing of 0.4 m and a secondary recognition confidence of 0.867, while a rebar-defect distance of 1.20 m reached a confidence of 0.858. The model accurately identified the void defect shapes. Compared to traditional rebar signal suppression methods, this approach simplifies data processing, enhances accuracy, and reduces training costs. A tunnel field case further validated the method, boosting GPR image recognition confidence from 0.37 to 0.73, significantly improving the automated detection of tunnel lining defects.
期刊介绍:
Tunnelling and Underground Space Technology is an international journal which publishes authoritative articles encompassing the development of innovative uses of underground space and the results of high quality research into improved, more cost-effective techniques for the planning, geo-investigation, design, construction, operation and maintenance of underground and earth-sheltered structures. The journal provides an effective vehicle for the improved worldwide exchange of information on developments in underground technology - and the experience gained from its use - and is strongly committed to publishing papers on the interdisciplinary aspects of creating, planning, and regulating underground space.