{"title":"利用深度学习对基于计算机断层扫描的径向支气管内超声周边肺部病变进行图像模拟","authors":"Chunxi Zhang, Yongzheng Zhou, Chuanqi Sun, Jilei Zhang, Junxiang Chen, Xiaoxuan Zheng, Ying Li, Xiaoyao Liu, Weiping Liu, Jiayuan Sun","doi":"10.1097/eus.0000000000000079","DOIUrl":null,"url":null,"abstract":"<h3>Background and Objectives </h3>\n<p>Radial endobronchial ultrasound (R-EBUS) plays an important role during transbronchial sampling of peripheral pulmonary lesions (PPLs). However, existing navigational bronchoscopy systems provide no guidance for R-EBUS. To guide intraoperative R-EBUS probe manipulation, we aimed to simulate R-EBUS images of PPLs from preoperative computed tomography (CT) data using deep learning.</p>\n<h3>Materials and Methods </h3>\n<p>Preoperative CT and intraoperative ultrasound data of PPLs in 250 patients who underwent R-EBUS–guided transbronchial lung biopsy were retrospectively collected. Two-dimensional CT sections perpendicular to the biopsy path were transformed into ultrasonic reflection and transmission images using an ultrasound propagation model to obtain the initial simulated R-EBUS images. A cycle generative adversarial network was trained to improve the realism of initial simulated images. Objective and subjective indicators were used to evaluate the similarity between real and simulated images.</p>\n<h3>Results </h3>\n<p>Wasserstein distances showed that utilizing the cycle generative adversarial network significantly improved the similarity between real and simulated R-EBUS images. There was no statistically significant difference in the long axis, short axis, and area between real and simulated lesions (all <em xmlns:mrws=\"http://webservices.ovid.com/mrws/1.0\">P</em> > 0.05). Based on the experts’ evaluation, a median similarity score of ≥4 on a 5-point scale was obtained for lesion size, shape, margin, internal echoes, and overall similarity.</p>\n<h3>Conclusions </h3>\n<p>Simulated R-EBUS images of PPLs generated by our method can closely mimic the corresponding real images, demonstrating the potential of our method to provide guidance for intraoperative R-EBUS probe manipulation.</p>","PeriodicalId":11577,"journal":{"name":"Endoscopic Ultrasound","volume":null,"pages":null},"PeriodicalIF":4.4000,"publicationDate":"2024-08-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Computed tomography–based radial endobronchial ultrasound image simulation of peripheral pulmonary lesions using deep learning\",\"authors\":\"Chunxi Zhang, Yongzheng Zhou, Chuanqi Sun, Jilei Zhang, Junxiang Chen, Xiaoxuan Zheng, Ying Li, Xiaoyao Liu, Weiping Liu, Jiayuan Sun\",\"doi\":\"10.1097/eus.0000000000000079\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<h3>Background and Objectives </h3>\\n<p>Radial endobronchial ultrasound (R-EBUS) plays an important role during transbronchial sampling of peripheral pulmonary lesions (PPLs). However, existing navigational bronchoscopy systems provide no guidance for R-EBUS. To guide intraoperative R-EBUS probe manipulation, we aimed to simulate R-EBUS images of PPLs from preoperative computed tomography (CT) data using deep learning.</p>\\n<h3>Materials and Methods </h3>\\n<p>Preoperative CT and intraoperative ultrasound data of PPLs in 250 patients who underwent R-EBUS–guided transbronchial lung biopsy were retrospectively collected. Two-dimensional CT sections perpendicular to the biopsy path were transformed into ultrasonic reflection and transmission images using an ultrasound propagation model to obtain the initial simulated R-EBUS images. A cycle generative adversarial network was trained to improve the realism of initial simulated images. Objective and subjective indicators were used to evaluate the similarity between real and simulated images.</p>\\n<h3>Results </h3>\\n<p>Wasserstein distances showed that utilizing the cycle generative adversarial network significantly improved the similarity between real and simulated R-EBUS images. There was no statistically significant difference in the long axis, short axis, and area between real and simulated lesions (all <em xmlns:mrws=\\\"http://webservices.ovid.com/mrws/1.0\\\">P</em> > 0.05). Based on the experts’ evaluation, a median similarity score of ≥4 on a 5-point scale was obtained for lesion size, shape, margin, internal echoes, and overall similarity.</p>\\n<h3>Conclusions </h3>\\n<p>Simulated R-EBUS images of PPLs generated by our method can closely mimic the corresponding real images, demonstrating the potential of our method to provide guidance for intraoperative R-EBUS probe manipulation.</p>\",\"PeriodicalId\":11577,\"journal\":{\"name\":\"Endoscopic Ultrasound\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":4.4000,\"publicationDate\":\"2024-08-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Endoscopic Ultrasound\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.1097/eus.0000000000000079\",\"RegionNum\":1,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"GASTROENTEROLOGY & HEPATOLOGY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Endoscopic Ultrasound","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1097/eus.0000000000000079","RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"GASTROENTEROLOGY & HEPATOLOGY","Score":null,"Total":0}
Computed tomography–based radial endobronchial ultrasound image simulation of peripheral pulmonary lesions using deep learning
Background and Objectives
Radial endobronchial ultrasound (R-EBUS) plays an important role during transbronchial sampling of peripheral pulmonary lesions (PPLs). However, existing navigational bronchoscopy systems provide no guidance for R-EBUS. To guide intraoperative R-EBUS probe manipulation, we aimed to simulate R-EBUS images of PPLs from preoperative computed tomography (CT) data using deep learning.
Materials and Methods
Preoperative CT and intraoperative ultrasound data of PPLs in 250 patients who underwent R-EBUS–guided transbronchial lung biopsy were retrospectively collected. Two-dimensional CT sections perpendicular to the biopsy path were transformed into ultrasonic reflection and transmission images using an ultrasound propagation model to obtain the initial simulated R-EBUS images. A cycle generative adversarial network was trained to improve the realism of initial simulated images. Objective and subjective indicators were used to evaluate the similarity between real and simulated images.
Results
Wasserstein distances showed that utilizing the cycle generative adversarial network significantly improved the similarity between real and simulated R-EBUS images. There was no statistically significant difference in the long axis, short axis, and area between real and simulated lesions (all P > 0.05). Based on the experts’ evaluation, a median similarity score of ≥4 on a 5-point scale was obtained for lesion size, shape, margin, internal echoes, and overall similarity.
Conclusions
Simulated R-EBUS images of PPLs generated by our method can closely mimic the corresponding real images, demonstrating the potential of our method to provide guidance for intraoperative R-EBUS probe manipulation.
期刊介绍:
Endoscopic Ultrasound, a publication of Euro-EUS Scientific Committee, Asia-Pacific EUS Task Force and Latin American Chapter of EUS, is a peer-reviewed online journal with Quarterly print on demand compilation of issues published. The journal’s full text is available online at http://www.eusjournal.com. The journal allows free access (Open Access) to its contents and permits authors to self-archive final accepted version of the articles on any OAI-compliant institutional / subject-based repository. The journal does not charge for submission, processing or publication of manuscripts and even for color reproduction of photographs.