Wing Yin Ng;Yehui Li;Tianle Pan;Yichong Sun;Qi Dou;Pheng Ann Heng;Philip Wai Yan Chiu;Zheng Li
{"title":"A Simultaneous Polyp and Lumen Detection Framework Toward Autonomous Robotic Colonoscopy","authors":"Wing Yin Ng;Yehui Li;Tianle Pan;Yichong Sun;Qi Dou;Pheng Ann Heng;Philip Wai Yan Chiu;Zheng Li","doi":"10.1109/TMRB.2024.3349623","DOIUrl":null,"url":null,"abstract":"Colorectal Cancer is one of the deadliest diseases with a high incidence and mortality worldwide. Robotic colonoscopes have been extensively developed to provide alternative solutions for colon screening. Nevertheless, most robotic colonoscopes remain a low autonomy level, which leads to non-intuitive manipulation and limits their clinical translation. This paper proposes a deep learning-based framework for simultaneous polyp and lumen detection, which aims to automates robotic colonoscopes to achieve intelligent and autonomous manipulations in two aspects of navigation and diagnosis. Two fully annotated datasets, including a real colon dataset, with 40186 images, and a colon phantom dataset, are developed to facilitate polyp and lumen detection in both clinical and laboratory environments. Benchmarking of various object detection models achieve an average precision of 0.827, and an average recall of 0.866. Experimental validation is conducted in both a commercialized colon phantom and an ex-vivo porcine colon using an electromagnetically actuated soft-tethered colonoscope as a case study, with the results indicate that the colonoscope can successfully perform autonomous navigation and automatic polyp detection under the proposed unified framework. This work promotes clinical applications of robotic colonoscopy by enhancing its autonomy level with artificial intelligence techniques.","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":null,"pages":null},"PeriodicalIF":3.4000,"publicationDate":"2024-01-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on medical robotics and bionics","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10380749/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0
Abstract
Colorectal Cancer is one of the deadliest diseases with a high incidence and mortality worldwide. Robotic colonoscopes have been extensively developed to provide alternative solutions for colon screening. Nevertheless, most robotic colonoscopes remain a low autonomy level, which leads to non-intuitive manipulation and limits their clinical translation. This paper proposes a deep learning-based framework for simultaneous polyp and lumen detection, which aims to automates robotic colonoscopes to achieve intelligent and autonomous manipulations in two aspects of navigation and diagnosis. Two fully annotated datasets, including a real colon dataset, with 40186 images, and a colon phantom dataset, are developed to facilitate polyp and lumen detection in both clinical and laboratory environments. Benchmarking of various object detection models achieve an average precision of 0.827, and an average recall of 0.866. Experimental validation is conducted in both a commercialized colon phantom and an ex-vivo porcine colon using an electromagnetically actuated soft-tethered colonoscope as a case study, with the results indicate that the colonoscope can successfully perform autonomous navigation and automatic polyp detection under the proposed unified framework. This work promotes clinical applications of robotic colonoscopy by enhancing its autonomy level with artificial intelligence techniques.