A Simultaneous Polyp and Lumen Detection Framework Toward Autonomous Robotic Colonoscopy

IF 3.4 Q2 ENGINEERING, BIOMEDICAL IEEE transactions on medical robotics and bionics Pub Date : 2024-01-04 DOI:10.1109/TMRB.2024.3349623
Wing Yin Ng;Yehui Li;Tianle Pan;Yichong Sun;Qi Dou;Pheng Ann Heng;Philip Wai Yan Chiu;Zheng Li
{"title":"A Simultaneous Polyp and Lumen Detection Framework Toward Autonomous Robotic Colonoscopy","authors":"Wing Yin Ng;Yehui Li;Tianle Pan;Yichong Sun;Qi Dou;Pheng Ann Heng;Philip Wai Yan Chiu;Zheng Li","doi":"10.1109/TMRB.2024.3349623","DOIUrl":null,"url":null,"abstract":"Colorectal Cancer is one of the deadliest diseases with a high incidence and mortality worldwide. Robotic colonoscopes have been extensively developed to provide alternative solutions for colon screening. Nevertheless, most robotic colonoscopes remain a low autonomy level, which leads to non-intuitive manipulation and limits their clinical translation. This paper proposes a deep learning-based framework for simultaneous polyp and lumen detection, which aims to automates robotic colonoscopes to achieve intelligent and autonomous manipulations in two aspects of navigation and diagnosis. Two fully annotated datasets, including a real colon dataset, with 40186 images, and a colon phantom dataset, are developed to facilitate polyp and lumen detection in both clinical and laboratory environments. Benchmarking of various object detection models achieve an average precision of 0.827, and an average recall of 0.866. Experimental validation is conducted in both a commercialized colon phantom and an ex-vivo porcine colon using an electromagnetically actuated soft-tethered colonoscope as a case study, with the results indicate that the colonoscope can successfully perform autonomous navigation and automatic polyp detection under the proposed unified framework. This work promotes clinical applications of robotic colonoscopy by enhancing its autonomy level with artificial intelligence techniques.","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":null,"pages":null},"PeriodicalIF":3.4000,"publicationDate":"2024-01-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on medical robotics and bionics","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10380749/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0

Abstract

Colorectal Cancer is one of the deadliest diseases with a high incidence and mortality worldwide. Robotic colonoscopes have been extensively developed to provide alternative solutions for colon screening. Nevertheless, most robotic colonoscopes remain a low autonomy level, which leads to non-intuitive manipulation and limits their clinical translation. This paper proposes a deep learning-based framework for simultaneous polyp and lumen detection, which aims to automates robotic colonoscopes to achieve intelligent and autonomous manipulations in two aspects of navigation and diagnosis. Two fully annotated datasets, including a real colon dataset, with 40186 images, and a colon phantom dataset, are developed to facilitate polyp and lumen detection in both clinical and laboratory environments. Benchmarking of various object detection models achieve an average precision of 0.827, and an average recall of 0.866. Experimental validation is conducted in both a commercialized colon phantom and an ex-vivo porcine colon using an electromagnetically actuated soft-tethered colonoscope as a case study, with the results indicate that the colonoscope can successfully perform autonomous navigation and automatic polyp detection under the proposed unified framework. This work promotes clinical applications of robotic colonoscopy by enhancing its autonomy level with artificial intelligence techniques.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
面向自主机器人结肠镜检查的息肉和管腔同步检测框架
结肠直肠癌是最致命的疾病之一,在全世界的发病率和死亡率都很高。机器人结肠镜已得到广泛开发,为结肠筛查提供了替代解决方案。然而,大多数机器人结肠镜的自主水平仍然较低,导致操作不直观,限制了其临床应用。本文提出了一种基于深度学习的同时检测息肉和管腔的框架,旨在实现机器人结肠镜的自动化,在导航和诊断两个方面实现智能自主操作。本文开发了两个完全注释的数据集,包括一个包含 40186 幅图像的真实结肠数据集和一个结肠模型数据集,以促进临床和实验室环境中的息肉和管腔检测。以电磁驱动软系结肠镜为例,在商业化结肠模型和活体猪结肠中进行了实验验证,结果表明结肠镜能在所提出的统一框架下成功实现自主导航和息肉自动检测。这项工作利用人工智能技术提高了机器人结肠镜的自主水平,从而促进了机器人结肠镜的临床应用。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
6.80
自引率
0.00%
发文量
0
期刊最新文献
Table of Contents IEEE Transactions on Medical Robotics and Bionics Publication Information Guest Editorial Joining Efforts Moving Faster in Surgical Robotics IEEE Transactions on Medical Robotics and Bionics Society Information IEEE Transactions on Medical Robotics and Bionics Information for Authors
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1