{"title":"Real-time tracking of surgical instruments based on spatio-temporal context and deep learning.","authors":"Zijian Zhao, Zhaorui Chen, Sandrine Voros, Xiaolin Cheng","doi":"10.1080/24699322.2018.1560097","DOIUrl":null,"url":null,"abstract":"<p><p>ABSTARCT Real-time tool tracking in minimally invasive-surgery (MIS) has numerous applications for computer-assisted interventions (CAIs). Visual tracking approaches are a promising solution to real-time surgical tool tracking, however, many approaches may fail to complete tracking when the tracker suffers from issues such as motion blur, adverse lighting, specular reflections, shadows, and occlusions. We propose an automatic real-time method for two-dimensional tool detection and tracking based on a spatial transformer network (STN) and spatio-temporal context (STC). Our method exploits both the ability of a convolutional neural network (CNN) with an in-house trained STN and STC to accurately locate the tool at high speed. Then we compared our method experimentally with other four general of CAIs' visual tracking methods using eight existing online and in-house datasets, covering both <i>in vivo</i> abdominal, cardiac and retinal clinical cases in which different surgical instruments were employed. The experiments demonstrate that our method achieved great performance with respect to the accuracy and the speed. It can track a surgical tool without labels in real time in the most challenging of cases, with an accuracy that is equal to and sometimes surpasses most state-of-the-art tracking algorithms. Further improvements to our method will focus on conditions of occlusion and multi-instruments.</p>","PeriodicalId":56051,"journal":{"name":"Computer Assisted Surgery","volume":null,"pages":null},"PeriodicalIF":1.5000,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/24699322.2018.1560097","citationCount":"31","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computer Assisted Surgery","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1080/24699322.2018.1560097","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2019/2/14 0:00:00","PubModel":"Epub","JCR":"Q3","JCRName":"SURGERY","Score":null,"Total":0}
引用次数: 31
Abstract
ABSTARCT Real-time tool tracking in minimally invasive-surgery (MIS) has numerous applications for computer-assisted interventions (CAIs). Visual tracking approaches are a promising solution to real-time surgical tool tracking, however, many approaches may fail to complete tracking when the tracker suffers from issues such as motion blur, adverse lighting, specular reflections, shadows, and occlusions. We propose an automatic real-time method for two-dimensional tool detection and tracking based on a spatial transformer network (STN) and spatio-temporal context (STC). Our method exploits both the ability of a convolutional neural network (CNN) with an in-house trained STN and STC to accurately locate the tool at high speed. Then we compared our method experimentally with other four general of CAIs' visual tracking methods using eight existing online and in-house datasets, covering both in vivo abdominal, cardiac and retinal clinical cases in which different surgical instruments were employed. The experiments demonstrate that our method achieved great performance with respect to the accuracy and the speed. It can track a surgical tool without labels in real time in the most challenging of cases, with an accuracy that is equal to and sometimes surpasses most state-of-the-art tracking algorithms. Further improvements to our method will focus on conditions of occlusion and multi-instruments.
期刊介绍:
omputer Assisted Surgery aims to improve patient care by advancing the utilization of computers during treatment; to evaluate the benefits and risks associated with the integration of advanced digital technologies into surgical practice; to disseminate clinical and basic research relevant to stereotactic surgery, minimal access surgery, endoscopy, and surgical robotics; to encourage interdisciplinary collaboration between engineers and physicians in developing new concepts and applications; to educate clinicians about the principles and techniques of computer assisted surgery and therapeutics; and to serve the international scientific community as a medium for the transfer of new information relating to theory, research, and practice in biomedical imaging and the surgical specialties.
The scope of Computer Assisted Surgery encompasses all fields within surgery, as well as biomedical imaging and instrumentation, and digital technology employed as an adjunct to imaging in diagnosis, therapeutics, and surgery. Topics featured include frameless as well as conventional stereotactic procedures, surgery guided by intraoperative ultrasound or magnetic resonance imaging, image guided focused irradiation, robotic surgery, and any therapeutic interventions performed with the use of digital imaging technology.