Pub Date : 2019-05-08DOI: 10.1109/ISMR.2019.8710184
Veena Jayasree-Krishnan, Dhruv Gamdha, Brian S. Goldberg, Shramana Ghosh, P. Raghavan, V. Kapila
We present a novel task-specific upper-extremity rehabilitation system that uses an instrumented cup and an interactive gaming environment to promote patient engagement during repetitive rehabilitative exercise. The designed system tracks the movement of the cup as a stroke patient uses her forearm to perform a complex goal-oriented and task-specific activity, namely, grasping, lifting, and tilting the cup to drink from it. A force sensitive resistive sensor is mounted on the cup to constantly monitor the grasp force and the cup is endowed with rich features allowing a webcam to track and estimate its location and pose with the aid of state-of-the-art machine learning algorithms. This bi-manual forearm rehabilitation system is designed to enable stroke patients to perform an activity of daily living, namely, grasping, lifting, and tilting, repetitively, at their home as an exercise, so that they can relearn the motions of arm, wrist, and hand related to drinking.
{"title":"A Novel Task-Specific Upper-Extremity Rehabilitation System with Interactive Game-Based Interface for Stroke Patients","authors":"Veena Jayasree-Krishnan, Dhruv Gamdha, Brian S. Goldberg, Shramana Ghosh, P. Raghavan, V. Kapila","doi":"10.1109/ISMR.2019.8710184","DOIUrl":"https://doi.org/10.1109/ISMR.2019.8710184","url":null,"abstract":"We present a novel task-specific upper-extremity rehabilitation system that uses an instrumented cup and an interactive gaming environment to promote patient engagement during repetitive rehabilitative exercise. The designed system tracks the movement of the cup as a stroke patient uses her forearm to perform a complex goal-oriented and task-specific activity, namely, grasping, lifting, and tilting the cup to drink from it. A force sensitive resistive sensor is mounted on the cup to constantly monitor the grasp force and the cup is endowed with rich features allowing a webcam to track and estimate its location and pose with the aid of state-of-the-art machine learning algorithms. This bi-manual forearm rehabilitation system is designed to enable stroke patients to perform an activity of daily living, namely, grasping, lifting, and tilting, repetitively, at their home as an exercise, so that they can relearn the motions of arm, wrist, and hand related to drinking.","PeriodicalId":404745,"journal":{"name":"2019 International Symposium on Medical Robotics (ISMR)","volume":"79 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-05-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121166176","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-04-03DOI: 10.1109/ISMR.2019.8710195
Xuefeng Wang, Phillip Tran, Sarah M. Callahan, S. Wolf, J. Desai
Spinal cord injury (SCI) to the C-5 area causes loss of fine motor control in the hand and fingers. Stroke often causes hemiparesis, which impairs arm and hand function. Both afflictions render the individual unable to complete activities of daily living (ADL). In this work, an exotendon glove system is designed for repetitive task practice (RTP) to improve the efficacy of hand and finger function rehabilitation in spinal cord injury patients. Common ADL tasks are evaluated through correlation analysis to aid in the design of the exotendon glove. A novel slack-enabling mechanism is introduced and a smartphone app voice control interface is developed to increase the efficiency and effectiveness of the exotendon glove. The range-of-motion (ROM) of the exotendon glove in individual finger movement and ADL tasks is experimentally identified, and the performance of the exotendon glove in ADL tasks is experimentally evaluated.
{"title":"Towards the development of a voice-controlled exoskeleton system for restoring hand function","authors":"Xuefeng Wang, Phillip Tran, Sarah M. Callahan, S. Wolf, J. Desai","doi":"10.1109/ISMR.2019.8710195","DOIUrl":"https://doi.org/10.1109/ISMR.2019.8710195","url":null,"abstract":"Spinal cord injury (SCI) to the C-5 area causes loss of fine motor control in the hand and fingers. Stroke often causes hemiparesis, which impairs arm and hand function. Both afflictions render the individual unable to complete activities of daily living (ADL). In this work, an exotendon glove system is designed for repetitive task practice (RTP) to improve the efficacy of hand and finger function rehabilitation in spinal cord injury patients. Common ADL tasks are evaluated through correlation analysis to aid in the design of the exotendon glove. A novel slack-enabling mechanism is introduced and a smartphone app voice control interface is developed to increase the efficiency and effectiveness of the exotendon glove. The range-of-motion (ROM) of the exotendon glove in individual finger movement and ADL tasks is experimentally identified, and the performance of the exotendon glove in ADL tasks is experimentally evaluated.","PeriodicalId":404745,"journal":{"name":"2019 International Symposium on Medical Robotics (ISMR)","volume":"48 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127098182","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-04-03DOI: 10.1109/ISMR.2019.8710191
Bahareh Abbasi, M. Sharifzadeh, E. Noohi, S. Parastegari, M. Žefran
Grasp is an integral part of manipulation actions in activities of daily living and programming by demonstration is a powerful paradigm for teaching the assistive robots how to perform a grasp. Since finger configuration and finger force are the fundamental features that need to be controlled during a grasp, using these variables is a natural choice for learning by demonstration. An important question then becomes whether the existing grasp taxonomies are appropriate when one considers these modalities. The goal of our paper is to answer this question by investigating grasp patterns that can be inferred from a static analysis of the grasp data, as the object is securely grasped. Human grasp data is measured using a newly developed data glove. The data includes pressure sensor measurements from eighteen areas of the hand, and measurements from bend sensors placed at finger joints. The pressure sensor measurements are calibrated and mapped into force by employing a novel data-driven approach. Unsupervised learning is used to identify patterns for different grasp types. Multiple clustering algorithms are used to partition the data. When the results are taken in aggregate, 25 human grasp types are reduced to 9 different clusters.
{"title":"Grasp Taxonomy for Robot Assistants Inferred from Finger Pressure and Flexion","authors":"Bahareh Abbasi, M. Sharifzadeh, E. Noohi, S. Parastegari, M. Žefran","doi":"10.1109/ISMR.2019.8710191","DOIUrl":"https://doi.org/10.1109/ISMR.2019.8710191","url":null,"abstract":"Grasp is an integral part of manipulation actions in activities of daily living and programming by demonstration is a powerful paradigm for teaching the assistive robots how to perform a grasp. Since finger configuration and finger force are the fundamental features that need to be controlled during a grasp, using these variables is a natural choice for learning by demonstration. An important question then becomes whether the existing grasp taxonomies are appropriate when one considers these modalities. The goal of our paper is to answer this question by investigating grasp patterns that can be inferred from a static analysis of the grasp data, as the object is securely grasped. Human grasp data is measured using a newly developed data glove. The data includes pressure sensor measurements from eighteen areas of the hand, and measurements from bend sensors placed at finger joints. The pressure sensor measurements are calibrated and mapped into force by employing a novel data-driven approach. Unsupervised learning is used to identify patterns for different grasp types. Multiple clustering algorithms are used to partition the data. When the results are taken in aggregate, 25 human grasp types are reduced to 9 different clusters.","PeriodicalId":404745,"journal":{"name":"2019 International Symposium on Medical Robotics (ISMR)","volume":"308 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126025899","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-04-03DOI: 10.1109/ISMR.2019.8710188
Yingqiao Yang, K. Yung, Robert T. W. Hung, J. Foster, K. Yu
Manual palpation for the detection of anomalies is not possible through the small incisions of Robotic Minimally Invasive Surgery. The proposed novel approach allows robotic palpation by deforming the tissue surface with an indenter and analyzing the corresponding induced surface shape for indications of the abnormalities underneath. Three-dimensional hyperelastic finite element models were used to simulate the tool-tissue interaction of a hemispherical indenter pushing downwards onto the tissue surface. Curve fitting methods were employed to characterize the indentation curve of the deformed surface of either normal or abnormal tissue with an empirical equation. By analyzing these equations, we developed volume-based and gradient-based methods to investigate how the tumor position affects the surface deformation behavior of the tissue.The results of the simulations indicate that there are obvious differences in the surface deformation between healthy and diseased tissue, due to the higher stiffness of the tumor. A significant advantage of the proposed method is that it greatly broadens the detection area by providing estimates on the direction and distance of the tumor from the surrounding area of the indentation site, compared with previous studies only predicting the presence of a tumor in the contact area.
{"title":"Surface Model Extraction from Indentation Curves of Hyperelastic Simulation for Abnormality Detection","authors":"Yingqiao Yang, K. Yung, Robert T. W. Hung, J. Foster, K. Yu","doi":"10.1109/ISMR.2019.8710188","DOIUrl":"https://doi.org/10.1109/ISMR.2019.8710188","url":null,"abstract":"Manual palpation for the detection of anomalies is not possible through the small incisions of Robotic Minimally Invasive Surgery. The proposed novel approach allows robotic palpation by deforming the tissue surface with an indenter and analyzing the corresponding induced surface shape for indications of the abnormalities underneath. Three-dimensional hyperelastic finite element models were used to simulate the tool-tissue interaction of a hemispherical indenter pushing downwards onto the tissue surface. Curve fitting methods were employed to characterize the indentation curve of the deformed surface of either normal or abnormal tissue with an empirical equation. By analyzing these equations, we developed volume-based and gradient-based methods to investigate how the tumor position affects the surface deformation behavior of the tissue.The results of the simulations indicate that there are obvious differences in the surface deformation between healthy and diseased tissue, due to the higher stiffness of the tumor. A significant advantage of the proposed method is that it greatly broadens the detection area by providing estimates on the direction and distance of the tumor from the surrounding area of the indentation site, compared with previous studies only predicting the presence of a tumor in the contact area.","PeriodicalId":404745,"journal":{"name":"2019 International Symposium on Medical Robotics (ISMR)","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126215873","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-04-03DOI: 10.1109/ISMR.2019.8710186
C. Schlenk, Andrea Schwier, M. Heiss, T. Bahls, A. Albu-Schäffer
Using a pressurized waterjet for cutting or abrasing tissue is a well established method in surgery. This paper presents a robotic tool for minimally invasive waterjet surgery with two degrees of freedom, integrated suction and an optional splash protection. The function of the tool and the effect of the splash protection on the suction of the applicated water was successfully evaluated in tests with ballistic gelatine. Based on this evaluated design, a concept for further increasing the oscillation frequency of the instrument tip is introduced and the results of preliminary tests of a simplified mockup are shown.
{"title":"Design of a robotic instrument for minimally invasive waterjet surgery","authors":"C. Schlenk, Andrea Schwier, M. Heiss, T. Bahls, A. Albu-Schäffer","doi":"10.1109/ISMR.2019.8710186","DOIUrl":"https://doi.org/10.1109/ISMR.2019.8710186","url":null,"abstract":"Using a pressurized waterjet for cutting or abrasing tissue is a well established method in surgery. This paper presents a robotic tool for minimally invasive waterjet surgery with two degrees of freedom, integrated suction and an optional splash protection. The function of the tool and the effect of the splash protection on the suction of the applicated water was successfully evaluated in tests with ballistic gelatine. Based on this evaluated design, a concept for further increasing the oscillation frequency of the instrument tip is introduced and the results of preliminary tests of a simplified mockup are shown.","PeriodicalId":404745,"journal":{"name":"2019 International Symposium on Medical Robotics (ISMR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128754826","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-04-03DOI: 10.1109/ISMR.2019.8710181
Jianxin Gao, Irfan Kil, R. Groff, R. Singapogu
Cannulation for hemodialysis is the process of inserting a tube into the body for removal or delivery of fluids. The cannulation process can cause several problems if the bottom of the arteriovenous fistula (AVF) or graft vessel wall is punctured by the dialysis needle. One of the main reasons for improper cannulation is the lack of skill in performing the various steps of cannulation: palpation to diagnose fistula health, needle entry, obtaining blood into cannula (termed “flashback”), and threading the needle. In this paper, we introduce a method for automatic detection of needle insertion in a simulated fistula that can be used for assessment and training of cannulation skills. Initial validation studies suggest that the algorithm is successful in detecting puncture within 0.1s on average and within 0.5s in the worst case. In the future, we will incorporate needle insertion detection into a simulator for objectively examining assessment and training of hemodialysis cannulation skills.
{"title":"Automatic Detection of Needle Puncture in a Simulated Cannulation Task","authors":"Jianxin Gao, Irfan Kil, R. Groff, R. Singapogu","doi":"10.1109/ISMR.2019.8710181","DOIUrl":"https://doi.org/10.1109/ISMR.2019.8710181","url":null,"abstract":"Cannulation for hemodialysis is the process of inserting a tube into the body for removal or delivery of fluids. The cannulation process can cause several problems if the bottom of the arteriovenous fistula (AVF) or graft vessel wall is punctured by the dialysis needle. One of the main reasons for improper cannulation is the lack of skill in performing the various steps of cannulation: palpation to diagnose fistula health, needle entry, obtaining blood into cannula (termed “flashback”), and threading the needle. In this paper, we introduce a method for automatic detection of needle insertion in a simulated fistula that can be used for assessment and training of cannulation skills. Initial validation studies suggest that the algorithm is successful in detecting puncture within 0.1s on average and within 0.5s in the worst case. In the future, we will incorporate needle insertion detection into a simulator for objectively examining assessment and training of hemodialysis cannulation skills.","PeriodicalId":404745,"journal":{"name":"2019 International Symposium on Medical Robotics (ISMR)","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116027051","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-04-03DOI: 10.1109/ISMR.2019.8710203
Niveditha Kalavakonda, L. Sekhar, B. Hannaford
Treatment of cancer patients has improved with advances in tumor resection techniques for skull-base surgery. However, a secondary procedure or chemotherapy is often required to treat residual tumor to prevent recurrence. With the advent of assistive technology, such as augmented reality, a myriad of possibilities have been facilitated for the field of surgery. This work explores the development of an augmented reality application to improving the tumor margin excised during surgical procedures, with a focus on skull-base surgery. An isosurface reconstruction algorithm was integrated with the Microsoft HoloLens, a self-contained holographic computer, to enable visualization of Computed Tomography (CT) imaging superimposed in 3D on the patient. The results suggest that though the device has limitations at its current stage, the Microsoft HoloLens could be used for planning and overlaying the imaging information on the patient for removal of lesions in real-time. The modules developed could also be extended to other types of surgery involving visualization of Digital Imaging and Communication in Medicine (DICOM) files.
{"title":"Augmented Reality Application for Aiding Tumor Resection in Skull-Base Surgery","authors":"Niveditha Kalavakonda, L. Sekhar, B. Hannaford","doi":"10.1109/ISMR.2019.8710203","DOIUrl":"https://doi.org/10.1109/ISMR.2019.8710203","url":null,"abstract":"Treatment of cancer patients has improved with advances in tumor resection techniques for skull-base surgery. However, a secondary procedure or chemotherapy is often required to treat residual tumor to prevent recurrence. With the advent of assistive technology, such as augmented reality, a myriad of possibilities have been facilitated for the field of surgery. This work explores the development of an augmented reality application to improving the tumor margin excised during surgical procedures, with a focus on skull-base surgery. An isosurface reconstruction algorithm was integrated with the Microsoft HoloLens, a self-contained holographic computer, to enable visualization of Computed Tomography (CT) imaging superimposed in 3D on the patient. The results suggest that though the device has limitations at its current stage, the Microsoft HoloLens could be used for planning and overlaying the imaging information on the patient for removal of lesions in real-time. The modules developed could also be extended to other types of surgery involving visualization of Digital Imaging and Communication in Medicine (DICOM) files.","PeriodicalId":404745,"journal":{"name":"2019 International Symposium on Medical Robotics (ISMR)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130194692","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-04-03DOI: 10.1109/ISMR.2019.8710194
Brijen Thananjeyan, A. Tanwani, Jessica J. Ji, Danyal Fer, Vatsal Patel, S. Krishnan, Ken Goldberg
Laparoscopic robots such as the da Vinci Research Kit encounter joint limits and singularities during procedures, leading to errors and prolonged operating times. We propose the Circle Suture Placement Problem to optimize the location and direction of four evenly-spaced stay sutures on surgical mesh for robot-assisted hernia surgery. We present an algorithm for this problem that runs in 0.4 seconds on a desktop equipped with commodity hardware. Simulated results integrating data from expert surgeon demonstrations suggest that optimizing over both suture position and direction increases dexterity reward by 11%-57% over baseline algorithms that optimize over either suture position or direction only.
{"title":"Optimizing Robot-Assisted Surgery Suture Plans to Avoid Joint Limits and Singularities","authors":"Brijen Thananjeyan, A. Tanwani, Jessica J. Ji, Danyal Fer, Vatsal Patel, S. Krishnan, Ken Goldberg","doi":"10.1109/ISMR.2019.8710194","DOIUrl":"https://doi.org/10.1109/ISMR.2019.8710194","url":null,"abstract":"Laparoscopic robots such as the da Vinci Research Kit encounter joint limits and singularities during procedures, leading to errors and prolonged operating times. We propose the Circle Suture Placement Problem to optimize the location and direction of four evenly-spaced stay sutures on surgical mesh for robot-assisted hernia surgery. We present an algorithm for this problem that runs in 0.4 seconds on a desktop equipped with commodity hardware. Simulated results integrating data from expert surgeon demonstrations suggest that optimizing over both suture position and direction increases dexterity reward by 11%-57% over baseline algorithms that optimize over either suture position or direction only.","PeriodicalId":404745,"journal":{"name":"2019 International Symposium on Medical Robotics (ISMR)","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122403124","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-04-03DOI: 10.1109/ISMR.2019.8710190
Yun-Hsuan Su, Kevin Huang, B. Hannaford
Dynamic 3D reconstruction of surgical cavities is essential in a wide range of computer-assisted surgical intervention applications, including but not limited to surgical guidance, pre-operative image registration and vision-based force estimation. According to a survey on vision based 3D reconstruction for abdominal minimally invasive surgery (MIS) [1], real-time 3D reconstruction and tissue deformation recovery remain open challenges to researchers. The main challenges include specular reflections from the wet tissue surface and the highly dynamic nature of abdominal surgical scenes. This work aims to overcome these obstacles by using multiple viewpoint and independently moving RGB cameras to generate an accurate measurement of tissue deformation at the volume of interest (VOI), and proposes a novel efficient camera pairing algorithm. Experimental results validate the proposed camera grouping and pair sequencing, and were evaluated with the Raven-II [2] surgical robot system for tool navigation, the Medtronic Stealth Station s7 surgical navigation system for real-time camera pose monitoring, and the Space Spider white light scanner to derive the ground truth 3D model.
手术腔的动态三维重建在广泛的计算机辅助手术干预应用中是必不可少的,包括但不限于手术指导、术前图像配准和基于视觉的力估计。根据一项关于腹部微创手术(MIS)基于视觉的三维重建的调查[1],实时三维重建和组织变形恢复仍然是研究人员面临的挑战。主要的挑战包括湿组织表面的镜面反射和腹部手术场景的高度动态性。本研究旨在克服这些障碍,利用多视点和独立移动的RGB相机来产生感兴趣体积(VOI)下组织变形的精确测量,并提出了一种新的高效相机配对算法。实验结果验证了所提出的相机分组和配对排序,并使用Raven-II[2]手术机器人系统进行工具导航,Medtronic Stealth Station s7手术导航系统进行实时相机姿态监测,以及Space Spider白光扫描仪进行评估,以获得地面真实3D模型。
{"title":"Multicamera 3D Reconstruction of Dynamic Surgical Cavities: Camera Grouping and Pair Sequencing","authors":"Yun-Hsuan Su, Kevin Huang, B. Hannaford","doi":"10.1109/ISMR.2019.8710190","DOIUrl":"https://doi.org/10.1109/ISMR.2019.8710190","url":null,"abstract":"Dynamic 3D reconstruction of surgical cavities is essential in a wide range of computer-assisted surgical intervention applications, including but not limited to surgical guidance, pre-operative image registration and vision-based force estimation. According to a survey on vision based 3D reconstruction for abdominal minimally invasive surgery (MIS) [1], real-time 3D reconstruction and tissue deformation recovery remain open challenges to researchers. The main challenges include specular reflections from the wet tissue surface and the highly dynamic nature of abdominal surgical scenes. This work aims to overcome these obstacles by using multiple viewpoint and independently moving RGB cameras to generate an accurate measurement of tissue deformation at the volume of interest (VOI), and proposes a novel efficient camera pairing algorithm. Experimental results validate the proposed camera grouping and pair sequencing, and were evaluated with the Raven-II [2] surgical robot system for tool navigation, the Medtronic Stealth Station s7 surgical navigation system for real-time camera pose monitoring, and the Space Spider white light scanner to derive the ground truth 3D model.","PeriodicalId":404745,"journal":{"name":"2019 International Symposium on Medical Robotics (ISMR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132712619","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-04-03DOI: 10.1109/ISMR.2019.8710201
Zhuo Zhao, Z. Tse
The use of an image-guided percutaneous needlebased method is on the rise. The accuracy of this method is highly depended on the placement of needle. An accurate placement minimizes patient risks and improves their health outcome. Combining this method with a needle guidance system allows the accuracy to improve significantly. In this paper, we developed a low-cost needle guidance system based on a magnet and smartphone to improve the accuracy of image-guided percutaneous needle-based method. The developed system can achieve tracking for both translation and angulation. The open air tests show that the system has an average radial error of 0.83 mm for translation tracking. For angulation tracking, the average errors were 0.71° and 0.61° for left-right angulation and anterior-posterior angulation, respectively. The accuracy is comparable to some commercially available tracking devices. The proposed method shows enormous potential for various clinical applications.
{"title":"A Smartphone and Permanent Magnet-based Needle Guidance System","authors":"Zhuo Zhao, Z. Tse","doi":"10.1109/ISMR.2019.8710201","DOIUrl":"https://doi.org/10.1109/ISMR.2019.8710201","url":null,"abstract":"The use of an image-guided percutaneous needlebased method is on the rise. The accuracy of this method is highly depended on the placement of needle. An accurate placement minimizes patient risks and improves their health outcome. Combining this method with a needle guidance system allows the accuracy to improve significantly. In this paper, we developed a low-cost needle guidance system based on a magnet and smartphone to improve the accuracy of image-guided percutaneous needle-based method. The developed system can achieve tracking for both translation and angulation. The open air tests show that the system has an average radial error of 0.83 mm for translation tracking. For angulation tracking, the average errors were 0.71° and 0.61° for left-right angulation and anterior-posterior angulation, respectively. The accuracy is comparable to some commercially available tracking devices. The proposed method shows enormous potential for various clinical applications.","PeriodicalId":404745,"journal":{"name":"2019 International Symposium on Medical Robotics (ISMR)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131171498","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}