Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems最新文献
Pub Date : 2018-10-01Epub Date: 2019-01-07DOI: 10.1109/IROS.2018.8593407
Mengyu Fu, Alan Kuntz, Robert J Webster, Ron Alterovitz
Lung cancer is the deadliest form of cancer, and early diagnosis is critical to favorable survival rates. Definitive diagnosis of lung cancer typically requires needle biopsy. Common lung nodule biopsy approaches either carry significant risk or are incapable of accessing large regions of the lung, such as in the periphery. Deploying a steerable needle from a bronchoscope and steering through the lung allows for safe biopsy while improving the accessibility of lung nodules in the lung periphery. In this work, we present a method for extracting a cost map automatically from pulmonary CT images, and utilizing the cost map to efficiently plan safe motions for a steerable needle through the lung. The cost map encodes obstacles that should be avoided, such as the lung pleura, bronchial tubes, and large blood vessels, and additionally formulates a cost for the rest of the lung which corresponds to an approximate likelihood that a blood vessel exists at each location in the anatomy. We then present a motion planning approach that utilizes the cost map to generate paths that minimize accumulated cost while safely reaching a goal location in the lung.
{"title":"Safe Motion Planning for Steerable Needles Using Cost Maps Automatically Extracted from Pulmonary Images.","authors":"Mengyu Fu, Alan Kuntz, Robert J Webster, Ron Alterovitz","doi":"10.1109/IROS.2018.8593407","DOIUrl":"10.1109/IROS.2018.8593407","url":null,"abstract":"<p><p>Lung cancer is the deadliest form of cancer, and early diagnosis is critical to favorable survival rates. Definitive diagnosis of lung cancer typically requires needle biopsy. Common lung nodule biopsy approaches either carry significant risk or are incapable of accessing large regions of the lung, such as in the periphery. Deploying a steerable needle from a bronchoscope and steering through the lung allows for safe biopsy while improving the accessibility of lung nodules in the lung periphery. In this work, we present a method for extracting a cost map automatically from pulmonary CT images, and utilizing the cost map to efficiently plan safe motions for a steerable needle through the lung. The cost map encodes obstacles that should be avoided, such as the lung pleura, bronchial tubes, and large blood vessels, and additionally formulates a cost for the rest of the lung which corresponds to an approximate likelihood that a blood vessel exists at each location in the anatomy. We then present a motion planning approach that utilizes the cost map to generate paths that minimize accumulated cost while safely reaching a goal location in the lung.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2018 ","pages":"4942-4949"},"PeriodicalIF":0.0,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6519054/pdf/nihms-1024548.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"37417954","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-10-01Epub Date: 2019-01-07DOI: 10.1109/IROS.2018.8593766
Siddarth Jain, Brenna Argall
Effective human-robot collaboration in shared control requires reasoning about the intentions of the human user. In this work, we present a mathematical formulation for human intent recognition during assistive teleoperation under shared autonomy. Our recursive Bayesian filtering approach models and fuses multiple non-verbal observations to probabilistically reason about the intended goal of the user. In addition to contextual observations, we model and incorporate the human agent's behavior as goal-directed actions with adjustable rationality to inform the underlying intent. We examine human inference on robot motion and furthermore validate our approach with a human subjects study that evaluates autonomy intent inference performance under a variety of goal scenarios and tasks, by novice subjects. Results show that our approach outperforms existing solutions and demonstrates that the probabilistic fusion of multiple observations improves intent inference and performance for shared-control operation.
{"title":"Recursive Bayesian Human Intent Recognition in Shared-Control Robotics.","authors":"Siddarth Jain, Brenna Argall","doi":"10.1109/IROS.2018.8593766","DOIUrl":"https://doi.org/10.1109/IROS.2018.8593766","url":null,"abstract":"<p><p>Effective human-robot collaboration in shared control requires reasoning about the intentions of the human user. In this work, we present a mathematical formulation for human intent recognition during assistive teleoperation under shared autonomy. Our recursive Bayesian filtering approach models and fuses multiple non-verbal observations to probabilistically reason about the intended goal of the user. In addition to contextual observations, we model and incorporate the human agent's behavior as goal-directed actions with adjustable rationality to inform the underlying intent. We examine human inference on robot motion and furthermore validate our approach with a human subjects study that evaluates autonomy intent inference performance under a variety of goal scenarios and tasks, by novice subjects. Results show that our approach outperforms existing solutions and demonstrates that the probabilistic fusion of multiple observations improves intent inference and performance for shared-control operation.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2018 ","pages":"3905-3912"},"PeriodicalIF":0.0,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/IROS.2018.8593766","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"37842539","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-10-01Epub Date: 2019-01-07DOI: 10.1109/IROS.2018.8594471
Ran Hao, Orhan Özgüner, M Cenk Çavuşoğlu
This paper presents an approach to surgical tool tracking using stereo vision for the da Vinci® Surgical Robotic System. The proposed method is based on robot kinematics, computer vision techniques and Bayesian state estimation. The proposed method employs a silhouette rendering algorithm to create virtual images of the surgical tool by generating the silhouette of the defined tool geometry under the da Vinci® robot endoscopes. The virtual rendering method provides the tool representation in image form, which makes it possible to measure the distance between the rendered tool and real tool from endoscopic stereo image streams. Particle Filter algorithm employing the virtual rendering method is then used for surgical tool tracking. The tracking performance is evaluated on an actual da Vinci® surgical robotic system and a ROS/Gazebo-based simulation of the da Vinci® system.
{"title":"Vision-Based Surgical Tool Pose Estimation for the da Vinci<sup>®</sup> Robotic Surgical System.","authors":"Ran Hao, Orhan Özgüner, M Cenk Çavuşoğlu","doi":"10.1109/IROS.2018.8594471","DOIUrl":"https://doi.org/10.1109/IROS.2018.8594471","url":null,"abstract":"<p><p>This paper presents an approach to surgical tool tracking using stereo vision for the da Vinci<sup>®</sup> Surgical Robotic System. The proposed method is based on robot kinematics, computer vision techniques and Bayesian state estimation. The proposed method employs a silhouette rendering algorithm to create virtual images of the surgical tool by generating the silhouette of the defined tool geometry under the da Vinci<sup>®</sup> robot endoscopes. The virtual rendering method provides the tool representation in image form, which makes it possible to measure the distance between the rendered tool and real tool from endoscopic stereo image streams. Particle Filter algorithm employing the virtual rendering method is then used for surgical tool tracking. The tracking performance is evaluated on an actual da Vinci<sup>®</sup> surgical robotic system and a ROS/Gazebo-based simulation of the da Vinci<sup>®</sup> system.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2018 ","pages":"1298-1305"},"PeriodicalIF":0.0,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/IROS.2018.8594471","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41223000","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
E Erdem Tuna, Taoming Liu, Russell C Jackson, Nate Lombard Poirot, Mac Russell, M Cenk Çavuşoğlu
This paper presents a free-space open-loop dynamic response analysis for an MRI-guided magnetically-actuated steerable intra-vascular catheter system. The catheter tip is embedded with a set of current carrying micro-coils. The catheter is directly actuated via the magnetic torques generated on these coils by the magnetic field of the magnetic resonance imaging (MRI) scanner. The relationship between the input current commands and catheter tip deflection angle presents an inherent nonlinearity in the proposed catheter system. The system nonlinearity is analyzed by utilizing a pendulum model. The pendulum model is used to describe the system nonlinearity and to perform an approximate input-output linearization. Then, a black-box system identification approach is performed for frequency response analysis of the linearized dynamics. The optimal estimated model is reduced by observing the modes and considering the Nyquist frequency of the camera system that is used to track the catheter motion. The reduced model is experimentally validated with 3D open-loop Cartesian free-space trajectories. This study paves the way for effective and accurate free-space closed-loop control of the robotic catheter with real-time feedback from MRI guidance in subsequent research.
{"title":"Analysis of Dynamic Response of an MRI-Guided Magnetically-Actuated Steerable Catheter System.","authors":"E Erdem Tuna, Taoming Liu, Russell C Jackson, Nate Lombard Poirot, Mac Russell, M Cenk Çavuşoğlu","doi":"","DOIUrl":"","url":null,"abstract":"<p><p>This paper presents a free-space open-loop dynamic response analysis for an MRI-guided magnetically-actuated steerable intra-vascular catheter system. The catheter tip is embedded with a set of current carrying micro-coils. The catheter is directly actuated via the magnetic torques generated on these coils by the magnetic field of the magnetic resonance imaging (MRI) scanner. The relationship between the input current commands and catheter tip deflection angle presents an inherent nonlinearity in the proposed catheter system. The system nonlinearity is analyzed by utilizing a pendulum model. The pendulum model is used to describe the system nonlinearity and to perform an approximate input-output linearization. Then, a black-box system identification approach is performed for frequency response analysis of the linearized dynamics. The optimal estimated model is reduced by observing the modes and considering the Nyquist frequency of the camera system that is used to track the catheter motion. The reduced model is experimentally validated with 3<i>D</i> open-loop Cartesian free-space trajectories. This study paves the way for effective and accurate free-space closed-loop control of the robotic catheter with real-time feedback from MRI guidance in subsequent research.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2018 ","pages":"4927-4934"},"PeriodicalIF":0.0,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6329396/pdf/nihms-998083.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"36863860","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-10-01Epub Date: 2019-01-07DOI: 10.1109/IROS.2018.8593807
Niravkumar A Patel, Jiawen Yan, David Levi, Reza Monfaredi, Kevin Cleary, Iulian Iordachita
This paper presents a body-mounted, four degree-of-freedom (4-DOF) parallel mechanism robot for image-guided percutaneous interventions. The design of the robot is optimized to be light weight and compact such that it could be mounted to the patient body. It has a modular design that can be adopted for assisting various image-guided, needle-based percutaneous interventions such as arthrography, biopsy and brachytherapy seed placement. The robot mechanism and the control system are designed and manufactured with components compatible with imaging modalities including Magnetic Resonance Imaging (MRI) and Computed Tomography (CT). The current version of the robot presented in this paper is optimized for shoulder arthrography under MRI guidance; a Z-shaped fiducial frame is attached to the robot, providing accurate and repeatable robot registration with the MR scanner coordinate system. Here we present the mechanical design of the manipulator, robot kinematics, robot calibration procedure, and preliminary bench-top accuracy assessment. The bench-top accuracy evaluation of the robotic manipulator shows average translational error of 1.01 mm and 0.96 mm in X and Z axes, respectively, and average rotational error of 3.06 degrees and 2.07 degrees about the X and Z axes, respectively.
{"title":"Body-Mounted Robot for Image-Guided Percutaneous Interventions: Mechanical Design and Preliminary Accuracy Evaluation.","authors":"Niravkumar A Patel, Jiawen Yan, David Levi, Reza Monfaredi, Kevin Cleary, Iulian Iordachita","doi":"10.1109/IROS.2018.8593807","DOIUrl":"10.1109/IROS.2018.8593807","url":null,"abstract":"<p><p>This paper presents a body-mounted, four degree-of-freedom (4-DOF) parallel mechanism robot for image-guided percutaneous interventions. The design of the robot is optimized to be light weight and compact such that it could be mounted to the patient body. It has a modular design that can be adopted for assisting various image-guided, needle-based percutaneous interventions such as arthrography, biopsy and brachytherapy seed placement. The robot mechanism and the control system are designed and manufactured with components compatible with imaging modalities including Magnetic Resonance Imaging (MRI) and Computed Tomography (CT). The current version of the robot presented in this paper is optimized for shoulder arthrography under MRI guidance; a Z-shaped fiducial frame is attached to the robot, providing accurate and repeatable robot registration with the MR scanner coordinate system. Here we present the mechanical design of the manipulator, robot kinematics, robot calibration procedure, and preliminary bench-top accuracy assessment. The bench-top accuracy evaluation of the robotic manipulator shows average translational error of 1.01 mm and 0.96 mm in X and Z axes, respectively, and average rotational error of 3.06 degrees and 2.07 degrees about the X and Z axes, respectively.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2018 ","pages":"1443-1448"},"PeriodicalIF":0.0,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/IROS.2018.8593807","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"37338684","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-10-01Epub Date: 2019-01-07DOI: 10.1109/IROS.2018.8594023
Siavash Rezazadeh, David Quintero, Nikhil Divekar, Robert D Gregg
Although there has been recent progress in control of multi-joint prosthetic legs for periodic tasks such as walking, volitional control of these systems for non-periodic maneuvers is still an open problem. In this paper, we develop a new controller that is capable of both periodic walking and common volitional leg motions based on a piecewise holonomic phase variable through a finite state machine. The phase variable is constructed by measuring the thigh angle, and the transitions in the finite state machine are formulated through sensing foot contact together with attributes of a nominal reference gait trajectory. The controller was implemented on a powered knee-ankle prosthesis and tested with a transfemoral amputee subject, who successfully performed a wide range of periodic and non-periodic tasks, including low- and high-speed walking, quick start and stop, backward walking, walking over obstacles, and kicking a soccer ball. The proposed approach is expected to provide better understanding of volitional motions and lead to more reliable control of multi-joint prostheses for a wider range of tasks.
{"title":"A Phase Variable Approach to Volitional Control of Powered Knee-Ankle Prostheses.","authors":"Siavash Rezazadeh, David Quintero, Nikhil Divekar, Robert D Gregg","doi":"10.1109/IROS.2018.8594023","DOIUrl":"https://doi.org/10.1109/IROS.2018.8594023","url":null,"abstract":"<p><p>Although there has been recent progress in control of multi-joint prosthetic legs for periodic tasks such as walking, volitional control of these systems for non-periodic maneuvers is still an open problem. In this paper, we develop a new controller that is capable of both periodic walking and common volitional leg motions based on a piecewise holonomic phase variable through a finite state machine. The phase variable is constructed by measuring the thigh angle, and the transitions in the finite state machine are formulated through sensing foot contact together with attributes of a nominal reference gait trajectory. The controller was implemented on a powered knee-ankle prosthesis and tested with a transfemoral amputee subject, who successfully performed a wide range of periodic and non-periodic tasks, including low- and high-speed walking, quick start and stop, backward walking, walking over obstacles, and kicking a soccer ball. The proposed approach is expected to provide better understanding of volitional motions and lead to more reliable control of multi-joint prostheses for a wider range of tasks.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2018 ","pages":"2292-2298"},"PeriodicalIF":0.0,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/IROS.2018.8594023","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"36941728","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-10-01Epub Date: 2019-01-07DOI: 10.1109/iros.2018.8594060
The use of robots, as a social stimulus, provides several advantages over using another animal. In particular, for rat-robot studies, robots can produce social behaviour that is reproducible across trials. In the current work, we outline a framework for rat-robot interaction studies, that consists of a novel rat-sized robot (PiRat), models of robotic behavior, and a position tracking system for both robot and rat. We present the design of the framework, including constraints on autonomy, latency, and control. We pilot tested our framework by individually running the robot rat with eight different rats, first through a habituation stage, and then with PiRat performing two different types of behaviour - avoiding and frequently approaching. We evaluate the performance of the framework on latency and autonomy, and on the ability to influence the behaviour of individual rats. We find that the framework performs well on its constraints, engages some of the rats (according to the number of meetings), and features a control scheme that produces reproducible behaviour in rats. These features represent a first demonstration of a closed-loop rat-robot framework.
{"title":"PiRat: An autonomous framework for studying social behaviour in rats and robots.","authors":"","doi":"10.1109/iros.2018.8594060","DOIUrl":"https://doi.org/10.1109/iros.2018.8594060","url":null,"abstract":"<p><p>The use of robots, as a social stimulus, provides several advantages over using another animal. In particular, for rat-robot studies, robots can produce social behaviour that is reproducible across trials. In the current work, we outline a framework for rat-robot interaction studies, that consists of a novel rat-sized robot (PiRat), models of robotic behavior, and a position tracking system for both robot and rat. We present the design of the framework, including constraints on autonomy, latency, and control. We pilot tested our framework by individually running the robot rat with eight different rats, first through a habituation stage, and then with PiRat performing two different types of behaviour - avoiding and frequently approaching. We evaluate the performance of the framework on latency and autonomy, and on the ability to influence the behaviour of individual rats. We find that the framework performs well on its constraints, engages some of the rats (according to the number of meetings), and features a control scheme that produces reproducible behaviour in rats. These features represent a first demonstration of a closed-loop rat-robot framework.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2018 ","pages":"7601-7608"},"PeriodicalIF":0.0,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/iros.2018.8594060","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39496146","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-09-01Epub Date: 2017-12-14DOI: 10.1109/IROS.2017.8205979
Christopher J Nycz, Radian Gondokaryono, Paulo Carvalho, Nirav Patel, Marek Wartenberg, Julie G Pilitsis, Gregory S Fischer
The use of magnetic resonance imaging (MRI) for guiding robotic surgical devices has shown great potential for performing precisely targeted and controlled interventions. To fully realize these benefits, devices must work safely within the tight confines of the MRI bore without negatively impacting image quality. Here we expand on previous work exploring MRI guided robots for neural interventions by presenting the mechanical design and assessment of a device for positioning, orienting, and inserting an interstitial ultrasound-based ablation probe. From our previous work we have added a 2 degree of freedom (DOF) needle driver for use with the aforementioned probe, revised the mechanical design to improve strength and function, and performed an evaluation of the mechanism's accuracy and effect on MR image quality. The result of this work is a 7-DOF MRI robot capable of positioning a needle tip and orienting it's axis with accuracy of 1.37 ± 0.06mm and 0.79° ± 0.41°, inserting it along it's axis with an accuracy of 0.06 ± 0.07mm, and rotating it about it's axis to an accuracy of 0.77° ± 1.31°. This was accomplished with no significant reduction in SNR caused by the robot's presence in the MRI bore, ≤ 10.3% reduction in SNR from running the robot's motors during a scan, and no visible paramagnetic artifacts.
{"title":"Mechanical Validation of an MRI Compatible Stereotactic Neurosurgery Robot in Preparation for Pre-Clinical Trials.","authors":"Christopher J Nycz, Radian Gondokaryono, Paulo Carvalho, Nirav Patel, Marek Wartenberg, Julie G Pilitsis, Gregory S Fischer","doi":"10.1109/IROS.2017.8205979","DOIUrl":"10.1109/IROS.2017.8205979","url":null,"abstract":"<p><p>The use of magnetic resonance imaging (MRI) for guiding robotic surgical devices has shown great potential for performing precisely targeted and controlled interventions. To fully realize these benefits, devices must work safely within the tight confines of the MRI bore without negatively impacting image quality. Here we expand on previous work exploring MRI guided robots for neural interventions by presenting the mechanical design and assessment of a device for positioning, orienting, and inserting an interstitial ultrasound-based ablation probe. From our previous work we have added a 2 degree of freedom (DOF) needle driver for use with the aforementioned probe, revised the mechanical design to improve strength and function, and performed an evaluation of the mechanism's accuracy and effect on MR image quality. The result of this work is a 7-DOF MRI robot capable of positioning a needle tip and orienting it's axis with accuracy of 1.37 <i>±</i> 0.06<i>mm</i> and 0.79° <i>±</i> 0.41°, inserting it along it's axis with an accuracy of 0.06 <i>±</i> 0.07<i>mm</i>, and rotating it about it's axis to an accuracy of 0.77° <i>±</i> 1.31°. This was accomplished with no significant reduction in SNR caused by the robot's presence in the MRI bore, ≤ 10.3% reduction in SNR from running the robot's motors during a scan, and no visible paramagnetic artifacts.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2017 ","pages":"1677-1684"},"PeriodicalIF":0.0,"publicationDate":"2017-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5912942/pdf/nihms959530.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"36046786","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2017-01-01Epub Date: 2017-12-14DOI: 10.1109/IROS.2017.8206210
Justin D Opfermann, Simon Leonard, Ryan S Decker, Nicholas A Uebele, Christopher E Bayne, Arjun S Joshi, Axel Krieger
This paper specifies a surgical robot performing semi-autonomous electrosurgery for tumor resection and evaluates its accuracy using a visual servoing paradigm. We describe the design and integration of a novel, multi-degree of freedom electrosurgical tool for the smart tissue autonomous robot (STAR). Standardized line tests are executed to determine ideal cut parameters in three different types of porcine tissue. STAR is then programmed with the ideal cut setting for porcine tissue and compared against expert surgeons using open and laparoscopic techniques in a line cutting task. We conclude with a proof of concept demonstration using STAR to semi-autonomously resect pseudo-tumors in porcine tissue using visual servoing. When tasked to excise tumors with a consistent 4mm margin, STAR can semi-autonomously dissect tissue with an average margin of 3.67 mm and a standard deviation of 0.89mm.
{"title":"Semi-Autonomous Electrosurgery for Tumor Resection Using a Multi-Degree of Freedom Electrosurgical Tool and Visual Servoing.","authors":"Justin D Opfermann, Simon Leonard, Ryan S Decker, Nicholas A Uebele, Christopher E Bayne, Arjun S Joshi, Axel Krieger","doi":"10.1109/IROS.2017.8206210","DOIUrl":"https://doi.org/10.1109/IROS.2017.8206210","url":null,"abstract":"<p><p>This paper specifies a surgical robot performing semi-autonomous electrosurgery for tumor resection and evaluates its accuracy using a visual servoing paradigm. We describe the design and integration of a novel, multi-degree of freedom electrosurgical tool for the smart tissue autonomous robot (STAR). Standardized line tests are executed to determine ideal cut parameters in three different types of porcine tissue. STAR is then programmed with the ideal cut setting for porcine tissue and compared against expert surgeons using open and laparoscopic techniques in a line cutting task. We conclude with a proof of concept demonstration using STAR to semi-autonomously resect pseudo-tumors in porcine tissue using visual servoing. When tasked to excise tumors with a consistent 4mm margin, STAR can semi-autonomously dissect tissue with an average margin of 3.67 mm and a standard deviation of 0.89mm.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2017 ","pages":"3653-3659"},"PeriodicalIF":0.0,"publicationDate":"2017-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/IROS.2017.8206210","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"35882193","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2016-10-01Epub Date: 2016-12-01DOI: 10.1109/IROS.2016.7759102
Max L Balter, Alvin I Chen, Alex Fromholtz, Alex Gorshkov, Tim J Maguire, Martin L Yarmush
Diagnostic blood testing is the most prevalent medical procedure performed in the world and forms the cornerstone of modern health care delivery. Yet blood tests are still predominantly carried out in centralized labs using large-volume samples acquired by manual venipuncture, and no end-to-end solution from blood draw to sample analysis exists today. Our group is developing a platform device that merges robotic phlebotomy with automated diagnostics to rapidly deliver patient information at the site of the blood draw. The system couples an image-guided venipuncture robot, designed to address the challenges of routine venous access, with a centrifuge-based blood analyzer to obtain quantitative measurements of hematology. In this paper, we first present the system design and architecture of the integrated device. We then perform a series of in vitro experiments to evaluate the cannulation accuracy of the system on blood vessel phantoms. Next, we assess the effects of vessel diameter, needle gauge, flow rate, and viscosity on the rate of sample collection. Finally, we demonstrate proof-of-concept of a white cell assay on the blood analyzer using in vitro human samples spiked with fluorescently labeled microbeads.
{"title":"System Design and Development of a Robotic Device for Automated Venipuncture and Diagnostic Blood Cell Analysis.","authors":"Max L Balter, Alvin I Chen, Alex Fromholtz, Alex Gorshkov, Tim J Maguire, Martin L Yarmush","doi":"10.1109/IROS.2016.7759102","DOIUrl":"https://doi.org/10.1109/IROS.2016.7759102","url":null,"abstract":"<p><p>Diagnostic blood testing is the most prevalent medical procedure performed in the world and forms the cornerstone of modern health care delivery. Yet blood tests are still predominantly carried out in centralized labs using large-volume samples acquired by manual venipuncture, and no end-to-end solution from blood draw to sample analysis exists today. Our group is developing a platform device that merges robotic phlebotomy with automated diagnostics to rapidly deliver patient information at the site of the blood draw. The system couples an image-guided venipuncture robot, designed to address the challenges of routine venous access, with a centrifuge-based blood analyzer to obtain quantitative measurements of hematology. In this paper, we first present the system design and architecture of the integrated device. We then perform a series of <i>in vitro</i> experiments to evaluate the cannulation accuracy of the system on blood vessel phantoms. Next, we assess the effects of vessel diameter, needle gauge, flow rate, and viscosity on the rate of sample collection. Finally, we demonstrate proof-of-concept of a white cell assay on the blood analyzer using <i>in vitro</i> human samples spiked with fluorescently labeled microbeads.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2016 ","pages":"514-520"},"PeriodicalIF":0.0,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/IROS.2016.7759102","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"34765858","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems