Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems最新文献
Pub Date : 2019-11-04Epub Date: 2020-01-27DOI: 10.1109/IROS40897.2019.8968575
Sherdil Niyaz, Alan Kuntz, Oren Salzman, Ron Alterovitz, Siddhartha S Srinivasa
A motion-planning problem's setup can drastically affect the quality of solutions returned by the planner. In this work we consider optimizing these setups, with a focus on doing so in a computationally-efficient fashion. Our approach interleaves optimization with motion planning, which allows us to consider the actual motions required of the robot. Similar prior work has treated the planner as a black box: our key insight is that opening this box in a simple-yet-effective manner enables a more efficient approach, by allowing us to bound the work done by the planner to optimizer-relevant computations. Finally, we apply our approach to a surgically-relevant motion-planning task, where our experiments validate our approach by more-efficiently optimizing the fixed insertion pose of a surgical robot.
{"title":"Optimizing Motion-Planning Problem Setup via Bounded Evaluation with Application to Following Surgical Trajectories.","authors":"Sherdil Niyaz, Alan Kuntz, Oren Salzman, Ron Alterovitz, Siddhartha S Srinivasa","doi":"10.1109/IROS40897.2019.8968575","DOIUrl":"10.1109/IROS40897.2019.8968575","url":null,"abstract":"<p><p>A motion-planning problem's setup can drastically affect the quality of solutions returned by the planner. In this work we consider optimizing these setups, with a focus on doing so in a computationally-efficient fashion. Our approach interleaves optimization with motion planning, which allows us to consider the actual motions required of the robot. Similar prior work has treated the planner as a black box: our key insight is that opening this box in a simple-yet-effective manner enables a more efficient approach, by allowing us to bound the work done by the planner to optimizer-relevant computations. Finally, we apply our approach to a surgically-relevant motion-planning task, where our experiments validate our approach by more-efficiently optimizing the fixed insertion pose of a surgical robot.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2019 ","pages":"1355-1362"},"PeriodicalIF":0.0,"publicationDate":"2019-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7172036/pdf/nihms-1576704.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"37857578","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-11-01Epub Date: 2020-01-27DOI: 10.1109/IROS40897.2019.8967751
Zhaoshuo Li, Mahya Shahbazi, Niravkumar Patel, Eimear O' Sullivan, Haojie Zhang, Khushi Vyas, Preetham Chalasani, Peter L Gehlbach, Iulian Iordachita, Guang-Zhong Yang, Russell H Taylor
In this paper, a novel semi-autonomous control framework is presented for enabling probe-based confocal laser endomicroscopy (pCLE) scan of the retinal tissue. With pCLE, retinal layers such as nerve fiber layer (NFL) and retinal ganglion cell (RGC) can be scanned and characterized in real-time for an improved diagnosis and surgical outcome prediction. However, the limited field of view of the pCLE system and the micron-scale optimal focus distance of the probe, which are in the order of physiological hand tremor, act as barriers to successful manual scan of retinal tissue. Therefore, a novel sensorless framework is proposed for real-time semi-autonomous endomicroscopy scanning during retinal surgery. The framework consists of the Steady-Hand Eye Robot (SHER) integrated with a pCLE system, where the motion of the probe is controlled semi-autonomously. Through a hybrid motion control strategy, the system autonomously controls the confocal probe to optimize the sharpness and quality of the pCLE images, while providing the surgeon with the ability to scan the tissue in a tremor-free manner. Effectiveness of the proposed architecture is validated through experimental evaluations as well as a user study involving 9 participants. It is shown through statistical analyses that the proposed framework can reduce the work load experienced by the users in a statistically-significant manner, while also enhancing their performance in retaining pCLE images with optimized quality.
{"title":"A Novel Semi-Autonomous Control Framework for Retina Confocal Endomicroscopy Scanning.","authors":"Zhaoshuo Li, Mahya Shahbazi, Niravkumar Patel, Eimear O' Sullivan, Haojie Zhang, Khushi Vyas, Preetham Chalasani, Peter L Gehlbach, Iulian Iordachita, Guang-Zhong Yang, Russell H Taylor","doi":"10.1109/IROS40897.2019.8967751","DOIUrl":"https://doi.org/10.1109/IROS40897.2019.8967751","url":null,"abstract":"<p><p>In this paper, a novel semi-autonomous control framework is presented for enabling probe-based confocal laser endomicroscopy (pCLE) scan of the retinal tissue. With pCLE, retinal layers such as nerve fiber layer (NFL) and retinal ganglion cell (RGC) can be scanned and characterized in real-time for an improved diagnosis and surgical outcome prediction. However, the limited field of view of the pCLE system and the micron-scale optimal focus distance of the probe, which are in the order of physiological hand tremor, act as barriers to successful manual scan of retinal tissue. Therefore, a novel sensorless framework is proposed for real-time semi-autonomous endomicroscopy scanning during retinal surgery. The framework consists of the Steady-Hand Eye Robot (SHER) integrated with a pCLE system, where the motion of the probe is controlled semi-autonomously. Through a hybrid motion control strategy, the system autonomously controls the confocal probe to optimize the sharpness and quality of the pCLE images, while providing the surgeon with the ability to scan the tissue in a tremor-free manner. Effectiveness of the proposed architecture is validated through experimental evaluations as well as a user study involving 9 participants. It is shown through statistical analyses that the proposed framework can reduce the work load experienced by the users in a statistically-significant manner, while also enhancing their performance in retaining pCLE images with optimized quality.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2019 ","pages":"7083-7090"},"PeriodicalIF":0.0,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/IROS40897.2019.8967751","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"25415684","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-11-01Epub Date: 2020-01-27DOI: 10.1109/IROS40897.2019.8967806
Ali Ebrahimi, Farshid Alambeigi, Ingrid E Zimmer-Galler, Peter Gehlbach, Russell H Taylor, Iulian Iordachita
When robotic assistance is present into vitreoretinal surgery, the surgeon will experience reduced sensory input that is otherwise derived from the tool's interaction with the eye wall (sclera). We speculate that disconnecting the surgeon from this sensory input may increase the risk of injury to the eye and affect the surgeon's usual technique. On the other hand, robot autonomous motion to enhance patient safety might inhibit the surgeons tool manipulation and diminish surgeon comfort with the procedure. In this study, to investigate the parameters of patient safety and surgeon comfort in a robot-assisted eye surgery, we implemented three different approaches designed to keep the scleral force in a safe range during a synergic eye manipulation task. To assess the surgeon comfort during these procedures, the amount of interference with the surgeons usual maneuvers has been analyzed by defining quantitative comfort metrics. The first two utilized scleral force control approaches are based on an adaptive force control method in which the robot actively counteracts any excessive force on the sclera. The third control method is based on a virtual fixture approach in which a virtual wall is created for the surgeon in the unsafe directions of manipulation. The performance of the utilized approaches was evaluated in user studies with two experienced retinal surgeons and the outcomes of the procedure were assessed using the defined safety and comfort metrics. Results of these analyses indicate the significance of the opted control paradigm on the outcome of a safe and comfortable robot-assisted eye surgery.
{"title":"Toward Improving Patient Safety and Surgeon Comfort in a Synergic Robot-Assisted Eye Surgery: A Comparative Study.","authors":"Ali Ebrahimi, Farshid Alambeigi, Ingrid E Zimmer-Galler, Peter Gehlbach, Russell H Taylor, Iulian Iordachita","doi":"10.1109/IROS40897.2019.8967806","DOIUrl":"https://doi.org/10.1109/IROS40897.2019.8967806","url":null,"abstract":"<p><p>When robotic assistance is present into vitreoretinal surgery, the surgeon will experience reduced sensory input that is otherwise derived from the tool's interaction with the eye wall (sclera). We speculate that disconnecting the surgeon from this sensory input may increase the risk of injury to the eye and affect the surgeon's usual technique. On the other hand, robot autonomous motion to enhance patient safety might inhibit the surgeons tool manipulation and diminish surgeon comfort with the procedure. In this study, to investigate the parameters of patient safety and surgeon comfort in a robot-assisted eye surgery, we implemented three different approaches designed to keep the scleral force in a safe range during a synergic eye manipulation task. To assess the surgeon comfort during these procedures, the amount of interference with the surgeons usual maneuvers has been analyzed by defining quantitative comfort metrics. The first two utilized scleral force control approaches are based on an adaptive force control method in which the robot actively counteracts any excessive force on the sclera. The third control method is based on a virtual fixture approach in which a virtual wall is created for the surgeon in the unsafe directions of manipulation. The performance of the utilized approaches was evaluated in user studies with two experienced retinal surgeons and the outcomes of the procedure were assessed using the defined safety and comfort metrics. Results of these analyses indicate the significance of the opted control paradigm on the outcome of a safe and comfortable robot-assisted eye surgery.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2019 ","pages":"7075-7082"},"PeriodicalIF":0.0,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/IROS40897.2019.8967806","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"37995690","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-11-01Epub Date: 2020-01-27DOI: 10.1109/IROS40897.2019.8968172
Alan Kuntz, Mengyu Fu, Ron Alterovitz
We present a method that plans motions for a concentric tube robot to automatically reach surgical targets inside the body while avoiding obstacles, where the patient's anatomy is represented by point clouds. Point clouds can be generated intra-operatively via endoscopic instruments, enabling the system to update obstacle representations over time as the patient anatomy changes during surgery. Our new motion planning method uses a combination of sampling-based motion planning methods and local optimization to efficiently handle point cloud data and quickly compute high quality plans. The local optimization step uses an interior point optimization method, ensuring that the computed plan is feasible and avoids obstacles at every iteration. This enables the motion planner to run in an anytime fashion, i.e., the method can be stopped at any time and the best solution found up until that point is returned. We demonstrate the method's efficacy in three anatomical scenarios, including two generated from endoscopic videos of real patient anatomy.
{"title":"Planning High-Quality Motions for Concentric Tube Robots in Point Clouds via Parallel Sampling and Optimization.","authors":"Alan Kuntz, Mengyu Fu, Ron Alterovitz","doi":"10.1109/IROS40897.2019.8968172","DOIUrl":"https://doi.org/10.1109/IROS40897.2019.8968172","url":null,"abstract":"<p><p>We present a method that plans motions for a concentric tube robot to automatically reach surgical targets inside the body while avoiding obstacles, where the patient's anatomy is represented by point clouds. Point clouds can be generated intra-operatively via endoscopic instruments, enabling the system to update obstacle representations over time as the patient anatomy changes during surgery. Our new motion planning method uses a combination of sampling-based motion planning methods and local optimization to efficiently handle point cloud data and quickly compute high quality plans. The local optimization step uses an interior point optimization method, ensuring that the computed plan is feasible and avoids obstacles at every iteration. This enables the motion planner to run in an anytime fashion, i.e., the method can be stopped at any time and the best solution found up until that point is returned. We demonstrate the method's efficacy in three anatomical scenarios, including two generated from endoscopic videos of real patient anatomy.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2019 ","pages":"2205-2212"},"PeriodicalIF":0.0,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/IROS40897.2019.8968172","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"37890623","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-10-01Epub Date: 2019-01-07DOI: 10.1109/IROS.2018.8593407
Mengyu Fu, Alan Kuntz, Robert J Webster, Ron Alterovitz
Lung cancer is the deadliest form of cancer, and early diagnosis is critical to favorable survival rates. Definitive diagnosis of lung cancer typically requires needle biopsy. Common lung nodule biopsy approaches either carry significant risk or are incapable of accessing large regions of the lung, such as in the periphery. Deploying a steerable needle from a bronchoscope and steering through the lung allows for safe biopsy while improving the accessibility of lung nodules in the lung periphery. In this work, we present a method for extracting a cost map automatically from pulmonary CT images, and utilizing the cost map to efficiently plan safe motions for a steerable needle through the lung. The cost map encodes obstacles that should be avoided, such as the lung pleura, bronchial tubes, and large blood vessels, and additionally formulates a cost for the rest of the lung which corresponds to an approximate likelihood that a blood vessel exists at each location in the anatomy. We then present a motion planning approach that utilizes the cost map to generate paths that minimize accumulated cost while safely reaching a goal location in the lung.
{"title":"Safe Motion Planning for Steerable Needles Using Cost Maps Automatically Extracted from Pulmonary Images.","authors":"Mengyu Fu, Alan Kuntz, Robert J Webster, Ron Alterovitz","doi":"10.1109/IROS.2018.8593407","DOIUrl":"10.1109/IROS.2018.8593407","url":null,"abstract":"<p><p>Lung cancer is the deadliest form of cancer, and early diagnosis is critical to favorable survival rates. Definitive diagnosis of lung cancer typically requires needle biopsy. Common lung nodule biopsy approaches either carry significant risk or are incapable of accessing large regions of the lung, such as in the periphery. Deploying a steerable needle from a bronchoscope and steering through the lung allows for safe biopsy while improving the accessibility of lung nodules in the lung periphery. In this work, we present a method for extracting a cost map automatically from pulmonary CT images, and utilizing the cost map to efficiently plan safe motions for a steerable needle through the lung. The cost map encodes obstacles that should be avoided, such as the lung pleura, bronchial tubes, and large blood vessels, and additionally formulates a cost for the rest of the lung which corresponds to an approximate likelihood that a blood vessel exists at each location in the anatomy. We then present a motion planning approach that utilizes the cost map to generate paths that minimize accumulated cost while safely reaching a goal location in the lung.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2018 ","pages":"4942-4949"},"PeriodicalIF":0.0,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6519054/pdf/nihms-1024548.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"37417954","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-10-01Epub Date: 2019-01-07DOI: 10.1109/IROS.2018.8593766
Siddarth Jain, Brenna Argall
Effective human-robot collaboration in shared control requires reasoning about the intentions of the human user. In this work, we present a mathematical formulation for human intent recognition during assistive teleoperation under shared autonomy. Our recursive Bayesian filtering approach models and fuses multiple non-verbal observations to probabilistically reason about the intended goal of the user. In addition to contextual observations, we model and incorporate the human agent's behavior as goal-directed actions with adjustable rationality to inform the underlying intent. We examine human inference on robot motion and furthermore validate our approach with a human subjects study that evaluates autonomy intent inference performance under a variety of goal scenarios and tasks, by novice subjects. Results show that our approach outperforms existing solutions and demonstrates that the probabilistic fusion of multiple observations improves intent inference and performance for shared-control operation.
{"title":"Recursive Bayesian Human Intent Recognition in Shared-Control Robotics.","authors":"Siddarth Jain, Brenna Argall","doi":"10.1109/IROS.2018.8593766","DOIUrl":"https://doi.org/10.1109/IROS.2018.8593766","url":null,"abstract":"<p><p>Effective human-robot collaboration in shared control requires reasoning about the intentions of the human user. In this work, we present a mathematical formulation for human intent recognition during assistive teleoperation under shared autonomy. Our recursive Bayesian filtering approach models and fuses multiple non-verbal observations to probabilistically reason about the intended goal of the user. In addition to contextual observations, we model and incorporate the human agent's behavior as goal-directed actions with adjustable rationality to inform the underlying intent. We examine human inference on robot motion and furthermore validate our approach with a human subjects study that evaluates autonomy intent inference performance under a variety of goal scenarios and tasks, by novice subjects. Results show that our approach outperforms existing solutions and demonstrates that the probabilistic fusion of multiple observations improves intent inference and performance for shared-control operation.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2018 ","pages":"3905-3912"},"PeriodicalIF":0.0,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/IROS.2018.8593766","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"37842539","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-10-01Epub Date: 2019-01-07DOI: 10.1109/IROS.2018.8594471
Ran Hao, Orhan Özgüner, M Cenk Çavuşoğlu
This paper presents an approach to surgical tool tracking using stereo vision for the da Vinci® Surgical Robotic System. The proposed method is based on robot kinematics, computer vision techniques and Bayesian state estimation. The proposed method employs a silhouette rendering algorithm to create virtual images of the surgical tool by generating the silhouette of the defined tool geometry under the da Vinci® robot endoscopes. The virtual rendering method provides the tool representation in image form, which makes it possible to measure the distance between the rendered tool and real tool from endoscopic stereo image streams. Particle Filter algorithm employing the virtual rendering method is then used for surgical tool tracking. The tracking performance is evaluated on an actual da Vinci® surgical robotic system and a ROS/Gazebo-based simulation of the da Vinci® system.
{"title":"Vision-Based Surgical Tool Pose Estimation for the da Vinci<sup>®</sup> Robotic Surgical System.","authors":"Ran Hao, Orhan Özgüner, M Cenk Çavuşoğlu","doi":"10.1109/IROS.2018.8594471","DOIUrl":"https://doi.org/10.1109/IROS.2018.8594471","url":null,"abstract":"<p><p>This paper presents an approach to surgical tool tracking using stereo vision for the da Vinci<sup>®</sup> Surgical Robotic System. The proposed method is based on robot kinematics, computer vision techniques and Bayesian state estimation. The proposed method employs a silhouette rendering algorithm to create virtual images of the surgical tool by generating the silhouette of the defined tool geometry under the da Vinci<sup>®</sup> robot endoscopes. The virtual rendering method provides the tool representation in image form, which makes it possible to measure the distance between the rendered tool and real tool from endoscopic stereo image streams. Particle Filter algorithm employing the virtual rendering method is then used for surgical tool tracking. The tracking performance is evaluated on an actual da Vinci<sup>®</sup> surgical robotic system and a ROS/Gazebo-based simulation of the da Vinci<sup>®</sup> system.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2018 ","pages":"1298-1305"},"PeriodicalIF":0.0,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/IROS.2018.8594471","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41223000","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
E Erdem Tuna, Taoming Liu, Russell C Jackson, Nate Lombard Poirot, Mac Russell, M Cenk Çavuşoğlu
This paper presents a free-space open-loop dynamic response analysis for an MRI-guided magnetically-actuated steerable intra-vascular catheter system. The catheter tip is embedded with a set of current carrying micro-coils. The catheter is directly actuated via the magnetic torques generated on these coils by the magnetic field of the magnetic resonance imaging (MRI) scanner. The relationship between the input current commands and catheter tip deflection angle presents an inherent nonlinearity in the proposed catheter system. The system nonlinearity is analyzed by utilizing a pendulum model. The pendulum model is used to describe the system nonlinearity and to perform an approximate input-output linearization. Then, a black-box system identification approach is performed for frequency response analysis of the linearized dynamics. The optimal estimated model is reduced by observing the modes and considering the Nyquist frequency of the camera system that is used to track the catheter motion. The reduced model is experimentally validated with 3D open-loop Cartesian free-space trajectories. This study paves the way for effective and accurate free-space closed-loop control of the robotic catheter with real-time feedback from MRI guidance in subsequent research.
{"title":"Analysis of Dynamic Response of an MRI-Guided Magnetically-Actuated Steerable Catheter System.","authors":"E Erdem Tuna, Taoming Liu, Russell C Jackson, Nate Lombard Poirot, Mac Russell, M Cenk Çavuşoğlu","doi":"","DOIUrl":"","url":null,"abstract":"<p><p>This paper presents a free-space open-loop dynamic response analysis for an MRI-guided magnetically-actuated steerable intra-vascular catheter system. The catheter tip is embedded with a set of current carrying micro-coils. The catheter is directly actuated via the magnetic torques generated on these coils by the magnetic field of the magnetic resonance imaging (MRI) scanner. The relationship between the input current commands and catheter tip deflection angle presents an inherent nonlinearity in the proposed catheter system. The system nonlinearity is analyzed by utilizing a pendulum model. The pendulum model is used to describe the system nonlinearity and to perform an approximate input-output linearization. Then, a black-box system identification approach is performed for frequency response analysis of the linearized dynamics. The optimal estimated model is reduced by observing the modes and considering the Nyquist frequency of the camera system that is used to track the catheter motion. The reduced model is experimentally validated with 3<i>D</i> open-loop Cartesian free-space trajectories. This study paves the way for effective and accurate free-space closed-loop control of the robotic catheter with real-time feedback from MRI guidance in subsequent research.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2018 ","pages":"4927-4934"},"PeriodicalIF":0.0,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6329396/pdf/nihms-998083.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"36863860","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-10-01Epub Date: 2019-01-07DOI: 10.1109/IROS.2018.8594023
Siavash Rezazadeh, David Quintero, Nikhil Divekar, Robert D Gregg
Although there has been recent progress in control of multi-joint prosthetic legs for periodic tasks such as walking, volitional control of these systems for non-periodic maneuvers is still an open problem. In this paper, we develop a new controller that is capable of both periodic walking and common volitional leg motions based on a piecewise holonomic phase variable through a finite state machine. The phase variable is constructed by measuring the thigh angle, and the transitions in the finite state machine are formulated through sensing foot contact together with attributes of a nominal reference gait trajectory. The controller was implemented on a powered knee-ankle prosthesis and tested with a transfemoral amputee subject, who successfully performed a wide range of periodic and non-periodic tasks, including low- and high-speed walking, quick start and stop, backward walking, walking over obstacles, and kicking a soccer ball. The proposed approach is expected to provide better understanding of volitional motions and lead to more reliable control of multi-joint prostheses for a wider range of tasks.
{"title":"A Phase Variable Approach to Volitional Control of Powered Knee-Ankle Prostheses.","authors":"Siavash Rezazadeh, David Quintero, Nikhil Divekar, Robert D Gregg","doi":"10.1109/IROS.2018.8594023","DOIUrl":"https://doi.org/10.1109/IROS.2018.8594023","url":null,"abstract":"<p><p>Although there has been recent progress in control of multi-joint prosthetic legs for periodic tasks such as walking, volitional control of these systems for non-periodic maneuvers is still an open problem. In this paper, we develop a new controller that is capable of both periodic walking and common volitional leg motions based on a piecewise holonomic phase variable through a finite state machine. The phase variable is constructed by measuring the thigh angle, and the transitions in the finite state machine are formulated through sensing foot contact together with attributes of a nominal reference gait trajectory. The controller was implemented on a powered knee-ankle prosthesis and tested with a transfemoral amputee subject, who successfully performed a wide range of periodic and non-periodic tasks, including low- and high-speed walking, quick start and stop, backward walking, walking over obstacles, and kicking a soccer ball. The proposed approach is expected to provide better understanding of volitional motions and lead to more reliable control of multi-joint prostheses for a wider range of tasks.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2018 ","pages":"2292-2298"},"PeriodicalIF":0.0,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/IROS.2018.8594023","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"36941728","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2018-10-01Epub Date: 2019-01-07DOI: 10.1109/IROS.2018.8593807
Niravkumar A Patel, Jiawen Yan, David Levi, Reza Monfaredi, Kevin Cleary, Iulian Iordachita
This paper presents a body-mounted, four degree-of-freedom (4-DOF) parallel mechanism robot for image-guided percutaneous interventions. The design of the robot is optimized to be light weight and compact such that it could be mounted to the patient body. It has a modular design that can be adopted for assisting various image-guided, needle-based percutaneous interventions such as arthrography, biopsy and brachytherapy seed placement. The robot mechanism and the control system are designed and manufactured with components compatible with imaging modalities including Magnetic Resonance Imaging (MRI) and Computed Tomography (CT). The current version of the robot presented in this paper is optimized for shoulder arthrography under MRI guidance; a Z-shaped fiducial frame is attached to the robot, providing accurate and repeatable robot registration with the MR scanner coordinate system. Here we present the mechanical design of the manipulator, robot kinematics, robot calibration procedure, and preliminary bench-top accuracy assessment. The bench-top accuracy evaluation of the robotic manipulator shows average translational error of 1.01 mm and 0.96 mm in X and Z axes, respectively, and average rotational error of 3.06 degrees and 2.07 degrees about the X and Z axes, respectively.
{"title":"Body-Mounted Robot for Image-Guided Percutaneous Interventions: Mechanical Design and Preliminary Accuracy Evaluation.","authors":"Niravkumar A Patel, Jiawen Yan, David Levi, Reza Monfaredi, Kevin Cleary, Iulian Iordachita","doi":"10.1109/IROS.2018.8593807","DOIUrl":"10.1109/IROS.2018.8593807","url":null,"abstract":"<p><p>This paper presents a body-mounted, four degree-of-freedom (4-DOF) parallel mechanism robot for image-guided percutaneous interventions. The design of the robot is optimized to be light weight and compact such that it could be mounted to the patient body. It has a modular design that can be adopted for assisting various image-guided, needle-based percutaneous interventions such as arthrography, biopsy and brachytherapy seed placement. The robot mechanism and the control system are designed and manufactured with components compatible with imaging modalities including Magnetic Resonance Imaging (MRI) and Computed Tomography (CT). The current version of the robot presented in this paper is optimized for shoulder arthrography under MRI guidance; a Z-shaped fiducial frame is attached to the robot, providing accurate and repeatable robot registration with the MR scanner coordinate system. Here we present the mechanical design of the manipulator, robot kinematics, robot calibration procedure, and preliminary bench-top accuracy assessment. The bench-top accuracy evaluation of the robotic manipulator shows average translational error of 1.01 mm and 0.96 mm in X and Z axes, respectively, and average rotational error of 3.06 degrees and 2.07 degrees about the X and Z axes, respectively.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2018 ","pages":"1443-1448"},"PeriodicalIF":0.0,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/IROS.2018.8593807","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"37338684","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems