Pub Date : 2021-11-03DOI: 10.1142/s2424905x21500069
Edoardo Battaglia, Bradly Mueller, D. Hogg, R. Rege, Daniel Scott, A. M. Fey
Training for robotic surgery can be challenging due the complexity of the technology, as well as a high demand for the robotic systems that must be primarily used for clinical care. While robotic surgical skills are traditionally trained using the robotic hardware coupled with physical simulated tissue models and test-beds, there has been an increasing interest in using virtual reality simulators. Use of virtual reality (VR) comes with some advantages, such as the ability to record and track metrics associated with learning. However, evidence of skill transfer from virtual environments to physical robotic tasks has yet to be fully demonstrated. In this work, we evaluate the effect of virtual reality pre-training on performance during a standardized robotic dry-lab training curriculum, where trainees perform a set of tasks and are evaluated with a score based on completion time and errors made during the task. Results show that VR pre-training is weakly significant ([Formula: see text]) in reducing the number of repetitions required to achieve proficiency on the robotic task; however, it is not able to significantly improve performance in any robotic tasks. This suggests that important skills are learned during physical training with the surgical robotic system that cannot yet be replaced with VR training.
{"title":"Evaluation of Pre-Training with the da Vinci Skills Simulator on Motor Skill Acquisition in a Surgical Robotics Curriculum","authors":"Edoardo Battaglia, Bradly Mueller, D. Hogg, R. Rege, Daniel Scott, A. M. Fey","doi":"10.1142/s2424905x21500069","DOIUrl":"https://doi.org/10.1142/s2424905x21500069","url":null,"abstract":"Training for robotic surgery can be challenging due the complexity of the technology, as well as a high demand for the robotic systems that must be primarily used for clinical care. While robotic surgical skills are traditionally trained using the robotic hardware coupled with physical simulated tissue models and test-beds, there has been an increasing interest in using virtual reality simulators. Use of virtual reality (VR) comes with some advantages, such as the ability to record and track metrics associated with learning. However, evidence of skill transfer from virtual environments to physical robotic tasks has yet to be fully demonstrated. In this work, we evaluate the effect of virtual reality pre-training on performance during a standardized robotic dry-lab training curriculum, where trainees perform a set of tasks and are evaluated with a score based on completion time and errors made during the task. Results show that VR pre-training is weakly significant ([Formula: see text]) in reducing the number of repetitions required to achieve proficiency on the robotic task; however, it is not able to significantly improve performance in any robotic tasks. This suggests that important skills are learned during physical training with the surgical robotic system that cannot yet be replaced with VR training.","PeriodicalId":447761,"journal":{"name":"J. Medical Robotics Res.","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-11-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124649489","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-11-03DOI: 10.1142/s2424905x21500057
Samir Morad, Zainab Jaffer, S. Dogramadzi
Previously, a pneumatic design of a fingertip haptic device (FHD) was developed for virtual reality applications. In this paper, the feasibility of representing tissues of varying stiffness is investigated. Physical properties, stiffness and Young’s modulus of the variable compliance platform (VCP) were compared with a set of bolus materials representing soft tissues. Young’s moduli of the bolus materials were ten times higher than those from the VCP, whereas the stiffness was fairly similar. Hence, stiffness is the common parameter that could be used to map the FHD to the bolus materials.
{"title":"Design of a Wearable Fingertip Haptic Device: Investigating Materials of Varying Stiffness for Mapping the Variable Compliance Platform","authors":"Samir Morad, Zainab Jaffer, S. Dogramadzi","doi":"10.1142/s2424905x21500057","DOIUrl":"https://doi.org/10.1142/s2424905x21500057","url":null,"abstract":"Previously, a pneumatic design of a fingertip haptic device (FHD) was developed for virtual reality applications. In this paper, the feasibility of representing tissues of varying stiffness is investigated. Physical properties, stiffness and Young’s modulus of the variable compliance platform (VCP) were compared with a set of bolus materials representing soft tissues. Young’s moduli of the bolus materials were ten times higher than those from the VCP, whereas the stiffness was fairly similar. Hence, stiffness is the common parameter that could be used to map the FHD to the bolus materials.","PeriodicalId":447761,"journal":{"name":"J. Medical Robotics Res.","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-11-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131969040","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-10-18DOI: 10.1142/s2424905x21500045
Mohammad Fattahi Sani, R. Ascione, S. Dogramadzi
Purpose: Recent developments in robotics and artificial intelligence (AI) have led to significant advances in healthcare technologies enhancing robot-assisted minimally invasive surgery (RAMIS) in some surgical specialties. However, current human–robot interfaces lack intuitive teleoperation and cannot mimic surgeon’s hand/finger sensing required for fine motion micro-surgeries. These limitations make teleoperated robotic surgery not less suitable for, e.g. cardiac surgery and it can be difficult to learn for established surgeons. We report a pilot study showing an intuitive way of recording and mapping surgeon’s gross hand motion and the fine synergic motion during cardiac micro-surgery as a way to enhance future intuitive teleoperation. Methods: We set to develop a prototype system able to train a Deep Neural Network (DNN) by mapping wrist, hand and surgical tool real-time data acquisition (RTDA) inputs during mock-up heart micro-surgery procedures. The trained network was used to estimate the tools poses from refined hand joint angles. Outputs of the network were surgical tool orientation and jaw angle acquired by an optical motion capture system. Results: Based on surgeon’s feedback during mock micro-surgery, the developed wearable system with light-weight sensors for motion tracking did not interfere with the surgery and instrument handling. The wearable motion tracking system used 12 finger/thumb/wrist joint angle sensors to generate meaningful datasets representing inputs of the DNN network with new hand joint angles added as necessary based on comparing the estimated tool poses against measured tool pose. The DNN architecture was optimized for the highest estimation accuracy and the ability to determine the tool pose with the least mean squared error. This novel approach showed that the surgical instrument’s pose, an essential requirement for teleoperation, can be accurately estimated from recorded surgeon’s hand/finger movements with a mean squared error (MSE) less than 0.3%. Conclusion: We have developed a system to capture fine movements of the surgeon’s hand during micro-surgery that could enhance future remote teleoperation of similar surgical tools during micro-surgery. More work is needed to refine this approach and confirm its potential role in teleoperation.
{"title":"Mapping Surgeons Hand/Finger Movements to Surgical Tool Motion During Conventional Microsurgery Using Machine Learning","authors":"Mohammad Fattahi Sani, R. Ascione, S. Dogramadzi","doi":"10.1142/s2424905x21500045","DOIUrl":"https://doi.org/10.1142/s2424905x21500045","url":null,"abstract":"Purpose: Recent developments in robotics and artificial intelligence (AI) have led to significant advances in healthcare technologies enhancing robot-assisted minimally invasive surgery (RAMIS) in some surgical specialties. However, current human–robot interfaces lack intuitive teleoperation and cannot mimic surgeon’s hand/finger sensing required for fine motion micro-surgeries. These limitations make teleoperated robotic surgery not less suitable for, e.g. cardiac surgery and it can be difficult to learn for established surgeons. We report a pilot study showing an intuitive way of recording and mapping surgeon’s gross hand motion and the fine synergic motion during cardiac micro-surgery as a way to enhance future intuitive teleoperation. Methods: We set to develop a prototype system able to train a Deep Neural Network (DNN) by mapping wrist, hand and surgical tool real-time data acquisition (RTDA) inputs during mock-up heart micro-surgery procedures. The trained network was used to estimate the tools poses from refined hand joint angles. Outputs of the network were surgical tool orientation and jaw angle acquired by an optical motion capture system. Results: Based on surgeon’s feedback during mock micro-surgery, the developed wearable system with light-weight sensors for motion tracking did not interfere with the surgery and instrument handling. The wearable motion tracking system used 12 finger/thumb/wrist joint angle sensors to generate meaningful datasets representing inputs of the DNN network with new hand joint angles added as necessary based on comparing the estimated tool poses against measured tool pose. The DNN architecture was optimized for the highest estimation accuracy and the ability to determine the tool pose with the least mean squared error. This novel approach showed that the surgical instrument’s pose, an essential requirement for teleoperation, can be accurately estimated from recorded surgeon’s hand/finger movements with a mean squared error (MSE) less than 0.3%. Conclusion: We have developed a system to capture fine movements of the surgeon’s hand during micro-surgery that could enhance future remote teleoperation of similar surgical tools during micro-surgery. More work is needed to refine this approach and confirm its potential role in teleoperation.","PeriodicalId":447761,"journal":{"name":"J. Medical Robotics Res.","volume":"2019 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127581451","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-10-18DOI: 10.1142/s2424905x21410026
Yuta Itabashi, Fumihiko Nakamura, Hiroki Kajita, H. Saito, M. Sugimoto
This work presents a method for identifying surgical field states using time-of-flight (ToF) sensors equipped with a surgical light. It is important to understand the surgical field state in a smart surgical room. In this study, we aimed to identify surgical field states by using 28 ToF sensors with a surgical light installed on each. In the experimental condition, we obtained a sensor dataset by changing the number of people, posture, and movement state of a person under the surgical light. The identification accuracy of the proposed system was evaluated by applying machine learning techniques. This system can be realized simply by attaching ToF sensors to the surface of an existing surgical light.
{"title":"Smart Surgical Light: Identification of Surgical Field States Using Time of Flight Sensors","authors":"Yuta Itabashi, Fumihiko Nakamura, Hiroki Kajita, H. Saito, M. Sugimoto","doi":"10.1142/s2424905x21410026","DOIUrl":"https://doi.org/10.1142/s2424905x21410026","url":null,"abstract":"This work presents a method for identifying surgical field states using time-of-flight (ToF) sensors equipped with a surgical light. It is important to understand the surgical field state in a smart surgical room. In this study, we aimed to identify surgical field states by using 28 ToF sensors with a surgical light installed on each. In the experimental condition, we obtained a sensor dataset by changing the number of people, posture, and movement state of a person under the surgical light. The identification accuracy of the proposed system was evaluated by applying machine learning techniques. This system can be realized simply by attaching ToF sensors to the surface of an existing surgical light.","PeriodicalId":447761,"journal":{"name":"J. Medical Robotics Res.","volume":"48 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122117821","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-10-09DOI: 10.1142/s2424905x21410014
Keitaro Yoshida, Ryo Hachiuma, Hisako Tomita, Jingjing Pan, Kris Kitani, Hiroki Kajita, T. Hayashida, M. Sugimoto
{"title":"Spatiotemporal Video Highlight by Neural Network Considering Gaze and Hands of Surgeon in Egocentric Surgical Videos","authors":"Keitaro Yoshida, Ryo Hachiuma, Hisako Tomita, Jingjing Pan, Kris Kitani, Hiroki Kajita, T. Hayashida, M. Sugimoto","doi":"10.1142/s2424905x21410014","DOIUrl":"https://doi.org/10.1142/s2424905x21410014","url":null,"abstract":"","PeriodicalId":447761,"journal":{"name":"J. Medical Robotics Res.","volume":"555 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123068433","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-07-10DOI: 10.1142/S2424905X21400031
Yun-Hsuan Su, Kevin Huang, B. Hannaford
While robot-assisted minimally invasive surgery (RMIS) procedures afford a variety of benefits over open surgery and manual laparoscopic operations (including increased tool dexterity, reduced pati...
{"title":"Multicamera 3D Viewpoint Adjustment for Robotic Surgery via Deep Reinforcement Learning","authors":"Yun-Hsuan Su, Kevin Huang, B. Hannaford","doi":"10.1142/S2424905X21400031","DOIUrl":"https://doi.org/10.1142/S2424905X21400031","url":null,"abstract":"While robot-assisted minimally invasive surgery (RMIS) procedures afford a variety of benefits over open surgery and manual laparoscopic operations (including increased tool dexterity, reduced pati...","PeriodicalId":447761,"journal":{"name":"J. Medical Robotics Res.","volume":"175 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132556779","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-04-12DOI: 10.1142/S2424905X2140002X
D. E. Martinez, Waiman Meinhold, J. Oshinski, Ai-Ping Hu, J. Ueda
This paper presents the development of a magnetic resonance imaging (MRI)-conditional needle positioning robot designed for spinal cellular injection. High-accuracy targeting performance is achieved by the combination of a high precision, parallel-plane, needle-orientation mechanism utilizing linear piezoelectric actuators with an iterative super-resolution (SR) visual navigation algorithm using multi-planar MR imaging. In previous work, the authors have developed an MRI conditional robot with positioning performance exceeding the standard resolution of MRI, rendering the MRI resolution the limit for navigation. This paper further explores the application of SR to images for robot guidance, evaluating positioning performance through simulations and experimentally in benchtop and MRI experiments.
{"title":"Super Resolution for Improved Positioning of an MRI-Guided Spinal Cellular Injection Robot","authors":"D. E. Martinez, Waiman Meinhold, J. Oshinski, Ai-Ping Hu, J. Ueda","doi":"10.1142/S2424905X2140002X","DOIUrl":"https://doi.org/10.1142/S2424905X2140002X","url":null,"abstract":"This paper presents the development of a magnetic resonance imaging (MRI)-conditional needle positioning robot designed for spinal cellular injection. High-accuracy targeting performance is achieved by the combination of a high precision, parallel-plane, needle-orientation mechanism utilizing linear piezoelectric actuators with an iterative super-resolution (SR) visual navigation algorithm using multi-planar MR imaging. In previous work, the authors have developed an MRI conditional robot with positioning performance exceeding the standard resolution of MRI, rendering the MRI resolution the limit for navigation. This paper further explores the application of SR to images for robot guidance, evaluating positioning performance through simulations and experimentally in benchtop and MRI experiments.","PeriodicalId":447761,"journal":{"name":"J. Medical Robotics Res.","volume":"101 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-04-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132457048","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-03-31DOI: 10.1142/S2424905X21500033
Blayton Padasdao, Z. K. Varnamkhasti, B. Konh
Needle insertion techniques have been used in several minimally invasive procedures for diagnostic and therapeutic purposes. For example, in tissue biopsy, a small sample of suspicious tissue is extracted using percutaneous needles for further analysis. A clinically significant biopsy sample is a definitive factor in cancer diagnosis; therefore, precise placement of the needle tip at target location is necessary. However, it is often challenging to guide and track the needle in a desired path to reach the target precisely, while avoiding sensitive organs or large arteries. Needle steering has been an active field of research in the past decade. Researchers have introduced passive and active needles to improve navigation and targeting inside the tissue. This work introduces a novel active steerable biopsy needle capable of bending inside the tissue in multiple directions. The needle is equipped with a biopsy mechanism to extract suspicious tissue. A motorized manipulation system is developed and programmed to pull the cable tendons and control the needle deflection inside tissue. To show the feasibility of the design concept, the active needle manipulation in air and in a tissue-mimicking phantom is evaluated. An average angular deflection of about 12.40∘ and 11.34∘ in three principal directions is realized in air and phantom tissue, respectively, which is expected to assist in breast cancer biopsy. A robot-assisted ultrasound tracking method is also proposed to track the active needle tip inside the phantom tissue in real time. It is shown that using this method, the needle tip can be tracked in real time with an average and maximum tracking error of [Formula: see text][Formula: see text]mm and [Formula: see text][Formula: see text]mm, respectively.
{"title":"3D Steerable Biopsy Needle with a Motorized Manipulation System and Ultrasound Tracking to Navigate inside Tissue","authors":"Blayton Padasdao, Z. K. Varnamkhasti, B. Konh","doi":"10.1142/S2424905X21500033","DOIUrl":"https://doi.org/10.1142/S2424905X21500033","url":null,"abstract":"Needle insertion techniques have been used in several minimally invasive procedures for diagnostic and therapeutic purposes. For example, in tissue biopsy, a small sample of suspicious tissue is extracted using percutaneous needles for further analysis. A clinically significant biopsy sample is a definitive factor in cancer diagnosis; therefore, precise placement of the needle tip at target location is necessary. However, it is often challenging to guide and track the needle in a desired path to reach the target precisely, while avoiding sensitive organs or large arteries. Needle steering has been an active field of research in the past decade. Researchers have introduced passive and active needles to improve navigation and targeting inside the tissue. This work introduces a novel active steerable biopsy needle capable of bending inside the tissue in multiple directions. The needle is equipped with a biopsy mechanism to extract suspicious tissue. A motorized manipulation system is developed and programmed to pull the cable tendons and control the needle deflection inside tissue. To show the feasibility of the design concept, the active needle manipulation in air and in a tissue-mimicking phantom is evaluated. An average angular deflection of about 12.40∘ and 11.34∘ in three principal directions is realized in air and phantom tissue, respectively, which is expected to assist in breast cancer biopsy. A robot-assisted ultrasound tracking method is also proposed to track the active needle tip inside the phantom tissue in real time. It is shown that using this method, the needle tip can be tracked in real time with an average and maximum tracking error of [Formula: see text][Formula: see text]mm and [Formula: see text][Formula: see text]mm, respectively.","PeriodicalId":447761,"journal":{"name":"J. Medical Robotics Res.","volume":"140 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121239221","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-02-01DOI: 10.1142/s2424905x20500014
Samir Morad, C. Ulbricht, P. Harkin, Justin Chan, K. Parker, R. Vaidyanathan
In this paper, a surgical robot platform with a novel concentric connector joint (CCJ) is presented. The surgical robot is a parallel robot platform comprised of multiple struts, arranged in a geometrically stable array, connected at their end points via the CCJ. The CCJ joints have nearperfect concentricity of rotation around the node point, which enables the tension and compression forces of the struts to be resolved in a structurally-efficient manner. The preliminary feasibility tests, modelling, and simulations were introduced.
{"title":"Surgical Robot Platform with a Novel Concentric Joint for Minimally Invasive Procedures","authors":"Samir Morad, C. Ulbricht, P. Harkin, Justin Chan, K. Parker, R. Vaidyanathan","doi":"10.1142/s2424905x20500014","DOIUrl":"https://doi.org/10.1142/s2424905x20500014","url":null,"abstract":"In this paper, a surgical robot platform with a novel concentric connector joint (CCJ) is presented. The surgical robot is a parallel robot platform comprised of multiple struts, arranged in a geometrically stable array, connected at their end points via the CCJ. The CCJ joints have nearperfect concentricity of rotation around the node point, which enables the tension and compression forces of the struts to be resolved in a structurally-efficient manner. The preliminary feasibility tests, modelling, and simulations were introduced.","PeriodicalId":447761,"journal":{"name":"J. Medical Robotics Res.","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126169294","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-01-06DOI: 10.1142/s2424905x23500034
Michael Bentley, Caleb Rucker, C. Reddy, Oren Salzman, A. Kuntz
Steerable needles are capable of accurately targeting difficult-to-reach clinical sites in the body. By bending around sensitive anatomical structures, steerable needles have the potential to reduce the invasiveness of many medical procedures. However, inserting these needles with curved trajectories increases the risk of tissue damage due to perpendicular forces exerted on the surrounding tissue by the needle's shaft, potentially resulting in lateral shearing through tissue. Such forces can cause significant damage to surrounding tissue, negatively affecting patient outcomes. In this work, we derive a tissue and needle force model based on a Cosserat string formulation, which describes the normal forces and frictional forces along the shaft as a function of the planned needle path, friction model and parameters, and tip piercing force. We propose this new force model and associated cost function as a safer and more clinically relevant metric than those currently used in motion planning for steerable needles. We fit and validate our model through physical needle robot experiments in a gel phantom. We use this force model to define a bottleneck cost function for motion planning and evaluate it against the commonly used path-length cost function in hundreds of randomly generated 3-D environments. Plans generated with our force-based cost show a 62% reduction in the peak modeled tissue force with only a 0.07% increase in length on average compared to using the path-length cost in planning. Additionally, we demonstrate the ability to plan motions with our force-based cost function in a lung tumor biopsy scenario from a segmented computed tomography (CT) scan. By planning motions for the needle that aim to minimize the modeled needle-to-tissue force explicitly, our method plans needle paths that may reduce the risk of significant tissue damage while still reaching desired targets in the body.
{"title":"Safer Motion Planning of Steerable Needles via a Shaft-to-Tissue Force Model","authors":"Michael Bentley, Caleb Rucker, C. Reddy, Oren Salzman, A. Kuntz","doi":"10.1142/s2424905x23500034","DOIUrl":"https://doi.org/10.1142/s2424905x23500034","url":null,"abstract":"Steerable needles are capable of accurately targeting difficult-to-reach clinical sites in the body. By bending around sensitive anatomical structures, steerable needles have the potential to reduce the invasiveness of many medical procedures. However, inserting these needles with curved trajectories increases the risk of tissue damage due to perpendicular forces exerted on the surrounding tissue by the needle's shaft, potentially resulting in lateral shearing through tissue. Such forces can cause significant damage to surrounding tissue, negatively affecting patient outcomes. In this work, we derive a tissue and needle force model based on a Cosserat string formulation, which describes the normal forces and frictional forces along the shaft as a function of the planned needle path, friction model and parameters, and tip piercing force. We propose this new force model and associated cost function as a safer and more clinically relevant metric than those currently used in motion planning for steerable needles. We fit and validate our model through physical needle robot experiments in a gel phantom. We use this force model to define a bottleneck cost function for motion planning and evaluate it against the commonly used path-length cost function in hundreds of randomly generated 3-D environments. Plans generated with our force-based cost show a 62% reduction in the peak modeled tissue force with only a 0.07% increase in length on average compared to using the path-length cost in planning. Additionally, we demonstrate the ability to plan motions with our force-based cost function in a lung tumor biopsy scenario from a segmented computed tomography (CT) scan. By planning motions for the needle that aim to minimize the modeled needle-to-tissue force explicitly, our method plans needle paths that may reduce the risk of significant tissue damage while still reaching desired targets in the body.","PeriodicalId":447761,"journal":{"name":"J. Medical Robotics Res.","volume":"61 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-01-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126291172","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}