首页 > 最新文献

J. Medical Robotics Res.最新文献

英文 中文
Evaluation of Pre-Training with the da Vinci Skills Simulator on Motor Skill Acquisition in a Surgical Robotics Curriculum 达芬奇技能模拟器对外科机器人课程中运动技能习得的预训练评价
Pub Date : 2021-11-03 DOI: 10.1142/s2424905x21500069
Edoardo Battaglia, Bradly Mueller, D. Hogg, R. Rege, Daniel Scott, A. M. Fey
Training for robotic surgery can be challenging due the complexity of the technology, as well as a high demand for the robotic systems that must be primarily used for clinical care. While robotic surgical skills are traditionally trained using the robotic hardware coupled with physical simulated tissue models and test-beds, there has been an increasing interest in using virtual reality simulators. Use of virtual reality (VR) comes with some advantages, such as the ability to record and track metrics associated with learning. However, evidence of skill transfer from virtual environments to physical robotic tasks has yet to be fully demonstrated. In this work, we evaluate the effect of virtual reality pre-training on performance during a standardized robotic dry-lab training curriculum, where trainees perform a set of tasks and are evaluated with a score based on completion time and errors made during the task. Results show that VR pre-training is weakly significant ([Formula: see text]) in reducing the number of repetitions required to achieve proficiency on the robotic task; however, it is not able to significantly improve performance in any robotic tasks. This suggests that important skills are learned during physical training with the surgical robotic system that cannot yet be replaced with VR training.
由于技术的复杂性,以及对主要用于临床护理的机器人系统的高需求,机器人手术的培训可能具有挑战性。虽然机器人手术技能传统上是使用机器人硬件以及物理模拟组织模型和试验台来训练的,但人们对使用虚拟现实模拟器的兴趣越来越大。使用虚拟现实(VR)具有一些优势,例如能够记录和跟踪与学习相关的指标。然而,从虚拟环境到物理机器人任务的技能转移的证据尚未得到充分证明。在这项工作中,我们评估了虚拟现实预训练对标准化机器人干实验室训练课程中表现的影响,在该课程中,受训者执行一组任务,并根据完成时间和任务期间发生的错误进行评分。结果表明,VR预训练在减少熟练完成机器人任务所需的重复次数方面具有弱显著性([公式:见文本]);然而,它不能显著提高任何机器人任务的性能。这表明,重要的技能是在外科手术机器人系统的体能训练中学习到的,而这些技能还不能被VR训练所取代。
{"title":"Evaluation of Pre-Training with the da Vinci Skills Simulator on Motor Skill Acquisition in a Surgical Robotics Curriculum","authors":"Edoardo Battaglia, Bradly Mueller, D. Hogg, R. Rege, Daniel Scott, A. M. Fey","doi":"10.1142/s2424905x21500069","DOIUrl":"https://doi.org/10.1142/s2424905x21500069","url":null,"abstract":"Training for robotic surgery can be challenging due the complexity of the technology, as well as a high demand for the robotic systems that must be primarily used for clinical care. While robotic surgical skills are traditionally trained using the robotic hardware coupled with physical simulated tissue models and test-beds, there has been an increasing interest in using virtual reality simulators. Use of virtual reality (VR) comes with some advantages, such as the ability to record and track metrics associated with learning. However, evidence of skill transfer from virtual environments to physical robotic tasks has yet to be fully demonstrated. In this work, we evaluate the effect of virtual reality pre-training on performance during a standardized robotic dry-lab training curriculum, where trainees perform a set of tasks and are evaluated with a score based on completion time and errors made during the task. Results show that VR pre-training is weakly significant ([Formula: see text]) in reducing the number of repetitions required to achieve proficiency on the robotic task; however, it is not able to significantly improve performance in any robotic tasks. This suggests that important skills are learned during physical training with the surgical robotic system that cannot yet be replaced with VR training.","PeriodicalId":447761,"journal":{"name":"J. Medical Robotics Res.","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-11-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124649489","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Design of a Wearable Fingertip Haptic Device: Investigating Materials of Varying Stiffness for Mapping the Variable Compliance Platform 一种可穿戴指尖触觉装置的设计:研究用于映射可变柔度平台的变刚度材料
Pub Date : 2021-11-03 DOI: 10.1142/s2424905x21500057
Samir Morad, Zainab Jaffer, S. Dogramadzi
Previously, a pneumatic design of a fingertip haptic device (FHD) was developed for virtual reality applications. In this paper, the feasibility of representing tissues of varying stiffness is investigated. Physical properties, stiffness and Young’s modulus of the variable compliance platform (VCP) were compared with a set of bolus materials representing soft tissues. Young’s moduli of the bolus materials were ten times higher than those from the VCP, whereas the stiffness was fairly similar. Hence, stiffness is the common parameter that could be used to map the FHD to the bolus materials.
以前,为虚拟现实应用开发了一种指尖触觉装置(FHD)的气动设计。本文研究了变刚度组织表示的可行性。将可变柔度平台(VCP)的物理性能、刚度和杨氏模量与一组代表软组织的材料进行比较。体块材料的杨氏模量比VCP材料高10倍,而刚度相当相似。因此,刚度是可用于将FHD映射到本体材料的常用参数。
{"title":"Design of a Wearable Fingertip Haptic Device: Investigating Materials of Varying Stiffness for Mapping the Variable Compliance Platform","authors":"Samir Morad, Zainab Jaffer, S. Dogramadzi","doi":"10.1142/s2424905x21500057","DOIUrl":"https://doi.org/10.1142/s2424905x21500057","url":null,"abstract":"Previously, a pneumatic design of a fingertip haptic device (FHD) was developed for virtual reality applications. In this paper, the feasibility of representing tissues of varying stiffness is investigated. Physical properties, stiffness and Young’s modulus of the variable compliance platform (VCP) were compared with a set of bolus materials representing soft tissues. Young’s moduli of the bolus materials were ten times higher than those from the VCP, whereas the stiffness was fairly similar. Hence, stiffness is the common parameter that could be used to map the FHD to the bolus materials.","PeriodicalId":447761,"journal":{"name":"J. Medical Robotics Res.","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-11-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131969040","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Mapping Surgeons Hand/Finger Movements to Surgical Tool Motion During Conventional Microsurgery Using Machine Learning 在常规显微外科手术中使用机器学习映射外科医生的手/手指运动到手术工具运动
Pub Date : 2021-10-18 DOI: 10.1142/s2424905x21500045
Mohammad Fattahi Sani, R. Ascione, S. Dogramadzi
Purpose: Recent developments in robotics and artificial intelligence (AI) have led to significant advances in healthcare technologies enhancing robot-assisted minimally invasive surgery (RAMIS) in some surgical specialties. However, current human–robot interfaces lack intuitive teleoperation and cannot mimic surgeon’s hand/finger sensing required for fine motion micro-surgeries. These limitations make teleoperated robotic surgery not less suitable for, e.g. cardiac surgery and it can be difficult to learn for established surgeons. We report a pilot study showing an intuitive way of recording and mapping surgeon’s gross hand motion and the fine synergic motion during cardiac micro-surgery as a way to enhance future intuitive teleoperation. Methods: We set to develop a prototype system able to train a Deep Neural Network (DNN) by mapping wrist, hand and surgical tool real-time data acquisition (RTDA) inputs during mock-up heart micro-surgery procedures. The trained network was used to estimate the tools poses from refined hand joint angles. Outputs of the network were surgical tool orientation and jaw angle acquired by an optical motion capture system. Results: Based on surgeon’s feedback during mock micro-surgery, the developed wearable system with light-weight sensors for motion tracking did not interfere with the surgery and instrument handling. The wearable motion tracking system used 12 finger/thumb/wrist joint angle sensors to generate meaningful datasets representing inputs of the DNN network with new hand joint angles added as necessary based on comparing the estimated tool poses against measured tool pose. The DNN architecture was optimized for the highest estimation accuracy and the ability to determine the tool pose with the least mean squared error. This novel approach showed that the surgical instrument’s pose, an essential requirement for teleoperation, can be accurately estimated from recorded surgeon’s hand/finger movements with a mean squared error (MSE) less than 0.3%. Conclusion: We have developed a system to capture fine movements of the surgeon’s hand during micro-surgery that could enhance future remote teleoperation of similar surgical tools during micro-surgery. More work is needed to refine this approach and confirm its potential role in teleoperation.
目的:机器人技术和人工智能(AI)的最新发展导致医疗技术的重大进步,增强了一些外科专业的机器人辅助微创手术(RAMIS)。然而,目前的人机界面缺乏直观的远程操作,无法模仿精细运动微手术所需的外科医生的手/手指感知。这些限制使得远程操作机器人手术不太适合,例如心脏手术,并且对于成熟的外科医生来说很难学习。我们报告了一项初步研究,展示了一种直观的方法来记录和绘制外科医生在心脏显微手术过程中的手部运动和精细的协同运动,以增强未来的直观远程手术。方法:在模拟心脏显微手术过程中,通过绘制手腕、手和手术工具实时数据采集(RTDA)输入来训练深度神经网络(DNN)的原型系统。利用训练好的网络从精细的手关节角度估计刀具姿态。该网络的输出是由光学运动捕捉系统获取的手术工具方向和颌角。结果:根据外科医生在模拟显微手术过程中的反馈,所开发的可穿戴系统具有轻量级的运动跟踪传感器,不影响手术和器械的操作。可穿戴运动跟踪系统使用12个手指/拇指/手腕关节角度传感器来生成有意义的数据集,表示DNN网络的输入,根据估计的工具姿态与测量的工具姿态进行比较,必要时添加新的手部关节角度。DNN体系结构经过优化,具有最高的估计精度和以最小均方误差确定工具姿态的能力。这种新颖的方法表明,手术器械的姿势是远程手术的基本要求,可以从记录的外科医生的手/手指运动中准确地估计出来,均方误差(MSE)小于0.3%。结论:我们开发了一种显微手术中外科医生手部精细动作的捕捉系统,为今后显微手术中类似手术工具的远程操作提供了技术支持。需要做更多的工作来完善这种方法,并确认其在远程操作中的潜在作用。
{"title":"Mapping Surgeons Hand/Finger Movements to Surgical Tool Motion During Conventional Microsurgery Using Machine Learning","authors":"Mohammad Fattahi Sani, R. Ascione, S. Dogramadzi","doi":"10.1142/s2424905x21500045","DOIUrl":"https://doi.org/10.1142/s2424905x21500045","url":null,"abstract":"Purpose: Recent developments in robotics and artificial intelligence (AI) have led to significant advances in healthcare technologies enhancing robot-assisted minimally invasive surgery (RAMIS) in some surgical specialties. However, current human–robot interfaces lack intuitive teleoperation and cannot mimic surgeon’s hand/finger sensing required for fine motion micro-surgeries. These limitations make teleoperated robotic surgery not less suitable for, e.g. cardiac surgery and it can be difficult to learn for established surgeons. We report a pilot study showing an intuitive way of recording and mapping surgeon’s gross hand motion and the fine synergic motion during cardiac micro-surgery as a way to enhance future intuitive teleoperation. Methods: We set to develop a prototype system able to train a Deep Neural Network (DNN) by mapping wrist, hand and surgical tool real-time data acquisition (RTDA) inputs during mock-up heart micro-surgery procedures. The trained network was used to estimate the tools poses from refined hand joint angles. Outputs of the network were surgical tool orientation and jaw angle acquired by an optical motion capture system. Results: Based on surgeon’s feedback during mock micro-surgery, the developed wearable system with light-weight sensors for motion tracking did not interfere with the surgery and instrument handling. The wearable motion tracking system used 12 finger/thumb/wrist joint angle sensors to generate meaningful datasets representing inputs of the DNN network with new hand joint angles added as necessary based on comparing the estimated tool poses against measured tool pose. The DNN architecture was optimized for the highest estimation accuracy and the ability to determine the tool pose with the least mean squared error. This novel approach showed that the surgical instrument’s pose, an essential requirement for teleoperation, can be accurately estimated from recorded surgeon’s hand/finger movements with a mean squared error (MSE) less than 0.3%. Conclusion: We have developed a system to capture fine movements of the surgeon’s hand during micro-surgery that could enhance future remote teleoperation of similar surgical tools during micro-surgery. More work is needed to refine this approach and confirm its potential role in teleoperation.","PeriodicalId":447761,"journal":{"name":"J. Medical Robotics Res.","volume":"2019 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127581451","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Smart Surgical Light: Identification of Surgical Field States Using Time of Flight Sensors 智能手术光:利用飞行时间传感器识别手术场状态
Pub Date : 2021-10-18 DOI: 10.1142/s2424905x21410026
Yuta Itabashi, Fumihiko Nakamura, Hiroki Kajita, H. Saito, M. Sugimoto
This work presents a method for identifying surgical field states using time-of-flight (ToF) sensors equipped with a surgical light. It is important to understand the surgical field state in a smart surgical room. In this study, we aimed to identify surgical field states by using 28 ToF sensors with a surgical light installed on each. In the experimental condition, we obtained a sensor dataset by changing the number of people, posture, and movement state of a person under the surgical light. The identification accuracy of the proposed system was evaluated by applying machine learning techniques. This system can be realized simply by attaching ToF sensors to the surface of an existing surgical light.
这项工作提出了一种方法来识别手术场状态使用飞行时间(ToF)传感器配备手术光。在智能手术室中,了解手术野状态是非常重要的。在这项研究中,我们的目标是通过使用28个ToF传感器来识别手术场状态,每个传感器上都安装了手术灯。在实验条件下,我们通过改变人在手术光下的人数、姿势和运动状态来获得传感器数据集。应用机器学习技术对系统的识别精度进行了评估。该系统可以简单地通过将ToF传感器连接到现有手术灯的表面来实现。
{"title":"Smart Surgical Light: Identification of Surgical Field States Using Time of Flight Sensors","authors":"Yuta Itabashi, Fumihiko Nakamura, Hiroki Kajita, H. Saito, M. Sugimoto","doi":"10.1142/s2424905x21410026","DOIUrl":"https://doi.org/10.1142/s2424905x21410026","url":null,"abstract":"This work presents a method for identifying surgical field states using time-of-flight (ToF) sensors equipped with a surgical light. It is important to understand the surgical field state in a smart surgical room. In this study, we aimed to identify surgical field states by using 28 ToF sensors with a surgical light installed on each. In the experimental condition, we obtained a sensor dataset by changing the number of people, posture, and movement state of a person under the surgical light. The identification accuracy of the proposed system was evaluated by applying machine learning techniques. This system can be realized simply by attaching ToF sensors to the surface of an existing surgical light.","PeriodicalId":447761,"journal":{"name":"J. Medical Robotics Res.","volume":"48 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122117821","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Spatiotemporal Video Highlight by Neural Network Considering Gaze and Hands of Surgeon in Egocentric Surgical Videos 基于神经网络的以自我为中心的手术视频中注视和手的视频时空亮点研究
Pub Date : 2021-10-09 DOI: 10.1142/s2424905x21410014
Keitaro Yoshida, Ryo Hachiuma, Hisako Tomita, Jingjing Pan, Kris Kitani, Hiroki Kajita, T. Hayashida, M. Sugimoto
{"title":"Spatiotemporal Video Highlight by Neural Network Considering Gaze and Hands of Surgeon in Egocentric Surgical Videos","authors":"Keitaro Yoshida, Ryo Hachiuma, Hisako Tomita, Jingjing Pan, Kris Kitani, Hiroki Kajita, T. Hayashida, M. Sugimoto","doi":"10.1142/s2424905x21410014","DOIUrl":"https://doi.org/10.1142/s2424905x21410014","url":null,"abstract":"","PeriodicalId":447761,"journal":{"name":"J. Medical Robotics Res.","volume":"555 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123068433","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Multicamera 3D Viewpoint Adjustment for Robotic Surgery via Deep Reinforcement Learning 基于深度强化学习的机器人手术多摄像机三维视点调整
Pub Date : 2021-07-10 DOI: 10.1142/S2424905X21400031
Yun-Hsuan Su, Kevin Huang, B. Hannaford
While robot-assisted minimally invasive surgery (RMIS) procedures afford a variety of benefits over open surgery and manual laparoscopic operations (including increased tool dexterity, reduced pati...
虽然机器人辅助微创手术(RMIS)程序比开放手术和人工腹腔镜手术(包括增加工具灵活性,减少疼痛和疼痛)提供了各种好处。
{"title":"Multicamera 3D Viewpoint Adjustment for Robotic Surgery via Deep Reinforcement Learning","authors":"Yun-Hsuan Su, Kevin Huang, B. Hannaford","doi":"10.1142/S2424905X21400031","DOIUrl":"https://doi.org/10.1142/S2424905X21400031","url":null,"abstract":"While robot-assisted minimally invasive surgery (RMIS) procedures afford a variety of benefits over open surgery and manual laparoscopic operations (including increased tool dexterity, reduced pati...","PeriodicalId":447761,"journal":{"name":"J. Medical Robotics Res.","volume":"175 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132556779","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Super Resolution for Improved Positioning of an MRI-Guided Spinal Cellular Injection Robot mri引导的脊髓细胞注射机器人的超分辨率定位改进
Pub Date : 2021-04-12 DOI: 10.1142/S2424905X2140002X
D. E. Martinez, Waiman Meinhold, J. Oshinski, Ai-Ping Hu, J. Ueda
This paper presents the development of a magnetic resonance imaging (MRI)-conditional needle positioning robot designed for spinal cellular injection. High-accuracy targeting performance is achieved by the combination of a high precision, parallel-plane, needle-orientation mechanism utilizing linear piezoelectric actuators with an iterative super-resolution (SR) visual navigation algorithm using multi-planar MR imaging. In previous work, the authors have developed an MRI conditional robot with positioning performance exceeding the standard resolution of MRI, rendering the MRI resolution the limit for navigation. This paper further explores the application of SR to images for robot guidance, evaluating positioning performance through simulations and experimentally in benchtop and MRI experiments.
本文介绍了一种用于脊髓细胞注射的磁共振成像(MRI)条件针定位机器人的研制。利用线性压电作动器的高精度、平行平面、针状定向机构与多平面磁共振成像的迭代超分辨率视觉导航算法相结合,实现了高精度瞄准性能。在之前的工作中,作者开发了一种MRI条件机器人,其定位性能超过了MRI的标准分辨率,使MRI分辨率成为导航的极限。本文进一步探讨了SR在机器人导航图像中的应用,通过模拟和实验在台式和MRI实验中评估定位性能。
{"title":"Super Resolution for Improved Positioning of an MRI-Guided Spinal Cellular Injection Robot","authors":"D. E. Martinez, Waiman Meinhold, J. Oshinski, Ai-Ping Hu, J. Ueda","doi":"10.1142/S2424905X2140002X","DOIUrl":"https://doi.org/10.1142/S2424905X2140002X","url":null,"abstract":"This paper presents the development of a magnetic resonance imaging (MRI)-conditional needle positioning robot designed for spinal cellular injection. High-accuracy targeting performance is achieved by the combination of a high precision, parallel-plane, needle-orientation mechanism utilizing linear piezoelectric actuators with an iterative super-resolution (SR) visual navigation algorithm using multi-planar MR imaging. In previous work, the authors have developed an MRI conditional robot with positioning performance exceeding the standard resolution of MRI, rendering the MRI resolution the limit for navigation. This paper further explores the application of SR to images for robot guidance, evaluating positioning performance through simulations and experimentally in benchtop and MRI experiments.","PeriodicalId":447761,"journal":{"name":"J. Medical Robotics Res.","volume":"101 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-04-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132457048","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
3D Steerable Biopsy Needle with a Motorized Manipulation System and Ultrasound Tracking to Navigate inside Tissue 带有机动操作系统和超声跟踪的三维可操纵活检针在组织内部导航
Pub Date : 2021-03-31 DOI: 10.1142/S2424905X21500033
Blayton Padasdao, Z. K. Varnamkhasti, B. Konh
Needle insertion techniques have been used in several minimally invasive procedures for diagnostic and therapeutic purposes. For example, in tissue biopsy, a small sample of suspicious tissue is extracted using percutaneous needles for further analysis. A clinically significant biopsy sample is a definitive factor in cancer diagnosis; therefore, precise placement of the needle tip at target location is necessary. However, it is often challenging to guide and track the needle in a desired path to reach the target precisely, while avoiding sensitive organs or large arteries. Needle steering has been an active field of research in the past decade. Researchers have introduced passive and active needles to improve navigation and targeting inside the tissue. This work introduces a novel active steerable biopsy needle capable of bending inside the tissue in multiple directions. The needle is equipped with a biopsy mechanism to extract suspicious tissue. A motorized manipulation system is developed and programmed to pull the cable tendons and control the needle deflection inside tissue. To show the feasibility of the design concept, the active needle manipulation in air and in a tissue-mimicking phantom is evaluated. An average angular deflection of about 12.40∘ and 11.34∘ in three principal directions is realized in air and phantom tissue, respectively, which is expected to assist in breast cancer biopsy. A robot-assisted ultrasound tracking method is also proposed to track the active needle tip inside the phantom tissue in real time. It is shown that using this method, the needle tip can be tracked in real time with an average and maximum tracking error of [Formula: see text][Formula: see text]mm and [Formula: see text][Formula: see text]mm, respectively.
针插入技术已用于几种微创手术的诊断和治疗目的。例如,在组织活检中,使用经皮针提取少量可疑组织样本以作进一步分析。有临床意义的活检样本是癌症诊断的决定性因素;因此,针尖在目标位置的精确放置是必要的。然而,在避开敏感器官或大动脉的情况下,引导和跟踪针在所需路径上精确到达目标通常是具有挑战性的。在过去的十年中,针转向一直是一个活跃的研究领域。研究人员已经引入了被动和主动针头来改善组织内部的导航和定位。这项工作介绍了一种新型的主动导向活检针,能够在组织内部向多个方向弯曲。该针配备了活检机构,以提取可疑组织。开发了一种电动操纵系统,并对其进行了编程,以拉动钢丝绳肌腱并控制针在组织内的挠度。为了证明该设计概念的可行性,对空气和模拟组织中的主动针操作进行了评估。在空气和假体组织中,在三个主要方向上的平均角偏转分别约为12.40和11.34°,这有望有助于乳腺癌活检。提出了一种机器人辅助超声跟踪方法,实时跟踪幻像组织内的活动针尖。结果表明,采用该方法可以实时跟踪针尖,平均跟踪误差为[公式:见文]mm,最大跟踪误差为[公式:见文]mm,[公式:见文]mm。
{"title":"3D Steerable Biopsy Needle with a Motorized Manipulation System and Ultrasound Tracking to Navigate inside Tissue","authors":"Blayton Padasdao, Z. K. Varnamkhasti, B. Konh","doi":"10.1142/S2424905X21500033","DOIUrl":"https://doi.org/10.1142/S2424905X21500033","url":null,"abstract":"Needle insertion techniques have been used in several minimally invasive procedures for diagnostic and therapeutic purposes. For example, in tissue biopsy, a small sample of suspicious tissue is extracted using percutaneous needles for further analysis. A clinically significant biopsy sample is a definitive factor in cancer diagnosis; therefore, precise placement of the needle tip at target location is necessary. However, it is often challenging to guide and track the needle in a desired path to reach the target precisely, while avoiding sensitive organs or large arteries. Needle steering has been an active field of research in the past decade. Researchers have introduced passive and active needles to improve navigation and targeting inside the tissue. This work introduces a novel active steerable biopsy needle capable of bending inside the tissue in multiple directions. The needle is equipped with a biopsy mechanism to extract suspicious tissue. A motorized manipulation system is developed and programmed to pull the cable tendons and control the needle deflection inside tissue. To show the feasibility of the design concept, the active needle manipulation in air and in a tissue-mimicking phantom is evaluated. An average angular deflection of about 12.40∘ and 11.34∘ in three principal directions is realized in air and phantom tissue, respectively, which is expected to assist in breast cancer biopsy. A robot-assisted ultrasound tracking method is also proposed to track the active needle tip inside the phantom tissue in real time. It is shown that using this method, the needle tip can be tracked in real time with an average and maximum tracking error of [Formula: see text][Formula: see text]mm and [Formula: see text][Formula: see text]mm, respectively.","PeriodicalId":447761,"journal":{"name":"J. Medical Robotics Res.","volume":"140 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-03-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121239221","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Surgical Robot Platform with a Novel Concentric Joint for Minimally Invasive Procedures 具有新型同心关节的微创手术机器人平台
Pub Date : 2021-02-01 DOI: 10.1142/s2424905x20500014
Samir Morad, C. Ulbricht, P. Harkin, Justin Chan, K. Parker, R. Vaidyanathan
In this paper, a surgical robot platform with a novel concentric connector joint (CCJ) is presented. The surgical robot is a parallel robot platform comprised of multiple struts, arranged in a geometrically stable array, connected at their end points via the CCJ. The CCJ joints have nearperfect concentricity of rotation around the node point, which enables the tension and compression forces of the struts to be resolved in a structurally-efficient manner. The preliminary feasibility tests, modelling, and simulations were introduced.
本文提出了一种新型同心连接关节(CCJ)手术机器人平台。手术机器人是由多个支柱组成的并联机器人平台,以几何稳定的阵列排列,在它们的端点通过CCJ连接。CCJ节点在节点周围具有近乎完美的旋转同心度,使得支撑的拉力和压缩力能够以结构高效的方式得到求解。介绍了初步的可行性测试、建模和仿真。
{"title":"Surgical Robot Platform with a Novel Concentric Joint for Minimally Invasive Procedures","authors":"Samir Morad, C. Ulbricht, P. Harkin, Justin Chan, K. Parker, R. Vaidyanathan","doi":"10.1142/s2424905x20500014","DOIUrl":"https://doi.org/10.1142/s2424905x20500014","url":null,"abstract":"In this paper, a surgical robot platform with a novel concentric connector joint (CCJ) is presented. The surgical robot is a parallel robot platform comprised of multiple struts, arranged in a geometrically stable array, connected at their end points via the CCJ. The CCJ joints have nearperfect concentricity of rotation around the node point, which enables the tension and compression forces of the struts to be resolved in a structurally-efficient manner. The preliminary feasibility tests, modelling, and simulations were introduced.","PeriodicalId":447761,"journal":{"name":"J. Medical Robotics Res.","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126169294","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Safer Motion Planning of Steerable Needles via a Shaft-to-Tissue Force Model 基于轴-组织力模型的可操纵针安全运动规划
Pub Date : 2021-01-06 DOI: 10.1142/s2424905x23500034
Michael Bentley, Caleb Rucker, C. Reddy, Oren Salzman, A. Kuntz
Steerable needles are capable of accurately targeting difficult-to-reach clinical sites in the body. By bending around sensitive anatomical structures, steerable needles have the potential to reduce the invasiveness of many medical procedures. However, inserting these needles with curved trajectories increases the risk of tissue damage due to perpendicular forces exerted on the surrounding tissue by the needle's shaft, potentially resulting in lateral shearing through tissue. Such forces can cause significant damage to surrounding tissue, negatively affecting patient outcomes. In this work, we derive a tissue and needle force model based on a Cosserat string formulation, which describes the normal forces and frictional forces along the shaft as a function of the planned needle path, friction model and parameters, and tip piercing force. We propose this new force model and associated cost function as a safer and more clinically relevant metric than those currently used in motion planning for steerable needles. We fit and validate our model through physical needle robot experiments in a gel phantom. We use this force model to define a bottleneck cost function for motion planning and evaluate it against the commonly used path-length cost function in hundreds of randomly generated 3-D environments. Plans generated with our force-based cost show a 62% reduction in the peak modeled tissue force with only a 0.07% increase in length on average compared to using the path-length cost in planning. Additionally, we demonstrate the ability to plan motions with our force-based cost function in a lung tumor biopsy scenario from a segmented computed tomography (CT) scan. By planning motions for the needle that aim to minimize the modeled needle-to-tissue force explicitly, our method plans needle paths that may reduce the risk of significant tissue damage while still reaching desired targets in the body.
可操纵的针头能够精确地瞄准身体中难以到达的临床部位。通过在敏感的解剖结构周围弯曲,可操纵针有可能减少许多医疗程序的侵入性。然而,由于针轴施加在周围组织上的垂直力,插入这些弯曲轨迹的针增加了组织损伤的风险,可能导致组织横向剪切。这种力量会对周围组织造成严重损害,对患者的预后产生负面影响。在这项工作中,我们推导了一个基于Cosserat管柱公式的组织和针刺力模型,该模型描述了沿轴的法向力和摩擦力作为计划针刺路径、摩擦模型和参数以及尖端穿刺力的函数。我们提出这个新的力模型和相关的成本函数,作为一个比目前用于可操纵针运动规划的更安全、更临床相关的度量。我们通过在凝胶模型中进行物理针机器人实验来拟合和验证我们的模型。我们使用该力模型定义了运动规划的瓶颈成本函数,并将其与数百个随机生成的3d环境中常用的路径长度成本函数进行了评估。与在规划中使用路径长度成本相比,使用基于力的成本生成的计划显示,峰值模拟组织力减少62%,平均长度仅增加0.07%。此外,我们展示了在分段计算机断层扫描(CT)的肺肿瘤活检场景中,使用基于力的成本函数来规划运动的能力。通过规划针的运动,旨在明确地将模拟的针对组织的力最小化,我们的方法规划了针的路径,可以降低显著组织损伤的风险,同时仍然达到体内所需的目标。
{"title":"Safer Motion Planning of Steerable Needles via a Shaft-to-Tissue Force Model","authors":"Michael Bentley, Caleb Rucker, C. Reddy, Oren Salzman, A. Kuntz","doi":"10.1142/s2424905x23500034","DOIUrl":"https://doi.org/10.1142/s2424905x23500034","url":null,"abstract":"Steerable needles are capable of accurately targeting difficult-to-reach clinical sites in the body. By bending around sensitive anatomical structures, steerable needles have the potential to reduce the invasiveness of many medical procedures. However, inserting these needles with curved trajectories increases the risk of tissue damage due to perpendicular forces exerted on the surrounding tissue by the needle's shaft, potentially resulting in lateral shearing through tissue. Such forces can cause significant damage to surrounding tissue, negatively affecting patient outcomes. In this work, we derive a tissue and needle force model based on a Cosserat string formulation, which describes the normal forces and frictional forces along the shaft as a function of the planned needle path, friction model and parameters, and tip piercing force. We propose this new force model and associated cost function as a safer and more clinically relevant metric than those currently used in motion planning for steerable needles. We fit and validate our model through physical needle robot experiments in a gel phantom. We use this force model to define a bottleneck cost function for motion planning and evaluate it against the commonly used path-length cost function in hundreds of randomly generated 3-D environments. Plans generated with our force-based cost show a 62% reduction in the peak modeled tissue force with only a 0.07% increase in length on average compared to using the path-length cost in planning. Additionally, we demonstrate the ability to plan motions with our force-based cost function in a lung tumor biopsy scenario from a segmented computed tomography (CT) scan. By planning motions for the needle that aim to minimize the modeled needle-to-tissue force explicitly, our method plans needle paths that may reduce the risk of significant tissue damage while still reaching desired targets in the body.","PeriodicalId":447761,"journal":{"name":"J. Medical Robotics Res.","volume":"61 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-01-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126291172","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
期刊
J. Medical Robotics Res.
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1