Ishara Paranawithana, Hsieh-Yu Li, S. Foong, U-Xuan Tan, Liangjing Yang, Terence Sey Kiat Lim, F. Ng
{"title":"Ultrasound-Guided Involuntary Motion Compensation of Kidney Stones in Percutaneous Nephrolithotomy Surgery","authors":"Ishara Paranawithana, Hsieh-Yu Li, S. Foong, U-Xuan Tan, Liangjing Yang, Terence Sey Kiat Lim, F. Ng","doi":"10.1109/COASE.2018.8560358","DOIUrl":null,"url":null,"abstract":"Percutaneous Nephrolithotomy (PCNL) is a minimally invasive percutaneous surgical procedure used for large kidney stone removal under ultrasound and fluoroscopy guidance. During the surgery, precise control of handheld 2D ultrasound probe is required but highly challenging as it depends on operator's experience, judgement and dexterity. To complicate the problem, kidney stone moves away from its 2D ultrasound image plane due to respiratory movement of the patient. This makes locating the kidney stone extremely challenging, if not impossible, further limiting the success of the initial needle puncture. Therefore, there is a need to bring automation to the intraoperative workflow to compensate out-of-plane motion of the kidney stone. Maintaining simultaneous control of appropriate contact force during visual tracking is also essential to ensure accurate percutaneous access to the target calyx. This work proposes a visual servoing framework to address the aforesaid problems. Our proposed visual servoing framework comes in the form of two stages namely; pre-scan and realtime visual servoing. Probe holding robotic manipulator firstly scans a small region around the target to construct 3D volume data, followed by out-of-plane target tracking using image correlation-based block matching algorithm. A position based admittance control scheme is developed to address the latent need of maintaining an appropriate contact force between the probe and patient's body during visual servoing. Experimental results show that proposed framework is able to track out-of-plane motion of kidney stone with a position error of only one frame while regulating the environment force feedback with a maximum error of 0.2N. By incorporating automation to existing surgical workflow, we hope to positively impact the way minimally invasive surgeries are performed.","PeriodicalId":6518,"journal":{"name":"2018 IEEE 14th International Conference on Automation Science and Engineering (CASE)","volume":"48 1","pages":"1123-1129"},"PeriodicalIF":0.0000,"publicationDate":"2018-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE 14th International Conference on Automation Science and Engineering (CASE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/COASE.2018.8560358","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Percutaneous Nephrolithotomy (PCNL) is a minimally invasive percutaneous surgical procedure used for large kidney stone removal under ultrasound and fluoroscopy guidance. During the surgery, precise control of handheld 2D ultrasound probe is required but highly challenging as it depends on operator's experience, judgement and dexterity. To complicate the problem, kidney stone moves away from its 2D ultrasound image plane due to respiratory movement of the patient. This makes locating the kidney stone extremely challenging, if not impossible, further limiting the success of the initial needle puncture. Therefore, there is a need to bring automation to the intraoperative workflow to compensate out-of-plane motion of the kidney stone. Maintaining simultaneous control of appropriate contact force during visual tracking is also essential to ensure accurate percutaneous access to the target calyx. This work proposes a visual servoing framework to address the aforesaid problems. Our proposed visual servoing framework comes in the form of two stages namely; pre-scan and realtime visual servoing. Probe holding robotic manipulator firstly scans a small region around the target to construct 3D volume data, followed by out-of-plane target tracking using image correlation-based block matching algorithm. A position based admittance control scheme is developed to address the latent need of maintaining an appropriate contact force between the probe and patient's body during visual servoing. Experimental results show that proposed framework is able to track out-of-plane motion of kidney stone with a position error of only one frame while regulating the environment force feedback with a maximum error of 0.2N. By incorporating automation to existing surgical workflow, we hope to positively impact the way minimally invasive surgeries are performed.