Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems最新文献
Visual place recognition (VPR) is critical in not only localization and mapping for autonomous driving vehicles, but also assistive navigation for the visually impaired population. To enable a long-term VPR system on a large scale, several challenges need to be addressed. First, different applications could require different image view directions, such as front views for self-driving cars while side views for the low vision people. Second, VPR in metropolitan scenes can often cause privacy concerns due to the imaging of pedestrian and vehicle identity information, calling for the need for data anonymization before VPR queries and database construction. Both factors could lead to VPR performance variations that are not well understood yet. To study their influences, we present the NYU-VPR dataset that contains more than 200,000 images over a 2km×2km area near the New York University campus, taken within the whole year of 2016. We present benchmark results on several popular VPR algorithms showing that side views are significantly more challenging for current VPR methods while the influence of data anonymization is almost negligible, together with our hypothetical explanations and in-depth analysis.
{"title":"NYU-VPR: Long-Term Visual Place Recognition Benchmark with View Direction and Data Anonymization Influences.","authors":"Diwei Sheng, Yuxiang Chai, Xinru Li, Chen Feng, Jianzhe Lin, Claudio Silva, John-Ross Rizzo","doi":"10.1109/iros51168.2021.9636640","DOIUrl":"https://doi.org/10.1109/iros51168.2021.9636640","url":null,"abstract":"<p><p>Visual place recognition (VPR) is critical in not only localization and mapping for autonomous driving vehicles, but also assistive navigation for the visually impaired population. To enable a long-term VPR system on a large scale, several challenges need to be addressed. First, different applications could require different image view directions, such as front views for self-driving cars while side views for the low vision people. Second, VPR in metropolitan scenes can often cause privacy concerns due to the imaging of pedestrian and vehicle identity information, calling for the need for data anonymization before VPR queries and database construction. Both factors could lead to VPR performance variations that are not well understood yet. To study their influences, we present the NYU-VPR dataset that contains more than 200,000 images over a 2km×2km area near the New York University campus, taken within the whole year of 2016. We present benchmark results on several popular VPR algorithms showing that side views are significantly more challenging for current VPR methods while the influence of data anonymization is almost negligible, together with our hypothetical explanations and in-depth analysis.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":" ","pages":"9773-9779"},"PeriodicalIF":0.0,"publicationDate":"2021-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9394449/pdf/nihms-1827810.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"40633245","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-09-01DOI: 10.1109/iros51168.2021.9636180
T Kevin Best, Kyle R Embry, Elliott J Rouse, Robert D Gregg
Most controllers for lower-limb robotic prostheses require individually tuned parameter sets for every combination of speed and incline that the device is designed for. Because ambulation occurs over a continuum of speeds and inclines, this design paradigm requires tuning of a potentially prohibitively large number of parameters. This limitation motivates an alternative control framework that enables walking over a range of speeds and inclines while requiring only a limited number of tunable parameters. In this work, we present the implementation of a continuously varying kinematic controller on a custom powered knee-ankle prosthesis. The controller uses a phase variable derived from the residual thigh angle, along with real-time estimates of ground inclination and walking speed, to compute the appropriate knee and ankle joint angles from a continuous model of able-bodied kinematic data. We modify an existing phase variable architecture to allow for changes in speeds and inclines, quantify the closed-loop accuracy of the speed and incline estimation algorithms for various references, and experimentally validate the controller by observing that it replicates kinematic trends seen in able-bodied gait as speed and incline vary.
{"title":"Phase-Variable Control of a Powered Knee-Ankle Prosthesis over Continuously Varying Speeds and Inclines.","authors":"T Kevin Best, Kyle R Embry, Elliott J Rouse, Robert D Gregg","doi":"10.1109/iros51168.2021.9636180","DOIUrl":"https://doi.org/10.1109/iros51168.2021.9636180","url":null,"abstract":"<p><p>Most controllers for lower-limb robotic prostheses require individually tuned parameter sets for every combination of speed and incline that the device is designed for. Because ambulation occurs over a continuum of speeds and inclines, this design paradigm requires tuning of a potentially prohibitively large number of parameters. This limitation motivates an alternative control framework that enables walking over a range of speeds and inclines while requiring only a limited number of tunable parameters. In this work, we present the implementation of a continuously varying kinematic controller on a custom powered knee-ankle prosthesis. The controller uses a phase variable derived from the residual thigh angle, along with real-time estimates of ground inclination and walking speed, to compute the appropriate knee and ankle joint angles from a continuous model of able-bodied kinematic data. We modify an existing phase variable architecture to allow for changes in speeds and inclines, quantify the closed-loop accuracy of the speed and incline estimation algorithms for various references, and experimentally validate the controller by observing that it replicates kinematic trends seen in able-bodied gait as speed and incline vary.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2021 ","pages":"6182-6189"},"PeriodicalIF":0.0,"publicationDate":"2021-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8890507/pdf/nihms-1726048.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10351178","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-09-01DOI: 10.1109/iros51168.2021.9635902
Xihan Ma, Ziming Zhang, Haichong K Zhang
Under the ceaseless global COVID-19 pandemic, lung ultrasound (LUS) is the emerging way for effective diagnosis and severeness evaluation of respiratory diseases. However, close physical contact is unavoidable in conventional clinical ultrasound, increasing the infection risk for health-care workers. Hence, a scanning approach involving minimal physical contact between an operator and a patient is vital to maximize the safety of clinical ultrasound procedures. A robotic ultrasound platform can satisfy this need by remotely manipulating the ultrasound probe with a robotic arm. This paper proposes a robotic LUS system that incorporates the automatic identification and execution of the ultrasound probe placement pose without manual input. An RGB-D camera is utilized to recognize the scanning targets on the patient through a learning-based human pose estimation algorithm and solve for the landing pose to attach the probe vertically to the tissue surface; A position/force controller is designed to handle intraoperative probe pose adjustment for maintaining the contact force. We evaluated the scanning area localization accuracy, motion execution accuracy, and ultrasound image acquisition capability using an upper torso mannequin and a realistic lung ultrasound phantom with healthy and COVID-19-infected lung anatomy. Results demonstrated the overall scanning target localization accuracy of 19.67 ± 4.92 mm and the probe landing pose estimation accuracy of 6.92 ± 2.75 mm in translation, 10.35 ± 2.97 deg in rotation. The contact force-controlled robotic scanning allowed the successful ultrasound image collection, capturing pathological landmarks.
{"title":"Autonomous Scanning Target Localization for Robotic Lung Ultrasound Imaging.","authors":"Xihan Ma, Ziming Zhang, Haichong K Zhang","doi":"10.1109/iros51168.2021.9635902","DOIUrl":"https://doi.org/10.1109/iros51168.2021.9635902","url":null,"abstract":"<p><p>Under the ceaseless global COVID-19 pandemic, lung ultrasound (LUS) is the emerging way for effective diagnosis and severeness evaluation of respiratory diseases. However, close physical contact is unavoidable in conventional clinical ultrasound, increasing the infection risk for health-care workers. Hence, a scanning approach involving minimal physical contact between an operator and a patient is vital to maximize the safety of clinical ultrasound procedures. A robotic ultrasound platform can satisfy this need by remotely manipulating the ultrasound probe with a robotic arm. This paper proposes a robotic LUS system that incorporates the automatic identification and execution of the ultrasound probe placement pose without manual input. An RGB-D camera is utilized to recognize the scanning targets on the patient through a learning-based human pose estimation algorithm and solve for the landing pose to attach the probe vertically to the tissue surface; A position/force controller is designed to handle intraoperative probe pose adjustment for maintaining the contact force. We evaluated the scanning area localization accuracy, motion execution accuracy, and ultrasound image acquisition capability using an upper torso mannequin and a realistic lung ultrasound phantom with healthy and COVID-19-infected lung anatomy. Results demonstrated the overall scanning target localization accuracy of 19.67 ± 4.92 mm and the probe landing pose estimation accuracy of 6.92 ± 2.75 mm in translation, 10.35 ± 2.97 deg in rotation. The contact force-controlled robotic scanning allowed the successful ultrasound image collection, capturing pathological landmarks.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2021 ","pages":"9467-9474"},"PeriodicalIF":0.0,"publicationDate":"2021-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9373068/pdf/nihms-1822595.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10351719","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-09-01Epub Date: 2021-12-16DOI: 10.1109/iros51168.2021.9636050
Guangshen Ma, Weston Ross, Patrick J Codd
This paper proposes an End-to-End stereovision-guided laser surgery system that can conduct laser ablation on targets selected by human operators in the color image, referred as StereoCNC. Two digital cameras are integrated into a previously developed robotic laser system to add a color sensing modality and formulate the stereovision. A calibration method is implemented to register the coordinate frames between stereo cameras and the laser system, modelled as a 3D-to-3D least-squares problem. The calibration reprojection errors are used to characterize a 3D error field by Gaussian Process Regression (GPR). This error field can make predictions for new point cloud data to identify an optimal position with lower calibration errors. A stereovision-guided laser ablation pipeline is proposed to optimize the positioning of the surgical site within the error field, which is achieved with a Genetic Algorithm search; mechanical stages move the site to the low-error region. The pipeline is validated by the experiments on phantoms with color texture and various geometric shapes. The overall targeting accuracy of the system achieved an average RMSE of 0.13 ± 0.02 mm and maximum error of 0.34 ± 0.06 mm, as measured by pre- and post-laser ablation images. The results show potential applications of using the developed stereovision-guided robotic system for superficial laser surgery, including dermatologic applications or removal of exposed tumorous tissue in neurosurgery.
{"title":"StereoCNC: A Stereovision-guided Robotic Laser System.","authors":"Guangshen Ma, Weston Ross, Patrick J Codd","doi":"10.1109/iros51168.2021.9636050","DOIUrl":"https://doi.org/10.1109/iros51168.2021.9636050","url":null,"abstract":"<p><p>This paper proposes an End-to-End stereovision-guided laser surgery system that can conduct laser ablation on targets selected by human operators in the color image, referred as <i>StereoCNC</i>. Two digital cameras are integrated into a previously developed robotic laser system to add a color sensing modality and formulate the stereovision. A calibration method is implemented to register the coordinate frames between stereo cameras and the laser system, modelled as a 3D-to-3D least-squares problem. The calibration reprojection errors are used to characterize a 3D error field by Gaussian Process Regression (GPR). This error field can make predictions for new point cloud data to identify an optimal position with lower calibration errors. A stereovision-guided laser ablation pipeline is proposed to optimize the positioning of the surgical site within the error field, which is achieved with a Genetic Algorithm search; mechanical stages move the site to the low-error region. The pipeline is validated by the experiments on phantoms with color texture and various geometric shapes. The overall targeting accuracy of the system achieved an average RMSE of 0.13 ± 0.02 <i>mm</i> and maximum error of 0.34 ± 0.06 <i>mm</i>, as measured by pre- and post-laser ablation images. The results show potential applications of using the developed stereovision-guided robotic system for superficial laser surgery, including dermatologic applications or removal of exposed tumorous tissue in neurosurgery.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":" ","pages":"540-547"},"PeriodicalIF":0.0,"publicationDate":"2021-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9358620/pdf/nihms-1814504.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"40685547","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2021-09-01Epub Date: 2021-12-16DOI: 10.1109/iros51168.2021.9636220
Yanzhou Wang, Gang Li, Ka-Wai Kwok, Kevin Cleary, Russell H Taylor, Iulian Iordachita
Lumbar injection is an image-guided procedure performed manually for diagnosis and treatment of lower back pain and leg pain. Previously, we have developed and verified an MR-Conditional robotic solution to assisting the needle insertion process. Drawing on our clinical experiences, a virtual remote center of motion (RCM) constraint is implemented to enable our robot to mimic a clinician's hand motion to adjust the needle tip position in situ. Force and image data are collected to study the needle behavior in gel phantoms during this motion, and a mechanics-based needle-tissue interaction model is proposed and evaluated to further examine the underlying physics. This work extends the commonly-adopted notion of an RCM for flexible needles, and introduces new motion parameters to describe the needle behavior. The model parameters can be tuned to match the experimental result to sub-millimeter accuracy, and this proposed needle manipulation method presents a safer alternative to laterally translating the needle during in situ needle adjustments.
{"title":"Towards Safe In Situ Needle Manipulation for Robot Assisted Lumbar Injection in Interventional MRI.","authors":"Yanzhou Wang, Gang Li, Ka-Wai Kwok, Kevin Cleary, Russell H Taylor, Iulian Iordachita","doi":"10.1109/iros51168.2021.9636220","DOIUrl":"https://doi.org/10.1109/iros51168.2021.9636220","url":null,"abstract":"<p><p>Lumbar injection is an image-guided procedure performed manually for diagnosis and treatment of lower back pain and leg pain. Previously, we have developed and verified an MR-Conditional robotic solution to assisting the needle insertion process. Drawing on our clinical experiences, a virtual remote center of motion (RCM) constraint is implemented to enable our robot to mimic a clinician's hand motion to adjust the needle tip position <i>in situ</i>. Force and image data are collected to study the needle behavior in gel phantoms during this motion, and a mechanics-based needle-tissue interaction model is proposed and evaluated to further examine the underlying physics. This work extends the commonly-adopted notion of an RCM for flexible needles, and introduces new motion parameters to describe the needle behavior. The model parameters can be tuned to match the experimental result to sub-millimeter accuracy, and this proposed needle manipulation method presents a safer alternative to laterally translating the needle during <i>in situ</i> needle adjustments.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":" ","pages":"1835-1842"},"PeriodicalIF":0.0,"publicationDate":"2021-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8845499/pdf/nihms-1777043.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39627523","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-10-01Epub Date: 2021-02-10DOI: 10.1109/iros45743.2020.9341527
Ran Hao, Nate Lombard Poirot, M Cenk Çavuşoğlu
This paper studies the contact stability and contact safety of a robotic intravascular cardiac catheter under blood flow disturbances while in contact with tissue surface. A probabilistic blood flow disturbance model, where the blood flow drag forces on the catheter body are approximated using a quasi-static model, is introduced. Using this blood flow disturbance model, probabilistic contact stability and contact safety metrics, employing a sample based representation of the blood flow velocity distribution, are proposed. Finally, the contact stability and contact safety of a MRI-actuated robotic catheter are analyzed using these models in a specific example scenario under left pulmonary inferior vein (LIV) blood flow disturbances.
{"title":"Analysis of Contact Stability and Contact Safety of a Robotic Intravascular Cardiac Catheter under Blood Flow Disturbances.","authors":"Ran Hao, Nate Lombard Poirot, M Cenk Çavuşoğlu","doi":"10.1109/iros45743.2020.9341527","DOIUrl":"10.1109/iros45743.2020.9341527","url":null,"abstract":"<p><p>This paper studies the contact stability and contact safety of a robotic intravascular cardiac catheter under blood flow disturbances while in contact with tissue surface. A probabilistic blood flow disturbance model, where the blood flow drag forces on the catheter body are approximated using a quasi-static model, is introduced. Using this blood flow disturbance model, probabilistic contact stability and contact safety metrics, employing a sample based representation of the blood flow velocity distribution, are proposed. Finally, the contact stability and contact safety of a MRI-actuated robotic catheter are analyzed using these models in a specific example scenario under left pulmonary inferior vein (LIV) blood flow disturbances.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2020 ","pages":"3216-3223"},"PeriodicalIF":0.0,"publicationDate":"2020-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8165756/pdf/nihms-1705038.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"38975419","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-10-01Epub Date: 2021-02-10DOI: 10.1109/iros45743.2020.9341043
E Erdem Tuna, Nate Lombard Poirot, Juana Barrera Bayona, Dominique Franson, Sherry Huang, Julian Narvaez, Nicole Seiberlich, Mark Griswold, M Cenk Çavuşoğlu
In magnetic resonance imaging (MRI) guided robotic catheter ablation procedures, reliable tracking of the catheter within the MRI scanner is needed to safely navigate the catheter. This requires accurate registration of the catheter to the scanner. This paper presents a differential, multi-slice image-based registration approach utilizing active fiducial coils. The proposed method would be used to preoperatively register the MRI image space with the physical catheter space. In the proposed scheme, the registration is performed with the help of a registration frame, which has a set of embedded electromagnetic coils designed to actively create MRI image artifacts. These coils are detected in the MRI scanner's coordinate system by background subtraction. The detected coil locations in each slice are weighted by the artifact size and then registered to known ground truth coil locations in the catheter's coordinate system via least-squares fitting. The proposed approach is validated by using a set of target coils placed withing the workspace, employing multi-planar capabilities of the MRI scanner. The average registration and validation errors are respectively computed as 1.97 mm and 2.49 mm. The multi-slice approach is also compared to the single-slice method and shown to improve registration and validation by respectively 0.45 mm and 0.66 mm.
{"title":"Differential Image Based Robot to MRI Scanner Registration with Active Fiducial Markers for an MRI-Guided Robotic Catheter System.","authors":"E Erdem Tuna, Nate Lombard Poirot, Juana Barrera Bayona, Dominique Franson, Sherry Huang, Julian Narvaez, Nicole Seiberlich, Mark Griswold, M Cenk Çavuşoğlu","doi":"10.1109/iros45743.2020.9341043","DOIUrl":"10.1109/iros45743.2020.9341043","url":null,"abstract":"<p><p>In magnetic resonance imaging (MRI) guided robotic catheter ablation procedures, reliable tracking of the catheter within the MRI scanner is needed to safely navigate the catheter. This requires accurate registration of the catheter to the scanner. This paper presents a differential, multi-slice image-based registration approach utilizing active fiducial coils. The proposed method would be used to preoperatively register the MRI image space with the physical catheter space. In the proposed scheme, the registration is performed with the help of a registration frame, which has a set of embedded electromagnetic coils designed to actively create MRI image artifacts. These coils are detected in the MRI scanner's coordinate system by background subtraction. The detected coil locations in each slice are weighted by the artifact size and then registered to known ground truth coil locations in the catheter's coordinate system via least-squares fitting. The proposed approach is validated by using a set of target coils placed withing the workspace, employing multi-planar capabilities of the MRI scanner. The average registration and validation errors are respectively computed as 1.97 mm and 2.49 mm. The multi-slice approach is also compared to the single-slice method and shown to improve registration and validation by respectively 0.45 mm and 0.66 mm.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2020 ","pages":"2958-2964"},"PeriodicalIF":0.0,"publicationDate":"2020-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/iros45743.2020.9341043","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39238889","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-10-01Epub Date: 2021-02-10DOI: 10.1109/iros45743.2020.9341350
Ali Ebrahimi, Marina Roizenblatt, Niravkumar Patel, Peter Gehlbach, Iulian Iordachita
Robot-assisted retinal surgery has become increasingly prevalent in recent years in part due to the potential for robots to help surgeons improve the safety of an immensely delicate and difficult set of tasks. The integration of robots into retinal surgery has resulted in diminished surgeon perception of tool-to-tissue interaction forces due to robot's stiffness. The tactile perception of these interaction forces (sclera force) has long been a crucial source of feedback for surgeons who rely on them to guide surgical maneuvers and to prevent damaging forces from being applied to the eye. This problem is exacerbated when there are unfavorable sclera forces originating from patient movements (dynamic eyeball manipulation) during surgery which may cause the sclera forces to increase even drastically. In this study we aim at evaluating the efficacy of providing warning auditory feedback based on the level of sclera force measured by force sensing instruments. The intent is to enhance safety during dynamic eye manipulations in robot-assisted retinal surgery. The disturbances caused by lateral movement of patient's head are simulated using a piezo-actuated linear stage. The Johns Hopkins Steady-Hand Eye Robot (SHER), is then used in a multi-user experiment. Twelve participants are asked to perform a mock retinal surgery by following painted vessels inside an eye phantom using a force sensing instrument while auditory feedback is provided. The results indicate that the users are able to handle the eye motion disturbances while maintaining the sclera forces within safe boundaries when audio feedback is provided.
{"title":"Auditory Feedback Effectiveness for Enabling Safe Sclera Force in Robot-Assisted Vitreoretinal Surgery: a Multi-User Study.","authors":"Ali Ebrahimi, Marina Roizenblatt, Niravkumar Patel, Peter Gehlbach, Iulian Iordachita","doi":"10.1109/iros45743.2020.9341350","DOIUrl":"https://doi.org/10.1109/iros45743.2020.9341350","url":null,"abstract":"<p><p>Robot-assisted retinal surgery has become increasingly prevalent in recent years in part due to the potential for robots to help surgeons improve the safety of an immensely delicate and difficult set of tasks. The integration of robots into retinal surgery has resulted in diminished surgeon perception of tool-to-tissue interaction forces due to robot's stiffness. The tactile perception of these interaction forces (sclera force) has long been a crucial source of feedback for surgeons who rely on them to guide surgical maneuvers and to prevent damaging forces from being applied to the eye. This problem is exacerbated when there are unfavorable sclera forces originating from patient movements (dynamic eyeball manipulation) during surgery which may cause the sclera forces to increase even drastically. In this study we aim at evaluating the efficacy of providing warning auditory feedback based on the level of sclera force measured by force sensing instruments. The intent is to enhance safety during dynamic eye manipulations in robot-assisted retinal surgery. The disturbances caused by lateral movement of patient's head are simulated using a piezo-actuated linear stage. The Johns Hopkins Steady-Hand Eye Robot (SHER), is then used in a multi-user experiment. Twelve participants are asked to perform a mock retinal surgery by following painted vessels inside an eye phantom using a force sensing instrument while auditory feedback is provided. The results indicate that the users are able to handle the eye motion disturbances while maintaining the sclera forces within safe boundaries when audio feedback is provided.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2020 ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2020-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/iros45743.2020.9341350","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39265151","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-10-01Epub Date: 2021-02-10DOI: 10.1109/iros45743.2020.9340741
Jiahao Wu, Gang Li, Muller Urias, Niravkumar A Patel, Yun-Hui Liu, Peter Gehlbach, Russell H Taylor, Iulian Iordachita
Robot-assisted vitreoretinal surgery can filter surgeons' hand tremors and provide safe, accurate tool manipulation. In this paper, we report the design, optimization, and evaluation of a novel tilt mechanism for a new Steady-Hand Eye Robot (SHER). The new tilt mechanism features a four-bar linkage design and has a compact structure. Its kinematic configuration is optimized to minimize the required linear range of motion (LRM) for implementing a virtual remote center-of-motion (V-RCM) while tilting a surgical tool. Due to the different optimization constraints for the robots at the left and right sides of the human head, two configurations of this tilt mechanism are proposed. Experimental results show that the optimized tilt mechanism requires a significantly smaller LRM (e.g. 5.08 mm along Z direction and 8.77 mm along Y direction for left side robot) as compared to the slider-crank tilt mechanism used in the previous SHER (32.39 mm along Z direction and 21.10 mm along Y direction). The feasibility of the proposed tilt mechanism is verified in a mock bilateral robot-assisted vitreoretinal surgery. The ergonomically acceptable robot postures needed to access the surgical field is also determined.
{"title":"An Optimized Tilt Mechanism for a New Steady-Hand Eye Robot.","authors":"Jiahao Wu, Gang Li, Muller Urias, Niravkumar A Patel, Yun-Hui Liu, Peter Gehlbach, Russell H Taylor, Iulian Iordachita","doi":"10.1109/iros45743.2020.9340741","DOIUrl":"10.1109/iros45743.2020.9340741","url":null,"abstract":"<p><p>Robot-assisted vitreoretinal surgery can filter surgeons' hand tremors and provide safe, accurate tool manipulation. In this paper, we report the design, optimization, and evaluation of a novel tilt mechanism for a new Steady-Hand Eye Robot (SHER). The new tilt mechanism features a four-bar linkage design and has a compact structure. Its kinematic configuration is optimized to minimize the required linear range of motion (LRM) for implementing a virtual remote center-of-motion (V-RCM) while tilting a surgical tool. Due to the different optimization constraints for the robots at the left and right sides of the human head, two configurations of this tilt mechanism are proposed. Experimental results show that the optimized tilt mechanism requires a significantly smaller LRM (e.g. 5.08 <i>mm</i> along Z direction and 8.77 <i>mm</i> along Y direction for left side robot) as compared to the slider-crank tilt mechanism used in the previous SHER (32.39 <i>mm</i> along Z direction and 21.10 <i>mm</i> along Y direction). The feasibility of the proposed tilt mechanism is verified in a mock bilateral robot-assisted vitreoretinal surgery. The ergonomically acceptable robot postures needed to access the surgical field is also determined.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2020 ","pages":"3105-3111"},"PeriodicalIF":0.0,"publicationDate":"2020-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8130837/pdf/nihms-1700974.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39000672","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2020-01-01Epub Date: 2021-02-10DOI: 10.1109/iros45743.2020.9341425
Matthew Fan, Xiaolong Liu, Kamakshi Jain, Daniel Lerner, Lamar O Mair, Irving N Weinberg, Yancy Diaz-Mercado, Axel Krieger
This paper proposes a magnetic needle steering controller to manipulate mesoscale magnetic suture needles for executing planned suturing motion. This is an initial step towards our research objective: enabling autonomous control of magnetic suture needles for suturing tasks in minimally invasive surgery. To demonstrate the feasibility of accurate motion control, we employ a cardinally-arranged four-coil electromagnetic system setup and control magnetic suture needles in a 2-dimensional environment, i.e., a Petri dish filled with viscous liquid. Different from only using magnetic field gradients to control small magnetic agents under high damping conditions, the dynamics of a magnetic suture needle are investigated and encoded in the controller. Based on mathematical formulations of magnetic force and torque applied on the needle, we develop a kinematically constrained dynamic model that controls the needle to rotate and only translate along its central axis for mimicking the behavior of surgical sutures. A current controller of the electromagnetic system combining with closed-loop control schemes is designed for commanding the magnetic suture needles to achieve desired linear and angular velocities. To evaluate control performance of magnetic suture needles, we conduct experiments including needle rotation control, needle position control by using discretized trajectories, and velocity control by using a time-varying circular trajectory. The experiment results demonstrate our proposed needle steering controller can perform accurate motion control of mesoscale magnetic suture needles.
{"title":"Towards Autonomous Control of Magnetic Suture Needles.","authors":"Matthew Fan, Xiaolong Liu, Kamakshi Jain, Daniel Lerner, Lamar O Mair, Irving N Weinberg, Yancy Diaz-Mercado, Axel Krieger","doi":"10.1109/iros45743.2020.9341425","DOIUrl":"10.1109/iros45743.2020.9341425","url":null,"abstract":"<p><p>This paper proposes a magnetic needle steering controller to manipulate mesoscale magnetic suture needles for executing planned suturing motion. This is an initial step towards our research objective: enabling autonomous control of magnetic suture needles for suturing tasks in minimally invasive surgery. To demonstrate the feasibility of accurate motion control, we employ a cardinally-arranged four-coil electromagnetic system setup and control magnetic suture needles in a 2-dimensional environment, i.e., a Petri dish filled with viscous liquid. Different from only using magnetic field gradients to control small magnetic agents under high damping conditions, the dynamics of a magnetic suture needle are investigated and encoded in the controller. Based on mathematical formulations of magnetic force and torque applied on the needle, we develop a kinematically constrained dynamic model that controls the needle to rotate and only translate along its central axis for mimicking the behavior of surgical sutures. A current controller of the electromagnetic system combining with closed-loop control schemes is designed for commanding the magnetic suture needles to achieve desired linear and angular velocities. To evaluate control performance of magnetic suture needles, we conduct experiments including needle rotation control, needle position control by using discretized trajectories, and velocity control by using a time-varying circular trajectory. The experiment results demonstrate our proposed needle steering controller can perform accurate motion control of mesoscale magnetic suture needles.</p>","PeriodicalId":74523,"journal":{"name":"Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"2020 ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8389736/pdf/nihms-1721263.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39365210","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Proceedings of the ... IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems