{"title":"Label-Free Adaptive Gaussian Sample Consensus Framework for Learning From Perfect and Imperfect Demonstrations","authors":"Yi Hu;Zahra Samadikhoshkho;Jun Jin;Mahdi Tavakoli","doi":"10.1109/TMRB.2024.3422652","DOIUrl":null,"url":null,"abstract":"Autonomous robotic surgery represents one of the most groundbreaking advancements in medical technology. Learning from human demonstrations is promising in this domain, which facilitates the transfer of skills from humans to robots. However, the practical application of this method is challenged by the difficulty of acquiring high-quality demonstrations. Surgical tasks often involve complex manipulations and stringent precision requirements, leading to frequent errors in the demonstrations. These imperfect demonstrations adversely affect the performance of controller policies learned from the data. Unlike existing methods that rely on extensive human labeling of demonstrated trajectories, we present a novel label-free adaptive Gaussian sample consensus framework to progressively refine the control policy. We demonstrate the efficacy and practicality of our approach through two experimental studies: a handwriting classification task, providing reproducible ground-truth labels for evaluation, and an endoscopy scanning task, demonstrating the feasibility of our method in a real-world clinical context. Both experiments highlight our method’s capacity to efficiently adapt to and learn from an ongoing stream of imperfect demonstrations.","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":"6 3","pages":"1093-1103"},"PeriodicalIF":3.4000,"publicationDate":"2024-07-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on medical robotics and bionics","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10582909/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0
Abstract
Autonomous robotic surgery represents one of the most groundbreaking advancements in medical technology. Learning from human demonstrations is promising in this domain, which facilitates the transfer of skills from humans to robots. However, the practical application of this method is challenged by the difficulty of acquiring high-quality demonstrations. Surgical tasks often involve complex manipulations and stringent precision requirements, leading to frequent errors in the demonstrations. These imperfect demonstrations adversely affect the performance of controller policies learned from the data. Unlike existing methods that rely on extensive human labeling of demonstrated trajectories, we present a novel label-free adaptive Gaussian sample consensus framework to progressively refine the control policy. We demonstrate the efficacy and practicality of our approach through two experimental studies: a handwriting classification task, providing reproducible ground-truth labels for evaluation, and an endoscopy scanning task, demonstrating the feasibility of our method in a real-world clinical context. Both experiments highlight our method’s capacity to efficiently adapt to and learn from an ongoing stream of imperfect demonstrations.