Abed Soleymani;Mahdi Tavakoli;Farzad Aghazadeh;Yafei Ou;Hossein Rouhani;Bin Zheng;Xingyu Li
{"title":"Hands Collaboration Evaluation for Surgical Skills Assessment: An Information Theoretical Approach","authors":"Abed Soleymani;Mahdi Tavakoli;Farzad Aghazadeh;Yafei Ou;Hossein Rouhani;Bin Zheng;Xingyu Li","doi":"10.1109/TMRB.2024.3464110","DOIUrl":null,"url":null,"abstract":"Bimanual tasks, where the brain must simultaneously control and plan the movements of both hands, such as needle passing and tissue cutting, commonly exist in surgeries, e.g., robot-assisted minimally invasive surgery. In this study, we present a novel approach for quantifying the quality of hands coordination and correspondence in bimanual tasks by utilizing information theory concepts to build a mathematical framework for measuring the collaboration strength between the two hands. The introduced method makes no assumption about the dynamics and couplings within the robotic platform, executive task, or human motor control. We implemented the proposed approach on MEELS and JIGSAWS datasets, corresponding to conventional minimally invasive surgery (MIS) and robot-assisted MIS, respectively. We analyzed the advantages of hands collaboration features in the skills assessment and style recognition of robotic surgery tasks. Furthermore, we demonstrated that incorporating intuitive domain knowledge of bimanual tasks potentially paves the way for other complex applications, including, but not limited to, autonomous surgery with a high level of model explainability and interpretability. Finally, we presented preliminary results to argue that incorporating hands collaboration features in deep learning-based classifiers reduces uncertainty, improves accuracy, and enhances the out-of-distribution robustness of the final model.","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":null,"pages":null},"PeriodicalIF":3.4000,"publicationDate":"2024-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on medical robotics and bionics","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10684295/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0
Abstract
Bimanual tasks, where the brain must simultaneously control and plan the movements of both hands, such as needle passing and tissue cutting, commonly exist in surgeries, e.g., robot-assisted minimally invasive surgery. In this study, we present a novel approach for quantifying the quality of hands coordination and correspondence in bimanual tasks by utilizing information theory concepts to build a mathematical framework for measuring the collaboration strength between the two hands. The introduced method makes no assumption about the dynamics and couplings within the robotic platform, executive task, or human motor control. We implemented the proposed approach on MEELS and JIGSAWS datasets, corresponding to conventional minimally invasive surgery (MIS) and robot-assisted MIS, respectively. We analyzed the advantages of hands collaboration features in the skills assessment and style recognition of robotic surgery tasks. Furthermore, we demonstrated that incorporating intuitive domain knowledge of bimanual tasks potentially paves the way for other complex applications, including, but not limited to, autonomous surgery with a high level of model explainability and interpretability. Finally, we presented preliminary results to argue that incorporating hands collaboration features in deep learning-based classifiers reduces uncertainty, improves accuracy, and enhances the out-of-distribution robustness of the final model.