Sen Qian, Jianxi Zhang, Zongkun Pei, Xiantao Sun, Zhe Wu
{"title":"Development of a flexible endoscopic robot with autonomous tracking control ability using machine vision and deep learning","authors":"Sen Qian, Jianxi Zhang, Zongkun Pei, Xiantao Sun, Zhe Wu","doi":"10.5194/ms-15-223-2024","DOIUrl":null,"url":null,"abstract":"Abstract. A flexible endoscopic robot is designed to solve the problem that it is difficult for auxiliary doctors to maintain a stable visual field in traditional endoscopic surgery. Based on geometric derivation, a motion control method under the constraint of the remote center motion (RCM) of the robot system is established, and a set of circular trajectories are planned for it. The RCM error of the robot during operation and the actual trajectory of the robot end in three-dimensional space are obtained through the motion capture system. The end of the robot is controlled by the heterogeneous primary–secondary teleoperation control algorithm based on position increments. Finally, the RTMDet deep learning object detection algorithm was selected to identify and locate surgical instruments through comparative experiments, and the autonomous tracking control was completed based on visual guidance. In the process of autonomous tracking, the RCM error was less than 1 mm, which met the actual surgical requirements.\n","PeriodicalId":502917,"journal":{"name":"Mechanical Sciences","volume":"41 22","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Mechanical Sciences","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5194/ms-15-223-2024","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Abstract. A flexible endoscopic robot is designed to solve the problem that it is difficult for auxiliary doctors to maintain a stable visual field in traditional endoscopic surgery. Based on geometric derivation, a motion control method under the constraint of the remote center motion (RCM) of the robot system is established, and a set of circular trajectories are planned for it. The RCM error of the robot during operation and the actual trajectory of the robot end in three-dimensional space are obtained through the motion capture system. The end of the robot is controlled by the heterogeneous primary–secondary teleoperation control algorithm based on position increments. Finally, the RTMDet deep learning object detection algorithm was selected to identify and locate surgical instruments through comparative experiments, and the autonomous tracking control was completed based on visual guidance. In the process of autonomous tracking, the RCM error was less than 1 mm, which met the actual surgical requirements.