Sudip Hazra, Abdul Hafiz Abdul Rahaman, P. Shiakolas
{"title":"An Affordable Telerobotic System Architecture for Grasp Training and Object Grasping for Human-machine Interaction","authors":"Sudip Hazra, Abdul Hafiz Abdul Rahaman, P. Shiakolas","doi":"10.1115/1.4063072","DOIUrl":null,"url":null,"abstract":"\n Due to mobility impairment, a person might rely on wheelchairs, canes, and crutches for assistance but could face challenges when performing tasks such as grasping and manipulating objects due to limitations in reach and capability. To overcome these challenges, a multi-degree-of-freedom robotic arm with an anthropomorphic robotic hand (ARH) could be used. In this research, we propose an architecture and then implement it towards the development of an assistive system to assist a person with object grasping. The architecture interlinks three functional modules to provide three operation modes to calibrate the system, train a user on how to execute a grasp, synthesize grasps, and execute a grasp. The developed system consists of a user input and feedback glove capable of capturing user inputs and providing grasp-related vibrotactile feedback, a CoppeliaSim-based virtual environment emulating the motions of the ARH, and an underactuated ARH capable of executing grasps while sensing grasp contact locations. The operation of the developed system is evaluated to determine the ability of a person to operate it and perform a grasp using two control methods; using a synthesized grasp or under real-time continuous control. The successful evaluation validates the architecture and the developed system to provide the ability to perform a grasp. The results of the evaluation provide confidence in expanding the system capabilities and use it to develop a database of grasp trajectories of objects with different geometries.","PeriodicalId":73734,"journal":{"name":"Journal of engineering and science in medical diagnostics and therapy","volume":"100 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of engineering and science in medical diagnostics and therapy","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1115/1.4063072","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Due to mobility impairment, a person might rely on wheelchairs, canes, and crutches for assistance but could face challenges when performing tasks such as grasping and manipulating objects due to limitations in reach and capability. To overcome these challenges, a multi-degree-of-freedom robotic arm with an anthropomorphic robotic hand (ARH) could be used. In this research, we propose an architecture and then implement it towards the development of an assistive system to assist a person with object grasping. The architecture interlinks three functional modules to provide three operation modes to calibrate the system, train a user on how to execute a grasp, synthesize grasps, and execute a grasp. The developed system consists of a user input and feedback glove capable of capturing user inputs and providing grasp-related vibrotactile feedback, a CoppeliaSim-based virtual environment emulating the motions of the ARH, and an underactuated ARH capable of executing grasps while sensing grasp contact locations. The operation of the developed system is evaluated to determine the ability of a person to operate it and perform a grasp using two control methods; using a synthesized grasp or under real-time continuous control. The successful evaluation validates the architecture and the developed system to provide the ability to perform a grasp. The results of the evaluation provide confidence in expanding the system capabilities and use it to develop a database of grasp trajectories of objects with different geometries.