{"title":"Toward Highly Flexible Inter-User Calibration of Myoelectric Control Models With User-Defined Hand Gestures","authors":"Yangyang Yuan;Zihao Chen;Jionghui Liu;ChihHong Chou;Chenyun Dai;Xinyu Jiang","doi":"10.1109/TMRB.2024.3504737","DOIUrl":null,"url":null,"abstract":"Myoelectric control models enabling accurate hand gesture recognition via electromyography (EMG) have attracted increasing attentions in rehabilitation robotics. Adapting pre-trained models to new users is a main challenge in real world applications due to the inter-user different EMG characteristics. Most previous transfer learning approaches employed a rigid model calibration process, usually in a supervised manner with ground truth labels, or in an unsupervised manner but still requiring users to perform pre-defined hand gestures to update model parameters. We argue that such a rigid model calibration process lacks flexibility and limit the translation of myoelectric control into real world practice. In this work, we gradually “flexibilize” the standard model calibration process toward a highly flexible version, which does not require the labels of calibration data, and can be performed on only a subset of pre-defined hand gestures or even unknown user-defined hand gestures. We identify those key components contributing to the performance difference along the way. Compared with the supervised method, the unsupervised model calibration even contributed to a 10% improvement (<inline-formula> <tex-math>${p}\\lt 0.05$ </tex-math></inline-formula>) in case where only a subset of gesture categories were available for model calibration. Moreover, the unsupervised model calibration achieved a highest recognition accuracy of 86.57% using unknown user-defined gestures, with no significant difference compared to the accuracy with pre-defined gestures (<inline-formula> <tex-math>${p}\\gt 0.05$ </tex-math></inline-formula>).","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":"7 1","pages":"359-367"},"PeriodicalIF":3.4000,"publicationDate":"2024-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on medical robotics and bionics","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10764714/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
Toward Highly Flexible Inter-User Calibration of Myoelectric Control Models With User-Defined Hand Gestures
Myoelectric control models enabling accurate hand gesture recognition via electromyography (EMG) have attracted increasing attentions in rehabilitation robotics. Adapting pre-trained models to new users is a main challenge in real world applications due to the inter-user different EMG characteristics. Most previous transfer learning approaches employed a rigid model calibration process, usually in a supervised manner with ground truth labels, or in an unsupervised manner but still requiring users to perform pre-defined hand gestures to update model parameters. We argue that such a rigid model calibration process lacks flexibility and limit the translation of myoelectric control into real world practice. In this work, we gradually “flexibilize” the standard model calibration process toward a highly flexible version, which does not require the labels of calibration data, and can be performed on only a subset of pre-defined hand gestures or even unknown user-defined hand gestures. We identify those key components contributing to the performance difference along the way. Compared with the supervised method, the unsupervised model calibration even contributed to a 10% improvement (${p}\lt 0.05$ ) in case where only a subset of gesture categories were available for model calibration. Moreover, the unsupervised model calibration achieved a highest recognition accuracy of 86.57% using unknown user-defined gestures, with no significant difference compared to the accuracy with pre-defined gestures (${p}\gt 0.05$ ).