J. G. Colli-Alfaro, Anas Ibrahim, Ana Luisa Trejos
{"title":"Design of User-Independent Hand Gesture Recognition Using Multilayer Perceptron Networks and Sensor Fusion Techniques","authors":"J. G. Colli-Alfaro, Anas Ibrahim, Ana Luisa Trejos","doi":"10.1109/ICORR.2019.8779533","DOIUrl":null,"url":null,"abstract":"According to the World Health Organization, stroke is the third leading cause of disability. A common consequence of stroke is hemiparesis, which leads to the impairment of one side of the body and affects the performance of activities of daily living. It has been proven that targeting the motor impairments as early as possible while using wearable mechatronic devices as a robot assisted therapy, and letting the patient be in control of the robotic system, can improve the rehabilitation outcomes. However, despite the increased progress on control methods for wearable mechatronic devices, a need for a more natural interface that allows for better control remains. In this work, a user-independent gesture classification method based on a sensor fusion technique using surface electromyography (EMG) and an inertial measurement unit (IMU) is presented. The Myo Armband was used to extract EMG and IMU data from healthy subjects. Participants were asked to perform 10 types of gestures in 4 different arm positions while using the Myo on their dominant limb. Data obtained from 14 participants were used to classify the gestures using a Multilayer Perceptron Network. Finally, the classification algorithm was tested on 5 novel users, obtaining an average accuracy of 78.94%. These results demonstrate that by using the proposed approach, it is possible to achieve a more natural human machine interface that allows better control of wearable mechatronic devices during robot assisted therapies.","PeriodicalId":130415,"journal":{"name":"2019 IEEE 16th International Conference on Rehabilitation Robotics (ICORR)","volume":"141 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"13","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE 16th International Conference on Rehabilitation Robotics (ICORR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICORR.2019.8779533","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 13
Abstract
According to the World Health Organization, stroke is the third leading cause of disability. A common consequence of stroke is hemiparesis, which leads to the impairment of one side of the body and affects the performance of activities of daily living. It has been proven that targeting the motor impairments as early as possible while using wearable mechatronic devices as a robot assisted therapy, and letting the patient be in control of the robotic system, can improve the rehabilitation outcomes. However, despite the increased progress on control methods for wearable mechatronic devices, a need for a more natural interface that allows for better control remains. In this work, a user-independent gesture classification method based on a sensor fusion technique using surface electromyography (EMG) and an inertial measurement unit (IMU) is presented. The Myo Armband was used to extract EMG and IMU data from healthy subjects. Participants were asked to perform 10 types of gestures in 4 different arm positions while using the Myo on their dominant limb. Data obtained from 14 participants were used to classify the gestures using a Multilayer Perceptron Network. Finally, the classification algorithm was tested on 5 novel users, obtaining an average accuracy of 78.94%. These results demonstrate that by using the proposed approach, it is possible to achieve a more natural human machine interface that allows better control of wearable mechatronic devices during robot assisted therapies.