{"title":"Electromyography and Speech Controlled Prototype Robotic Car using CNN Based Classifier for EMG","authors":"Zahid Ul Hassan, Nouman Bashir, Afaq Iltaf","doi":"10.1109/ETECTE55893.2022.10007092","DOIUrl":null,"url":null,"abstract":"Wearable electronic equipment is continually improving and becoming more integrated with technology for prosthesis control. These devices, which come in a variety of shapes and sizes, can detect, quantify, and perhaps use signals generated by the human body's physiological and muscular changes to control machinery. One such gadget, the MYO gesture/arm band, collects information from our forearm in the form of electromyographic (EMG) Signal, which is based on the measurement of small electrical impulses caused by ion exchange between muscle membranes, utilize these myoelectric impulses and converts them into input signals by using pre-defined motions. There is a range of tasks that may be carried out with this device and use of this device can give better results in a combination with another controlling modality. This paper addresses the use of several input modalities, including speech and myoelectric signals recorded through microphone and MYO band, respectively to control a robotic car. Hand gestures are used to control the car through MYO armband. The complete process is done by using Raspberry Pi. Classification of EMG signals is done by using Convolution Neural Network (CNN) classifier. Experimental results obtained as well as their accuracies for performance analysis are also presented.","PeriodicalId":131572,"journal":{"name":"2022 International Conference on Emerging Trends in Electrical, Control, and Telecommunication Engineering (ETECTE)","volume":"16 10 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Conference on Emerging Trends in Electrical, Control, and Telecommunication Engineering (ETECTE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ETECTE55893.2022.10007092","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Wearable electronic equipment is continually improving and becoming more integrated with technology for prosthesis control. These devices, which come in a variety of shapes and sizes, can detect, quantify, and perhaps use signals generated by the human body's physiological and muscular changes to control machinery. One such gadget, the MYO gesture/arm band, collects information from our forearm in the form of electromyographic (EMG) Signal, which is based on the measurement of small electrical impulses caused by ion exchange between muscle membranes, utilize these myoelectric impulses and converts them into input signals by using pre-defined motions. There is a range of tasks that may be carried out with this device and use of this device can give better results in a combination with another controlling modality. This paper addresses the use of several input modalities, including speech and myoelectric signals recorded through microphone and MYO band, respectively to control a robotic car. Hand gestures are used to control the car through MYO armband. The complete process is done by using Raspberry Pi. Classification of EMG signals is done by using Convolution Neural Network (CNN) classifier. Experimental results obtained as well as their accuracies for performance analysis are also presented.