{"title":"Dynamic Two Hand Gesture Recognition using CNN-LSTM based networks","authors":"Vaidehi Sharma, Mohita Jaiswal, Abhishek Sharma, Sandeep Saini, Raghuvir Tomar","doi":"10.1109/iSES52644.2021.00059","DOIUrl":null,"url":null,"abstract":"Millions of speech-hearing disabled persons routinely use various signs like hand shapes, movement of hands, lip, and facial expressions to communicate. Indian Sign Language(ISL) is a language that has a large vocabulary of words, which changes from region to region. Generally, there is no dataset publicly available on sequential gestures for ISL. Therefore, the authors have presented the dynamic hand gesture dataset having 33 categories, including months of the year, days of the week, and those used in day-today life. This dataset is collected with a technique called burst shots. To enable speedy evaluation, a smaller subset of the dataset is used with 12 classes represents the months of the year. It is pretty complex because of its signs, and each category contains an average of 5 to 6 gestures per class. The proposed model is designed to work on low-power embedded hardware and this paper also discusses the workflow for the deployment of the particular neural network on embedded hardware. Furthermore, the proposed model is compared with different sequential architectures to find the most suited model for dynamic hand gesture recognition.","PeriodicalId":293167,"journal":{"name":"2021 IEEE International Symposium on Smart Electronic Systems (iSES) (Formerly iNiS)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE International Symposium on Smart Electronic Systems (iSES) (Formerly iNiS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/iSES52644.2021.00059","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Millions of speech-hearing disabled persons routinely use various signs like hand shapes, movement of hands, lip, and facial expressions to communicate. Indian Sign Language(ISL) is a language that has a large vocabulary of words, which changes from region to region. Generally, there is no dataset publicly available on sequential gestures for ISL. Therefore, the authors have presented the dynamic hand gesture dataset having 33 categories, including months of the year, days of the week, and those used in day-today life. This dataset is collected with a technique called burst shots. To enable speedy evaluation, a smaller subset of the dataset is used with 12 classes represents the months of the year. It is pretty complex because of its signs, and each category contains an average of 5 to 6 gestures per class. The proposed model is designed to work on low-power embedded hardware and this paper also discusses the workflow for the deployment of the particular neural network on embedded hardware. Furthermore, the proposed model is compared with different sequential architectures to find the most suited model for dynamic hand gesture recognition.