N. E. AL-Qaisy, Bilal R. Al-Kaseem, Yousif Al-Dunainawi
{"title":"AI-Based Portable Gesture Recognition System for Hearing Impaired People Using Wearable Sensors","authors":"N. E. AL-Qaisy, Bilal R. Al-Kaseem, Yousif Al-Dunainawi","doi":"10.1109/DeSE58274.2023.10099999","DOIUrl":null,"url":null,"abstract":"Recently, there has been a remarkable interest in sign language recognition techniques. Especially in the field of sensor-based besides the extensive employment of open-source platforms in research and development testbeds. Sign language recognition has attracted considerable attention from academic scholars and the industry because deafness recognized as a severe and worldwide health concern. However, most studies in recognition have only focused on vision-based or image-based systems that were not suitable for outdoor usage and lack mobility features. This paper introduces a smart glove that is based on wearable sensors to achieve portable standalone system working in a real-time environment with a user-friendly interface. The presented system utilized modern approaches to collect and generate new datasets using two kinds of sensors only. This dataset was employed to develop an artificial neural network (ANN) model that was capable of predicting the alphabetic letters based on hand gestures and orientation. The ANN model was trained using Scaled Conjugate Gradient (SCG) algorithm. The obtained results showed a remarkable performance in terms of ANN accuracy for both Arabic Sign Language (ArSL) and American Sign Language (ASL) which were 96%, 98% respectively. The performance of the developed ANN model ensured its usability in real-time scenario.","PeriodicalId":346847,"journal":{"name":"2023 15th International Conference on Developments in eSystems Engineering (DeSE)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 15th International Conference on Developments in eSystems Engineering (DeSE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DeSE58274.2023.10099999","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Recently, there has been a remarkable interest in sign language recognition techniques. Especially in the field of sensor-based besides the extensive employment of open-source platforms in research and development testbeds. Sign language recognition has attracted considerable attention from academic scholars and the industry because deafness recognized as a severe and worldwide health concern. However, most studies in recognition have only focused on vision-based or image-based systems that were not suitable for outdoor usage and lack mobility features. This paper introduces a smart glove that is based on wearable sensors to achieve portable standalone system working in a real-time environment with a user-friendly interface. The presented system utilized modern approaches to collect and generate new datasets using two kinds of sensors only. This dataset was employed to develop an artificial neural network (ANN) model that was capable of predicting the alphabetic letters based on hand gestures and orientation. The ANN model was trained using Scaled Conjugate Gradient (SCG) algorithm. The obtained results showed a remarkable performance in terms of ANN accuracy for both Arabic Sign Language (ArSL) and American Sign Language (ASL) which were 96%, 98% respectively. The performance of the developed ANN model ensured its usability in real-time scenario.