Radzi Ambar, Safyzan Salim, Mohd Helmy Abd Wahab, Muhammad Mahadi Abdul Jamil, Tan Ching Phing
{"title":"Development of a Wearable Sensor Glove for Real-Time Sign Language Translation","authors":"Radzi Ambar, Safyzan Salim, Mohd Helmy Abd Wahab, Muhammad Mahadi Abdul Jamil, Tan Ching Phing","doi":"10.33166/aetic.2023.05.003","DOIUrl":null,"url":null,"abstract":"This article describes the development of a wearable sensor glove for sign language translation and an Android-based application that can display words and produce speech of the translated gestures in real-time. The objective of this project is to enable a conversation between a deaf person and another person who does not know sign language. The glove is composed of five (5) flexible sensors and an inertial sensor. This article also elaborates the development of an Android-based application using the MIT App Inventor software that produces words and speech of the translated gestures in real-time. The sign language gestures were measured by sensors and transmitted to an Arduino Nano microcontroller to be translated into words. Then, the processed data was transmitted to the Android application via Bluetooth. The application displayed the words and produced the sound of the gesture. Furthermore, preliminary experimental results demonstrated that the glove successfully displayed words and produced the sound of thirteen (13) translated sign languages via the developed application. In the future, it is hoped that further upgrades can produce a device to assist a deaf person communicates with normal people without over-reliance on sign language interpreters.","PeriodicalId":36440,"journal":{"name":"Annals of Emerging Technologies in Computing","volume":"72 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-10-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Annals of Emerging Technologies in Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.33166/aetic.2023.05.003","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Computer Science","Score":null,"Total":0}
引用次数: 0
Abstract
This article describes the development of a wearable sensor glove for sign language translation and an Android-based application that can display words and produce speech of the translated gestures in real-time. The objective of this project is to enable a conversation between a deaf person and another person who does not know sign language. The glove is composed of five (5) flexible sensors and an inertial sensor. This article also elaborates the development of an Android-based application using the MIT App Inventor software that produces words and speech of the translated gestures in real-time. The sign language gestures were measured by sensors and transmitted to an Arduino Nano microcontroller to be translated into words. Then, the processed data was transmitted to the Android application via Bluetooth. The application displayed the words and produced the sound of the gesture. Furthermore, preliminary experimental results demonstrated that the glove successfully displayed words and produced the sound of thirteen (13) translated sign languages via the developed application. In the future, it is hoped that further upgrades can produce a device to assist a deaf person communicates with normal people without over-reliance on sign language interpreters.