Venkatesh Kandukuri, Srujal Reddy Gundedi, V. Kamble, V. Satpute
{"title":"聋哑人手语翻译静态字母手势使用MobileNet","authors":"Venkatesh Kandukuri, Srujal Reddy Gundedi, V. Kamble, V. Satpute","doi":"10.1109/PCEMS58491.2023.10136074","DOIUrl":null,"url":null,"abstract":"Sign language is the language used by deaf and dumb people to communicate with others. Deaf and mute people express their thoughts and ideas through hand movements or facial expressions or gestures. However, interpreting sign language can be challenging for individuals who are not fluent in it. The current sign language recognition methods often rely on expensive hardware such as depth cameras or specialized gloves, which can be a barrier to widespread adoption. In this paper, we propose a low-cost solution for sign language recognition using MobileNet, a lightweight convolutional neural network architecture. This Paper deals with the static American Sign alphabet (j and z dynamic). The proposed model extracts the features and classifies them. The Model is able to predict the alphabet successfully corresponding to the sign. A finger Spelling dataset is used to train and test the model. The proposed model was successfully recognized with an accuracy of 99.93%. The obtained results and graphs show that the system is able to predict the sign correctly.","PeriodicalId":330870,"journal":{"name":"2023 2nd International Conference on Paradigm Shifts in Communications Embedded Systems, Machine Learning and Signal Processing (PCEMS)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-04-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Deaf and Mute Sign Language Translator on Static Alphabets Gestures using MobileNet\",\"authors\":\"Venkatesh Kandukuri, Srujal Reddy Gundedi, V. Kamble, V. Satpute\",\"doi\":\"10.1109/PCEMS58491.2023.10136074\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Sign language is the language used by deaf and dumb people to communicate with others. Deaf and mute people express their thoughts and ideas through hand movements or facial expressions or gestures. However, interpreting sign language can be challenging for individuals who are not fluent in it. The current sign language recognition methods often rely on expensive hardware such as depth cameras or specialized gloves, which can be a barrier to widespread adoption. In this paper, we propose a low-cost solution for sign language recognition using MobileNet, a lightweight convolutional neural network architecture. This Paper deals with the static American Sign alphabet (j and z dynamic). The proposed model extracts the features and classifies them. The Model is able to predict the alphabet successfully corresponding to the sign. A finger Spelling dataset is used to train and test the model. The proposed model was successfully recognized with an accuracy of 99.93%. The obtained results and graphs show that the system is able to predict the sign correctly.\",\"PeriodicalId\":330870,\"journal\":{\"name\":\"2023 2nd International Conference on Paradigm Shifts in Communications Embedded Systems, Machine Learning and Signal Processing (PCEMS)\",\"volume\":\"14 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-04-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 2nd International Conference on Paradigm Shifts in Communications Embedded Systems, Machine Learning and Signal Processing (PCEMS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/PCEMS58491.2023.10136074\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 2nd International Conference on Paradigm Shifts in Communications Embedded Systems, Machine Learning and Signal Processing (PCEMS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PCEMS58491.2023.10136074","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Deaf and Mute Sign Language Translator on Static Alphabets Gestures using MobileNet
Sign language is the language used by deaf and dumb people to communicate with others. Deaf and mute people express their thoughts and ideas through hand movements or facial expressions or gestures. However, interpreting sign language can be challenging for individuals who are not fluent in it. The current sign language recognition methods often rely on expensive hardware such as depth cameras or specialized gloves, which can be a barrier to widespread adoption. In this paper, we propose a low-cost solution for sign language recognition using MobileNet, a lightweight convolutional neural network architecture. This Paper deals with the static American Sign alphabet (j and z dynamic). The proposed model extracts the features and classifies them. The Model is able to predict the alphabet successfully corresponding to the sign. A finger Spelling dataset is used to train and test the model. The proposed model was successfully recognized with an accuracy of 99.93%. The obtained results and graphs show that the system is able to predict the sign correctly.