{"title":"手语翻译系统:机器学习的替代系统","authors":"Salma A. Essam El-Din, Mohamed A. Abd El-Ghany","doi":"10.1109/NILES50944.2020.9257958","DOIUrl":null,"url":null,"abstract":"Losing the ability to speak exerts psychological and social impacts on the affected people due to the lack of proper communication. Thus, Sign Language (SL) is considered a boon to people with hearing and speech impairment. SL has developed as a handy mean of communication that form the core of local deaf cultures. It is a visual–spatial language based on positional and visual components, such as the shape of fingers and hands, their location and orientation as well as arm and body movements. The problem is that SL is not understood by everyone, forming a communication gap between the mute and the able people. Multiple and systematic scholarly interventions that vary according to context have been implemented to overcome disability-related difficulties. Sign language recognition (SLR) systems based on sensory gloves are significant innovations that aim to procure data on the shape or movement of the human hand to bridge this communication gap, as the proposed system. The proposed model is a glove equipped with five flex sensors, interfacing with a control unit fixed on the arm, translating American Sign Language (ASL) and Arabic Sign Language (ArSL) to both text and speech, displayed on a simple Graphical User Interface (GUI). The proposed system aims to provide an affordable and user friendly SL translator system, working on the basis of Machine Learning (ML). However, it adapts to each person’s hand instead of using a generic data set. The system achieved 95% recognition rate with static gestures and up to 88% with dynamic gestures.","PeriodicalId":253090,"journal":{"name":"2020 2nd Novel Intelligent and Leading Emerging Sciences Conference (NILES)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2020-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"14","resultStr":"{\"title\":\"Sign Language Interpreter System: An alternative system for machine learning\",\"authors\":\"Salma A. Essam El-Din, Mohamed A. Abd El-Ghany\",\"doi\":\"10.1109/NILES50944.2020.9257958\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Losing the ability to speak exerts psychological and social impacts on the affected people due to the lack of proper communication. Thus, Sign Language (SL) is considered a boon to people with hearing and speech impairment. SL has developed as a handy mean of communication that form the core of local deaf cultures. It is a visual–spatial language based on positional and visual components, such as the shape of fingers and hands, their location and orientation as well as arm and body movements. The problem is that SL is not understood by everyone, forming a communication gap between the mute and the able people. Multiple and systematic scholarly interventions that vary according to context have been implemented to overcome disability-related difficulties. Sign language recognition (SLR) systems based on sensory gloves are significant innovations that aim to procure data on the shape or movement of the human hand to bridge this communication gap, as the proposed system. The proposed model is a glove equipped with five flex sensors, interfacing with a control unit fixed on the arm, translating American Sign Language (ASL) and Arabic Sign Language (ArSL) to both text and speech, displayed on a simple Graphical User Interface (GUI). The proposed system aims to provide an affordable and user friendly SL translator system, working on the basis of Machine Learning (ML). However, it adapts to each person’s hand instead of using a generic data set. The system achieved 95% recognition rate with static gestures and up to 88% with dynamic gestures.\",\"PeriodicalId\":253090,\"journal\":{\"name\":\"2020 2nd Novel Intelligent and Leading Emerging Sciences Conference (NILES)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-10-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"14\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 2nd Novel Intelligent and Leading Emerging Sciences Conference (NILES)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/NILES50944.2020.9257958\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 2nd Novel Intelligent and Leading Emerging Sciences Conference (NILES)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NILES50944.2020.9257958","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Sign Language Interpreter System: An alternative system for machine learning
Losing the ability to speak exerts psychological and social impacts on the affected people due to the lack of proper communication. Thus, Sign Language (SL) is considered a boon to people with hearing and speech impairment. SL has developed as a handy mean of communication that form the core of local deaf cultures. It is a visual–spatial language based on positional and visual components, such as the shape of fingers and hands, their location and orientation as well as arm and body movements. The problem is that SL is not understood by everyone, forming a communication gap between the mute and the able people. Multiple and systematic scholarly interventions that vary according to context have been implemented to overcome disability-related difficulties. Sign language recognition (SLR) systems based on sensory gloves are significant innovations that aim to procure data on the shape or movement of the human hand to bridge this communication gap, as the proposed system. The proposed model is a glove equipped with five flex sensors, interfacing with a control unit fixed on the arm, translating American Sign Language (ASL) and Arabic Sign Language (ArSL) to both text and speech, displayed on a simple Graphical User Interface (GUI). The proposed system aims to provide an affordable and user friendly SL translator system, working on the basis of Machine Learning (ML). However, it adapts to each person’s hand instead of using a generic data set. The system achieved 95% recognition rate with static gestures and up to 88% with dynamic gestures.