{"title":"基于手势检测的聋哑人行为分析","authors":"Nirmala M.S.","doi":"10.58346/jowua.2023.i3.010","DOIUrl":null,"url":null,"abstract":"Deaf and mute people have unique communication and social challenges that make it hard to express their thoughts, needs, and ideas. Understanding people's behavior is more important to protect them and help them integrate into society. This study discusses the critical need for behavioral analysis on deaf and mute people and introduces the Automatic Behavioral Analysis Employing Gesture Detection Framework (ABA-GDF). Gesture detection technology has gained popularity recently. This emphasis may be due to its ability to overcome communication hurdles and illuminate nonverbal communication. Current methods have various challenges, including limited accuracy and adaptability. The ABA-GDF architecture comprises three phases: dataset collection, modeling, and deployment. The data collection technique includes hand signals used by deaf and quiet people. The material is then processed to partition and normalize the hand area for consistent analysis. During Modelling, feature descriptor attributes are developed to extract relevant motion information. A classifier learns and predicts using the feature vectors, enabling the framework to recognize and interpret motions and actions. Large-scale simulations of ABA-GDF showed promising results. The ABA-GDF framework achieved 92% gesture recognition accuracy on the dataset. The system's robustness is demonstrated by its capacity to understand non-verbal messages. The research showed a 15% reduction in false positives compared to earlier methods, demonstrating its real-world usefulness.","PeriodicalId":38235,"journal":{"name":"Journal of Wireless Mobile Networks, Ubiquitous Computing, and Dependable Applications","volume":"98 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Behavioural Analysis of Deaf and Mute People Using Gesture Detection\",\"authors\":\"Nirmala M.S.\",\"doi\":\"10.58346/jowua.2023.i3.010\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Deaf and mute people have unique communication and social challenges that make it hard to express their thoughts, needs, and ideas. Understanding people's behavior is more important to protect them and help them integrate into society. This study discusses the critical need for behavioral analysis on deaf and mute people and introduces the Automatic Behavioral Analysis Employing Gesture Detection Framework (ABA-GDF). Gesture detection technology has gained popularity recently. This emphasis may be due to its ability to overcome communication hurdles and illuminate nonverbal communication. Current methods have various challenges, including limited accuracy and adaptability. The ABA-GDF architecture comprises three phases: dataset collection, modeling, and deployment. The data collection technique includes hand signals used by deaf and quiet people. The material is then processed to partition and normalize the hand area for consistent analysis. During Modelling, feature descriptor attributes are developed to extract relevant motion information. A classifier learns and predicts using the feature vectors, enabling the framework to recognize and interpret motions and actions. Large-scale simulations of ABA-GDF showed promising results. The ABA-GDF framework achieved 92% gesture recognition accuracy on the dataset. The system's robustness is demonstrated by its capacity to understand non-verbal messages. The research showed a 15% reduction in false positives compared to earlier methods, demonstrating its real-world usefulness.\",\"PeriodicalId\":38235,\"journal\":{\"name\":\"Journal of Wireless Mobile Networks, Ubiquitous Computing, and Dependable Applications\",\"volume\":\"98 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-09-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Wireless Mobile Networks, Ubiquitous Computing, and Dependable Applications\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.58346/jowua.2023.i3.010\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"Computer Science\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Wireless Mobile Networks, Ubiquitous Computing, and Dependable Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.58346/jowua.2023.i3.010","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Computer Science","Score":null,"Total":0}
Behavioural Analysis of Deaf and Mute People Using Gesture Detection
Deaf and mute people have unique communication and social challenges that make it hard to express their thoughts, needs, and ideas. Understanding people's behavior is more important to protect them and help them integrate into society. This study discusses the critical need for behavioral analysis on deaf and mute people and introduces the Automatic Behavioral Analysis Employing Gesture Detection Framework (ABA-GDF). Gesture detection technology has gained popularity recently. This emphasis may be due to its ability to overcome communication hurdles and illuminate nonverbal communication. Current methods have various challenges, including limited accuracy and adaptability. The ABA-GDF architecture comprises three phases: dataset collection, modeling, and deployment. The data collection technique includes hand signals used by deaf and quiet people. The material is then processed to partition and normalize the hand area for consistent analysis. During Modelling, feature descriptor attributes are developed to extract relevant motion information. A classifier learns and predicts using the feature vectors, enabling the framework to recognize and interpret motions and actions. Large-scale simulations of ABA-GDF showed promising results. The ABA-GDF framework achieved 92% gesture recognition accuracy on the dataset. The system's robustness is demonstrated by its capacity to understand non-verbal messages. The research showed a 15% reduction in false positives compared to earlier methods, demonstrating its real-world usefulness.
期刊介绍:
JoWUA is an online peer-reviewed journal and aims to provide an international forum for researchers, professionals, and industrial practitioners on all topics related to wireless mobile networks, ubiquitous computing, and their dependable applications. JoWUA consists of high-quality technical manuscripts on advances in the state-of-the-art of wireless mobile networks, ubiquitous computing, and their dependable applications; both theoretical approaches and practical approaches are encouraged to submit. All published articles in JoWUA are freely accessible in this website because it is an open access journal. JoWUA has four issues (March, June, September, December) per year with special issues covering specific research areas by guest editors.