G. Aldehim, Radwa Marzouk, M. Al-Hagery, A. Hilal, Amani A. Alneil
{"title":"Design of Information Feedback Firefly Algorithm with a Nested Deep Learning Model for Intelligent Gesture Recognition of Visually Disabled People","authors":"G. Aldehim, Radwa Marzouk, M. Al-Hagery, A. Hilal, Amani A. Alneil","doi":"10.57197/jdr-2023-0025","DOIUrl":null,"url":null,"abstract":"Gesture recognition is a developing topic in current technologies. The focus is to detect human gestures by utilizing mathematical methods for human–computer interaction. Some modes of human–computer interaction are touch screens, keyboard, mouse, etc. All these gadgets have their merits and demerits while implementing versatile hardware in computers. Gesture detection is one of the vital methods to construct user-friendly interfaces. Generally, gestures are created from any bodily state or motion but typically originate from the hand or face. Therefore, this manuscript designs an Information Feedback Firefly Algorithm with Nested Deep Learning (IFBFFA-NDL) model for intelligent gesture recognition of visually disabled people. The presented IFBFFA-NDL technique exploits the concepts of DL with a metaheuristic hyperparameter tuning strategy for the recognition process. To generate a collection of feature vectors, the IFBFFA-NDL technique uses the NASNet model. For optimal hyperparameter selection of the NASNet model, the IFBFFA algorithm is used. To recognize different types of gestures, a nested long short-term memory classification model was used. For exhibiting the improvised gesture detection efficiency of the IFBFFA-NDL technique, a detailed comparative result analysis was conducted and the outcomes highlighted the improved recognition rate of the IFBFFA-NDL technique as 99.73% compared to recent approaches.","PeriodicalId":46073,"journal":{"name":"Scandinavian Journal of Disability Research","volume":"29 1","pages":""},"PeriodicalIF":1.7000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Scandinavian Journal of Disability Research","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.57197/jdr-2023-0025","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"REHABILITATION","Score":null,"Total":0}
引用次数: 0
Abstract
Gesture recognition is a developing topic in current technologies. The focus is to detect human gestures by utilizing mathematical methods for human–computer interaction. Some modes of human–computer interaction are touch screens, keyboard, mouse, etc. All these gadgets have their merits and demerits while implementing versatile hardware in computers. Gesture detection is one of the vital methods to construct user-friendly interfaces. Generally, gestures are created from any bodily state or motion but typically originate from the hand or face. Therefore, this manuscript designs an Information Feedback Firefly Algorithm with Nested Deep Learning (IFBFFA-NDL) model for intelligent gesture recognition of visually disabled people. The presented IFBFFA-NDL technique exploits the concepts of DL with a metaheuristic hyperparameter tuning strategy for the recognition process. To generate a collection of feature vectors, the IFBFFA-NDL technique uses the NASNet model. For optimal hyperparameter selection of the NASNet model, the IFBFFA algorithm is used. To recognize different types of gestures, a nested long short-term memory classification model was used. For exhibiting the improvised gesture detection efficiency of the IFBFFA-NDL technique, a detailed comparative result analysis was conducted and the outcomes highlighted the improved recognition rate of the IFBFFA-NDL technique as 99.73% compared to recent approaches.