{"title":"Hand Gesture Recognition Based on Shape Context Analysis","authors":"S. Qaisar, M. Krichen, A. Mihoub","doi":"10.1109/CAIDA51941.2021.9425200","DOIUrl":null,"url":null,"abstract":"The technological advancement is evolving the human–computer interaction (HCI). The goal is to ameliorate the HCI to a level where computers can be interacted in a natural way. It is a demanding aim and keeps the contemporary HCI systems complex and challenging. This paper aims to develop an effective hand gesture identification piloted HCI. It is realizable by three stages of preprocessing, features extraction and classification. The system functionality is studied by using a colored images database. Each incoming instance presents a hand gesture. Firstly it is subtracted from the background template to focus on the intended hand gesture. Afterward the subtracted image is enhanced and then converted into the grayscale one which is then thresholded by converting it in a binary image. This segmented version is further enhanced by using the morphological filters. The features are extracted by using the grayscale pixel values and shape context analysis (SC). Gestures are automatically recognized by using the k-Nearest Neighbor (k-NN) classification algorithm. The system achieves 83.3% of gesture recognition precision. The classification decisions are conveyed to the front-end embedded controller for systematic actuations and actions.","PeriodicalId":272573,"journal":{"name":"2021 1st International Conference on Artificial Intelligence and Data Analytics (CAIDA)","volume":"113 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-04-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 1st International Conference on Artificial Intelligence and Data Analytics (CAIDA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CAIDA51941.2021.9425200","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The technological advancement is evolving the human–computer interaction (HCI). The goal is to ameliorate the HCI to a level where computers can be interacted in a natural way. It is a demanding aim and keeps the contemporary HCI systems complex and challenging. This paper aims to develop an effective hand gesture identification piloted HCI. It is realizable by three stages of preprocessing, features extraction and classification. The system functionality is studied by using a colored images database. Each incoming instance presents a hand gesture. Firstly it is subtracted from the background template to focus on the intended hand gesture. Afterward the subtracted image is enhanced and then converted into the grayscale one which is then thresholded by converting it in a binary image. This segmented version is further enhanced by using the morphological filters. The features are extracted by using the grayscale pixel values and shape context analysis (SC). Gestures are automatically recognized by using the k-Nearest Neighbor (k-NN) classification algorithm. The system achieves 83.3% of gesture recognition precision. The classification decisions are conveyed to the front-end embedded controller for systematic actuations and actions.