{"title":"A Framework for 3D Hand Tracking and Gesture Recognition using Elements of Genetic Programming","authors":"A. El-Sawah, C. Joslin, N. Georganas, E. Petriu","doi":"10.1109/CRV.2007.3","DOIUrl":null,"url":null,"abstract":"In this paper we present a framework for 3D hand tracking and dynamic gesture recognition using a single camera. Hand tracking is performed in a two step process: we first generate 3D hand posture hypothesis using geometric and kinematics inverse transformations, and then validate the hypothesis by projecting the postures on the image plane and comparing the projected model with the ground truth using a probabilistic observation model. Dynamic gesture recognition is performed using a Dynamic Bayesian Network model. The framework utilizes elements of soft computing to resolve the ambiguity inherent in vision-based tracking by producing a fuzzy hand posture output by the hand tracking module and feeding back potential posture hypothesis from the gesture recognition module.","PeriodicalId":304254,"journal":{"name":"Fourth Canadian Conference on Computer and Robot Vision (CRV '07)","volume":"122 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2007-05-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"33","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Fourth Canadian Conference on Computer and Robot Vision (CRV '07)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CRV.2007.3","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 33
Abstract
In this paper we present a framework for 3D hand tracking and dynamic gesture recognition using a single camera. Hand tracking is performed in a two step process: we first generate 3D hand posture hypothesis using geometric and kinematics inverse transformations, and then validate the hypothesis by projecting the postures on the image plane and comparing the projected model with the ground truth using a probabilistic observation model. Dynamic gesture recognition is performed using a Dynamic Bayesian Network model. The framework utilizes elements of soft computing to resolve the ambiguity inherent in vision-based tracking by producing a fuzzy hand posture output by the hand tracking module and feeding back potential posture hypothesis from the gesture recognition module.