{"title":"基于主成分分析的手势检测与识别","authors":"Nasser H. Dardas, E. Petriu","doi":"10.1109/CIMSA.2011.6059935","DOIUrl":null,"url":null,"abstract":"This paper presents a real time system, which includes detecting and tracking bare hand in cluttered background using skin detection and hand postures contours comparison algorithm after face subtraction, and recognizing hand gestures using Principle Components Analysis (PCA). In the training stage, a set of hand postures images with different scales, rotation and lighting conditions are trained. Then, the most eigenvectors of training images are determined, and the training weights are calculated by projecting each training image onto the most eigenvectors. In the testing stage, for every frame captured from a webcam, the hand gesture is detected using our algorithm, then the small image that contains the detected hand gesture is projected onto the most eigenvectors of training images to form its test weights. Finally, the minimum Euclidean distance is determined between the test weights and the training weights of each training image to recognize the hand gesture.","PeriodicalId":422972,"journal":{"name":"2011 IEEE International Conference on Computational Intelligence for Measurement Systems and Applications (CIMSA) Proceedings","volume":"25 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"67","resultStr":"{\"title\":\"Hand gesture detection and recognition using principal component analysis\",\"authors\":\"Nasser H. Dardas, E. Petriu\",\"doi\":\"10.1109/CIMSA.2011.6059935\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper presents a real time system, which includes detecting and tracking bare hand in cluttered background using skin detection and hand postures contours comparison algorithm after face subtraction, and recognizing hand gestures using Principle Components Analysis (PCA). In the training stage, a set of hand postures images with different scales, rotation and lighting conditions are trained. Then, the most eigenvectors of training images are determined, and the training weights are calculated by projecting each training image onto the most eigenvectors. In the testing stage, for every frame captured from a webcam, the hand gesture is detected using our algorithm, then the small image that contains the detected hand gesture is projected onto the most eigenvectors of training images to form its test weights. Finally, the minimum Euclidean distance is determined between the test weights and the training weights of each training image to recognize the hand gesture.\",\"PeriodicalId\":422972,\"journal\":{\"name\":\"2011 IEEE International Conference on Computational Intelligence for Measurement Systems and Applications (CIMSA) Proceedings\",\"volume\":\"25 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-10-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"67\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2011 IEEE International Conference on Computational Intelligence for Measurement Systems and Applications (CIMSA) Proceedings\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CIMSA.2011.6059935\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 IEEE International Conference on Computational Intelligence for Measurement Systems and Applications (CIMSA) Proceedings","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CIMSA.2011.6059935","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Hand gesture detection and recognition using principal component analysis
This paper presents a real time system, which includes detecting and tracking bare hand in cluttered background using skin detection and hand postures contours comparison algorithm after face subtraction, and recognizing hand gestures using Principle Components Analysis (PCA). In the training stage, a set of hand postures images with different scales, rotation and lighting conditions are trained. Then, the most eigenvectors of training images are determined, and the training weights are calculated by projecting each training image onto the most eigenvectors. In the testing stage, for every frame captured from a webcam, the hand gesture is detected using our algorithm, then the small image that contains the detected hand gesture is projected onto the most eigenvectors of training images to form its test weights. Finally, the minimum Euclidean distance is determined between the test weights and the training weights of each training image to recognize the hand gesture.