Fu-Hsiang Chan, Duan-Yu Chen, J. Hsieh, Chi-Hung Chuang
{"title":"Wrinkle of fingers based robust person identification","authors":"Fu-Hsiang Chan, Duan-Yu Chen, J. Hsieh, Chi-Hung Chuang","doi":"10.1109/ICMLC.2014.7009724","DOIUrl":null,"url":null,"abstract":"This paper proposed a novel biometric identification system through 2D fingers' geometry measurements. First, the right middle finger and index finger are captured using a CCD camera. Then the fingers are segmented out from background based on skin colors. Splitting out two fingers based on their contours, multiple features such as the length, mean width, finger shape vector and wrinkle texture are computed and consequently are used for person identification. Experiments show that our proposed method can perform well in real time with the recognition rate being up to 97%.","PeriodicalId":335296,"journal":{"name":"2014 International Conference on Machine Learning and Cybernetics","volume":"48 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-07-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 International Conference on Machine Learning and Cybernetics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMLC.2014.7009724","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
This paper proposed a novel biometric identification system through 2D fingers' geometry measurements. First, the right middle finger and index finger are captured using a CCD camera. Then the fingers are segmented out from background based on skin colors. Splitting out two fingers based on their contours, multiple features such as the length, mean width, finger shape vector and wrinkle texture are computed and consequently are used for person identification. Experiments show that our proposed method can perform well in real time with the recognition rate being up to 97%.