{"title":"Palmprint Phenotype Feature Extraction and Classification Based on Deep Learning.","authors":"Yu Fan, Jinxi Li, Shaoying Song, Haiguo Zhang, Sijia Wang, Guangtao Zhai","doi":"10.1007/s43657-022-00063-0","DOIUrl":null,"url":null,"abstract":"<p><p>Palmprints are of long practical and cultural interest. Palmprint principal lines, also called primary palmar lines, are one of the most dominant palmprint features and do not change over the lifespan. The existing methods utilize filters and edge detection operators to get the principal lines from the palm region of interest (ROI), but can not distinguish the principal lines from fine wrinkles. This paper proposes a novel deep-learning architecture to extract palmprint principal lines, which could greatly reduce the influence of fine wrinkles, and classify palmprint phenotypes further from 2D palmprint images. This architecture includes three modules, ROI extraction module (REM) using pre-trained hand key point location model, principal line extraction module (PLEM) using deep edge detection model, and phenotype classifier (PC) based on ResNet34 network. Compared with the current ROI extraction method, our extraction is competitive with a success rate of 95.2%. For principal line extraction, the similarity score between our extracted lines and ground truth palmprint lines achieves 0.813. And the proposed architecture achieves a phenotype classification accuracy of 95.7% based on our self-built palmprint dataset CAS_Palm.</p>","PeriodicalId":74435,"journal":{"name":"Phenomics (Cham, Switzerland)","volume":"2 4","pages":"219-229"},"PeriodicalIF":3.7000,"publicationDate":"2022-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9590507/pdf/43657_2022_Article_63.pdf","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Phenomics (Cham, Switzerland)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s43657-022-00063-0","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"GENETICS & HEREDITY","Score":null,"Total":0}
引用次数: 1
Abstract
Palmprints are of long practical and cultural interest. Palmprint principal lines, also called primary palmar lines, are one of the most dominant palmprint features and do not change over the lifespan. The existing methods utilize filters and edge detection operators to get the principal lines from the palm region of interest (ROI), but can not distinguish the principal lines from fine wrinkles. This paper proposes a novel deep-learning architecture to extract palmprint principal lines, which could greatly reduce the influence of fine wrinkles, and classify palmprint phenotypes further from 2D palmprint images. This architecture includes three modules, ROI extraction module (REM) using pre-trained hand key point location model, principal line extraction module (PLEM) using deep edge detection model, and phenotype classifier (PC) based on ResNet34 network. Compared with the current ROI extraction method, our extraction is competitive with a success rate of 95.2%. For principal line extraction, the similarity score between our extracted lines and ground truth palmprint lines achieves 0.813. And the proposed architecture achieves a phenotype classification accuracy of 95.7% based on our self-built palmprint dataset CAS_Palm.