{"title":"基于智能手机眼图像的卷积神经网络年龄分类","authors":"A. Rattani, N. Reddy, R. Derakhshani","doi":"10.1109/BTAS.2017.8272766","DOIUrl":null,"url":null,"abstract":"Automated age classification has drawn significant interest in numerous applications such as marketing, forensics, human-computer interaction, and age simulation. A number of studies have demonstrated that age can be automatically deduced from face images. However, few studies have explored the possibility of computational estimation of age information from other modalities such as fingerprint or ocular region. The main challenge in age classification is that age progression is person-specific which depends on many factors such as genetics, health conditions, life style, and stress level. In this paper, we investigate age classification from ocular images acquired using smart-phones. Age information, though not unique to the individual, can be combined along with ocular recognition system to improve authentication accuracy or invariance to the ageing effect. To this end, we propose a convolutional neural network (CNN) architecture for the task. We evaluate our proposed CNN model on the ocular crops of the recent large-scale Adience benchmark for gender and age classification captured using smart-phones. The obtained results establish a baseline for deep learning approaches for age classification from ocular images captured by smart-phones.","PeriodicalId":372008,"journal":{"name":"2017 IEEE International Joint Conference on Biometrics (IJCB)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"32","resultStr":"{\"title\":\"Convolutional neural network for age classification from smart-phone based ocular images\",\"authors\":\"A. Rattani, N. Reddy, R. Derakhshani\",\"doi\":\"10.1109/BTAS.2017.8272766\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Automated age classification has drawn significant interest in numerous applications such as marketing, forensics, human-computer interaction, and age simulation. A number of studies have demonstrated that age can be automatically deduced from face images. However, few studies have explored the possibility of computational estimation of age information from other modalities such as fingerprint or ocular region. The main challenge in age classification is that age progression is person-specific which depends on many factors such as genetics, health conditions, life style, and stress level. In this paper, we investigate age classification from ocular images acquired using smart-phones. Age information, though not unique to the individual, can be combined along with ocular recognition system to improve authentication accuracy or invariance to the ageing effect. To this end, we propose a convolutional neural network (CNN) architecture for the task. We evaluate our proposed CNN model on the ocular crops of the recent large-scale Adience benchmark for gender and age classification captured using smart-phones. The obtained results establish a baseline for deep learning approaches for age classification from ocular images captured by smart-phones.\",\"PeriodicalId\":372008,\"journal\":{\"name\":\"2017 IEEE International Joint Conference on Biometrics (IJCB)\",\"volume\":\"17 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"32\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 IEEE International Joint Conference on Biometrics (IJCB)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/BTAS.2017.8272766\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE International Joint Conference on Biometrics (IJCB)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/BTAS.2017.8272766","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Convolutional neural network for age classification from smart-phone based ocular images
Automated age classification has drawn significant interest in numerous applications such as marketing, forensics, human-computer interaction, and age simulation. A number of studies have demonstrated that age can be automatically deduced from face images. However, few studies have explored the possibility of computational estimation of age information from other modalities such as fingerprint or ocular region. The main challenge in age classification is that age progression is person-specific which depends on many factors such as genetics, health conditions, life style, and stress level. In this paper, we investigate age classification from ocular images acquired using smart-phones. Age information, though not unique to the individual, can be combined along with ocular recognition system to improve authentication accuracy or invariance to the ageing effect. To this end, we propose a convolutional neural network (CNN) architecture for the task. We evaluate our proposed CNN model on the ocular crops of the recent large-scale Adience benchmark for gender and age classification captured using smart-phones. The obtained results establish a baseline for deep learning approaches for age classification from ocular images captured by smart-phones.