首页 > 最新文献

2015 IEEE 7th International Conference on Biometrics Theory, Applications and Systems (BTAS)最新文献

英文 中文
Attribute-based continuous user authentication on mobile devices 基于属性的移动设备用户持续认证
Pouya Samangouei, Vishal M. Patel, R. Chellappa
We present a method using facial attributes for continuous authentication of smartphone users. The binary attribute classifiers are trained using PubFig dataset and provide compact visual descriptions of faces. The learned classifiers are applied to the image of the current user of a mobile device to extract the attributes and then authentication is done by simply comparing the difference between the acquired attributes and the enrolled attributes of the original user. Extensive experiments on two publicly available unconstrained mobile face video datasets show that our method is able to capture meaningful attributes of faces and performs better than the previously proposed LBP-based authentication method.
我们提出了一种使用面部属性对智能手机用户进行连续认证的方法。二元属性分类器使用PubFig数据集进行训练,并提供紧凑的人脸视觉描述。将学习到的分类器应用到移动设备当前用户的图像中提取属性,然后通过简单地比较获取的属性与原始用户注册属性之间的差异来进行身份验证。在两个公开的无约束移动人脸视频数据集上进行的大量实验表明,我们的方法能够捕获人脸的有意义属性,并且比之前提出的基于lbp的身份验证方法性能更好。
{"title":"Attribute-based continuous user authentication on mobile devices","authors":"Pouya Samangouei, Vishal M. Patel, R. Chellappa","doi":"10.1109/BTAS.2015.7358748","DOIUrl":"https://doi.org/10.1109/BTAS.2015.7358748","url":null,"abstract":"We present a method using facial attributes for continuous authentication of smartphone users. The binary attribute classifiers are trained using PubFig dataset and provide compact visual descriptions of faces. The learned classifiers are applied to the image of the current user of a mobile device to extract the attributes and then authentication is done by simply comparing the difference between the acquired attributes and the enrolled attributes of the original user. Extensive experiments on two publicly available unconstrained mobile face video datasets show that our method is able to capture meaningful attributes of faces and performs better than the previously proposed LBP-based authentication method.","PeriodicalId":404972,"journal":{"name":"2015 IEEE 7th International Conference on Biometrics Theory, Applications and Systems (BTAS)","volume":"125 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132665052","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 49
Robust face recognition based on saliency maps of sigma sets 基于sigma集显著性映射的鲁棒人脸识别
Ramya Srinivasan, A. Roy-Chowdhury
We propose a robust unsupervised method for face recognition wherein saliency maps of second order statistics are employed as image descriptors. In particular, we leverage upon region covariance matrices (RCM) and their enhancement based on sigma sets for constructing saliency maps of face images. Sigma sets are of low dimension, robust to rotation and illumination changes and are efficient in distance evaluation. Further, they provide a natural way to combine multiple features and hence facilitate a simple mechanism for building otherwise tedious saliency maps. Using saliency maps thus constructed as the face descriptors brings in an additional advantage of emphasizing the most discriminative regions of a face and thereby improve recognition performance. We demonstrate the effectiveness of the proposed method for face photo-sketch recognition, wherein we achieve performance comparable to state-of-the-art without having to do sketch synthesis.
我们提出了一种鲁棒的无监督人脸识别方法,其中二阶统计量的显著性映射被用作图像描述符。特别是,我们利用区域协方差矩阵(RCM)及其基于sigma集的增强来构建人脸图像的显著性图。Sigma集具有低维数、对旋转和光照变化的鲁棒性和距离评估的有效性。此外,它们提供了一种结合多种功能的自然方式,从而为构建单调乏味的显著性地图提供了一种简单的机制。使用这样构建的显著性图作为人脸描述符带来了强调人脸最具区别性的区域的额外优势,从而提高了识别性能。我们证明了所提出的人脸照片素描识别方法的有效性,其中我们实现了与最先进的性能相当的性能,而无需进行素描合成。
{"title":"Robust face recognition based on saliency maps of sigma sets","authors":"Ramya Srinivasan, A. Roy-Chowdhury","doi":"10.1109/BTAS.2015.7358793","DOIUrl":"https://doi.org/10.1109/BTAS.2015.7358793","url":null,"abstract":"We propose a robust unsupervised method for face recognition wherein saliency maps of second order statistics are employed as image descriptors. In particular, we leverage upon region covariance matrices (RCM) and their enhancement based on sigma sets for constructing saliency maps of face images. Sigma sets are of low dimension, robust to rotation and illumination changes and are efficient in distance evaluation. Further, they provide a natural way to combine multiple features and hence facilitate a simple mechanism for building otherwise tedious saliency maps. Using saliency maps thus constructed as the face descriptors brings in an additional advantage of emphasizing the most discriminative regions of a face and thereby improve recognition performance. We demonstrate the effectiveness of the proposed method for face photo-sketch recognition, wherein we achieve performance comparable to state-of-the-art without having to do sketch synthesis.","PeriodicalId":404972,"journal":{"name":"2015 IEEE 7th International Conference on Biometrics Theory, Applications and Systems (BTAS)","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130913209","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Exploiting polarization-state information for cross-spectrum face recognition 利用偏振态信息进行跨光谱人脸识别
Nathan J. Short, Shuowen Hu, Prudhvi K. Gurram, K. Gurton
Face recognition research has primarily focused on the visible spectrum, due to the prevalence and low cost of visible cameras. However, face recognition in the visible spectrum is sensitive to illumination variations, and is infeasible in low-light or nighttime settings. In contrast, thermal imaging acquires naturally emitted radiation from facial skin tissue, and is therefore ideal for nighttime surveillance and intelligence gathering operations. However, conventional thermal face imagery lacks textural and geometrics details that are present in visible spectrum face signatures. In this work, we further explore the impact of polarimetric imaging in the LWIR spectrum for face recognition. Polarization-state information provides textural and geometric facial details unavailable with conventional thermal imaging. Since the frequency content of the conventional thermal, polarimetric thermal, and visible images is quite different, we propose a spatial correlation based procedure to optimize the filtering of polarimetric thermal and visible face images to further facilitate cross-spectrum face recognition. Additionally, we use a more extensive gallery database to more robustly demonstrate an improvement in the performance of cross-spectrum face recognition using polarimetric thermal imaging.
由于可见相机的普及和低成本,人脸识别的研究主要集中在可见光谱上。然而,可见光谱中的人脸识别对光照变化很敏感,在低光或夜间环境下是不可行的。相比之下,热成像从面部皮肤组织获得自然发射的辐射,因此是夜间监视和情报收集行动的理想选择。然而,传统的热人脸图像缺乏可见光谱人脸特征中存在的纹理和几何细节。在这项工作中,我们进一步探讨了偏振成像在LWIR光谱中对人脸识别的影响。偏振状态信息提供了传统热成像无法获得的纹理和几何面部细节。由于常规热、极化热和可见光图像的频率含量存在较大差异,本文提出了一种基于空间相关性的方法来优化极化热和可见光图像的滤波,以进一步促进跨光谱人脸识别。此外,我们使用更广泛的图库数据库来更有力地证明使用偏振热成像的跨光谱人脸识别性能的改进。
{"title":"Exploiting polarization-state information for cross-spectrum face recognition","authors":"Nathan J. Short, Shuowen Hu, Prudhvi K. Gurram, K. Gurton","doi":"10.1109/BTAS.2015.7358758","DOIUrl":"https://doi.org/10.1109/BTAS.2015.7358758","url":null,"abstract":"Face recognition research has primarily focused on the visible spectrum, due to the prevalence and low cost of visible cameras. However, face recognition in the visible spectrum is sensitive to illumination variations, and is infeasible in low-light or nighttime settings. In contrast, thermal imaging acquires naturally emitted radiation from facial skin tissue, and is therefore ideal for nighttime surveillance and intelligence gathering operations. However, conventional thermal face imagery lacks textural and geometrics details that are present in visible spectrum face signatures. In this work, we further explore the impact of polarimetric imaging in the LWIR spectrum for face recognition. Polarization-state information provides textural and geometric facial details unavailable with conventional thermal imaging. Since the frequency content of the conventional thermal, polarimetric thermal, and visible images is quite different, we propose a spatial correlation based procedure to optimize the filtering of polarimetric thermal and visible face images to further facilitate cross-spectrum face recognition. Additionally, we use a more extensive gallery database to more robustly demonstrate an improvement in the performance of cross-spectrum face recognition using polarimetric thermal imaging.","PeriodicalId":404972,"journal":{"name":"2015 IEEE 7th International Conference on Biometrics Theory, Applications and Systems (BTAS)","volume":"142 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131930059","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 12
BioEye 2015: Competition on biometrics via eye movements BioEye 2015:眼球运动生物识别技术竞赛
Oleg V. Komogortsev, Ioannis Rigas
Biometric recognition via eye movement-driven features is an emerging field of research. Eye movement cues are characterized by their non-static nature, the encapsulation of physical and behavioral traits, and the possibility to be recorded in tandem with other modalities, e.g. the iris. The BioEye 2015 competition was organized with the aim to boost the evolution of the eye movement biometrics field. The competition was implemented with a particular focus on the issues facing the researchers in the domain of the eye movement recognition, e.g. quality of the eye movement recordings, different visual stimulus types, and the effect of template aging on the resulting recognition accuracy. This paper describes the details and the results of the BioEye 2015 competition, which provided the largest to date biometric database containing records from 306 subjects, stimulus of two types, and recordings separated by short-time and long-time intervals.
通过眼动驱动特征进行生物识别是一个新兴的研究领域。眼球运动线索的特点是它们的非静态性质,身体和行为特征的封装,以及与其他模式(如虹膜)串联记录的可能性。BioEye 2015大赛旨在推动眼动生物识别技术领域的发展。比赛的实施重点是研究人员在眼动识别领域面临的问题,例如眼动记录的质量、不同的视觉刺激类型以及模板老化对识别精度的影响。本文描述了BioEye 2015竞赛的细节和结果,该竞赛提供了迄今为止最大的生物特征数据库,包含来自306名受试者的记录,两种类型的刺激,以及按短时间和长时间间隔分开的记录。
{"title":"BioEye 2015: Competition on biometrics via eye movements","authors":"Oleg V. Komogortsev, Ioannis Rigas","doi":"10.1109/BTAS.2015.7358750","DOIUrl":"https://doi.org/10.1109/BTAS.2015.7358750","url":null,"abstract":"Biometric recognition via eye movement-driven features is an emerging field of research. Eye movement cues are characterized by their non-static nature, the encapsulation of physical and behavioral traits, and the possibility to be recorded in tandem with other modalities, e.g. the iris. The BioEye 2015 competition was organized with the aim to boost the evolution of the eye movement biometrics field. The competition was implemented with a particular focus on the issues facing the researchers in the domain of the eye movement recognition, e.g. quality of the eye movement recordings, different visual stimulus types, and the effect of template aging on the resulting recognition accuracy. This paper describes the details and the results of the BioEye 2015 competition, which provided the largest to date biometric database containing records from 306 subjects, stimulus of two types, and recordings separated by short-time and long-time intervals.","PeriodicalId":404972,"journal":{"name":"2015 IEEE 7th International Conference on Biometrics Theory, Applications and Systems (BTAS)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134180645","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 21
Statistical evaluation of up-to-three-attempt iris recognition 三次以上虹膜识别的统计评价
A. Czajka, K. Bowyer
Real-world biometric applications often operate in the context of an identity transaction that allows up to three attempts. That is, if a biometric sample is acquired and if it does not result in a match, the user is allowed to acquire a second sample, and if it again does not result in a match, the user is allowed to acquire a third sample. If the third sample does not result in a match, then the transaction is ended with no match. We report results of an experiment to determine whether or not successive attempts can be considered as independent samples from the same distribution, and whether and how the quality of a biometric sample changes in successive attempts. To our knowledge, this is the first published research to investigate the statistics of multi-attempt biometric transactions. We find that the common assumption that the attempt outcomes come from independent and identically distributed random variables in multi-attempt biometric transactions is incorrect.
现实世界的生物识别应用程序通常在允许最多三次尝试的身份事务环境中运行。也就是说,如果获得了生物识别样本,如果没有匹配,则允许用户获取第二个样本,如果再次没有匹配,则允许用户获取第三个样本。如果第三个样本没有匹配,则事务以不匹配结束。我们报告了一个实验的结果,以确定连续的尝试是否可以被视为来自同一分布的独立样本,以及在连续的尝试中生物识别样本的质量是否以及如何变化。据我们所知,这是首次发表的调查多尝试生物识别交易统计数据的研究。我们发现,在多尝试生物识别交易中,通常假设尝试结果来自独立且同分布的随机变量是不正确的。
{"title":"Statistical evaluation of up-to-three-attempt iris recognition","authors":"A. Czajka, K. Bowyer","doi":"10.1109/BTAS.2015.7358797","DOIUrl":"https://doi.org/10.1109/BTAS.2015.7358797","url":null,"abstract":"Real-world biometric applications often operate in the context of an identity transaction that allows up to three attempts. That is, if a biometric sample is acquired and if it does not result in a match, the user is allowed to acquire a second sample, and if it again does not result in a match, the user is allowed to acquire a third sample. If the third sample does not result in a match, then the transaction is ended with no match. We report results of an experiment to determine whether or not successive attempts can be considered as independent samples from the same distribution, and whether and how the quality of a biometric sample changes in successive attempts. To our knowledge, this is the first published research to investigate the statistics of multi-attempt biometric transactions. We find that the common assumption that the attempt outcomes come from independent and identically distributed random variables in multi-attempt biometric transactions is incorrect.","PeriodicalId":404972,"journal":{"name":"2015 IEEE 7th International Conference on Biometrics Theory, Applications and Systems (BTAS)","volume":"60 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126647723","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Person verification via eye movement-driven text reading model 基于眼动驱动文本阅读模型的人物验证
Evgeniy Abdulin, Oleg V. Komogortsev
The paper presents a reading-based eye movement biometrics model. The model is able to process passages of text and extract metrics that represent the physiological and behavioral aspects of the eye movements in reading. When tested on a database of eye movements from 103 individuals, the model yielded the Equal Error Rate of 10.2%. The proposed method performed better in the template-aging scenario than comparable eye movement-driven biometrics methods.
提出了一种基于阅读的眼动生物识别模型。该模型能够处理文本段落,并提取代表阅读时眼球运动的生理和行为方面的指标。当在103个人的眼球运动数据库中进行测试时,该模型的错误率为10.2%。该方法在模板老化场景下的表现优于类似的眼动驱动生物识别方法。
{"title":"Person verification via eye movement-driven text reading model","authors":"Evgeniy Abdulin, Oleg V. Komogortsev","doi":"10.1109/BTAS.2015.7358786","DOIUrl":"https://doi.org/10.1109/BTAS.2015.7358786","url":null,"abstract":"The paper presents a reading-based eye movement biometrics model. The model is able to process passages of text and extract metrics that represent the physiological and behavioral aspects of the eye movements in reading. When tested on a database of eye movements from 103 individuals, the model yielded the Equal Error Rate of 10.2%. The proposed method performed better in the template-aging scenario than comparable eye movement-driven biometrics methods.","PeriodicalId":404972,"journal":{"name":"2015 IEEE 7th International Conference on Biometrics Theory, Applications and Systems (BTAS)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121043634","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 12
Fusing binary templates for multi-biometric cryptosystems 融合多生物特征密码系统的二进制模板
Guangcan Mai, M. Lim, P. Yuen
Biometric cryptosystem has been proven to be one of the promising approaches for template protection. Since most methods in this approach require binary input, to extend it for multiple modalities, binary template fusion is required. This paper addresses the issues of multi-biometrics' performance and security, and proposes a new binary template fusion method which could maximize the fused template discriminability and its entropy by reducing the bits dependency. Three publicly available datasets are used for experiments. Experimental results show that the proposed method outperforms the state of the art methods.
生物特征密码系统已被证明是一种很有前途的模板保护方法。由于该方法中的大多数方法都需要二进制输入,为了将其扩展到多模态,需要进行二进制模板融合。针对多生物特征识别的性能和安全性问题,提出了一种新的二值模板融合方法,通过减小位依赖,使融合模板的可鉴别性和熵最大化。实验使用了三个公开可用的数据集。实验结果表明,该方法优于现有方法。
{"title":"Fusing binary templates for multi-biometric cryptosystems","authors":"Guangcan Mai, M. Lim, P. Yuen","doi":"10.1109/BTAS.2015.7358764","DOIUrl":"https://doi.org/10.1109/BTAS.2015.7358764","url":null,"abstract":"Biometric cryptosystem has been proven to be one of the promising approaches for template protection. Since most methods in this approach require binary input, to extend it for multiple modalities, binary template fusion is required. This paper addresses the issues of multi-biometrics' performance and security, and proposes a new binary template fusion method which could maximize the fused template discriminability and its entropy by reducing the bits dependency. Three publicly available datasets are used for experiments. Experimental results show that the proposed method outperforms the state of the art methods.","PeriodicalId":404972,"journal":{"name":"2015 IEEE 7th International Conference on Biometrics Theory, Applications and Systems (BTAS)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126644588","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
fNIRS: A new modality for brain activity-based biometric authentication fNIRS:一种基于脑活动的生物识别认证新模式
Abdul Serwadda, V. Phoha, Sujit Poudel, Leanne M. Hirshfield, Danushka Bandara, Sarah E. Bratt, Mark R. Costa
There is a rapidly increasing amount of research on the use of brain activity patterns as a basis for biometric user verification. The vast majority of this research is based on Electroencephalogram (EEG), a technology which measures the electrical activity along the scalp. In this paper, we evaluate Functional Near-Infrared Spectroscopy (fNIRS) as an alternative approach to brain activity-based user authentication. fNIRS is centered around the measurement of light absorbed by blood and, compared to EEG, has a higher signal-to-noise ratio, is more suited for use during normal working conditions, and has a much higher spatial resolution which enables targeted measurements of specific brain regions. Based on a dataset of 50 users that was analysed using an SVM and a Naïve Bayes classifier, we show fNIRS to respectively give EERs of 0.036 and 0.046 when using our best channel configuration. Further, we present some results on the areas of the brain which demonstrated highest discriminative power. Our findings indicate that fNIRS has significant promise as a biometric authentication modality.
使用大脑活动模式作为生物识别用户验证的基础的研究正在迅速增加。这项研究的绝大部分是基于脑电图(EEG),这是一种测量头皮电活动的技术。在本文中,我们评估了功能近红外光谱(fNIRS)作为基于大脑活动的用户认证的替代方法。fNIRS以测量血液吸收的光为中心,与脑电图相比,它具有更高的信噪比,更适合在正常工作条件下使用,并且具有更高的空间分辨率,可以针对特定的大脑区域进行测量。基于使用支持向量机和Naïve贝叶斯分类器分析的50个用户的数据集,我们显示了当使用我们的最佳通道配置时,fNIRS分别给出0.036和0.046的EERs。此外,我们提出了一些结果在大脑的区域,表现出最高的辨别能力。我们的研究结果表明,fNIRS作为一种生物识别认证方式具有重要的前景。
{"title":"fNIRS: A new modality for brain activity-based biometric authentication","authors":"Abdul Serwadda, V. Phoha, Sujit Poudel, Leanne M. Hirshfield, Danushka Bandara, Sarah E. Bratt, Mark R. Costa","doi":"10.1109/BTAS.2015.7358763","DOIUrl":"https://doi.org/10.1109/BTAS.2015.7358763","url":null,"abstract":"There is a rapidly increasing amount of research on the use of brain activity patterns as a basis for biometric user verification. The vast majority of this research is based on Electroencephalogram (EEG), a technology which measures the electrical activity along the scalp. In this paper, we evaluate Functional Near-Infrared Spectroscopy (fNIRS) as an alternative approach to brain activity-based user authentication. fNIRS is centered around the measurement of light absorbed by blood and, compared to EEG, has a higher signal-to-noise ratio, is more suited for use during normal working conditions, and has a much higher spatial resolution which enables targeted measurements of specific brain regions. Based on a dataset of 50 users that was analysed using an SVM and a Naïve Bayes classifier, we show fNIRS to respectively give EERs of 0.036 and 0.046 when using our best channel configuration. Further, we present some results on the areas of the brain which demonstrated highest discriminative power. Our findings indicate that fNIRS has significant promise as a biometric authentication modality.","PeriodicalId":404972,"journal":{"name":"2015 IEEE 7th International Conference on Biometrics Theory, Applications and Systems (BTAS)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116258894","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 21
Pace independent mobile gait biometrics 不依赖速度的移动步态生物识别技术
Yu Zhong, Yunbin Deng, Geoffrey S. Meltzner
Accelerometers embedded in mobile devices have shown great potential for non-obtrusive gait biometrics by directly capturing a user's characteristic locomotion. Although gait analysis using these sensors has achieved highly accurate authentication and identification performance under controlled experimental settings, the robustness of such algorithms in the presence of assorted variations typical in real world scenarios remains a major challenge. In this paper, we propose a novel pace independent mobile gait biometrics algorithm that is insensitive to variability in walking speed. Our approach also exploits recent advances in invariant mobile gait representation to be independent of sensor rotation. Performance evaluations on a realistic mobile gait dataset containing 51 subjects confirm the merits of the proposed algorithm toward practical mobile gait authentication.
嵌入移动设备的加速度计通过直接捕捉用户的特征运动,显示出非突兀步态生物识别的巨大潜力。尽管使用这些传感器的步态分析在受控实验设置下实现了高度精确的身份验证和识别性能,但这些算法在现实世界场景中存在各种典型变化的鲁棒性仍然是一个主要挑战。在本文中,我们提出了一种新的对步行速度变化不敏感的与速度无关的移动步态生物识别算法。我们的方法还利用了独立于传感器旋转的不变性移动步态表示的最新进展。在包含51个受试者的真实移动步态数据集上的性能评估证实了该算法在实际移动步态认证方面的优点。
{"title":"Pace independent mobile gait biometrics","authors":"Yu Zhong, Yunbin Deng, Geoffrey S. Meltzner","doi":"10.1109/BTAS.2015.7358784","DOIUrl":"https://doi.org/10.1109/BTAS.2015.7358784","url":null,"abstract":"Accelerometers embedded in mobile devices have shown great potential for non-obtrusive gait biometrics by directly capturing a user's characteristic locomotion. Although gait analysis using these sensors has achieved highly accurate authentication and identification performance under controlled experimental settings, the robustness of such algorithms in the presence of assorted variations typical in real world scenarios remains a major challenge. In this paper, we propose a novel pace independent mobile gait biometrics algorithm that is insensitive to variability in walking speed. Our approach also exploits recent advances in invariant mobile gait representation to be independent of sensor rotation. Performance evaluations on a realistic mobile gait dataset containing 51 subjects confirm the merits of the proposed algorithm toward practical mobile gait authentication.","PeriodicalId":404972,"journal":{"name":"2015 IEEE 7th International Conference on Biometrics Theory, Applications and Systems (BTAS)","volume":"115 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114492642","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 27
Exploiting stable and discriminative iris weight map for iris recognition under less constrained environment 利用稳定的判别性虹膜权值图,实现低约束环境下的虹膜识别
Yang Hu, K. Sirlantzis, G. Howells
In this paper, we address the problem of iris recognition under less constrained environment. We propose a novel iris weight map for iris matching stage to improve the robustness of iris recognition to the noise and degradations in less constrained environment. The proposed iris weight map is class specific considering both the bit stability and bit discriminability of iris codes. It is the combination of a stability map and a discriminability map. The stability map focuses on intra-class bit stability, aiming to improve the intra-class matching. It assigns more weight to the bits that are highly consistent with their noiseless estimations which are sought via low rank approximation. The discriminability map models the inter-class bit discriminability. It emphasizes more discriminative bits in iris codes to improve the inter-class separation via a 1-to-N strategy. The experimental results demonstrate that the proposed iris weight map achieves improved identification and verification performance compared to state-of-the-art algorithms on publicly available datasets.
在本文中,我们解决了在较少约束环境下的虹膜识别问题。为了提高虹膜识别在低约束环境下对噪声和退化的鲁棒性,我们提出了一种新的虹膜权重图用于虹膜匹配阶段。考虑到虹膜码的位稳定性和位可辨别性,所提出的虹膜权映射是类特异性的。它是稳定性图和可判别性图的结合。稳定性映射主要关注类内比特的稳定性,旨在提高类内匹配。它将更多的权重分配给与通过低秩近似寻求的无噪声估计高度一致的比特。鉴别映射对类间位的鉴别性进行建模。它强调虹膜码中更多的判别位,通过1对n策略提高类间分离。实验结果表明,与现有算法相比,本文提出的虹膜权重图在公开数据集上的识别和验证性能有所提高。
{"title":"Exploiting stable and discriminative iris weight map for iris recognition under less constrained environment","authors":"Yang Hu, K. Sirlantzis, G. Howells","doi":"10.1109/BTAS.2015.7358759","DOIUrl":"https://doi.org/10.1109/BTAS.2015.7358759","url":null,"abstract":"In this paper, we address the problem of iris recognition under less constrained environment. We propose a novel iris weight map for iris matching stage to improve the robustness of iris recognition to the noise and degradations in less constrained environment. The proposed iris weight map is class specific considering both the bit stability and bit discriminability of iris codes. It is the combination of a stability map and a discriminability map. The stability map focuses on intra-class bit stability, aiming to improve the intra-class matching. It assigns more weight to the bits that are highly consistent with their noiseless estimations which are sought via low rank approximation. The discriminability map models the inter-class bit discriminability. It emphasizes more discriminative bits in iris codes to improve the inter-class separation via a 1-to-N strategy. The experimental results demonstrate that the proposed iris weight map achieves improved identification and verification performance compared to state-of-the-art algorithms on publicly available datasets.","PeriodicalId":404972,"journal":{"name":"2015 IEEE 7th International Conference on Biometrics Theory, Applications and Systems (BTAS)","volume":"90 1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128308071","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 17
期刊
2015 IEEE 7th International Conference on Biometrics Theory, Applications and Systems (BTAS)
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1