Presentation Attack Detection with Advanced CNN Models for Noncontact-based Fingerprint Systems

Sandip Purnapatra, Conor Miller-Lynch, Stephen Miner, Yu Liu, Keivan Bahmani, Soumyabrata Dey, S. Schuckers
{"title":"Presentation Attack Detection with Advanced CNN Models for Noncontact-based Fingerprint Systems","authors":"Sandip Purnapatra, Conor Miller-Lynch, Stephen Miner, Yu Liu, Keivan Bahmani, Soumyabrata Dey, S. Schuckers","doi":"10.1109/IWBF57495.2023.10157605","DOIUrl":null,"url":null,"abstract":"Touch-based fingerprint biometrics is one of the most popular biometric modalities with applications in several fields. Problems associated with touch-based techniques such as the presence of latent fingerprints and hygiene issues due to many people touching the same surface motivated the community to look for non-contact-based solutions. For the last few years, contactless fingerprint systems are on the rise and in demand because of the ability to turn any device with a camera into a fingerprint reader. Yet, before we can fully utilize the benefit of noncontact-based methods, the biometric community needs to resolve a few concerns such as the resiliency of the system against presentation attacks. One of the major obstacles is the limited publicly available data sets with inadequate spoof and live data. In this publication, we have developed a Presentation attack detection (PAD) dataset of more than 7500 four-finger images and more than 14,000 manually segmented single-fingertip images, and 10,000 synthetic fingertips (deepfakes). The PAD dataset was collected from six different Presentation Attack Instruments (PAI) of three different difficulty levels according to FIDO protocols, with five different types of PAI materials, and different smartphone cameras with manual focusing. We have utilized DenseNet-121 and NasNetMobile models and our proposed dataset to develop PAD algorithms and achieved PAD accuracy of Attack presentation classification error rate (APCER) 0.14% and Bonafide presentation classification error rate (BPCER) 0.18%. We have also reported the test results of the models against unseen spoof types to replicate uncertain real-world testing scenarios.","PeriodicalId":273412,"journal":{"name":"2023 11th International Workshop on Biometrics and Forensics (IWBF)","volume":"61 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 11th International Workshop on Biometrics and Forensics (IWBF)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IWBF57495.2023.10157605","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5

Abstract

Touch-based fingerprint biometrics is one of the most popular biometric modalities with applications in several fields. Problems associated with touch-based techniques such as the presence of latent fingerprints and hygiene issues due to many people touching the same surface motivated the community to look for non-contact-based solutions. For the last few years, contactless fingerprint systems are on the rise and in demand because of the ability to turn any device with a camera into a fingerprint reader. Yet, before we can fully utilize the benefit of noncontact-based methods, the biometric community needs to resolve a few concerns such as the resiliency of the system against presentation attacks. One of the major obstacles is the limited publicly available data sets with inadequate spoof and live data. In this publication, we have developed a Presentation attack detection (PAD) dataset of more than 7500 four-finger images and more than 14,000 manually segmented single-fingertip images, and 10,000 synthetic fingertips (deepfakes). The PAD dataset was collected from six different Presentation Attack Instruments (PAI) of three different difficulty levels according to FIDO protocols, with five different types of PAI materials, and different smartphone cameras with manual focusing. We have utilized DenseNet-121 and NasNetMobile models and our proposed dataset to develop PAD algorithms and achieved PAD accuracy of Attack presentation classification error rate (APCER) 0.14% and Bonafide presentation classification error rate (BPCER) 0.18%. We have also reported the test results of the models against unseen spoof types to replicate uncertain real-world testing scenarios.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于非接触指纹系统的高级CNN模型表示攻击检测
基于触摸的指纹识别技术是目前最流行的生物识别技术之一,在多个领域都有广泛的应用。与基于触摸的技术相关的问题,如潜在指纹的存在和由于许多人触摸同一个表面而引起的卫生问题,促使社区寻找非接触式解决方案。在过去的几年里,非接触式指纹识别系统的需求不断增加,因为它能够将任何带摄像头的设备变成指纹识别器。然而,在我们能够充分利用非接触式方法的好处之前,生物识别社区需要解决一些问题,例如系统对表示攻击的弹性。其中一个主要障碍是公开可用的数据集有限,欺骗和实时数据不足。在本文中,我们开发了一个包含超过7500个四指图像和超过14,000个手动分割的单指图像以及10,000个合成指尖(深度伪造)的呈现攻击检测(PAD)数据集。PAD数据集收集自六种不同的呈现攻击工具(PAI),根据FIDO协议具有三种不同的难度级别,五种不同类型的PAI材料,以及不同的手动对焦智能手机相机。我们利用DenseNet-121和NasNetMobile模型和我们提出的数据集开发了PAD算法,并实现了攻击呈现分类错误率(APCER) 0.14%和Bonafide呈现分类错误率(BPCER) 0.18%的PAD准确率。我们还报告了模型针对未见的欺骗类型的测试结果,以复制不确定的真实世界测试场景。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A Study on Diffusion Modelling For Sensor-based Human Activity Recognition Video superresolution in real forensic cases Exposing Deepfake Frames through Spectral Analysis of Color Channels in Frequency Domain Face Morphing Attack Detection with Denoising Diffusion Probabilistic Models Cross-Sensor Micro-Texture Material Classification and Smartphone Acquisition do not go well together
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1