Feiyu Han;Panlong Yang;Yuanhao Feng;Haohua Du;Xiang-Yang Li
{"title":"Exploring Earable-Based Passive User Authentication via Interpretable In-Ear Breathing Biometrics","authors":"Feiyu Han;Panlong Yang;Yuanhao Feng;Haohua Du;Xiang-Yang Li","doi":"10.1109/TMC.2024.3453412","DOIUrl":null,"url":null,"abstract":"As earable devices have become indispensable smart devices in people's lives, earable-based user authentication has gradually attracted widespread attention. In our work, we explore novel in-ear breathing biometrics and design an earable-based authentication approach, named \n<italic>BreathSign</i>\n, which takes advantage of inward-facing microphones on commercial earphones to capture in-ear breathing sounds for passive authentication. To expand the differences among individuals, we model the process of breathing sound generation, transmission, and reception. Based on that, we derive hard-to-forge physical-level features from in-ear breathing sounds as biometrics. Furthermore, to eliminate the impact of breathing behavioral patterns (e.g., duration and intensity), we design a triple network model to extract breathing behavior-independent features and design an online user template update mechanism for long-term authentication. Extensive experiments with 35 healthy subjects have been conducted to evaluate the performance of \n<italic>BreathSign</i>\n. The results show that our system achieves the average authentication accuracy of 93.15%, 98.06%, and 99.74% via one, five, and nine breathing cycles, respectively. Regarding the resistance of spoofing attacks, \n<italic>BreathSign</i>\n could achieve an average EER of approximately 3.5%. Compared with other behavior-based authentication schemes, \n<italic>BreathSign</i>\n does not require users to perform complex movements or postures but only effortless breathing for authentication and can be easily implemented on commercial earphones with high usability and enhanced security.","PeriodicalId":50389,"journal":{"name":"IEEE Transactions on Mobile Computing","volume":"23 12","pages":"15238-15255"},"PeriodicalIF":7.7000,"publicationDate":"2024-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Mobile Computing","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10663927/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
As earable devices have become indispensable smart devices in people's lives, earable-based user authentication has gradually attracted widespread attention. In our work, we explore novel in-ear breathing biometrics and design an earable-based authentication approach, named
BreathSign
, which takes advantage of inward-facing microphones on commercial earphones to capture in-ear breathing sounds for passive authentication. To expand the differences among individuals, we model the process of breathing sound generation, transmission, and reception. Based on that, we derive hard-to-forge physical-level features from in-ear breathing sounds as biometrics. Furthermore, to eliminate the impact of breathing behavioral patterns (e.g., duration and intensity), we design a triple network model to extract breathing behavior-independent features and design an online user template update mechanism for long-term authentication. Extensive experiments with 35 healthy subjects have been conducted to evaluate the performance of
BreathSign
. The results show that our system achieves the average authentication accuracy of 93.15%, 98.06%, and 99.74% via one, five, and nine breathing cycles, respectively. Regarding the resistance of spoofing attacks,
BreathSign
could achieve an average EER of approximately 3.5%. Compared with other behavior-based authentication schemes,
BreathSign
does not require users to perform complex movements or postures but only effortless breathing for authentication and can be easily implemented on commercial earphones with high usability and enhanced security.
期刊介绍:
IEEE Transactions on Mobile Computing addresses key technical issues related to various aspects of mobile computing. This includes (a) architectures, (b) support services, (c) algorithm/protocol design and analysis, (d) mobile environments, (e) mobile communication systems, (f) applications, and (g) emerging technologies. Topics of interest span a wide range, covering aspects like mobile networks and hosts, mobility management, multimedia, operating system support, power management, online and mobile environments, security, scalability, reliability, and emerging technologies such as wearable computers, body area networks, and wireless sensor networks. The journal serves as a comprehensive platform for advancements in mobile computing research.