Yi Wu;Xiande Zhang;Tianhao Wu;Bing Zhou;Phuc Nguyen;Jian Liu
{"title":"通过轻量级单耳生物传感器进行3D面部跟踪和用户认证","authors":"Yi Wu;Xiande Zhang;Tianhao Wu;Bing Zhou;Phuc Nguyen;Jian Liu","doi":"10.1109/TMC.2024.3470339","DOIUrl":null,"url":null,"abstract":"Facial landmark tracking and 3D reconstruction have gained considerable attention due to their numerous applications such as human-computer interactions, facial expression analysis, and emotion recognition, etc. Traditional approaches require users to be confined to a particular location and face a camera under constrained recording conditions, which prevents them from being deployed in many application scenarios involving human motions. In this paper, we propose the first single-earpiece lightweight biosensing system, \n<italic>BioFace-3D</i>\n, that can unobtrusively, continuously, and reliably sense the entire facial movements, track 2D facial landmarks, and further render 3D facial animations. Our single-earpiece biosensing system takes advantage of the cross-modal transfer learning model to transfer the knowledge embodied in a \n<italic>high-grade</i>\n visual facial landmark detection model to the \n<italic>low-grade</i>\n biosignal domain. After training, our \n<italic>BioFace-3D</i>\n can directly perform continuous 3D facial reconstruction from the biosignals, without any visual input. Additionally, by utilizing biosensors, we also showcase the potential for capturing both behavioral aspects, such as facial gestures, and distinctive individual physiological traits, establishing a comprehensive two-factor authentication/identification framework. Extensive experiments involving 16 participants demonstrate that \n<italic>BioFace-3D</i>\n can accurately track 53 major facial landmarks with only 1.85 mm average error and 3.38% normalized mean error, which is comparable with most state-of-the-art camera-based solutions. Experiments also show that the system can authenticate users with high accuracy (e.g., over 99.8% within two trials for three gestures in series), low false positive rate (e.g., less 0.24%), and is robust to various types of attacks.","PeriodicalId":50389,"journal":{"name":"IEEE Transactions on Mobile Computing","volume":"24 2","pages":"749-762"},"PeriodicalIF":7.7000,"publicationDate":"2024-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"3D Facial Tracking and User Authentication Through Lightweight Single-Ear Biosensors\",\"authors\":\"Yi Wu;Xiande Zhang;Tianhao Wu;Bing Zhou;Phuc Nguyen;Jian Liu\",\"doi\":\"10.1109/TMC.2024.3470339\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Facial landmark tracking and 3D reconstruction have gained considerable attention due to their numerous applications such as human-computer interactions, facial expression analysis, and emotion recognition, etc. Traditional approaches require users to be confined to a particular location and face a camera under constrained recording conditions, which prevents them from being deployed in many application scenarios involving human motions. In this paper, we propose the first single-earpiece lightweight biosensing system, \\n<italic>BioFace-3D</i>\\n, that can unobtrusively, continuously, and reliably sense the entire facial movements, track 2D facial landmarks, and further render 3D facial animations. Our single-earpiece biosensing system takes advantage of the cross-modal transfer learning model to transfer the knowledge embodied in a \\n<italic>high-grade</i>\\n visual facial landmark detection model to the \\n<italic>low-grade</i>\\n biosignal domain. After training, our \\n<italic>BioFace-3D</i>\\n can directly perform continuous 3D facial reconstruction from the biosignals, without any visual input. Additionally, by utilizing biosensors, we also showcase the potential for capturing both behavioral aspects, such as facial gestures, and distinctive individual physiological traits, establishing a comprehensive two-factor authentication/identification framework. Extensive experiments involving 16 participants demonstrate that \\n<italic>BioFace-3D</i>\\n can accurately track 53 major facial landmarks with only 1.85 mm average error and 3.38% normalized mean error, which is comparable with most state-of-the-art camera-based solutions. Experiments also show that the system can authenticate users with high accuracy (e.g., over 99.8% within two trials for three gestures in series), low false positive rate (e.g., less 0.24%), and is robust to various types of attacks.\",\"PeriodicalId\":50389,\"journal\":{\"name\":\"IEEE Transactions on Mobile Computing\",\"volume\":\"24 2\",\"pages\":\"749-762\"},\"PeriodicalIF\":7.7000,\"publicationDate\":\"2024-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Mobile Computing\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10701549/\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Mobile Computing","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10701549/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
3D Facial Tracking and User Authentication Through Lightweight Single-Ear Biosensors
Facial landmark tracking and 3D reconstruction have gained considerable attention due to their numerous applications such as human-computer interactions, facial expression analysis, and emotion recognition, etc. Traditional approaches require users to be confined to a particular location and face a camera under constrained recording conditions, which prevents them from being deployed in many application scenarios involving human motions. In this paper, we propose the first single-earpiece lightweight biosensing system,
BioFace-3D
, that can unobtrusively, continuously, and reliably sense the entire facial movements, track 2D facial landmarks, and further render 3D facial animations. Our single-earpiece biosensing system takes advantage of the cross-modal transfer learning model to transfer the knowledge embodied in a
high-grade
visual facial landmark detection model to the
low-grade
biosignal domain. After training, our
BioFace-3D
can directly perform continuous 3D facial reconstruction from the biosignals, without any visual input. Additionally, by utilizing biosensors, we also showcase the potential for capturing both behavioral aspects, such as facial gestures, and distinctive individual physiological traits, establishing a comprehensive two-factor authentication/identification framework. Extensive experiments involving 16 participants demonstrate that
BioFace-3D
can accurately track 53 major facial landmarks with only 1.85 mm average error and 3.38% normalized mean error, which is comparable with most state-of-the-art camera-based solutions. Experiments also show that the system can authenticate users with high accuracy (e.g., over 99.8% within two trials for three gestures in series), low false positive rate (e.g., less 0.24%), and is robust to various types of attacks.
期刊介绍:
IEEE Transactions on Mobile Computing addresses key technical issues related to various aspects of mobile computing. This includes (a) architectures, (b) support services, (c) algorithm/protocol design and analysis, (d) mobile environments, (e) mobile communication systems, (f) applications, and (g) emerging technologies. Topics of interest span a wide range, covering aspects like mobile networks and hosts, mobility management, multimedia, operating system support, power management, online and mobile environments, security, scalability, reliability, and emerging technologies such as wearable computers, body area networks, and wireless sensor networks. The journal serves as a comprehensive platform for advancements in mobile computing research.