Rack, Christian, Fernando, Tamara, Yalcin, Murat, Hotho, Andreas, Latoschik, Marc Erich
{"title":"Who is Alyx? A new behavioral biometric dataset for user identification in XR","authors":"Rack, Christian, Fernando, Tamara, Yalcin, Murat, Hotho, Andreas, Latoschik, Marc Erich","doi":"10.3389/frvir.2023.1272234","DOIUrl":null,"url":null,"abstract":"This article presents a new dataset containing motion and physiological data of users playing the game \"Half-Life: Alyx\". The dataset specifically targets behavioral and biometric identification of XR users. It includes motion and eye-tracking data captured by a HTC Vive Pro of 71 users playing the game on two separate days for 45 minutes. Additionally, we collected physiological data from 31 of these users. We provide benchmark performances for the task of motion-based identification of XR users with two prominent state-of-the-art deep learning architectures (GRU and CNN). After training on the first session of each user, the best model can identify the 71 users in the second session with a mean accuracy of 95% within 2 minutes. The dataset is freely available under https://github.com/cschell/who-is-alyx","PeriodicalId":73116,"journal":{"name":"Frontiers in virtual reality","volume":"79 11","pages":"0"},"PeriodicalIF":3.2000,"publicationDate":"2023-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in virtual reality","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3389/frvir.2023.1272234","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
引用次数: 0
Abstract
This article presents a new dataset containing motion and physiological data of users playing the game "Half-Life: Alyx". The dataset specifically targets behavioral and biometric identification of XR users. It includes motion and eye-tracking data captured by a HTC Vive Pro of 71 users playing the game on two separate days for 45 minutes. Additionally, we collected physiological data from 31 of these users. We provide benchmark performances for the task of motion-based identification of XR users with two prominent state-of-the-art deep learning architectures (GRU and CNN). After training on the first session of each user, the best model can identify the 71 users in the second session with a mean accuracy of 95% within 2 minutes. The dataset is freely available under https://github.com/cschell/who-is-alyx