{"title":"Ultra-Low-Power Gaze Tracking for Virtual Reality","authors":"Tianxing Li, Qiang Liu, Xia Zhou","doi":"10.1145/3308755.3308765","DOIUrl":null,"url":null,"abstract":"LiGaze is a low-cost, low-power approach to gaze tracking tailored to virtual reality (VR). It relies on a few low-cost photodiodes, eliminating the need for cameras and active infrared emitters. Reusing light from the VR screen, LiGaze leverages photodiodes around a VR lens to measure reflected screen light in different directions. It then infers gaze direction by exploiting the pupil's light absorption property. The core of LiGaze is to deal with screen light dynamics and extract changes in reflected light related to pupil movement. LiGaze infers a 3D gaze vector on the fly using a lightweight regression algorithm. Compared to the eye tracker of an existing VR headset (FOVE), LiGaze achieves 6.3° and 10.1° mean within-user and cross-user accuracy. Its sensing and computation consume 791 W in total and thus can be completely powered by a credit card-size solar cell harvesting energy from indoor lighting. LiGaze's simplicity and ultra-low power make it applicable in a wide range of VR headsets to better unleash VR's potential.","PeriodicalId":213775,"journal":{"name":"GetMobile Mob. Comput. Commun.","volume":"42 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"GetMobile Mob. Comput. Commun.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3308755.3308765","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
LiGaze is a low-cost, low-power approach to gaze tracking tailored to virtual reality (VR). It relies on a few low-cost photodiodes, eliminating the need for cameras and active infrared emitters. Reusing light from the VR screen, LiGaze leverages photodiodes around a VR lens to measure reflected screen light in different directions. It then infers gaze direction by exploiting the pupil's light absorption property. The core of LiGaze is to deal with screen light dynamics and extract changes in reflected light related to pupil movement. LiGaze infers a 3D gaze vector on the fly using a lightweight regression algorithm. Compared to the eye tracker of an existing VR headset (FOVE), LiGaze achieves 6.3° and 10.1° mean within-user and cross-user accuracy. Its sensing and computation consume 791 W in total and thus can be completely powered by a credit card-size solar cell harvesting energy from indoor lighting. LiGaze's simplicity and ultra-low power make it applicable in a wide range of VR headsets to better unleash VR's potential.