Ultra-Low Power Gaze Tracking for Virtual Reality

Tianxing Li, Qiang Liu, Xia Zhou
{"title":"Ultra-Low Power Gaze Tracking for Virtual Reality","authors":"Tianxing Li, Qiang Liu, Xia Zhou","doi":"10.1145/3131672.3131682","DOIUrl":null,"url":null,"abstract":"Tracking user's eye fixation direction is crucial to virtual reality (VR): it eases user's interaction with the virtual scene and enables intelligent rendering to improve user's visual experiences and save system energy. Existing techniques commonly rely on cameras and active infrared emitters, making them too expensive and power-hungry for VR headsets (especially mobile VR headsets). We present LiGaze, a low-cost, low-power approach to gaze tracking tailored to VR. It relies on a few low-cost photodiodes, eliminating the need for cameras and active infrared emitters. Reusing light emitted from the VR screen, LiGaze leverages photodiodes around a VR lens to measure reflected screen light in different directions. It then infers gaze direction by exploiting pupil's light absorption property. The core of LiGaze is to deal with screen light dynamics and extract changes in reflected light related to pupil movement. LiGaze infers a 3D gaze vector on the fly using a lightweight regression algorithm. We design and fabricate a LiGaze prototype using off-the-shelf photodiodes. Our comparison to a commercial VR eye tracker (FOVE) shows that LiGaze achieves 6.3° and 10.1° mean within-user and cross-user accuracy. Its sensing and computation consume 791μW in total and thus can be completely powered by a credit-card sized solar cell harvesting energy from indoor lighting. LiGaze's simplicity and ultra-low power make it applicable in a wide range of VR headsets to better unleash VR's potential.","PeriodicalId":424262,"journal":{"name":"Proceedings of the 15th ACM Conference on Embedded Network Sensor Systems","volume":"98 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"29","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 15th ACM Conference on Embedded Network Sensor Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3131672.3131682","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 29

Abstract

Tracking user's eye fixation direction is crucial to virtual reality (VR): it eases user's interaction with the virtual scene and enables intelligent rendering to improve user's visual experiences and save system energy. Existing techniques commonly rely on cameras and active infrared emitters, making them too expensive and power-hungry for VR headsets (especially mobile VR headsets). We present LiGaze, a low-cost, low-power approach to gaze tracking tailored to VR. It relies on a few low-cost photodiodes, eliminating the need for cameras and active infrared emitters. Reusing light emitted from the VR screen, LiGaze leverages photodiodes around a VR lens to measure reflected screen light in different directions. It then infers gaze direction by exploiting pupil's light absorption property. The core of LiGaze is to deal with screen light dynamics and extract changes in reflected light related to pupil movement. LiGaze infers a 3D gaze vector on the fly using a lightweight regression algorithm. We design and fabricate a LiGaze prototype using off-the-shelf photodiodes. Our comparison to a commercial VR eye tracker (FOVE) shows that LiGaze achieves 6.3° and 10.1° mean within-user and cross-user accuracy. Its sensing and computation consume 791μW in total and thus can be completely powered by a credit-card sized solar cell harvesting energy from indoor lighting. LiGaze's simplicity and ultra-low power make it applicable in a wide range of VR headsets to better unleash VR's potential.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
虚拟现实的超低功耗注视跟踪
跟踪用户眼球注视方向对虚拟现实至关重要,它可以简化用户与虚拟场景的交互,实现智能渲染,从而改善用户的视觉体验,节省系统能源。现有的技术通常依赖于摄像头和主动红外发射器,这使得它们对于VR头显(尤其是移动VR头显)来说过于昂贵和耗电。我们介绍了LiGaze,一种低成本,低功耗的方法,用于为VR量身定制的凝视跟踪。它依赖于一些低成本的光电二极管,消除了对摄像机和主动红外发射器的需求。LiGaze利用VR屏幕发出的光,利用VR镜头周围的光电二极管来测量不同方向反射的屏幕光。然后利用瞳孔的光吸收特性来推断凝视方向。LiGaze的核心是处理屏幕光动态,提取与瞳孔运动相关的反射光变化。LiGaze使用轻量级回归算法在飞行中推断出3D凝视向量。我们使用现成的光电二极管设计和制造了一个LiGaze原型。我们与商用VR眼动仪(FOVE)的比较表明,LiGaze在用户内和跨用户的平均精度分别为6.3°和10.1°。它的传感和计算总共消耗791μW,因此完全可以由信用卡大小的太阳能电池从室内照明中收集能量。LiGaze的简单性和超低功耗使其适用于各种VR头显,以更好地释放VR的潜力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Stalwart: a Predictable Reliable Adaptive and Low-latency Real-time Wireless Protocol SmartLight: Light-weight 3D Indoor Localization Using a Single LED Lamp UWB-based Single-anchor Low-cost Indoor Localization System Hierarchical Subchannel Allocation for Mode-3 Vehicle-to-Vehicle Sidelink Communications Taming Link-layer Heterogeneity in IoT through Interleaving Multiple Link-Layers over a Single Radio
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1