Tianxing Li, Emmanuel S. Akosah, Qiang Liu, Xia Zhou
{"title":"虚拟现实的超低功耗注视跟踪","authors":"Tianxing Li, Emmanuel S. Akosah, Qiang Liu, Xia Zhou","doi":"10.1145/3131672.3136989","DOIUrl":null,"url":null,"abstract":"We present LiGaze, a low-power approach to gaze tracking tailored to VR. It relies on a few low-cost photodiodes, eliminating the need for cameras and active infrared emitters. Reusing light emitted from the VR screen, LiGaze leverages photodiodes around a VR lens to measure reflected screen light in different directions. It then infers gaze direction by exploiting pupil's light absorption property. The core of LiGaze is to deal with screen light dynamics and extract changes in reflected light related to pupil movement. We design and fabricate a LiGaze prototype using off-the-shelf photodiodes. Its sensing and computation consume 791μW in total.","PeriodicalId":424262,"journal":{"name":"Proceedings of the 15th ACM Conference on Embedded Network Sensor Systems","volume":"14 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Ultra-Low Power Gaze Tracking for Virtual Reality\",\"authors\":\"Tianxing Li, Emmanuel S. Akosah, Qiang Liu, Xia Zhou\",\"doi\":\"10.1145/3131672.3136989\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We present LiGaze, a low-power approach to gaze tracking tailored to VR. It relies on a few low-cost photodiodes, eliminating the need for cameras and active infrared emitters. Reusing light emitted from the VR screen, LiGaze leverages photodiodes around a VR lens to measure reflected screen light in different directions. It then infers gaze direction by exploiting pupil's light absorption property. The core of LiGaze is to deal with screen light dynamics and extract changes in reflected light related to pupil movement. We design and fabricate a LiGaze prototype using off-the-shelf photodiodes. Its sensing and computation consume 791μW in total.\",\"PeriodicalId\":424262,\"journal\":{\"name\":\"Proceedings of the 15th ACM Conference on Embedded Network Sensor Systems\",\"volume\":\"14 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-11-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 15th ACM Conference on Embedded Network Sensor Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3131672.3136989\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 15th ACM Conference on Embedded Network Sensor Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3131672.3136989","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
We present LiGaze, a low-power approach to gaze tracking tailored to VR. It relies on a few low-cost photodiodes, eliminating the need for cameras and active infrared emitters. Reusing light emitted from the VR screen, LiGaze leverages photodiodes around a VR lens to measure reflected screen light in different directions. It then infers gaze direction by exploiting pupil's light absorption property. The core of LiGaze is to deal with screen light dynamics and extract changes in reflected light related to pupil movement. We design and fabricate a LiGaze prototype using off-the-shelf photodiodes. Its sensing and computation consume 791μW in total.