Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency

Yusuke Sugano, A. Bulling
{"title":"Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency","authors":"Yusuke Sugano, A. Bulling","doi":"10.1145/2807442.2807445","DOIUrl":null,"url":null,"abstract":"Head-mounted eye tracking has significant potential for gaze-based applications such as life logging, mental health monitoring, or the quantified self. A neglected challenge for the long-term recordings required by these applications is that drift in the initial person-specific eye tracker calibration, for example caused by physical activity, can severely impact gaze estimation accuracy and thus system performance and user experience. We first analyse calibration drift on a new dataset of natural gaze data recorded using synchronised video-based and Electrooculography-based eye trackers of 20 users performing everyday activities in a mobile setting. Based on this analysis we present a method to automatically self-calibrate head-mounted eye trackers based on a computational model of bottom-up visual saliency. Through evaluations on the dataset we show that our method 1) is effective in reducing calibration drift in calibrated eye trackers and 2) given sufficient data, can achieve gaze estimation accuracy competitive with that of a calibrated eye tracker, without any manual calibration.","PeriodicalId":103668,"journal":{"name":"Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology","volume":"64 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-11-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"80","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2807442.2807445","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 80

Abstract

Head-mounted eye tracking has significant potential for gaze-based applications such as life logging, mental health monitoring, or the quantified self. A neglected challenge for the long-term recordings required by these applications is that drift in the initial person-specific eye tracker calibration, for example caused by physical activity, can severely impact gaze estimation accuracy and thus system performance and user experience. We first analyse calibration drift on a new dataset of natural gaze data recorded using synchronised video-based and Electrooculography-based eye trackers of 20 users performing everyday activities in a mobile setting. Based on this analysis we present a method to automatically self-calibrate head-mounted eye trackers based on a computational model of bottom-up visual saliency. Through evaluations on the dataset we show that our method 1) is effective in reducing calibration drift in calibrated eye trackers and 2) given sufficient data, can achieve gaze estimation accuracy competitive with that of a calibrated eye tracker, without any manual calibration.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
自我中心视觉显著性自校准头戴式眼动仪
头戴式眼动追踪在诸如生活记录、心理健康监测或量化自我等基于凝视的应用中具有巨大的潜力。对于这些应用程序所需的长期记录来说,一个被忽视的挑战是,初始个人特定眼动仪校准中的漂移,例如由身体活动引起的漂移,可能严重影响凝视估计的准确性,从而影响系统性能和用户体验。我们首先分析了在一个新的自然凝视数据集上的校准漂移,该数据集使用同步视频和基于眼电的眼动仪记录了20名在移动环境中进行日常活动的用户。在此基础上,提出了一种基于自下而上视觉显著性计算模型的头戴式眼动仪自动自校准方法。通过对数据集的评估,我们表明,我们的方法1)有效地减少了校准后眼动仪的校准漂移;2)在数据充足的情况下,无需手动校准,即可实现与校准后眼动仪相媲美的凝视估计精度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology Push-Push: A Drag-like Operation Overlapped with a Page Transition Operation on Touch Interfaces Gaze-Shifting: Direct-Indirect Input with Pen and Touch Modulated by Gaze LaserStacker: Fabricating 3D Objects by Laser Cutting and Welding Capture-Time Feedback for Recording Scripted Narration
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1