The eye of the typer: a benchmark and analysis of gaze behavior during typing

Alexandra Papoutsaki, Aaron Gokaslan, J. Tompkin, Yuze He, Jeff Huang
{"title":"The eye of the typer: a benchmark and analysis of gaze behavior during typing","authors":"Alexandra Papoutsaki, Aaron Gokaslan, J. Tompkin, Yuze He, Jeff Huang","doi":"10.1145/3204493.3204552","DOIUrl":null,"url":null,"abstract":"We examine the relationship between eye gaze and typing, focusing on the differences between touch and non-touch typists. To enable typing-based research, we created a 51-participant benchmark dataset for user input across multiple tasks, including user input data, screen recordings, webcam video of the participant's face, and eye tracking positions. There are patterns of eye movements that differ between the two types of typists, representing glances at the keyboard, which can be used to identify touch-.typed strokes with 92% accuracy. Then, we relate eye gaze with cursor activity, aligning both pointing and typing to eye gaze. One demonstrative application of the work is in extending WebGazer, a real-time web-browser-based webcam eye tracker. We show that incorporating typing behavior as a secondary signal improves eye tracking accuracy by 16% for touch typists, and 8% for non-touch typists.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"21","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3204493.3204552","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 21

Abstract

We examine the relationship between eye gaze and typing, focusing on the differences between touch and non-touch typists. To enable typing-based research, we created a 51-participant benchmark dataset for user input across multiple tasks, including user input data, screen recordings, webcam video of the participant's face, and eye tracking positions. There are patterns of eye movements that differ between the two types of typists, representing glances at the keyboard, which can be used to identify touch-.typed strokes with 92% accuracy. Then, we relate eye gaze with cursor activity, aligning both pointing and typing to eye gaze. One demonstrative application of the work is in extending WebGazer, a real-time web-browser-based webcam eye tracker. We show that incorporating typing behavior as a secondary signal improves eye tracking accuracy by 16% for touch typists, and 8% for non-touch typists.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
打字者的眼睛:打字时注视行为的基准和分析
我们研究了眼睛注视和打字之间的关系,重点是触摸打字者和非触摸打字者之间的差异。为了实现基于类型的研究,我们创建了一个51个参与者的基准数据集,用于跨多个任务的用户输入,包括用户输入数据、屏幕记录、参与者面部的网络摄像头视频和眼动追踪位置。两种类型的打字员有不同的眼球运动模式,表示对键盘的扫视,这可以用来识别触摸。键入笔画的准确率为92%。然后,我们将眼睛注视与光标活动联系起来,将指向和打字与眼睛注视对齐。这项工作的一个示范应用是扩展WebGazer,一个基于实时网络浏览器的网络摄像头眼动仪。我们的研究表明,将打字行为作为次要信号,可以使触控打字者的眼动追踪准确率提高16%,非触控打字者的眼动追踪准确率提高8%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Evaluating gender difference on algorithmic problems using eye-tracker Eyemic Gaze patterns during remote presentations while listening and speaking An investigation of the effects of n-gram length in scanpath analysis for eye-tracking research Towards concise gaze sharing
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1