Identifying Lines and Interpreting Vertical Jumps in Eye Tracking Studies of Reading Text and Code

IF 1.9 4区 计算机科学 Q3 COMPUTER SCIENCE, SOFTWARE ENGINEERING ACM Transactions on Applied Perception Pub Date : 2023-04-06 DOI:https://dl.acm.org/doi/10.1145/3579357
Mor Shamy, Dror G. Feitelson
{"title":"Identifying Lines and Interpreting Vertical Jumps in Eye Tracking Studies of Reading Text and Code","authors":"Mor Shamy, Dror G. Feitelson","doi":"https://dl.acm.org/doi/10.1145/3579357","DOIUrl":null,"url":null,"abstract":"<p>Eye tracking studies have shown that reading code, in contradistinction to reading text, includes many vertical jumps. As different lines of code may have quite different functions (e.g., variable definition, flow control, or computation), it is important to accurately identify the lines being read. We design experiments that require a specific line of text to be scrutinized. Using the distribution of gazes around this line, we then calculate how the precision with which we can identify the line being read depends on the font size and spacing. The results indicate that, even after correcting for systematic bias, unnaturally large fonts and spacing may be required for reliable line identification.</p><p>Interestingly, during the experiments, the participants also repeatedly re-checked their task and if they were looking at the correct line, leading to vertical jumps similar to those observed when reading code. This suggests that observed reading patterns may be “inefficient,” in the sense that participants feel the need to repeat actions beyond the minimal number apparently required for the task. This may have implications regarding the interpretation of reading patterns. In particular, reading does not reflect only the extraction of information from the text or code. Rather, reading patterns may also reflect other types of activities, such as getting a general orientation, and searching for specific locations in the context of performing a particular task.</p>","PeriodicalId":50921,"journal":{"name":"ACM Transactions on Applied Perception","volume":null,"pages":null},"PeriodicalIF":1.9000,"publicationDate":"2023-04-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Transactions on Applied Perception","FirstCategoryId":"94","ListUrlMain":"https://doi.org/https://dl.acm.org/doi/10.1145/3579357","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
引用次数: 0

Abstract

Eye tracking studies have shown that reading code, in contradistinction to reading text, includes many vertical jumps. As different lines of code may have quite different functions (e.g., variable definition, flow control, or computation), it is important to accurately identify the lines being read. We design experiments that require a specific line of text to be scrutinized. Using the distribution of gazes around this line, we then calculate how the precision with which we can identify the line being read depends on the font size and spacing. The results indicate that, even after correcting for systematic bias, unnaturally large fonts and spacing may be required for reliable line identification.

Interestingly, during the experiments, the participants also repeatedly re-checked their task and if they were looking at the correct line, leading to vertical jumps similar to those observed when reading code. This suggests that observed reading patterns may be “inefficient,” in the sense that participants feel the need to repeat actions beyond the minimal number apparently required for the task. This may have implications regarding the interpretation of reading patterns. In particular, reading does not reflect only the extraction of information from the text or code. Rather, reading patterns may also reflect other types of activities, such as getting a general orientation, and searching for specific locations in the context of performing a particular task.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
在阅读文本和代码的眼动追踪研究中,线的识别和垂直跳跃的解释
眼动追踪研究表明,与阅读文本相比,阅读代码包含许多垂直跳跃。由于不同的代码行可能具有完全不同的功能(例如,变量定义、流控制或计算),因此准确识别正在读取的行是很重要的。我们设计的实验需要仔细检查一行特定的文本。使用这一行周围的注视分布,然后我们计算识别正在阅读的行的精度如何取决于字体大小和间距。结果表明,即使在纠正了系统偏差之后,不自然的大字体和间距可能需要可靠的线条识别。有趣的是,在实验过程中,参与者还反复检查他们的任务,看看他们是否在看正确的线,导致垂直跳跃,类似于阅读代码时观察到的。这表明观察到的阅读模式可能是“低效的”,从某种意义上说,参与者觉得有必要重复超过任务所需的最小数量的动作。这可能会对阅读模式的解释产生影响。特别是,阅读并不仅仅反映从文本或代码中提取信息。相反,阅读模式也可能反映其他类型的活动,比如获得一个大致的方向,以及在执行特定任务的背景下搜索特定的位置。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
ACM Transactions on Applied Perception
ACM Transactions on Applied Perception 工程技术-计算机:软件工程
CiteScore
3.70
自引率
0.00%
发文量
22
审稿时长
12 months
期刊介绍: ACM Transactions on Applied Perception (TAP) aims to strengthen the synergy between computer science and psychology/perception by publishing top quality papers that help to unify research in these fields. The journal publishes inter-disciplinary research of significant and lasting value in any topic area that spans both Computer Science and Perceptual Psychology. All papers must incorporate both perceptual and computer science components.
期刊最新文献
Virtual Reality Audio Game for Entertainment & Sound Localization Training The Impact of Nature Realism on the Restorative Quality of Virtual Reality Forest Bathing Color Theme Evaluation through User Preference Modeling Understanding the Impact of Visual and Kinematic Information on the Perception of Physicality Errors Decoding Functional Brain Data for Emotion Recognition: A Machine Learning Approach
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1