首页 > 最新文献

Eye Tracking Research & Application最新文献

英文 中文
Auramirror: reflections on attention aurammirror:对注意力的反射
Pub Date : 2004-03-22 DOI: 10.1145/968363.968385
Alexander W. Skaburskis, Roel Vertegaal, Jeffrey S. Shell
As ubiquitous computing becomes more prevalent, greater consideration will have to be taken on how devices interrupt us and vie for our attention. This paper describes Auramirror, an interactive art piece that raises questions of how computers use our attention. By measuring attention and visualizing the results for the audience in real-time, Auramirror brings the subject matter to the forefront of the audience's consideration. Finally, some ways of using the Auramirror system to help in the design of attention sensitive devices are discussed.
随着无处不在的计算变得越来越普遍,我们将不得不更多地考虑设备是如何打断我们并争夺我们的注意力的。这篇论文描述了aurammirror,一个互动艺术作品,提出了电脑如何利用我们的注意力的问题。通过实时测量观众的注意力并将结果可视化,aurammirror将主题带到观众考虑的最前沿。最后,讨论了利用aurammirror系统帮助设计注意敏感装置的一些方法。
{"title":"Auramirror: reflections on attention","authors":"Alexander W. Skaburskis, Roel Vertegaal, Jeffrey S. Shell","doi":"10.1145/968363.968385","DOIUrl":"https://doi.org/10.1145/968363.968385","url":null,"abstract":"As ubiquitous computing becomes more prevalent, greater consideration will have to be taken on how devices interrupt us and vie for our attention. This paper describes Auramirror, an interactive art piece that raises questions of how computers use our attention. By measuring attention and visualizing the results for the audience in real-time, Auramirror brings the subject matter to the forefront of the audience's consideration. Finally, some ways of using the Auramirror system to help in the design of attention sensitive devices are discussed.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129731400","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Eye tracking system model with easy calibration 眼动追踪系统模型,易于校准
Pub Date : 2004-03-22 DOI: 10.1145/968363.968372
A. Villanueva, R. Cabeza, Sonia Porta
Calibration is one of the most tedious and often annoying aspects of many eye tracking systems. It normally consists in looking at several marks on a screen in order to collect enough data to modify the parameters of an adjustable model. Unfortunately this step is unavoidable if a competent tracking system is desired. Many efforts have been made to achieve more competent and improved eye tracking systems. Maybe the search for an accurate mathematical model is one of the least researched fields. The lack of a parametric description of the gaze estimation problem makes it difficult to find the most suitable model, and therefore generic expressions in calibration and tracking sessions are employed instead. In other words, a model based on parameters describing the elements involved in the tracking system would provide a stronger basis and robustness. The aim of this work is to build up a mathematical model totally based in realistic variables describing elements taking part in an eye tracking system employing the well known bright pupil technique i.e. user, camera, illumination and screen. The model is said to be defined when the expression relating the point the user is looking at with the extracted features of the image (glint position and center of the pupil) is found. The desired model would have to be simple, realistic, accurate and easy to calibrate.
校准是许多眼动追踪系统中最繁琐、最烦人的方面之一。它通常包括查看屏幕上的几个标记,以便收集足够的数据来修改可调节模型的参数。不幸的是,如果需要一个合格的跟踪系统,这一步是不可避免的。人们已经做出了许多努力来实现更有能力和改进的眼动追踪系统。也许寻找精确的数学模型是研究最少的领域之一。由于缺乏对注视估计问题的参数描述,很难找到最合适的模型,因此在校准和跟踪会话中使用通用表达式代替。换句话说,基于描述跟踪系统中涉及的元素的参数的模型将提供更强的基础和鲁棒性。这项工作的目的是建立一个完全基于现实变量的数学模型,描述参与眼动追踪系统的元素,采用众所周知的亮瞳技术,即用户,相机,照明和屏幕。当找到用户所看的点与提取的图像特征(闪烁位置和瞳孔中心)之间的表达式时,即定义了该模型。理想的模型必须简单、真实、准确且易于校准。
{"title":"Eye tracking system model with easy calibration","authors":"A. Villanueva, R. Cabeza, Sonia Porta","doi":"10.1145/968363.968372","DOIUrl":"https://doi.org/10.1145/968363.968372","url":null,"abstract":"Calibration is one of the most tedious and often annoying aspects of many eye tracking systems. It normally consists in looking at several marks on a screen in order to collect enough data to modify the parameters of an adjustable model. Unfortunately this step is unavoidable if a competent tracking system is desired. Many efforts have been made to achieve more competent and improved eye tracking systems. Maybe the search for an accurate mathematical model is one of the least researched fields. The lack of a parametric description of the gaze estimation problem makes it difficult to find the most suitable model, and therefore generic expressions in calibration and tracking sessions are employed instead. In other words, a model based on parameters describing the elements involved in the tracking system would provide a stronger basis and robustness. The aim of this work is to build up a mathematical model totally based in realistic variables describing elements taking part in an eye tracking system employing the well known bright pupil technique i.e. user, camera, illumination and screen. The model is said to be defined when the expression relating the point the user is looking at with the extracted features of the image (glint position and center of the pupil) is found. The desired model would have to be simple, realistic, accurate and easy to calibrate.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128899080","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 29
Eye movements as reflections of perceptual and cognitive processes (abstract only) 眼动作为知觉和认知过程的反映(仅抽象)
Pub Date : 2004-03-22 DOI: 10.1145/968363.968365
K. Rayner
Some historical issues regarding the use of eye movements to study cognitive processes will initially be discussed. The development of eye contingent display change experiments will be reviewed and examples will be presented regarding how the development of the technique provided answers to interesting questions. For the most part, examples will be taken from the psychology of reading, but other tasks will also be discussed. More recently, sophisticated models of eye movement control in the context of reading have been developed, and these models will be discussed. Some thoughts on future directions of eye movement research will also be presented.
一些关于使用眼动来研究认知过程的历史问题将首先被讨论。我们将回顾眼视变化实验的发展,并举例说明该技术的发展如何为有趣的问题提供答案。在大多数情况下,例子将取自阅读心理学,但其他任务也将被讨论。最近,在阅读的背景下,复杂的眼动控制模型已经被开发出来,这些模型将被讨论。并对今后眼动研究的方向提出了一些思考。
{"title":"Eye movements as reflections of perceptual and cognitive processes (abstract only)","authors":"K. Rayner","doi":"10.1145/968363.968365","DOIUrl":"https://doi.org/10.1145/968363.968365","url":null,"abstract":"Some historical issues regarding the use of eye movements to study cognitive processes will initially be discussed. The development of eye contingent display change experiments will be reviewed and examples will be presented regarding how the development of the technique provided answers to interesting questions. For the most part, examples will be taken from the psychology of reading, but other tasks will also be discussed. More recently, sophisticated models of eye movement control in the context of reading have been developed, and these models will be discussed. Some thoughts on future directions of eye movement research will also be presented.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121584552","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Robust clustering of eye movement recordings for quantification of visual interest 用于量化视觉兴趣的眼动记录鲁棒聚类
Pub Date : 2004-03-22 DOI: 10.1145/968363.968368
A. Santella, D. DeCarlo
Characterizing the location and extent of a viewer's interest, in terms of eye movement recordings, informs a range of investigations in image and scene viewing. We present an automatic data-driven method for accomplishing this, which clusters visual point-of-regard (POR) measurements into gazes and regions-of-interest using the mean shift procedure. Clusters produced using this method form a structured representation of viewer interest, and at the same time are replicable and not heavily influenced by noise or outliers. Thus, they are useful in answering fine-grained questions about where and how a viewer examined an image.
根据眼球运动记录来描述观看者兴趣的位置和程度,可以为图像和场景观看的一系列调查提供信息。我们提出了一种自动数据驱动的方法来实现这一目标,该方法使用平均移位过程将视觉关注点(POR)测量聚类到凝视和感兴趣的区域。使用这种方法产生的群集形成了观众兴趣的结构化表示,同时是可复制的,不受噪声或异常值的严重影响。因此,它们在回答有关观看者在何处以及如何检查图像的细粒度问题时非常有用。
{"title":"Robust clustering of eye movement recordings for quantification of visual interest","authors":"A. Santella, D. DeCarlo","doi":"10.1145/968363.968368","DOIUrl":"https://doi.org/10.1145/968363.968368","url":null,"abstract":"Characterizing the location and extent of a viewer's interest, in terms of eye movement recordings, informs a range of investigations in image and scene viewing. We present an automatic data-driven method for accomplishing this, which clusters visual point-of-regard (POR) measurements into gazes and regions-of-interest using the mean shift procedure. Clusters produced using this method form a structured representation of viewer interest, and at the same time are replicable and not heavily influenced by noise or outliers. Thus, they are useful in answering fine-grained questions about where and how a viewer examined an image.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126988374","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 191
Frequency analysis of task evoked pupillary response and eye-movement 任务诱发瞳孔反应和眼球运动的频率分析
Pub Date : 2004-03-22 DOI: 10.1145/968363.968381
M. Nakayama, Y. Shimizu
This paper describes the influence of eye blinks on frequency analysis and power spectrum difference for task-evoked pupillography and eye-movement during an experiment which consisted of ocular following tasks and oral calculation tasks with three levels of task difficulty: control, 1×1,and 1×2 digit oral calculation.The compensation model for temporal pupil size based on MLP (multi layer perceptron) was trained to detect a blink and to estimate pupil size by using blinkless pupillary change and artificial blink patterns. The PSD (power spectrum density) measurements from the estimated pupillography during oral calculation tasks show significant differences, and the PSD increased with task difficulty in the area of 0.1 - 0.5Hz and 1.6 - 3.5Hz, as did the average pupil size.The eye-movement during blinks was corrected manually, to remove irregular eye-movements such as saccades. The CSD (cross spectrum density) was achieved from horizontal and vertical eye-movement coordinates. Significant differences in CSDs among experimental conditions were examined in the area of 0.6 - 1.5 Hz. These differences suggest that the task difficulty affects the relationship between horizontal and vertical eye-movement coordinates in the frequency domain.
本文描述了眨眼对任务诱发瞳孔分布和眼动的频率分析和功率谱差异的影响。该实验由眼跟随任务和口头计算任务组成,任务难度为控制、1×1和1×2数字口头计算。利用无眨瞳孔变化和人工眨眼模式,训练了基于多层感知器的瞳孔补偿模型来检测眨眼并估计瞳孔大小。在口头计算任务中,瞳孔分布的功率谱密度(PSD)测量结果显示出显著差异,在0.1 - 0.5Hz和1.6 - 3.5Hz范围内,PSD随着任务难度的增加而增加,平均瞳孔大小也是如此。眨眼时的眼球运动是手动纠正的,以消除不规则的眼球运动,如扫视。交叉光谱密度(CSD)由水平和垂直眼动坐标获得。在0.6 ~ 1.5 Hz范围内,不同实验条件下的CSDs有显著差异。这些差异表明,任务难度会影响水平和垂直眼动坐标在频域上的关系。
{"title":"Frequency analysis of task evoked pupillary response and eye-movement","authors":"M. Nakayama, Y. Shimizu","doi":"10.1145/968363.968381","DOIUrl":"https://doi.org/10.1145/968363.968381","url":null,"abstract":"This paper describes the influence of eye blinks on frequency analysis and power spectrum difference for task-evoked pupillography and eye-movement during an experiment which consisted of ocular following tasks and oral calculation tasks with three levels of task difficulty: control, 1×1,and 1×2 digit oral calculation.The compensation model for temporal pupil size based on MLP (multi layer perceptron) was trained to detect a blink and to estimate pupil size by using blinkless pupillary change and artificial blink patterns. The PSD (power spectrum density) measurements from the estimated pupillography during oral calculation tasks show significant differences, and the PSD increased with task difficulty in the area of 0.1 - 0.5Hz and 1.6 - 3.5Hz, as did the average pupil size.The eye-movement during blinks was corrected manually, to remove irregular eye-movements such as saccades. The CSD (cross spectrum density) was achieved from horizontal and vertical eye-movement coordinates. Significant differences in CSDs among experimental conditions were examined in the area of 0.6 - 1.5 Hz. These differences suggest that the task difficulty affects the relationship between horizontal and vertical eye-movement coordinates in the frequency domain.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115904676","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 50
A free-head, simple calibration, gaze tracking system that enables gaze-based interaction 一个自由的头,简单的校准,凝视跟踪系统,使基于凝视的互动
Pub Date : 2004-03-22 DOI: 10.1145/968363.968387
Takehiko Ohno, N. Mukawa
Human eye gaze is a strong candidate to create a new application area based on human-computer interaction. To implement a really practical gaze-based interaction system, gaze detection must be realized without placing any restriction on the user's behavior or comfort. This paper describes a gaze tracking system that offers freehead, simple personal calibration. It does not require the user wear anything on her head, and she can move her head freely. Personal calibration takes only a very short time; the user is asked to look at two markers on the screen. An experiment shows that the accuracy of the implemented system is about 1.0 degrees (view angle).
人眼注视是创建基于人机交互的新应用领域的有力候选者。为了实现一个真正实用的基于凝视的交互系统,必须在不限制用户行为或舒适度的情况下实现凝视检测。本文描述了一种提供自由、简单的个人校准的凝视跟踪系统。它不需要使用者在头上戴任何东西,而且她可以自由地移动她的头。个人校准只需要很短的时间;用户被要求看着屏幕上的两个标记。实验结果表明,该系统的视觉精度约为1.0度(视角)。
{"title":"A free-head, simple calibration, gaze tracking system that enables gaze-based interaction","authors":"Takehiko Ohno, N. Mukawa","doi":"10.1145/968363.968387","DOIUrl":"https://doi.org/10.1145/968363.968387","url":null,"abstract":"Human eye gaze is a strong candidate to create a new application area based on human-computer interaction. To implement a really practical gaze-based interaction system, gaze detection must be realized without placing any restriction on the user's behavior or comfort. This paper describes a gaze tracking system that offers freehead, simple personal calibration. It does not require the user wear anything on her head, and she can move her head freely. Personal calibration takes only a very short time; the user is asked to look at two markers on the screen. An experiment shows that the accuracy of the implemented system is about 1.0 degrees (view angle).","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131425634","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 184
An eye for an eye: a performance evaluation comparison of the LC technologies and Tobii eye trackers 以眼还眼:LC技术与Tobii眼动仪的性能评估比较
Pub Date : 2004-03-22 DOI: 10.1145/968363.968378
D. Cheng, Roel Vertegaal
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Permissions Dept, ACM Inc., fax +1 (212) 869-0481 or e-mail permissions@acm.org. © 2004 ACM 1-58113-825-3/04/0003 $5.00 An Eye for an Eye: A Performance Evaluation Comparison of the LC Technologies and Tobii Eye Trackers
允许制作部分或全部作品的数字或硬拷贝供个人或课堂使用,但不收取任何费用,前提是副本不是出于商业利益而制作或分发的,并且副本在第一页上带有本通知和完整的引用。本作品组件的版权归ACM以外的其他人所有,必须得到尊重。允许有信用的摘要。以其他方式复制、重新发布、在服务器上发布或重新分发到列表,需要事先获得特定许可和/或付费。请联系ACM公司权限部,传真+1(212)8669 -0481或发邮件至permissions@acm.org。©2004 ACM 1-58113-825-3/04/0003 $5.00一眼对一眼:LC技术和Tobii眼动仪的性能评估比较
{"title":"An eye for an eye: a performance evaluation comparison of the LC technologies and Tobii eye trackers","authors":"D. Cheng, Roel Vertegaal","doi":"10.1145/968363.968378","DOIUrl":"https://doi.org/10.1145/968363.968378","url":null,"abstract":"Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Permissions Dept, ACM Inc., fax +1 (212) 869-0481 or e-mail permissions@acm.org. © 2004 ACM 1-58113-825-3/04/0003 $5.00 An Eye for an Eye: A Performance Evaluation Comparison of the LC Technologies and Tobii Eye Trackers","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121692697","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 21
Gaze typing compared with input by head and hand 凝视输入与头和手输入的比较
Pub Date : 2004-03-22 DOI: 10.1145/968363.968389
J. P. Hansen, K. Tørning, A. Johansen, K. Itoh, Hirotaka Aoki
This paper investigates the usability of gaze-typing systems for disabled people in a broad perspective that takes into account the usage scenarios and the particular users that these systems benefit. Design goals for a gaze-typing system are identified: productivity above 25 words per minute, robust tracking, high availability, and support of multimodal input. A detailed investigation of the efficiency and user satisfaction with a Danish and a Japanese gaze-typing system compares it to head- and mouse (hand) - typing. We found gaze typing to be more erroneous than the other two modalities. Gaze typing was just as fast as head typing, and both were slower than mouse (hand-) typing. Possibilities for design improvements are discussed.
本文从一个广泛的角度研究了残疾人注视输入系统的可用性,该系统考虑了使用场景和这些系统受益的特定用户。确定了注视输入系统的设计目标:每分钟超过25个单词的生产率、可靠的跟踪、高可用性和支持多模式输入。一项对丹麦和日本注视打字系统的效率和用户满意度的详细调查将其与头鼠标(手)打字系统进行了比较。我们发现凝视输入比其他两种方式更容易出错。凝视打字和头部打字一样快,两者都比鼠标(手)打字慢。讨论了设计改进的可能性。
{"title":"Gaze typing compared with input by head and hand","authors":"J. P. Hansen, K. Tørning, A. Johansen, K. Itoh, Hirotaka Aoki","doi":"10.1145/968363.968389","DOIUrl":"https://doi.org/10.1145/968363.968389","url":null,"abstract":"This paper investigates the usability of gaze-typing systems for disabled people in a broad perspective that takes into account the usage scenarios and the particular users that these systems benefit. Design goals for a gaze-typing system are identified: productivity above 25 words per minute, robust tracking, high availability, and support of multimodal input. A detailed investigation of the efficiency and user satisfaction with a Danish and a Japanese gaze-typing system compares it to head- and mouse (hand) - typing. We found gaze typing to be more erroneous than the other two modalities. Gaze typing was just as fast as head typing, and both were slower than mouse (hand-) typing. Possibilities for design improvements are discussed.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130452125","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 143
Building a lightweight eyetracking headgear 打造一个轻量级的眼球追踪头盔
Pub Date : 2004-03-22 DOI: 10.1145/968363.968386
J. Babcock, J. Pelz
Eyetracking systems that use video-based cameras to monitor the eye and scene can be made significantly smaller thanks to tiny micro-lens video cameras. Pupil detection algorithms are generally implemented in hardware, allowing for real-time eyetracking. However, it is likely that real-time eyetracking will soon be fully accomplished in software alone. This paper encourages an "open-source" approach to eyetracking by providing practical tips on building a lightweight eyetracker from commercially available micro-lens cameras and other parts. While the headgear described here can be used with any dark-pupil eyetracking controller, it also opens the door to open-source software solutions that could be developed by the eyetracking and image-processing communities. Such systems could be optimized without concern for real-time performance because the systems could be run offline.
由于微型微镜头摄像机的出现,使用基于视频的摄像机来监控眼睛和场景的眼球追踪系统可以做得更小。瞳孔检测算法通常在硬件中实现,允许实时眼球跟踪。然而,实时眼球追踪很可能很快就会完全在软件中完成。这篇论文鼓励采用“开源”的方法来进行眼球追踪,提供了一些实用的技巧,可以用市面上可以买到的微镜头相机和其他部件来制造轻量级的眼球追踪器。虽然这里描述的头盔可以与任何暗瞳眼球追踪控制器一起使用,但它也为眼球追踪和图像处理社区开发的开源软件解决方案打开了大门。这样的系统可以在不考虑实时性能的情况下进行优化,因为系统可以离线运行。
{"title":"Building a lightweight eyetracking headgear","authors":"J. Babcock, J. Pelz","doi":"10.1145/968363.968386","DOIUrl":"https://doi.org/10.1145/968363.968386","url":null,"abstract":"Eyetracking systems that use video-based cameras to monitor the eye and scene can be made significantly smaller thanks to tiny micro-lens video cameras. Pupil detection algorithms are generally implemented in hardware, allowing for real-time eyetracking. However, it is likely that real-time eyetracking will soon be fully accomplished in software alone. This paper encourages an \"open-source\" approach to eyetracking by providing practical tips on building a lightweight eyetracker from commercially available micro-lens cameras and other parts. While the headgear described here can be used with any dark-pupil eyetracking controller, it also opens the door to open-source software solutions that could be developed by the eyetracking and image-processing communities. Such systems could be optimized without concern for real-time performance because the systems could be run offline.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125691184","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 166
Eye gaze patterns differentiate novice and experts in a virtual laparoscopic surgery training environment 眼睛注视模式区分新手和专家在虚拟腹腔镜手术训练环境
Pub Date : 2004-03-22 DOI: 10.1145/968363.968370
Benjamin Law, S. Fraser, Stella Atkins, A. Kirkpatrick, A. Lomax, C. MacKenzie
Visual information is important in surgeons' manipulative performance especially in laparoscopic surgery where tactual feedback is less than in open surgery. The study of surgeons' eye movements is an innovative way of assessing skill, in that a comparison of the eye movement strategies between expert surgeons and novices may show important differences that could be used in training. We conducted a preliminary study comparing the eye movements of 5 experts and 5 novices performing a one-handed aiming task on a computer-based laparoscopic surgery simulator. The performance results showed that experts were quicker and generally committed fewer errors than novices. We investigated eye movements as a possible factor for experts performing better than novices. The results from eye gaze analysis showed that novices needed more visual feedback of the tool position to complete the task than did experts. In addition, the experts tended to maintain eye gaze on the target while manipulating the tool, whereas novices were more varied in their behaviours. For example, we found that on some trials, novices tracked the movement of the tool until it reached the target.
视觉信息对外科医生的操作表现很重要,特别是在腹腔镜手术中,触觉反馈比开放手术少。对外科医生眼球运动的研究是评估技能的一种创新方式,因为对专家外科医生和新手外科医生的眼球运动策略进行比较可能会显示出可用于培训的重要差异。我们进行了一项初步研究,比较了5名专家和5名新手在计算机腹腔镜手术模拟器上进行单手瞄准任务的眼球运动。性能结果表明,专家的速度更快,而且通常比新手犯的错误更少。我们调查了眼动作为专家比新手表现更好的可能因素。眼睛注视分析的结果表明,新手比专家需要更多的工具位置视觉反馈来完成任务。此外,专家在操作工具时倾向于保持眼睛盯着目标,而新手的行为则更加多样化。例如,我们发现在一些试验中,新手跟踪工具的运动,直到它到达目标。
{"title":"Eye gaze patterns differentiate novice and experts in a virtual laparoscopic surgery training environment","authors":"Benjamin Law, S. Fraser, Stella Atkins, A. Kirkpatrick, A. Lomax, C. MacKenzie","doi":"10.1145/968363.968370","DOIUrl":"https://doi.org/10.1145/968363.968370","url":null,"abstract":"Visual information is important in surgeons' manipulative performance especially in laparoscopic surgery where tactual feedback is less than in open surgery. The study of surgeons' eye movements is an innovative way of assessing skill, in that a comparison of the eye movement strategies between expert surgeons and novices may show important differences that could be used in training. We conducted a preliminary study comparing the eye movements of 5 experts and 5 novices performing a one-handed aiming task on a computer-based laparoscopic surgery simulator. The performance results showed that experts were quicker and generally committed fewer errors than novices. We investigated eye movements as a possible factor for experts performing better than novices. The results from eye gaze analysis showed that novices needed more visual feedback of the tool position to complete the task than did experts. In addition, the experts tended to maintain eye gaze on the target while manipulating the tool, whereas novices were more varied in their behaviours. For example, we found that on some trials, novices tracked the movement of the tool until it reached the target.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116849001","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 251
期刊
Eye Tracking Research & Application
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1