首页 > 最新文献

Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications最新文献

英文 中文
Pupil responses signal less inhibition for own relative to other names 瞳孔反应表明,相对于其他名字,对自己名字的抑制作用较小
Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204576
Lukas Greiter, Christoph Strauch, A. Huckauf
Previous research suggests that self-relevant stimuli, such as one's own name, attract more attention than stimuli that are not self-relevant. In two experiments, we examined to which extent the own name is also less prone to inhibition than other names using a Go/NoGo approach. The pupil diameter was employed as psychophysiological indicator of attention. A total of 36 subjects performed various categorization tasks, with their own name and other names. Whereas in Go-trials, pupil dilation for own and other names did not differ, in NoGo-trials, significant larger pupil dilations were obtained for subjects' own names compared to other names. This difference was especially pronounced at larger intervals after stimulus onset, suggesting that inhibitory processing was less effective with one's own name.
先前的研究表明,自我相关的刺激,比如自己的名字,比非自我相关的刺激更能吸引人们的注意力。在两个实验中,我们使用Go/NoGo方法检查了自己的名字在多大程度上也比其他名字更不容易受到抑制。采用瞳孔直径作为注意的心理生理指标。共有36名受试者用自己的名字和其他人的名字完成各种分类任务。在go试验中,受试者对自己名字和其他名字的瞳孔扩张没有差异,而在nogo试验中,受试者对自己名字的瞳孔扩张明显大于对其他名字的瞳孔扩张。这种差异在刺激开始后的较长时间间隔内尤为明显,这表明对自己名字的抑制性加工效果较差。
{"title":"Pupil responses signal less inhibition for own relative to other names","authors":"Lukas Greiter, Christoph Strauch, A. Huckauf","doi":"10.1145/3204493.3204576","DOIUrl":"https://doi.org/10.1145/3204493.3204576","url":null,"abstract":"Previous research suggests that self-relevant stimuli, such as one's own name, attract more attention than stimuli that are not self-relevant. In two experiments, we examined to which extent the own name is also less prone to inhibition than other names using a Go/NoGo approach. The pupil diameter was employed as psychophysiological indicator of attention. A total of 36 subjects performed various categorization tasks, with their own name and other names. Whereas in Go-trials, pupil dilation for own and other names did not differ, in NoGo-trials, significant larger pupil dilations were obtained for subjects' own names compared to other names. This difference was especially pronounced at larger intervals after stimulus onset, suggesting that inhibitory processing was less effective with one's own name.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124972461","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Scene perception while listening to music: an eye-tracking study 听音乐时的场景感知:一项眼球追踪研究
Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204582
J. Petruzálek, Denis Sefara, M. Franěk, M. Kabelác
Previous studies have observed longer fixations and fewer saccades while viewing various outdoor scenes and listening to music compared to a no-music condition. There is also evidence that musical tempo can modulate the speed of eye movements. However, recent investigations from environmental psychology demonstrated differences in eye movement behavior while viewing natural and urban outdoor scenes. The first goal of this study was to replicate the observed effect of music listening while viewing outdoor scenes with different musical stimuli. Next, the effect of a fast and a slow musical tempo on eye movement speed was investigated. Finally, the effect of the type of outdoor scene (natural vs. urban scenes) was explored. The results revealed shorter fixation durations in the no-music condition compared to both music conditions, but these differences were non-significant. Moreover, we did not find differences in eye movements between music conditions with fast and slow tempo. Although significantly shorter fixations were found for viewing urban scenes compared with natural scenes, we did not find a significant interaction between the type of scene and music conditions.
先前的研究发现,与不听音乐的情况相比,在观看各种户外场景和听音乐时,注视的时间更长,扫视的次数更少。也有证据表明,音乐节奏可以调节眼球运动的速度。然而,最近来自环境心理学的研究表明,在观看自然和城市户外场景时,眼球运动行为是不同的。本研究的第一个目标是复制在观看不同音乐刺激的户外场景时听音乐所观察到的效果。其次,研究了快节奏和慢节奏音乐对眼球运动速度的影响。最后,探讨了室外景观类型(自然景观与城市景观)的影响。结果显示,与两种音乐条件相比,无音乐条件下的注视时间较短,但这些差异不显著。此外,我们没有发现快节奏和慢节奏音乐条件下眼球运动的差异。尽管人们在观看城市场景时的注视时间明显短于观看自然场景时的注视时间,但我们没有发现场景类型和音乐条件之间存在显著的相互作用。
{"title":"Scene perception while listening to music: an eye-tracking study","authors":"J. Petruzálek, Denis Sefara, M. Franěk, M. Kabelác","doi":"10.1145/3204493.3204582","DOIUrl":"https://doi.org/10.1145/3204493.3204582","url":null,"abstract":"Previous studies have observed longer fixations and fewer saccades while viewing various outdoor scenes and listening to music compared to a no-music condition. There is also evidence that musical tempo can modulate the speed of eye movements. However, recent investigations from environmental psychology demonstrated differences in eye movement behavior while viewing natural and urban outdoor scenes. The first goal of this study was to replicate the observed effect of music listening while viewing outdoor scenes with different musical stimuli. Next, the effect of a fast and a slow musical tempo on eye movement speed was investigated. Finally, the effect of the type of outdoor scene (natural vs. urban scenes) was explored. The results revealed shorter fixation durations in the no-music condition compared to both music conditions, but these differences were non-significant. Moreover, we did not find differences in eye movements between music conditions with fast and slow tempo. Although significantly shorter fixations were found for viewing urban scenes compared with natural scenes, we did not find a significant interaction between the type of scene and music conditions.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128511669","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Enhanced representation of web pages for usability analysis with eye tracking 通过眼动追踪增强网页可用性分析的表示
Pub Date : 2018-06-14 DOI: 10.1145/3204493.3214308
Raphael Menges, Hanadi Tamimi, C. Kumar, T. Walber, Christoph Schaefer, Steffen Staab
Eye tracking as a tool to quantify user attention plays a major role in research and application design. For Web page usability, it has become a prominent measure to assess which sections of a Web page are read, glanced or skipped. Such assessments primarily depend on the mapping of gaze data to a Web page representation. However, current representation methods, a virtual screenshot of the Web page or a video recording of the complete interaction session, suffer either from accuracy or scalability issues. We present a method that identifies fixed elements on Web pages and combines user viewport screenshots in relation to fixed elements for an enhanced representation of the page. We conducted an experiment with 10 participants and the results signify that analysis with our method is more efficient than a video recording, which is an essential criterion for large scale Web studies.
眼动追踪作为一种量化用户注意力的工具,在研究和应用设计中发挥着重要作用。对于Web页面的可用性,它已经成为评估Web页面的哪些部分被阅读、浏览或跳过的重要指标。这种评估主要依赖于注视数据到Web页面表示的映射。但是,当前的表示方法(Web页面的虚拟屏幕截图或完整交互会话的视频记录)存在准确性或可伸缩性问题。我们提出了一种方法,可以识别Web页面上的固定元素,并将用户视口屏幕截图与固定元素相结合,以增强页面的表示。我们进行了一个有10名参与者的实验,结果表明,用我们的方法进行分析比视频记录更有效,而视频记录是大规模网络研究的基本标准。
{"title":"Enhanced representation of web pages for usability analysis with eye tracking","authors":"Raphael Menges, Hanadi Tamimi, C. Kumar, T. Walber, Christoph Schaefer, Steffen Staab","doi":"10.1145/3204493.3214308","DOIUrl":"https://doi.org/10.1145/3204493.3214308","url":null,"abstract":"Eye tracking as a tool to quantify user attention plays a major role in research and application design. For Web page usability, it has become a prominent measure to assess which sections of a Web page are read, glanced or skipped. Such assessments primarily depend on the mapping of gaze data to a Web page representation. However, current representation methods, a virtual screenshot of the Web page or a video recording of the complete interaction session, suffer either from accuracy or scalability issues. We present a method that identifies fixed elements on Web pages and combines user viewport screenshots in relation to fixed elements for an enhanced representation of the page. We conducted an experiment with 10 participants and the results signify that analysis with our method is more efficient than a video recording, which is an essential criterion for large scale Web studies.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"86 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127132131","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Implementing innovative gaze analytic methods in clinical psychology: a study on eye movements in antisocial violent offenders 创新凝视分析方法在临床心理学中的应用:反社会暴力罪犯眼动的研究
Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204543
Nina A. Gehrer, M. Schönenberg, A. Duchowski, Krzysztof Krejtz
A variety of psychological disorders like antisocial personality disorder have been linked to impairments in facial emotion recognition. Exploring eye movements during categorization of emotional faces is a promising approach with the potential to reveal possible differences in cognitive processes underlying these deficits. Based on this premise we investigated whether antisocial violent offenders exhibit different scan patterns compared to a matched healthy control group while categorizing emotional faces. Group differences were analyzed in terms of attention to the eyes, extent of exploration behavior and structure of switching patterns between Areas of Interest. While we were not able to show clear group differences, the present study is one of the first that demonstrates the feasibility and utility of incorporating recently developed eye movement metrics such as gaze transition entropy into clinical psychology.
各种心理障碍,如反社会人格障碍,都与面部情绪识别障碍有关。探索情绪面孔分类过程中的眼球运动是一种很有前途的方法,有可能揭示这些缺陷背后认知过程的可能差异。基于这一前提,我们调查了反社会暴力罪犯在对情绪面孔进行分类时,是否表现出与匹配的健康对照组不同的扫描模式。从眼睛的注意力、探索行为的程度和兴趣区域之间切换模式的结构等方面分析了组间差异。虽然我们无法显示出明显的群体差异,但目前的研究是第一个证明将最近开发的眼动指标(如凝视转移熵)纳入临床心理学的可行性和实用性的研究之一。
{"title":"Implementing innovative gaze analytic methods in clinical psychology: a study on eye movements in antisocial violent offenders","authors":"Nina A. Gehrer, M. Schönenberg, A. Duchowski, Krzysztof Krejtz","doi":"10.1145/3204493.3204543","DOIUrl":"https://doi.org/10.1145/3204493.3204543","url":null,"abstract":"A variety of psychological disorders like antisocial personality disorder have been linked to impairments in facial emotion recognition. Exploring eye movements during categorization of emotional faces is a promising approach with the potential to reveal possible differences in cognitive processes underlying these deficits. Based on this premise we investigated whether antisocial violent offenders exhibit different scan patterns compared to a matched healthy control group while categorizing emotional faces. Group differences were analyzed in terms of attention to the eyes, extent of exploration behavior and structure of switching patterns between Areas of Interest. While we were not able to show clear group differences, the present study is one of the first that demonstrates the feasibility and utility of incorporating recently developed eye movement metrics such as gaze transition entropy into clinical psychology.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"65 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129128762","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 14
Semantic fovea: real-time annotation of ego-centric videos with gaze context 语义中央凹:具有注视上下文的自我中心视频的实时注释
Pub Date : 2018-06-14 DOI: 10.1145/3204493.3208349
C. Auepanwiriyakul, Alex Harston, Pavel Orlov, A. Shafti, A. Faisal
Visual context plays a crucial role in understanding human visual attention in natural, unconstrained tasks - the objects we look at during everyday tasks provide an indicator of our ongoing attention. Collection, interpretation, and study of visual behaviour in unconstrained environments therefore is necessary, however presents many challenges, requiring painstaking hand-coding. Here we demonstrate a proof-of-concept system that enables real-time annotation of objects in an egocentric video stream from head-mounted eye-tracking glasses. We concurrently obtain a live stream of user gaze vectors with respect to their own visual field. Even during dynamic, fast-paced interactions, our system was able to recognise all objects in the user's field-of-view with moderate accuracy. To validate our concept, our system was used to annotate an in-lab breakfast scenario in real time.
视觉环境在理解人类在自然、无约束任务中的视觉注意力方面起着至关重要的作用——我们在日常任务中看到的物体提供了我们持续注意力的一个指标。因此,在不受约束的环境中收集、解释和研究视觉行为是必要的,但也存在许多挑战,需要艰苦的手工编码。在这里,我们展示了一个概念验证系统,该系统可以通过头戴式眼动追踪眼镜在以自我为中心的视频流中实时注释对象。我们同时获得相对于他们自己的视野的用户注视向量的实时流。即使在动态的、快节奏的交互中,我们的系统也能够以中等的精度识别用户视野中的所有物体。为了验证我们的概念,我们的系统被用于实时注释实验室早餐场景。
{"title":"Semantic fovea: real-time annotation of ego-centric videos with gaze context","authors":"C. Auepanwiriyakul, Alex Harston, Pavel Orlov, A. Shafti, A. Faisal","doi":"10.1145/3204493.3208349","DOIUrl":"https://doi.org/10.1145/3204493.3208349","url":null,"abstract":"Visual context plays a crucial role in understanding human visual attention in natural, unconstrained tasks - the objects we look at during everyday tasks provide an indicator of our ongoing attention. Collection, interpretation, and study of visual behaviour in unconstrained environments therefore is necessary, however presents many challenges, requiring painstaking hand-coding. Here we demonstrate a proof-of-concept system that enables real-time annotation of objects in an egocentric video stream from head-mounted eye-tracking glasses. We concurrently obtain a live stream of user gaze vectors with respect to their own visual field. Even during dynamic, fast-paced interactions, our system was able to recognise all objects in the user's field-of-view with moderate accuracy. To validate our concept, our system was used to annotate an in-lab breakfast scenario in real time.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127947164","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Gaze typing in virtual reality: impact of keyboard design, selection method, and motion 虚拟现实中的注视打字:键盘设计、选择方法和动作的影响
Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204541
Vijay Rajanna, J. P. Hansen
Gaze tracking in virtual reality (VR) allows for hands-free text entry, but it has not yet been explored. We investigate how the keyboard design, selection method, and motion in the field of view may impact typing performance and user experience. We present two studies of people (n = 32) typing with gaze+dwell and gaze+click inputs in VR. In study 1, the typing keyboard was flat and within-view; in study 2, it was larger-than-view but curved. Both studies included a stationary and a dynamic motion conditions in the user's field of view. Our findings suggest that 1) gaze typing in VR is viable but constrained, 2) the users perform best (10.15 WPM) when the entire keyboard is within-view; the larger-than-view keyboard (9.15 WPM) induces physical strain due to increased head movements, 3) motion in the field of view impacts the user's performance: users perform better while stationary than when in motion, and 4) gaze+click is better than dwell only (fixed at 550 ms) interaction.
虚拟现实(VR)中的凝视跟踪允许免提文本输入,但尚未被探索。我们研究了键盘设计、选择方法和视场中的运动如何影响打字性能和用户体验。我们提出了两项关于人们(n = 32)在VR中使用凝视+停留和凝视+点击输入进行打字的研究。在研究1中,打字键盘是平面的,在视线范围内;在研究2中,它比视图大,但弯曲。这两项研究都包括在用户视野中的静止和动态运动条件。我们的研究结果表明:1)注视输入在VR中是可行的,但受到限制;2)当整个键盘都在视线范围内时,用户表现最佳(10.15 WPM);大于视场的键盘(9.15 WPM)由于头部运动增加而导致身体紧张,3)视场中的运动影响用户的表现:用户在静止时比在运动时表现更好,4)凝视+点击比仅停留(固定在550 ms)的交互要好。
{"title":"Gaze typing in virtual reality: impact of keyboard design, selection method, and motion","authors":"Vijay Rajanna, J. P. Hansen","doi":"10.1145/3204493.3204541","DOIUrl":"https://doi.org/10.1145/3204493.3204541","url":null,"abstract":"Gaze tracking in virtual reality (VR) allows for hands-free text entry, but it has not yet been explored. We investigate how the keyboard design, selection method, and motion in the field of view may impact typing performance and user experience. We present two studies of people (n = 32) typing with gaze+dwell and gaze+click inputs in VR. In study 1, the typing keyboard was flat and within-view; in study 2, it was larger-than-view but curved. Both studies included a stationary and a dynamic motion conditions in the user's field of view. Our findings suggest that 1) gaze typing in VR is viable but constrained, 2) the users perform best (10.15 WPM) when the entire keyboard is within-view; the larger-than-view keyboard (9.15 WPM) induces physical strain due to increased head movements, 3) motion in the field of view impacts the user's performance: users perform better while stationary than when in motion, and 4) gaze+click is better than dwell only (fixed at 550 ms) interaction.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123351356","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 76
Ocular reactions in response to impressions of emotion-evoking pictures 对唤起情绪的图片印象的眼部反应
Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204574
M. Nakayama
Oculomotor indicies in response to emotional stimuli were analysed chronologically in order to investigate the relationships between eye behaviour and emotional activity in human visual perception. Seven participants classified visual stimuli into two emotional groups using subjective ratings of images, such as "Pleasant" and "Unpleasant". Changes in both eye movements and pupil diameters between the two groups of images were compared. Both the mean saccade lengths and the cross power spectra of eye movements for "Unpleasant" ratings were significantly higher than for other ratings of eye movements in regards to certain the duration of certain pictures shown. Also, both mean pupil diameters and their power spectrum densities were significantly higher when the durations of pictures presented were lengthened. When comparing the response latencies, pupil reactions followed the appearance of changes in the direction of eye movements. The results suggest that at specific latencies, "Unpleasant" images activate both eye movements and pupil dilations.
为了研究人类视觉知觉中眼睛行为与情绪活动之间的关系,按时间顺序分析了情绪刺激下的动眼指数。七名参与者通过对图像的主观评价,如“愉快”和“不愉快”,将视觉刺激分为两组。比较两组图像之间眼球运动和瞳孔直径的变化。“不愉快”等级的平均扫视长度和眼动的交叉功率谱都显著高于其他等级的眼动,这与所显示的特定图片的特定持续时间有关。同时,随着图像呈现时间的延长,平均瞳孔直径和功率谱密度均显著增加。当比较反应潜伏期时,瞳孔反应跟随眼球运动方向的变化。结果表明,在特定的潜伏期,“不愉快”的图像会激活眼球运动和瞳孔扩张。
{"title":"Ocular reactions in response to impressions of emotion-evoking pictures","authors":"M. Nakayama","doi":"10.1145/3204493.3204574","DOIUrl":"https://doi.org/10.1145/3204493.3204574","url":null,"abstract":"Oculomotor indicies in response to emotional stimuli were analysed chronologically in order to investigate the relationships between eye behaviour and emotional activity in human visual perception. Seven participants classified visual stimuli into two emotional groups using subjective ratings of images, such as \"Pleasant\" and \"Unpleasant\". Changes in both eye movements and pupil diameters between the two groups of images were compared. Both the mean saccade lengths and the cross power spectra of eye movements for \"Unpleasant\" ratings were significantly higher than for other ratings of eye movements in regards to certain the duration of certain pictures shown. Also, both mean pupil diameters and their power spectrum densities were significantly higher when the durations of pictures presented were lengthened. When comparing the response latencies, pupil reactions followed the appearance of changes in the direction of eye movements. The results suggest that at specific latencies, \"Unpleasant\" images activate both eye movements and pupil dilations.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126384166","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
An implementation of eye movement-driven biometrics in virtual reality 眼动驱动生物识别技术在虚拟现实中的实现
Pub Date : 2018-06-14 DOI: 10.1145/3204493.3208333
D. Lohr, S. Berndt, Oleg V. Komogortsev
As eye tracking can reduce the computational burden of virtual reality devices through a technique known as foveated rendering, we believe not only that eye tracking will be implemented in all virtual reality devices, but that eye tracking biometrics will become the standard method of authentication in virtual reality. Thus, we have created a real-time eye movement-driven authentication system for virtual reality devices. In this work, we describe the architecture of the system and provide a specific implementation that is done using the FOVE head-mounted display. We end with an exploration into future topics of research to spur thought and discussion.
由于眼动追踪可以通过一种称为注视点渲染的技术来减少虚拟现实设备的计算负担,我们相信眼动追踪不仅将在所有虚拟现实设备中实现,而且眼动追踪生物识别将成为虚拟现实中的标准认证方法。因此,我们为虚拟现实设备创建了一个实时眼动驱动的认证系统。在这项工作中,我们描述了系统的架构,并提供了使用FOVE头戴式显示器完成的具体实现。最后,我们将探讨未来的研究主题,以激发思考和讨论。
{"title":"An implementation of eye movement-driven biometrics in virtual reality","authors":"D. Lohr, S. Berndt, Oleg V. Komogortsev","doi":"10.1145/3204493.3208333","DOIUrl":"https://doi.org/10.1145/3204493.3208333","url":null,"abstract":"As eye tracking can reduce the computational burden of virtual reality devices through a technique known as foveated rendering, we believe not only that eye tracking will be implemented in all virtual reality devices, but that eye tracking biometrics will become the standard method of authentication in virtual reality. Thus, we have created a real-time eye movement-driven authentication system for virtual reality devices. In this work, we describe the architecture of the system and provide a specific implementation that is done using the FOVE head-mounted display. We end with an exploration into future topics of research to spur thought and discussion.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115858360","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 22
Substantiating reading teachers with scanpaths 用扫描器充实阅读教师
Pub Date : 2018-06-14 DOI: 10.1145/3204493.3208329
Sigrid Klerke, J. Madsen, E. Jacobsen, J. P. Hansen
We present a tool that allows reading teachers to record and replay students' voice and gaze behavior during reading. The tool replays scanpaths to reading professionals without prior gaze data experience. On the basis of test experiences with 147 students, we share our initial observations on how teachers make use of the tool to create a dialog with their students.
我们提出了一个工具,允许阅读教师记录和回放学生在阅读过程中的声音和凝视行为。该工具将扫描路径重放给阅读专业人员,而无需事先注视数据经验。基于147名学生的测试经验,我们分享了我们对教师如何使用该工具与学生建立对话的初步观察。
{"title":"Substantiating reading teachers with scanpaths","authors":"Sigrid Klerke, J. Madsen, E. Jacobsen, J. P. Hansen","doi":"10.1145/3204493.3208329","DOIUrl":"https://doi.org/10.1145/3204493.3208329","url":null,"abstract":"We present a tool that allows reading teachers to record and replay students' voice and gaze behavior during reading. The tool replays scanpaths to reading professionals without prior gaze data experience. On the basis of test experiences with 147 students, we share our initial observations on how teachers make use of the tool to create a dialog with their students.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"191 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121736201","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
An inconspicuous and modular head-mounted eye tracker 一个不起眼的模块化头戴式眼动仪
Pub Date : 2018-06-14 DOI: 10.1145/3204493.3208345
Shaharam Eivazi, Thomas C. Kübler, Thiago Santini, Enkelejda Kasneci
State of the art head mounted eye trackers employ glasses like frames, making their usage uncomfortable or even impossible for prescription eyewear users. Nonetheless, these users represent a notable portion of the population (e.g. the Prevent Blindness America organization reports that about half of the USA population use corrective eyewear for refractive errors alone). Thus, making eye tracking accessible for eyewear users is paramount to not only improve usability, but is also key for the ecological validity of eye tracking studies. In this work, we report on a novel approach for eye tracker design in the form of a modular and inconspicuous device that can be easily attached to glasses; for users without glasses, we also provide a 3D printable frame blueprint. Our prototypes include both low cost Commerical Out of The Shelf (COTS) and more expensive Original Equipment manufacturer (OEM) cameras, with sampling rates ranging between 30 and 120 fps and multiple pixel resolutions.
最先进的头戴式眼动仪使用像镜框一样的眼镜,这使得它们的使用不舒服,甚至不适合处方眼镜用户。尽管如此,这些用户代表了人口的很大一部分(例如,美国预防失明组织报告说,大约一半的美国人口仅因屈光不正而使用矫正眼镜)。因此,为眼镜使用者提供眼动追踪不仅对提高可用性至关重要,而且对眼动追踪研究的生态有效性也至关重要。在这项工作中,我们报告了一种新颖的眼动仪设计方法,它是一种模块化的、不显眼的设备,可以很容易地附着在眼镜上;对于不戴眼镜的用户,我们还提供了3D打印框架蓝图。我们的原型包括低成本的商用现货(COTS)和更昂贵的原始设备制造商(OEM)相机,采样率范围在30到120帧/秒之间,分辨率为多像素。
{"title":"An inconspicuous and modular head-mounted eye tracker","authors":"Shaharam Eivazi, Thomas C. Kübler, Thiago Santini, Enkelejda Kasneci","doi":"10.1145/3204493.3208345","DOIUrl":"https://doi.org/10.1145/3204493.3208345","url":null,"abstract":"State of the art head mounted eye trackers employ glasses like frames, making their usage uncomfortable or even impossible for prescription eyewear users. Nonetheless, these users represent a notable portion of the population (e.g. the Prevent Blindness America organization reports that about half of the USA population use corrective eyewear for refractive errors alone). Thus, making eye tracking accessible for eyewear users is paramount to not only improve usability, but is also key for the ecological validity of eye tracking studies. In this work, we report on a novel approach for eye tracker design in the form of a modular and inconspicuous device that can be easily attached to glasses; for users without glasses, we also provide a 3D printable frame blueprint. Our prototypes include both low cost Commerical Out of The Shelf (COTS) and more expensive Original Equipment manufacturer (OEM) cameras, with sampling rates ranging between 30 and 120 fps and multiple pixel resolutions.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133168560","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
期刊
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1