首页 > 最新文献

Eye Tracking Research & Application最新文献

英文 中文
Task Classification using Eye Movements and Graph Neural Networks 利用眼动和图神经网络进行任务分类
Pub Date : 2024-06-04 DOI: 10.1145/3649902.3655097
Jarod P. Hartley
{"title":"Task Classification using Eye Movements and Graph Neural Networks","authors":"Jarod P. Hartley","doi":"10.1145/3649902.3655097","DOIUrl":"https://doi.org/10.1145/3649902.3655097","url":null,"abstract":"","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-06-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141267441","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Characterizing Learners' Complex Attentional States During Online Multimedia Learning Using Eye-tracking, Egocentric Camera, Webcam, and Retrospective recalls 利用眼动跟踪、眼球中心摄像机、网络摄像头和回顾性回想描述学习者在在线多媒体学习过程中的复杂注意状态
Pub Date : 2024-06-04 DOI: 10.1145/3649902.3653939
Prasanth Chandran, Yifeng Huang, J. Munsell, Brian Howatt, Brayden Wallace, Lindsey Wilson, Sidney K. D’Mello, Minh Hoai, N. S. Rebello, Lester C. Loschky
{"title":"Characterizing Learners' Complex Attentional States During Online Multimedia Learning Using Eye-tracking, Egocentric Camera, Webcam, and Retrospective recalls","authors":"Prasanth Chandran, Yifeng Huang, J. Munsell, Brian Howatt, Brayden Wallace, Lindsey Wilson, Sidney K. D’Mello, Minh Hoai, N. S. Rebello, Lester C. Loschky","doi":"10.1145/3649902.3653939","DOIUrl":"https://doi.org/10.1145/3649902.3653939","url":null,"abstract":"","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-06-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141267606","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Joint Attention on the Future: Pro-Ecological Attitudes Change In Collaboration 共同关注未来:在合作中改变支持生态的态度
Pub Date : 2024-06-04 DOI: 10.1145/3649902.3655100
Iga Szwoch
{"title":"Joint Attention on the Future: Pro-Ecological Attitudes Change In Collaboration","authors":"Iga Szwoch","doi":"10.1145/3649902.3655100","DOIUrl":"https://doi.org/10.1145/3649902.3655100","url":null,"abstract":"","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-06-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141268007","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Impact of reward expectation on pupillary change during an adaptive two player card game 在适应性双人纸牌游戏中,奖励预期对瞳孔变化的影响
Pub Date : 2024-06-04 DOI: 10.1145/3649902.3655649
Ryo Yasuda, Minoru Nakayama
{"title":"Impact of reward expectation on pupillary change during an adaptive two player card game","authors":"Ryo Yasuda, Minoru Nakayama","doi":"10.1145/3649902.3655649","DOIUrl":"https://doi.org/10.1145/3649902.3655649","url":null,"abstract":"","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-06-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141268099","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Attempts on detecting Alzheimer's disease by fine-tuning pre-trained model with Gaze Data 利用凝视数据微调预训练模型检测阿尔茨海默病的尝试
Pub Date : 2024-06-04 DOI: 10.1145/3649902.3656360
Junichi Nagasawa, Yuichi Nakata, Mamoru Hiroe, Yujia Zheng, Yutaka Kawaguchi, Yuji Maegawa, Naoki Hojo, Tetsuya Takiguchi, Minoru Nakayama, Maki Uchimura, Yuma Sonoda, Hisatomo Kowa, Takashi Nagamatsu
{"title":"Attempts on detecting Alzheimer's disease by fine-tuning pre-trained model with Gaze Data","authors":"Junichi Nagasawa, Yuichi Nakata, Mamoru Hiroe, Yujia Zheng, Yutaka Kawaguchi, Yuji Maegawa, Naoki Hojo, Tetsuya Takiguchi, Minoru Nakayama, Maki Uchimura, Yuma Sonoda, Hisatomo Kowa, Takashi Nagamatsu","doi":"10.1145/3649902.3656360","DOIUrl":"https://doi.org/10.1145/3649902.3656360","url":null,"abstract":"","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-06-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141266328","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Analyzing and Interpreting Eye Movements in C++: Using Holistic Models of Image Perception 用 C++ 分析和解释眼球运动:使用图像感知整体模型
Pub Date : 2024-06-04 DOI: 10.1145/3649902.3655093
Florian Hauser, Lisa Grabinger, Timur Ezer, J. Mottok, Hans Gruber
{"title":"Analyzing and Interpreting Eye Movements in C++: Using Holistic Models of Image Perception","authors":"Florian Hauser, Lisa Grabinger, Timur Ezer, J. Mottok, Hans Gruber","doi":"10.1145/3649902.3655093","DOIUrl":"https://doi.org/10.1145/3649902.3655093","url":null,"abstract":"","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-06-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141266582","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The role of stress in silent reading 压力在默读中的作用
Pub Date : 2024-06-04 DOI: 10.1145/3649902.3656492
Kristina Cergol, M. Palmović
{"title":"The role of stress in silent reading","authors":"Kristina Cergol, M. Palmović","doi":"10.1145/3649902.3656492","DOIUrl":"https://doi.org/10.1145/3649902.3656492","url":null,"abstract":"","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-06-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141266654","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Visual feature extraction via eye tracking for saliency driven 2D/3D registration 通过眼动追踪提取视觉特征,用于显著性驱动的2D/3D配准
Pub Date : 2004-03-22 DOI: 10.1145/968363.968371
A. Chung, F. Deligianni, Xiao-Peng Hu, Guang-Zhong Yang
This paper presents a new technique for extracting visual saliency from experimental eye tracking data. An eye-tracking system is employed to determine which features that a group of human observers considered to be salient when viewing a set of video images. With this information, a biologically inspired saliency map is derived by transforming each observed video image into a feature space representation. By using a feature normalisation process based on the relative abundance of visual features within the background image and those dwelled on eye tracking scan paths, features related to visual attention are determined. These features are then back projected to the image domain to determine spatial areas of interest for unseen video images. The strengths and weaknesses of the method are demonstrated with feature correspondence for 2D to 3D image registration of endoscopy videos with computed tomography data. The biologically derived saliency map is employed to provide an image similarity measure that forms the heart of the 2D/3D registration method. It is shown that by only processing selective regions of interest as determined by the saliency map, rendering overhead can be greatly reduced. Significant improvements in pose estimation efficiency can be achieved without apparent reduction in registration accuracy when compared to that of using a non-saliency based similarity measure.
提出了一种从实验眼动追踪数据中提取视觉显著性的新技术。眼球追踪系统用于确定一组人类观察者在观看一组视频图像时认为哪些特征是显著的。利用这些信息,通过将每个观察到的视频图像转换为特征空间表示来获得生物学启发的显著性图。通过基于背景图像内视觉特征的相对丰度和眼动追踪扫描路径上的视觉特征的特征归一化过程,确定与视觉注意相关的特征。然后将这些特征投影回图像域,以确定未见视频图像感兴趣的空间区域。通过内窥镜视频与计算机断层扫描数据的二维到三维图像配准的特征对应,证明了该方法的优缺点。采用生物衍生的显著性图来提供形成2D/3D配准方法核心的图像相似性度量。结果表明,通过仅处理由显著性图确定的感兴趣的选择性区域,可以大大减少渲染开销。与使用非显著性相似度量相比,可以在不明显降低配准精度的情况下显著提高姿态估计效率。
{"title":"Visual feature extraction via eye tracking for saliency driven 2D/3D registration","authors":"A. Chung, F. Deligianni, Xiao-Peng Hu, Guang-Zhong Yang","doi":"10.1145/968363.968371","DOIUrl":"https://doi.org/10.1145/968363.968371","url":null,"abstract":"This paper presents a new technique for extracting visual saliency from experimental eye tracking data. An eye-tracking system is employed to determine which features that a group of human observers considered to be salient when viewing a set of video images. With this information, a biologically inspired saliency map is derived by transforming each observed video image into a feature space representation. By using a feature normalisation process based on the relative abundance of visual features within the background image and those dwelled on eye tracking scan paths, features related to visual attention are determined. These features are then back projected to the image domain to determine spatial areas of interest for unseen video images. The strengths and weaknesses of the method are demonstrated with feature correspondence for 2D to 3D image registration of endoscopy videos with computed tomography data. The biologically derived saliency map is employed to provide an image similarity measure that forms the heart of the 2D/3D registration method. It is shown that by only processing selective regions of interest as determined by the saliency map, rendering overhead can be greatly reduced. Significant improvements in pose estimation efficiency can be achieved without apparent reduction in registration accuracy when compared to that of using a non-saliency based similarity measure.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123587953","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 13
Mental imagery in problem solving: an eye tracking study 解决问题的心理意象:一项眼动追踪研究
Pub Date : 2004-03-22 DOI: 10.1145/968363.968382
Daesub Yoon, N. Hari Narayanan
Cognitive models and empirical studies of problem solving in visuo-spatial and causal domains suggest that problem solving tasks in such domains invoke cognitive processes involving mental animation and imagery. If these internal processes are externally manifested in the form of eye movements, such tasks present situations in which the trajectory of a user's visual attention can provide clues regarding his or her information needs to an Attentive User Interface [Vertegaal 2002]. In this paper, we briefly review research related to problem solving that involves mental imagery, and describe an experiment that looked for evidence and effects of an imagery strategy in problem solving. We eye-tracked 90 subjects solving two causal reasoning problems, one in which a diagram of the problem appeared on the stimulus display, and a second related problem that was posed on a blank display. Results indicated that 42% of the subjects employed mental imagery and visually scanned the display in a correspondingly systematic fashion. This suggests that information displays that respond to a user's visual attention trajectory, a kind of Attentive User Interface, are more likely to benefit this class of users.
在视觉空间和因果领域解决问题的认知模型和实证研究表明,在这些领域解决问题的任务调用涉及心理动画和意象的认知过程。如果这些内部过程以眼球运动的形式在外部表现出来,那么在这些任务中,用户的视觉注意轨迹可以为他或她的信息需求提供线索,以关注用户界面[Vertegaal 2002]。在本文中,我们简要回顾了涉及心理意象的问题解决相关研究,并描述了一个寻找意象策略在问题解决中的证据和效果的实验。我们用眼睛追踪了90名受试者解决两个因果推理问题,其中一个问题的图表出现在刺激显示器上,另一个相关问题出现在空白显示器上。结果表明,42%的受试者使用心理意象,并以相应的系统方式进行视觉扫描。这表明,响应用户视觉注意力轨迹的信息显示,一种专注的用户界面,更有可能使这类用户受益。
{"title":"Mental imagery in problem solving: an eye tracking study","authors":"Daesub Yoon, N. Hari Narayanan","doi":"10.1145/968363.968382","DOIUrl":"https://doi.org/10.1145/968363.968382","url":null,"abstract":"Cognitive models and empirical studies of problem solving in visuo-spatial and causal domains suggest that problem solving tasks in such domains invoke cognitive processes involving mental animation and imagery. If these internal processes are externally manifested in the form of eye movements, such tasks present situations in which the trajectory of a user's visual attention can provide clues regarding his or her information needs to an Attentive User Interface [Vertegaal 2002]. In this paper, we briefly review research related to problem solving that involves mental imagery, and describe an experiment that looked for evidence and effects of an imagery strategy in problem solving. We eye-tracked 90 subjects solving two causal reasoning problems, one in which a diagram of the problem appeared on the stimulus display, and a second related problem that was posed on a blank display. Results indicated that 42% of the subjects employed mental imagery and visually scanned the display in a correspondingly systematic fashion. This suggests that information displays that respond to a user's visual attention trajectory, a kind of Attentive User Interface, are more likely to benefit this class of users.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122648496","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 50
Effects of feedback on eye typing with a short dwell time 反馈对短停留时间眼打字的影响
Pub Date : 2004-03-22 DOI: 10.1145/968363.968390
P. Majaranta, A. Aula, Kari-Jouko Räihä
Eye typing provides means of communication especially for people with severe disabilities. Recent research indicates that the type of feedback impacts typing speed, error rate, and the user's need to switch her gaze between the on-screen keyboard and the typed text field. The current study focuses on the issues of feedback when a short dwell time (450 ms vs. 900 ms in a previous study) is used. Results show that the findings obtained using longer dwell times only partly apply for shorter dwell times. For example, with a short dwell time, spoken feedback results in slower text entry speed and double entry errors. A short dwell time requires sharp and clear feedback that supports the typing rhythm.
眼睛打字为严重残疾的人提供了交流的手段。最近的研究表明,反馈的类型会影响打字速度、错误率,以及用户在屏幕键盘和输入文本框之间切换目光的需要。目前的研究重点是在使用较短的停留时间(450毫秒vs. 900毫秒)时的反馈问题。结果表明,使用较长停留时间获得的结果仅部分适用于较短的停留时间。例如,在停留时间较短的情况下,语音反馈会导致较慢的文本输入速度和双重输入错误。短的停留时间需要支持输入节奏的清晰反馈。
{"title":"Effects of feedback on eye typing with a short dwell time","authors":"P. Majaranta, A. Aula, Kari-Jouko Räihä","doi":"10.1145/968363.968390","DOIUrl":"https://doi.org/10.1145/968363.968390","url":null,"abstract":"Eye typing provides means of communication especially for people with severe disabilities. Recent research indicates that the type of feedback impacts typing speed, error rate, and the user's need to switch her gaze between the on-screen keyboard and the typed text field. The current study focuses on the issues of feedback when a short dwell time (450 ms vs. 900 ms in a previous study) is used. Results show that the findings obtained using longer dwell times only partly apply for shorter dwell times. For example, with a short dwell time, spoken feedback results in slower text entry speed and double entry errors. A short dwell time requires sharp and clear feedback that supports the typing rhythm.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128297745","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 77
期刊
Eye Tracking Research & Application
全部 Acc. Chem. Res. ACS Applied Bio Materials ACS Appl. Electron. Mater. ACS Appl. Energy Mater. ACS Appl. Mater. Interfaces ACS Appl. Nano Mater. ACS Appl. Polym. Mater. ACS BIOMATER-SCI ENG ACS Catal. ACS Cent. Sci. ACS Chem. Biol. ACS Chemical Health & Safety ACS Chem. Neurosci. ACS Comb. Sci. ACS Earth Space Chem. ACS Energy Lett. ACS Infect. Dis. ACS Macro Lett. ACS Mater. Lett. ACS Med. Chem. Lett. ACS Nano ACS Omega ACS Photonics ACS Sens. ACS Sustainable Chem. Eng. ACS Synth. Biol. Anal. Chem. BIOCHEMISTRY-US Bioconjugate Chem. BIOMACROMOLECULES Chem. Res. Toxicol. Chem. Rev. Chem. Mater. CRYST GROWTH DES ENERG FUEL Environ. Sci. Technol. Environ. Sci. Technol. Lett. Eur. J. Inorg. Chem. IND ENG CHEM RES Inorg. Chem. J. Agric. Food. Chem. J. Chem. Eng. Data J. Chem. Educ. J. Chem. Inf. Model. J. Chem. Theory Comput. J. Med. Chem. J. Nat. Prod. J PROTEOME RES J. Am. Chem. Soc. LANGMUIR MACROMOLECULES Mol. Pharmaceutics Nano Lett. Org. Lett. ORG PROCESS RES DEV ORGANOMETALLICS J. Org. Chem. J. Phys. Chem. J. Phys. Chem. A J. Phys. Chem. B J. Phys. Chem. C J. Phys. Chem. Lett. Analyst Anal. Methods Biomater. Sci. Catal. Sci. Technol. Chem. Commun. Chem. Soc. Rev. CHEM EDUC RES PRACT CRYSTENGCOMM Dalton Trans. Energy Environ. Sci. ENVIRON SCI-NANO ENVIRON SCI-PROC IMP ENVIRON SCI-WAT RES Faraday Discuss. Food Funct. Green Chem. Inorg. Chem. Front. Integr. Biol. J. Anal. At. Spectrom. J. Mater. Chem. A J. Mater. Chem. B J. Mater. Chem. C Lab Chip Mater. Chem. Front. Mater. Horiz. MEDCHEMCOMM Metallomics Mol. Biosyst. Mol. Syst. Des. Eng. Nanoscale Nanoscale Horiz. Nat. Prod. Rep. New J. Chem. Org. Biomol. Chem. Org. Chem. Front. PHOTOCH PHOTOBIO SCI PCCP Polym. Chem.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1