Detecting Task Difficulty of Learners in Colonoscopy: Evidence from Eye-Tracking.

IF 1.3 4区 心理学 Q3 OPHTHALMOLOGY Journal of Eye Movement Research Pub Date : 2021-07-13 eCollection Date: 2021-01-01 DOI:10.16910/jemr.14.2.5
Liu Xin, Zheng Bin, Duan Xiaoqin, He Wenjing, Li Yuandong, Zhao Jinyu, Zhao Chen, Wang Lin
{"title":"Detecting Task Difficulty of Learners in Colonoscopy: Evidence from Eye-Tracking.","authors":"Liu Xin,&nbsp;Zheng Bin,&nbsp;Duan Xiaoqin,&nbsp;He Wenjing,&nbsp;Li Yuandong,&nbsp;Zhao Jinyu,&nbsp;Zhao Chen,&nbsp;Wang Lin","doi":"10.16910/jemr.14.2.5","DOIUrl":null,"url":null,"abstract":"<p><p>Eye-tracking can help decode the intricate control mechanism in human performance. In healthcare, physicians-in-training require extensive practice to improve their healthcare skills. When a trainee encounters any difficulty in the practice, they will need feedback from experts to improve their performance. Personal feedback is time-consuming and subjected to bias. In this study, we tracked the eye movements of trainees during their colonoscopic performance in simulation. We examined changes in eye movement behavior during the moments of navigation loss (MNL), a signature sign for task difficulty during colonoscopy, and tested whether deep learning algorithms can detect the MNL by feeding data from eye-tracking. Human eye gaze and pupil characteristics were learned and verified by the deep convolutional generative adversarial networks (DCGANs); the generated data were fed to the Long Short-Term Memory (LSTM) networks with three different data feeding strategies to classify MNLs from the entire colonoscopic procedure. Outputs from deep learning were compared to the expert's judgment on the MNLs based on colonoscopic videos. The best classification outcome was achieved when we fed human eye data with 1000 synthesized eye data, where accuracy (91.80%), sensitivity (90.91%), and specificity (94.12%) were optimized. This study built an important foundation for our work of developing an education system for training healthcare skills using simulation.</p>","PeriodicalId":15813,"journal":{"name":"Journal of Eye Movement Research","volume":"14 2","pages":""},"PeriodicalIF":1.3000,"publicationDate":"2021-07-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8327395/pdf/","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Eye Movement Research","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.16910/jemr.14.2.5","RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2021/1/1 0:00:00","PubModel":"eCollection","JCR":"Q3","JCRName":"OPHTHALMOLOGY","Score":null,"Total":0}
引用次数: 3

Abstract

Eye-tracking can help decode the intricate control mechanism in human performance. In healthcare, physicians-in-training require extensive practice to improve their healthcare skills. When a trainee encounters any difficulty in the practice, they will need feedback from experts to improve their performance. Personal feedback is time-consuming and subjected to bias. In this study, we tracked the eye movements of trainees during their colonoscopic performance in simulation. We examined changes in eye movement behavior during the moments of navigation loss (MNL), a signature sign for task difficulty during colonoscopy, and tested whether deep learning algorithms can detect the MNL by feeding data from eye-tracking. Human eye gaze and pupil characteristics were learned and verified by the deep convolutional generative adversarial networks (DCGANs); the generated data were fed to the Long Short-Term Memory (LSTM) networks with three different data feeding strategies to classify MNLs from the entire colonoscopic procedure. Outputs from deep learning were compared to the expert's judgment on the MNLs based on colonoscopic videos. The best classification outcome was achieved when we fed human eye data with 1000 synthesized eye data, where accuracy (91.80%), sensitivity (90.91%), and specificity (94.12%) were optimized. This study built an important foundation for our work of developing an education system for training healthcare skills using simulation.

Abstract Image

Abstract Image

Abstract Image

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
结肠镜检查中学习者任务难度的检测:来自眼动追踪的证据。
眼球追踪可以帮助解读人类行为中复杂的控制机制。在医疗保健领域,实习医生需要广泛的实践来提高他们的医疗技能。当学员在练习中遇到困难时,他们需要专家的反馈来提高他们的表现。个人反馈既费时又容易受到偏见的影响。在这项研究中,我们在模拟中跟踪了受训人员在结肠镜检查过程中的眼球运动。我们研究了结肠镜检查过程中导航丢失时刻(MNL)的眼动行为变化,并测试了深度学习算法是否可以通过输入眼动追踪数据来检测MNL。采用深度卷积生成对抗网络(dcgan)学习并验证人眼凝视和瞳孔特征;将生成的数据以三种不同的数据馈送策略馈送到长短期记忆(LSTM)网络,以对整个结肠镜检查过程中的mnl进行分类。将深度学习的输出与专家基于结肠镜视频对mnl的判断进行比较。用1000张人眼合成数据进行分类,准确率(91.80%)、灵敏度(90.91%)、特异度(94.12%)达到最佳。本研究为我们开发模拟医疗技能培训教育系统的工作奠定了重要基础。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
2.90
自引率
33.30%
发文量
10
审稿时长
10 weeks
期刊介绍: The Journal of Eye Movement Research is an open-access, peer-reviewed scientific periodical devoted to all aspects of oculomotor functioning including methodology of eye recording, neurophysiological and cognitive models, attention, reading, as well as applications in neurology, ergonomy, media research and other areas,
期刊最新文献
Intelligent Evaluation Method for Design Education and Comparison Research between visualizing Heat-Maps of Class Activation and Eye-Movement. The level of skills involved in an observation-based gait analysis. Effect of Action Video Games in Eye Movement Behavior: A Systematic Review. Persistence of primitive reflexes associated with asymmetries in fixation and ocular motility values. The Observer's Lens: The Impact of Personality Traits and Gaze on Facial Impression Inferences.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1