David J Harris, Mark R Wilson, Martin I Jones, Toby de Burgh, Daisy Mundy, Tom Arthur, Mayowa Olonilua, Samuel J Vine
{"title":"An investigation of feed-forward and feedback eye movement training in immersive virtual reality.","authors":"David J Harris, Mark R Wilson, Martin I Jones, Toby de Burgh, Daisy Mundy, Tom Arthur, Mayowa Olonilua, Samuel J Vine","doi":"10.16910/jemr.15.3.7","DOIUrl":null,"url":null,"abstract":"<p><p>The control of eye gaze is critical to the execution of many skills. The observation that task experts in many domains exhibit more efficient control of eye gaze than novices has led to the development of gaze training interventions that teach these behaviours. We aimed to extend this literature by i) examining the relative benefits of feed-forward (observing an expert's eye movements) versus feed-back (observing your own eye movements) training, and ii) automating this training within virtual reality. Serving personnel from the British Army and Royal Navy were randomised to either feed-forward or feed-back training within a virtual reality simulation of a room search and clearance task. Eye movement metrics - including visual search, saccade direction, and entropy - were recorded to quantify the efficiency of visual search behaviours. Feed-forward and feed-back eye movement training produced distinct learning benefits, but both accelerated the development of efficient gaze behaviours. However, we found no evidence that these more efficient search behaviours transferred to better decision making in the room clearance task. Our results suggest integrating eye movement training principles within virtual reality training simulations may be effective, but further work is needed to understand the learning mechanisms.</p>","PeriodicalId":15813,"journal":{"name":"Journal of Eye Movement Research","volume":null,"pages":null},"PeriodicalIF":1.3000,"publicationDate":"2023-06-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11229047/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Eye Movement Research","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.16910/jemr.15.3.7","RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2022/1/1 0:00:00","PubModel":"eCollection","JCR":"Q3","JCRName":"OPHTHALMOLOGY","Score":null,"Total":0}
引用次数: 0
Abstract
The control of eye gaze is critical to the execution of many skills. The observation that task experts in many domains exhibit more efficient control of eye gaze than novices has led to the development of gaze training interventions that teach these behaviours. We aimed to extend this literature by i) examining the relative benefits of feed-forward (observing an expert's eye movements) versus feed-back (observing your own eye movements) training, and ii) automating this training within virtual reality. Serving personnel from the British Army and Royal Navy were randomised to either feed-forward or feed-back training within a virtual reality simulation of a room search and clearance task. Eye movement metrics - including visual search, saccade direction, and entropy - were recorded to quantify the efficiency of visual search behaviours. Feed-forward and feed-back eye movement training produced distinct learning benefits, but both accelerated the development of efficient gaze behaviours. However, we found no evidence that these more efficient search behaviours transferred to better decision making in the room clearance task. Our results suggest integrating eye movement training principles within virtual reality training simulations may be effective, but further work is needed to understand the learning mechanisms.
期刊介绍:
The Journal of Eye Movement Research is an open-access, peer-reviewed scientific periodical devoted to all aspects of oculomotor functioning including methodology of eye recording, neurophysiological and cognitive models, attention, reading, as well as applications in neurology, ergonomy, media research and other areas,