R. P. Fraga, Ziho Kang, Junehyung Lee, J. Crutchfield
{"title":"Real-time eye tracking analysis for training in a dynamic task","authors":"R. P. Fraga, Ziho Kang, Junehyung Lee, J. Crutchfield","doi":"10.1109/BioSMART54244.2021.9677680","DOIUrl":null,"url":null,"abstract":"A dynamic task refers to a task in which the state of the system can dynamically change when a user interacts with the system's components. For example, when an air traffic controller detects aircraft on converging flight paths, the controller can select from multiple altitude, heading and speed clearances to maintain safe separation between them. Some clearance options for those two aircraft, however, may lead to losses of separation with other aircraft. One viable non-intrusive approach to characterize a user's interaction with a system is through real-time analysis of eye movements at the time when the state of the system is changing. The presentation of data from such analyses could be an effective way to enhance user training techniques. In this article, we provide a framework of how to analyze eye-tracking data to identify useful characteristics along with associated algorithms, followed by a simple case study to validate our framework.","PeriodicalId":286026,"journal":{"name":"2021 4th International Conference on Bio-Engineering for Smart Technologies (BioSMART)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 4th International Conference on Bio-Engineering for Smart Technologies (BioSMART)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/BioSMART54244.2021.9677680","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
A dynamic task refers to a task in which the state of the system can dynamically change when a user interacts with the system's components. For example, when an air traffic controller detects aircraft on converging flight paths, the controller can select from multiple altitude, heading and speed clearances to maintain safe separation between them. Some clearance options for those two aircraft, however, may lead to losses of separation with other aircraft. One viable non-intrusive approach to characterize a user's interaction with a system is through real-time analysis of eye movements at the time when the state of the system is changing. The presentation of data from such analyses could be an effective way to enhance user training techniques. In this article, we provide a framework of how to analyze eye-tracking data to identify useful characteristics along with associated algorithms, followed by a simple case study to validate our framework.