{"title":"Self-Directed Learning using Eye-Tracking: A Comparison between Wearable Head-worn and Webcam-based Technologies","authors":"Sara Khosravi, A. Khan, A. Zoha, R. Ghannam","doi":"10.1109/EDUCON52537.2022.9766468","DOIUrl":null,"url":null,"abstract":"The COVID-19 pandemic has accelerated our transition to an online and self-directed learning environment. In an effort to design better e-learning materials, we investigated the effectiveness of collecting psychophysiological eye-tracking data from participants in response to visual stimuli. In particular, we focused on collecting fixation data since this is closely related to human attention. Current wearable devices allow the measurement of visual data unobtrusively and in real-time, leading to new applications in wearable technology. Despite their accuracy, head-mounted eye trackers are too expensive for deployment on large-scale deployment. Therefore, we developed a low-cost, webcam-based eye tracking solution and compared its performance with a commercial head-mounted eye tracker. Four-minute lecture slides on the 3rd year electronic engineering course were presented as stimuli to eight learners for data collection. Their eye movement was collected within the pre-defined area of interest (AOI). Our results demonstrate that a low-cost webcam-based eye-tracking solution, combined with machine learning algorithms, can achieve similar accuracy to the head-worn tracker. Based on these results, learners can use the eye tracker for attention guidance. Our work also demonstrates that these webcam-based eye trackers can be scaled up and used in large classrooms to provide real-time information to instructors regarding student attention and behaviour.","PeriodicalId":416694,"journal":{"name":"2022 IEEE Global Engineering Education Conference (EDUCON)","volume":"56 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-03-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE Global Engineering Education Conference (EDUCON)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/EDUCON52537.2022.9766468","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
The COVID-19 pandemic has accelerated our transition to an online and self-directed learning environment. In an effort to design better e-learning materials, we investigated the effectiveness of collecting psychophysiological eye-tracking data from participants in response to visual stimuli. In particular, we focused on collecting fixation data since this is closely related to human attention. Current wearable devices allow the measurement of visual data unobtrusively and in real-time, leading to new applications in wearable technology. Despite their accuracy, head-mounted eye trackers are too expensive for deployment on large-scale deployment. Therefore, we developed a low-cost, webcam-based eye tracking solution and compared its performance with a commercial head-mounted eye tracker. Four-minute lecture slides on the 3rd year electronic engineering course were presented as stimuli to eight learners for data collection. Their eye movement was collected within the pre-defined area of interest (AOI). Our results demonstrate that a low-cost webcam-based eye-tracking solution, combined with machine learning algorithms, can achieve similar accuracy to the head-worn tracker. Based on these results, learners can use the eye tracker for attention guidance. Our work also demonstrates that these webcam-based eye trackers can be scaled up and used in large classrooms to provide real-time information to instructors regarding student attention and behaviour.