{"title":"Gaze Tracking Accuracy Maintenance using Traffic Sign Detection","authors":"Shaohua Jia, Do Hyong Koh, Marc Pomplun","doi":"10.1145/3239092.3265947","DOIUrl":null,"url":null,"abstract":"Eye tracking technology is becoming an important component of Advanced Driver Assistance Systems. Unfortunately, eye tracking systems require calibration to correctly associate pupil positions with gaze directions, and periodic calibration would be necessary because the accuracy will deteriorate overtime. This routine reduces the usability and practicability of in-vehicle eye tracking technology. We propose an approach to automatically perform real-time eye tracking calibration. We apply an object detection algorithm to continually detect objects that would likely attract the drivers' attention, such as traffic signs and lights. Those are, in turn, used as moving stimuli for the gaze accuracy maintenance procedure. The error vectors between recorded fixations and moving targets are calculated immediately and the weighted average of them is used to compensate for the offset of fixations in real-time. We evaluated our method both on laboratory data and real driving data. The results show that we can effectively reduce the gaze tracking errors.","PeriodicalId":313474,"journal":{"name":"Adjunct Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications","volume":"65 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Adjunct Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3239092.3265947","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Eye tracking technology is becoming an important component of Advanced Driver Assistance Systems. Unfortunately, eye tracking systems require calibration to correctly associate pupil positions with gaze directions, and periodic calibration would be necessary because the accuracy will deteriorate overtime. This routine reduces the usability and practicability of in-vehicle eye tracking technology. We propose an approach to automatically perform real-time eye tracking calibration. We apply an object detection algorithm to continually detect objects that would likely attract the drivers' attention, such as traffic signs and lights. Those are, in turn, used as moving stimuli for the gaze accuracy maintenance procedure. The error vectors between recorded fixations and moving targets are calculated immediately and the weighted average of them is used to compensate for the offset of fixations in real-time. We evaluated our method both on laboratory data and real driving data. The results show that we can effectively reduce the gaze tracking errors.