{"title":"基于平滑追踪眼球运动的交互式应用中对象选择的相似性度量","authors":"Herlina, S. Wibirama, I. Ardiyanto","doi":"10.1109/ICOIACT.2018.8350701","DOIUrl":null,"url":null,"abstract":"Gaze-based interaction in various digital technologies is a rapidly growing research area. Eye tracking provides an alternative input modality to control interactive contents in computers. Nowadays, eye tracking is not only expected to be a personal assistive technology, but also to be a controller for interactive contents in a public display. Instead of fixational eye movement, smooth pursuit eye movement has been used for object selection in gaze-based interactive applications. However, previous works did not consider various similarity measures for spontaneous object selection. Hence, no information on how different similarity measures affect performance of object selection. To fill this gap, we compared two similarity measures — Euclidean distance and Pearson's product moment coefficient — for object selection. We presented simple interactive applications containing four dynamic objects, each of which was presented subsequently or simultaneously. The participants were asked to select the objects by gazing and following the trajectory of the moving objects. Our results show that object selection with Euclidean distance achieved superior accuracy (78.65%) compared with object selection with Pearson's product moment coefficient (57.38%). In future, our results maybe used as a guideline for development of spontaneous gaze-based interactive application.","PeriodicalId":6660,"journal":{"name":"2018 International Conference on Information and Communications Technology (ICOIACT)","volume":"72 1","pages":"639-644"},"PeriodicalIF":0.0000,"publicationDate":"2018-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"Similarity measures of object selection in interactive applications based on smooth pursuit eye movements\",\"authors\":\"Herlina, S. Wibirama, I. Ardiyanto\",\"doi\":\"10.1109/ICOIACT.2018.8350701\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Gaze-based interaction in various digital technologies is a rapidly growing research area. Eye tracking provides an alternative input modality to control interactive contents in computers. Nowadays, eye tracking is not only expected to be a personal assistive technology, but also to be a controller for interactive contents in a public display. Instead of fixational eye movement, smooth pursuit eye movement has been used for object selection in gaze-based interactive applications. However, previous works did not consider various similarity measures for spontaneous object selection. Hence, no information on how different similarity measures affect performance of object selection. To fill this gap, we compared two similarity measures — Euclidean distance and Pearson's product moment coefficient — for object selection. We presented simple interactive applications containing four dynamic objects, each of which was presented subsequently or simultaneously. The participants were asked to select the objects by gazing and following the trajectory of the moving objects. Our results show that object selection with Euclidean distance achieved superior accuracy (78.65%) compared with object selection with Pearson's product moment coefficient (57.38%). In future, our results maybe used as a guideline for development of spontaneous gaze-based interactive application.\",\"PeriodicalId\":6660,\"journal\":{\"name\":\"2018 International Conference on Information and Communications Technology (ICOIACT)\",\"volume\":\"72 1\",\"pages\":\"639-644\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-03-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 International Conference on Information and Communications Technology (ICOIACT)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICOIACT.2018.8350701\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 International Conference on Information and Communications Technology (ICOIACT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICOIACT.2018.8350701","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Similarity measures of object selection in interactive applications based on smooth pursuit eye movements
Gaze-based interaction in various digital technologies is a rapidly growing research area. Eye tracking provides an alternative input modality to control interactive contents in computers. Nowadays, eye tracking is not only expected to be a personal assistive technology, but also to be a controller for interactive contents in a public display. Instead of fixational eye movement, smooth pursuit eye movement has been used for object selection in gaze-based interactive applications. However, previous works did not consider various similarity measures for spontaneous object selection. Hence, no information on how different similarity measures affect performance of object selection. To fill this gap, we compared two similarity measures — Euclidean distance and Pearson's product moment coefficient — for object selection. We presented simple interactive applications containing four dynamic objects, each of which was presented subsequently or simultaneously. The participants were asked to select the objects by gazing and following the trajectory of the moving objects. Our results show that object selection with Euclidean distance achieved superior accuracy (78.65%) compared with object selection with Pearson's product moment coefficient (57.38%). In future, our results maybe used as a guideline for development of spontaneous gaze-based interactive application.