Previous research suggests that self-relevant stimuli, such as one's own name, attract more attention than stimuli that are not self-relevant. In two experiments, we examined to which extent the own name is also less prone to inhibition than other names using a Go/NoGo approach. The pupil diameter was employed as psychophysiological indicator of attention. A total of 36 subjects performed various categorization tasks, with their own name and other names. Whereas in Go-trials, pupil dilation for own and other names did not differ, in NoGo-trials, significant larger pupil dilations were obtained for subjects' own names compared to other names. This difference was especially pronounced at larger intervals after stimulus onset, suggesting that inhibitory processing was less effective with one's own name.
{"title":"Pupil responses signal less inhibition for own relative to other names","authors":"Lukas Greiter, Christoph Strauch, A. Huckauf","doi":"10.1145/3204493.3204576","DOIUrl":"https://doi.org/10.1145/3204493.3204576","url":null,"abstract":"Previous research suggests that self-relevant stimuli, such as one's own name, attract more attention than stimuli that are not self-relevant. In two experiments, we examined to which extent the own name is also less prone to inhibition than other names using a Go/NoGo approach. The pupil diameter was employed as psychophysiological indicator of attention. A total of 36 subjects performed various categorization tasks, with their own name and other names. Whereas in Go-trials, pupil dilation for own and other names did not differ, in NoGo-trials, significant larger pupil dilations were obtained for subjects' own names compared to other names. This difference was especially pronounced at larger intervals after stimulus onset, suggesting that inhibitory processing was less effective with one's own name.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124972461","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Petruzálek, Denis Sefara, M. Franěk, M. Kabelác
Previous studies have observed longer fixations and fewer saccades while viewing various outdoor scenes and listening to music compared to a no-music condition. There is also evidence that musical tempo can modulate the speed of eye movements. However, recent investigations from environmental psychology demonstrated differences in eye movement behavior while viewing natural and urban outdoor scenes. The first goal of this study was to replicate the observed effect of music listening while viewing outdoor scenes with different musical stimuli. Next, the effect of a fast and a slow musical tempo on eye movement speed was investigated. Finally, the effect of the type of outdoor scene (natural vs. urban scenes) was explored. The results revealed shorter fixation durations in the no-music condition compared to both music conditions, but these differences were non-significant. Moreover, we did not find differences in eye movements between music conditions with fast and slow tempo. Although significantly shorter fixations were found for viewing urban scenes compared with natural scenes, we did not find a significant interaction between the type of scene and music conditions.
{"title":"Scene perception while listening to music: an eye-tracking study","authors":"J. Petruzálek, Denis Sefara, M. Franěk, M. Kabelác","doi":"10.1145/3204493.3204582","DOIUrl":"https://doi.org/10.1145/3204493.3204582","url":null,"abstract":"Previous studies have observed longer fixations and fewer saccades while viewing various outdoor scenes and listening to music compared to a no-music condition. There is also evidence that musical tempo can modulate the speed of eye movements. However, recent investigations from environmental psychology demonstrated differences in eye movement behavior while viewing natural and urban outdoor scenes. The first goal of this study was to replicate the observed effect of music listening while viewing outdoor scenes with different musical stimuli. Next, the effect of a fast and a slow musical tempo on eye movement speed was investigated. Finally, the effect of the type of outdoor scene (natural vs. urban scenes) was explored. The results revealed shorter fixation durations in the no-music condition compared to both music conditions, but these differences were non-significant. Moreover, we did not find differences in eye movements between music conditions with fast and slow tempo. Although significantly shorter fixations were found for viewing urban scenes compared with natural scenes, we did not find a significant interaction between the type of scene and music conditions.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128511669","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Raphael Menges, Hanadi Tamimi, C. Kumar, T. Walber, Christoph Schaefer, Steffen Staab
Eye tracking as a tool to quantify user attention plays a major role in research and application design. For Web page usability, it has become a prominent measure to assess which sections of a Web page are read, glanced or skipped. Such assessments primarily depend on the mapping of gaze data to a Web page representation. However, current representation methods, a virtual screenshot of the Web page or a video recording of the complete interaction session, suffer either from accuracy or scalability issues. We present a method that identifies fixed elements on Web pages and combines user viewport screenshots in relation to fixed elements for an enhanced representation of the page. We conducted an experiment with 10 participants and the results signify that analysis with our method is more efficient than a video recording, which is an essential criterion for large scale Web studies.
{"title":"Enhanced representation of web pages for usability analysis with eye tracking","authors":"Raphael Menges, Hanadi Tamimi, C. Kumar, T. Walber, Christoph Schaefer, Steffen Staab","doi":"10.1145/3204493.3214308","DOIUrl":"https://doi.org/10.1145/3204493.3214308","url":null,"abstract":"Eye tracking as a tool to quantify user attention plays a major role in research and application design. For Web page usability, it has become a prominent measure to assess which sections of a Web page are read, glanced or skipped. Such assessments primarily depend on the mapping of gaze data to a Web page representation. However, current representation methods, a virtual screenshot of the Web page or a video recording of the complete interaction session, suffer either from accuracy or scalability issues. We present a method that identifies fixed elements on Web pages and combines user viewport screenshots in relation to fixed elements for an enhanced representation of the page. We conducted an experiment with 10 participants and the results signify that analysis with our method is more efficient than a video recording, which is an essential criterion for large scale Web studies.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"86 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127132131","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Nina A. Gehrer, M. Schönenberg, A. Duchowski, Krzysztof Krejtz
A variety of psychological disorders like antisocial personality disorder have been linked to impairments in facial emotion recognition. Exploring eye movements during categorization of emotional faces is a promising approach with the potential to reveal possible differences in cognitive processes underlying these deficits. Based on this premise we investigated whether antisocial violent offenders exhibit different scan patterns compared to a matched healthy control group while categorizing emotional faces. Group differences were analyzed in terms of attention to the eyes, extent of exploration behavior and structure of switching patterns between Areas of Interest. While we were not able to show clear group differences, the present study is one of the first that demonstrates the feasibility and utility of incorporating recently developed eye movement metrics such as gaze transition entropy into clinical psychology.
{"title":"Implementing innovative gaze analytic methods in clinical psychology: a study on eye movements in antisocial violent offenders","authors":"Nina A. Gehrer, M. Schönenberg, A. Duchowski, Krzysztof Krejtz","doi":"10.1145/3204493.3204543","DOIUrl":"https://doi.org/10.1145/3204493.3204543","url":null,"abstract":"A variety of psychological disorders like antisocial personality disorder have been linked to impairments in facial emotion recognition. Exploring eye movements during categorization of emotional faces is a promising approach with the potential to reveal possible differences in cognitive processes underlying these deficits. Based on this premise we investigated whether antisocial violent offenders exhibit different scan patterns compared to a matched healthy control group while categorizing emotional faces. Group differences were analyzed in terms of attention to the eyes, extent of exploration behavior and structure of switching patterns between Areas of Interest. While we were not able to show clear group differences, the present study is one of the first that demonstrates the feasibility and utility of incorporating recently developed eye movement metrics such as gaze transition entropy into clinical psychology.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"65 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129128762","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
C. Auepanwiriyakul, Alex Harston, Pavel Orlov, A. Shafti, A. Faisal
Visual context plays a crucial role in understanding human visual attention in natural, unconstrained tasks - the objects we look at during everyday tasks provide an indicator of our ongoing attention. Collection, interpretation, and study of visual behaviour in unconstrained environments therefore is necessary, however presents many challenges, requiring painstaking hand-coding. Here we demonstrate a proof-of-concept system that enables real-time annotation of objects in an egocentric video stream from head-mounted eye-tracking glasses. We concurrently obtain a live stream of user gaze vectors with respect to their own visual field. Even during dynamic, fast-paced interactions, our system was able to recognise all objects in the user's field-of-view with moderate accuracy. To validate our concept, our system was used to annotate an in-lab breakfast scenario in real time.
{"title":"Semantic fovea: real-time annotation of ego-centric videos with gaze context","authors":"C. Auepanwiriyakul, Alex Harston, Pavel Orlov, A. Shafti, A. Faisal","doi":"10.1145/3204493.3208349","DOIUrl":"https://doi.org/10.1145/3204493.3208349","url":null,"abstract":"Visual context plays a crucial role in understanding human visual attention in natural, unconstrained tasks - the objects we look at during everyday tasks provide an indicator of our ongoing attention. Collection, interpretation, and study of visual behaviour in unconstrained environments therefore is necessary, however presents many challenges, requiring painstaking hand-coding. Here we demonstrate a proof-of-concept system that enables real-time annotation of objects in an egocentric video stream from head-mounted eye-tracking glasses. We concurrently obtain a live stream of user gaze vectors with respect to their own visual field. Even during dynamic, fast-paced interactions, our system was able to recognise all objects in the user's field-of-view with moderate accuracy. To validate our concept, our system was used to annotate an in-lab breakfast scenario in real time.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127947164","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Gaze tracking in virtual reality (VR) allows for hands-free text entry, but it has not yet been explored. We investigate how the keyboard design, selection method, and motion in the field of view may impact typing performance and user experience. We present two studies of people (n = 32) typing with gaze+dwell and gaze+click inputs in VR. In study 1, the typing keyboard was flat and within-view; in study 2, it was larger-than-view but curved. Both studies included a stationary and a dynamic motion conditions in the user's field of view. Our findings suggest that 1) gaze typing in VR is viable but constrained, 2) the users perform best (10.15 WPM) when the entire keyboard is within-view; the larger-than-view keyboard (9.15 WPM) induces physical strain due to increased head movements, 3) motion in the field of view impacts the user's performance: users perform better while stationary than when in motion, and 4) gaze+click is better than dwell only (fixed at 550 ms) interaction.
{"title":"Gaze typing in virtual reality: impact of keyboard design, selection method, and motion","authors":"Vijay Rajanna, J. P. Hansen","doi":"10.1145/3204493.3204541","DOIUrl":"https://doi.org/10.1145/3204493.3204541","url":null,"abstract":"Gaze tracking in virtual reality (VR) allows for hands-free text entry, but it has not yet been explored. We investigate how the keyboard design, selection method, and motion in the field of view may impact typing performance and user experience. We present two studies of people (n = 32) typing with gaze+dwell and gaze+click inputs in VR. In study 1, the typing keyboard was flat and within-view; in study 2, it was larger-than-view but curved. Both studies included a stationary and a dynamic motion conditions in the user's field of view. Our findings suggest that 1) gaze typing in VR is viable but constrained, 2) the users perform best (10.15 WPM) when the entire keyboard is within-view; the larger-than-view keyboard (9.15 WPM) induces physical strain due to increased head movements, 3) motion in the field of view impacts the user's performance: users perform better while stationary than when in motion, and 4) gaze+click is better than dwell only (fixed at 550 ms) interaction.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123351356","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Oculomotor indicies in response to emotional stimuli were analysed chronologically in order to investigate the relationships between eye behaviour and emotional activity in human visual perception. Seven participants classified visual stimuli into two emotional groups using subjective ratings of images, such as "Pleasant" and "Unpleasant". Changes in both eye movements and pupil diameters between the two groups of images were compared. Both the mean saccade lengths and the cross power spectra of eye movements for "Unpleasant" ratings were significantly higher than for other ratings of eye movements in regards to certain the duration of certain pictures shown. Also, both mean pupil diameters and their power spectrum densities were significantly higher when the durations of pictures presented were lengthened. When comparing the response latencies, pupil reactions followed the appearance of changes in the direction of eye movements. The results suggest that at specific latencies, "Unpleasant" images activate both eye movements and pupil dilations.
{"title":"Ocular reactions in response to impressions of emotion-evoking pictures","authors":"M. Nakayama","doi":"10.1145/3204493.3204574","DOIUrl":"https://doi.org/10.1145/3204493.3204574","url":null,"abstract":"Oculomotor indicies in response to emotional stimuli were analysed chronologically in order to investigate the relationships between eye behaviour and emotional activity in human visual perception. Seven participants classified visual stimuli into two emotional groups using subjective ratings of images, such as \"Pleasant\" and \"Unpleasant\". Changes in both eye movements and pupil diameters between the two groups of images were compared. Both the mean saccade lengths and the cross power spectra of eye movements for \"Unpleasant\" ratings were significantly higher than for other ratings of eye movements in regards to certain the duration of certain pictures shown. Also, both mean pupil diameters and their power spectrum densities were significantly higher when the durations of pictures presented were lengthened. When comparing the response latencies, pupil reactions followed the appearance of changes in the direction of eye movements. The results suggest that at specific latencies, \"Unpleasant\" images activate both eye movements and pupil dilations.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126384166","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
As eye tracking can reduce the computational burden of virtual reality devices through a technique known as foveated rendering, we believe not only that eye tracking will be implemented in all virtual reality devices, but that eye tracking biometrics will become the standard method of authentication in virtual reality. Thus, we have created a real-time eye movement-driven authentication system for virtual reality devices. In this work, we describe the architecture of the system and provide a specific implementation that is done using the FOVE head-mounted display. We end with an exploration into future topics of research to spur thought and discussion.
{"title":"An implementation of eye movement-driven biometrics in virtual reality","authors":"D. Lohr, S. Berndt, Oleg V. Komogortsev","doi":"10.1145/3204493.3208333","DOIUrl":"https://doi.org/10.1145/3204493.3208333","url":null,"abstract":"As eye tracking can reduce the computational burden of virtual reality devices through a technique known as foveated rendering, we believe not only that eye tracking will be implemented in all virtual reality devices, but that eye tracking biometrics will become the standard method of authentication in virtual reality. Thus, we have created a real-time eye movement-driven authentication system for virtual reality devices. In this work, we describe the architecture of the system and provide a specific implementation that is done using the FOVE head-mounted display. We end with an exploration into future topics of research to spur thought and discussion.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115858360","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Sigrid Klerke, J. Madsen, E. Jacobsen, J. P. Hansen
We present a tool that allows reading teachers to record and replay students' voice and gaze behavior during reading. The tool replays scanpaths to reading professionals without prior gaze data experience. On the basis of test experiences with 147 students, we share our initial observations on how teachers make use of the tool to create a dialog with their students.
{"title":"Substantiating reading teachers with scanpaths","authors":"Sigrid Klerke, J. Madsen, E. Jacobsen, J. P. Hansen","doi":"10.1145/3204493.3208329","DOIUrl":"https://doi.org/10.1145/3204493.3208329","url":null,"abstract":"We present a tool that allows reading teachers to record and replay students' voice and gaze behavior during reading. The tool replays scanpaths to reading professionals without prior gaze data experience. On the basis of test experiences with 147 students, we share our initial observations on how teachers make use of the tool to create a dialog with their students.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"191 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121736201","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Shaharam Eivazi, Thomas C. Kübler, Thiago Santini, Enkelejda Kasneci
State of the art head mounted eye trackers employ glasses like frames, making their usage uncomfortable or even impossible for prescription eyewear users. Nonetheless, these users represent a notable portion of the population (e.g. the Prevent Blindness America organization reports that about half of the USA population use corrective eyewear for refractive errors alone). Thus, making eye tracking accessible for eyewear users is paramount to not only improve usability, but is also key for the ecological validity of eye tracking studies. In this work, we report on a novel approach for eye tracker design in the form of a modular and inconspicuous device that can be easily attached to glasses; for users without glasses, we also provide a 3D printable frame blueprint. Our prototypes include both low cost Commerical Out of The Shelf (COTS) and more expensive Original Equipment manufacturer (OEM) cameras, with sampling rates ranging between 30 and 120 fps and multiple pixel resolutions.
{"title":"An inconspicuous and modular head-mounted eye tracker","authors":"Shaharam Eivazi, Thomas C. Kübler, Thiago Santini, Enkelejda Kasneci","doi":"10.1145/3204493.3208345","DOIUrl":"https://doi.org/10.1145/3204493.3208345","url":null,"abstract":"State of the art head mounted eye trackers employ glasses like frames, making their usage uncomfortable or even impossible for prescription eyewear users. Nonetheless, these users represent a notable portion of the population (e.g. the Prevent Blindness America organization reports that about half of the USA population use corrective eyewear for refractive errors alone). Thus, making eye tracking accessible for eyewear users is paramount to not only improve usability, but is also key for the ecological validity of eye tracking studies. In this work, we report on a novel approach for eye tracker design in the form of a modular and inconspicuous device that can be easily attached to glasses; for users without glasses, we also provide a 3D printable frame blueprint. Our prototypes include both low cost Commerical Out of The Shelf (COTS) and more expensive Original Equipment manufacturer (OEM) cameras, with sampling rates ranging between 30 and 120 fps and multiple pixel resolutions.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133168560","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}