S. Stellmach, S. Stober, A. Nürnberger, Raimund Dachselt
{"title":"为探索大型图像集设计支持凝视的多模式交互","authors":"S. Stellmach, S. Stober, A. Nürnberger, Raimund Dachselt","doi":"10.1145/1983302.1983303","DOIUrl":null,"url":null,"abstract":"While eye tracking is becoming more and more relevant as a promising input channel, diverse applications using gaze control in a more natural way are still rather limited. Though several researchers have indicated the particular high potential of gaze-based interaction for pointing tasks, often gaze-only approaches are investigated. However, time-consuming dwell-time activations limit this potential. To overcome this, we present a gaze-supported fisheye lens in combination with (1) a keyboard and (2) and a tilt-sensitive mobile multi-touch device. In a user-centered design approach, we elicited how users would use the aforementioned input combinations. Based on the received feedback we designed a prototype system for the interaction with a remote display using gaze and a touch-and-tilt device. This eliminates gaze dwell-time activations and the well-known Midas Touch problem (unintentionally issuing an action via gaze). A formative user study testing our prototype provided further insights into how well the elaborated gaze-supported interaction techniques were experienced by users.","PeriodicalId":184593,"journal":{"name":"Conference on Novel Gaze-Controlled Applications","volume":"68 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"67","resultStr":"{\"title\":\"Designing gaze-supported multimodal interactions for the exploration of large image collections\",\"authors\":\"S. Stellmach, S. Stober, A. Nürnberger, Raimund Dachselt\",\"doi\":\"10.1145/1983302.1983303\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"While eye tracking is becoming more and more relevant as a promising input channel, diverse applications using gaze control in a more natural way are still rather limited. Though several researchers have indicated the particular high potential of gaze-based interaction for pointing tasks, often gaze-only approaches are investigated. However, time-consuming dwell-time activations limit this potential. To overcome this, we present a gaze-supported fisheye lens in combination with (1) a keyboard and (2) and a tilt-sensitive mobile multi-touch device. In a user-centered design approach, we elicited how users would use the aforementioned input combinations. Based on the received feedback we designed a prototype system for the interaction with a remote display using gaze and a touch-and-tilt device. This eliminates gaze dwell-time activations and the well-known Midas Touch problem (unintentionally issuing an action via gaze). A formative user study testing our prototype provided further insights into how well the elaborated gaze-supported interaction techniques were experienced by users.\",\"PeriodicalId\":184593,\"journal\":{\"name\":\"Conference on Novel Gaze-Controlled Applications\",\"volume\":\"68 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-05-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"67\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Conference on Novel Gaze-Controlled Applications\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/1983302.1983303\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Conference on Novel Gaze-Controlled Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/1983302.1983303","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Designing gaze-supported multimodal interactions for the exploration of large image collections
While eye tracking is becoming more and more relevant as a promising input channel, diverse applications using gaze control in a more natural way are still rather limited. Though several researchers have indicated the particular high potential of gaze-based interaction for pointing tasks, often gaze-only approaches are investigated. However, time-consuming dwell-time activations limit this potential. To overcome this, we present a gaze-supported fisheye lens in combination with (1) a keyboard and (2) and a tilt-sensitive mobile multi-touch device. In a user-centered design approach, we elicited how users would use the aforementioned input combinations. Based on the received feedback we designed a prototype system for the interaction with a remote display using gaze and a touch-and-tilt device. This eliminates gaze dwell-time activations and the well-known Midas Touch problem (unintentionally issuing an action via gaze). A formative user study testing our prototype provided further insights into how well the elaborated gaze-supported interaction techniques were experienced by users.