{"title":"基于凝视的命令激活技术,使用停留然后手势抵抗意外激活","authors":"Toshiya Isomoto, Shota Yamanaka, B. Shizuki","doi":"10.20380/GI2020.26","DOIUrl":null,"url":null,"abstract":"We show a gaze-based command activation technique that is robust to unintentional command activations using a series of manipulation of dwelling on a target and performing a gesture (dwell-then-gesture manipulation). The gesture we adopt is a simple two-level stroke, which consists of a sequence of two orthogonal strokes. To achieve robustness against unintentional command activations, we design and fine-tune a gesture detection system based on how users move their gaze revealed through three experiments. Although our technique seems to just combine well-known dwell-based and gesture-based manipulations and to not be enough success rate, our work will be the first work that enriches the vocabulary, which is as much as mouse-based interaction.","PeriodicalId":93493,"journal":{"name":"Proceedings. Graphics Interface (Conference)","volume":"1 1","pages":"256-266"},"PeriodicalIF":0.0000,"publicationDate":"2020-04-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":"{\"title\":\"Gaze-based Command Activation Technique Robust Against Unintentional Activation using Dwell-then-Gesture\",\"authors\":\"Toshiya Isomoto, Shota Yamanaka, B. Shizuki\",\"doi\":\"10.20380/GI2020.26\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We show a gaze-based command activation technique that is robust to unintentional command activations using a series of manipulation of dwelling on a target and performing a gesture (dwell-then-gesture manipulation). The gesture we adopt is a simple two-level stroke, which consists of a sequence of two orthogonal strokes. To achieve robustness against unintentional command activations, we design and fine-tune a gesture detection system based on how users move their gaze revealed through three experiments. Although our technique seems to just combine well-known dwell-based and gesture-based manipulations and to not be enough success rate, our work will be the first work that enriches the vocabulary, which is as much as mouse-based interaction.\",\"PeriodicalId\":93493,\"journal\":{\"name\":\"Proceedings. Graphics Interface (Conference)\",\"volume\":\"1 1\",\"pages\":\"256-266\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-04-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"5\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings. Graphics Interface (Conference)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.20380/GI2020.26\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings. Graphics Interface (Conference)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.20380/GI2020.26","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Gaze-based Command Activation Technique Robust Against Unintentional Activation using Dwell-then-Gesture
We show a gaze-based command activation technique that is robust to unintentional command activations using a series of manipulation of dwelling on a target and performing a gesture (dwell-then-gesture manipulation). The gesture we adopt is a simple two-level stroke, which consists of a sequence of two orthogonal strokes. To achieve robustness against unintentional command activations, we design and fine-tune a gesture detection system based on how users move their gaze revealed through three experiments. Although our technique seems to just combine well-known dwell-based and gesture-based manipulations and to not be enough success rate, our work will be the first work that enriches the vocabulary, which is as much as mouse-based interaction.