{"title":"EyeExpress: Expanding Hands-free Input Vocabulary using Eye Expressions","authors":"Pin-Sung Ku, Te-Yen Wu, Mike Y. Chen","doi":"10.1145/3266037.3266123","DOIUrl":null,"url":null,"abstract":"The muscles surrounding the human eye are capable of performing a wide range of expressions such as squinting, blinking, frowning, and raising eyebrows. This work explores the use of these ocular expressions to expand the input vocabularies of hands-free interactions. We conducted a series of user studies: 1) to understand which eye expressions users could consistently perform among all possible expressions, 2) to explore how these expressions can be used for hands-free interactions through a user-defined design process. Our study results showed that most participants could consistently perform 9 of the 18 possible eye expressions. Also, in the user define study the participants used the eye expressions to create hands-free interactions for the state-of-the-art augmented reality (AR) head-mounted displays.","PeriodicalId":208006,"journal":{"name":"Adjunct Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology","volume":"19 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-10-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Adjunct Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3266037.3266123","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
The muscles surrounding the human eye are capable of performing a wide range of expressions such as squinting, blinking, frowning, and raising eyebrows. This work explores the use of these ocular expressions to expand the input vocabularies of hands-free interactions. We conducted a series of user studies: 1) to understand which eye expressions users could consistently perform among all possible expressions, 2) to explore how these expressions can be used for hands-free interactions through a user-defined design process. Our study results showed that most participants could consistently perform 9 of the 18 possible eye expressions. Also, in the user define study the participants used the eye expressions to create hands-free interactions for the state-of-the-art augmented reality (AR) head-mounted displays.