Aaron L. Gardony, Kana Okano, Gregory I. Hughes, Alex J. Kim, Kai T. Renshaw, Aldis Sipolins, Andrew B. Whitig, Feiyu Lu, Doug A. Bowman
{"title":"在视线自适应增强现实界面中描述信息访问需求:对快节奏和动态使用环境的影响","authors":"Aaron L. Gardony, Kana Okano, Gregory I. Hughes, Alex J. Kim, Kai T. Renshaw, Aldis Sipolins, Andrew B. Whitig, Feiyu Lu, Doug A. Bowman","doi":"10.1080/07370024.2023.2260788","DOIUrl":null,"url":null,"abstract":"Gaze-adaptive interfaces can enable intuitive hands-free augmented reality (AR) interaction but unintentional selection (i.e. “Midas Touch”) can have serious consequences during high-stakes real-world AR use. In the present study, we assessed how simulated gaze-adaptive AR interfaces, implementing single and dual gaze inputs, influence Soldiers’ human performance and user experience (UX) in a fast-paced virtual reality marksmanship task. In Experiment 1, we investigated 1- and 2-stage dwell-based interfaces, finding confirmatory dual gaze dwell input effectively reduced Midas Touch but also reduced task performance and UX compared to an always-on (AO) interface. In Experiment 2, we investigated gaze depth-based interfaces, finding similar negative impacts of confirmatory dwell on Midas Touch, task performance, and UX. Overall, compared to the AO interface, single gaze input interfaces (e.g. single dwell or gaze depth threshold) reduced viewing of task-irrelevant information and yielded similar task performance and UX despite being prone to Midas Touch. Broadly, our findings demonstrate that AR users performing fast-paced dynamic tasks can tolerate some unintentional activation of AR displays if reliable and rapid information access is maintained and point to the need to develop and refine gaze depth estimation algorithms and novel gaze depth-based interfaces that provide rapid access to AR display content.","PeriodicalId":56306,"journal":{"name":"Human-Computer Interaction","volume":"9 1","pages":"0"},"PeriodicalIF":4.5000,"publicationDate":"2023-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Characterizing information access needs in gaze-adaptive augmented reality interfaces: implications for fast-paced and dynamic usage contexts\",\"authors\":\"Aaron L. Gardony, Kana Okano, Gregory I. Hughes, Alex J. Kim, Kai T. Renshaw, Aldis Sipolins, Andrew B. Whitig, Feiyu Lu, Doug A. Bowman\",\"doi\":\"10.1080/07370024.2023.2260788\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Gaze-adaptive interfaces can enable intuitive hands-free augmented reality (AR) interaction but unintentional selection (i.e. “Midas Touch”) can have serious consequences during high-stakes real-world AR use. In the present study, we assessed how simulated gaze-adaptive AR interfaces, implementing single and dual gaze inputs, influence Soldiers’ human performance and user experience (UX) in a fast-paced virtual reality marksmanship task. In Experiment 1, we investigated 1- and 2-stage dwell-based interfaces, finding confirmatory dual gaze dwell input effectively reduced Midas Touch but also reduced task performance and UX compared to an always-on (AO) interface. In Experiment 2, we investigated gaze depth-based interfaces, finding similar negative impacts of confirmatory dwell on Midas Touch, task performance, and UX. Overall, compared to the AO interface, single gaze input interfaces (e.g. single dwell or gaze depth threshold) reduced viewing of task-irrelevant information and yielded similar task performance and UX despite being prone to Midas Touch. Broadly, our findings demonstrate that AR users performing fast-paced dynamic tasks can tolerate some unintentional activation of AR displays if reliable and rapid information access is maintained and point to the need to develop and refine gaze depth estimation algorithms and novel gaze depth-based interfaces that provide rapid access to AR display content.\",\"PeriodicalId\":56306,\"journal\":{\"name\":\"Human-Computer Interaction\",\"volume\":\"9 1\",\"pages\":\"0\"},\"PeriodicalIF\":4.5000,\"publicationDate\":\"2023-10-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Human-Computer Interaction\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1080/07370024.2023.2260788\",\"RegionNum\":2,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, CYBERNETICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Human-Computer Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/07370024.2023.2260788","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, CYBERNETICS","Score":null,"Total":0}
Characterizing information access needs in gaze-adaptive augmented reality interfaces: implications for fast-paced and dynamic usage contexts
Gaze-adaptive interfaces can enable intuitive hands-free augmented reality (AR) interaction but unintentional selection (i.e. “Midas Touch”) can have serious consequences during high-stakes real-world AR use. In the present study, we assessed how simulated gaze-adaptive AR interfaces, implementing single and dual gaze inputs, influence Soldiers’ human performance and user experience (UX) in a fast-paced virtual reality marksmanship task. In Experiment 1, we investigated 1- and 2-stage dwell-based interfaces, finding confirmatory dual gaze dwell input effectively reduced Midas Touch but also reduced task performance and UX compared to an always-on (AO) interface. In Experiment 2, we investigated gaze depth-based interfaces, finding similar negative impacts of confirmatory dwell on Midas Touch, task performance, and UX. Overall, compared to the AO interface, single gaze input interfaces (e.g. single dwell or gaze depth threshold) reduced viewing of task-irrelevant information and yielded similar task performance and UX despite being prone to Midas Touch. Broadly, our findings demonstrate that AR users performing fast-paced dynamic tasks can tolerate some unintentional activation of AR displays if reliable and rapid information access is maintained and point to the need to develop and refine gaze depth estimation algorithms and novel gaze depth-based interfaces that provide rapid access to AR display content.
期刊介绍:
Human-Computer Interaction (HCI) is a multidisciplinary journal defining and reporting
on fundamental research in human-computer interaction. The goal of HCI is to be a journal
of the highest quality that combines the best research and design work to extend our
understanding of human-computer interaction. The target audience is the research
community with an interest in both the scientific implications and practical relevance of
how interactive computer systems should be designed and how they are actually used. HCI is
concerned with the theoretical, empirical, and methodological issues of interaction science
and system design as it affects the user.