Characterizing information access needs in gaze-adaptive augmented reality interfaces: implications for fast-paced and dynamic usage contexts

IF 4.5 2区 工程技术 Q1 COMPUTER SCIENCE, CYBERNETICS Human-Computer Interaction Pub Date : 2023-10-13 DOI:10.1080/07370024.2023.2260788
Aaron L. Gardony, Kana Okano, Gregory I. Hughes, Alex J. Kim, Kai T. Renshaw, Aldis Sipolins, Andrew B. Whitig, Feiyu Lu, Doug A. Bowman
{"title":"Characterizing information access needs in gaze-adaptive augmented reality interfaces: implications for fast-paced and dynamic usage contexts","authors":"Aaron L. Gardony, Kana Okano, Gregory I. Hughes, Alex J. Kim, Kai T. Renshaw, Aldis Sipolins, Andrew B. Whitig, Feiyu Lu, Doug A. Bowman","doi":"10.1080/07370024.2023.2260788","DOIUrl":null,"url":null,"abstract":"Gaze-adaptive interfaces can enable intuitive hands-free augmented reality (AR) interaction but unintentional selection (i.e. “Midas Touch”) can have serious consequences during high-stakes real-world AR use. In the present study, we assessed how simulated gaze-adaptive AR interfaces, implementing single and dual gaze inputs, influence Soldiers’ human performance and user experience (UX) in a fast-paced virtual reality marksmanship task. In Experiment 1, we investigated 1- and 2-stage dwell-based interfaces, finding confirmatory dual gaze dwell input effectively reduced Midas Touch but also reduced task performance and UX compared to an always-on (AO) interface. In Experiment 2, we investigated gaze depth-based interfaces, finding similar negative impacts of confirmatory dwell on Midas Touch, task performance, and UX. Overall, compared to the AO interface, single gaze input interfaces (e.g. single dwell or gaze depth threshold) reduced viewing of task-irrelevant information and yielded similar task performance and UX despite being prone to Midas Touch. Broadly, our findings demonstrate that AR users performing fast-paced dynamic tasks can tolerate some unintentional activation of AR displays if reliable and rapid information access is maintained and point to the need to develop and refine gaze depth estimation algorithms and novel gaze depth-based interfaces that provide rapid access to AR display content.","PeriodicalId":56306,"journal":{"name":"Human-Computer Interaction","volume":"9 1","pages":"0"},"PeriodicalIF":4.5000,"publicationDate":"2023-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Human-Computer Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/07370024.2023.2260788","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, CYBERNETICS","Score":null,"Total":0}
引用次数: 0

Abstract

Gaze-adaptive interfaces can enable intuitive hands-free augmented reality (AR) interaction but unintentional selection (i.e. “Midas Touch”) can have serious consequences during high-stakes real-world AR use. In the present study, we assessed how simulated gaze-adaptive AR interfaces, implementing single and dual gaze inputs, influence Soldiers’ human performance and user experience (UX) in a fast-paced virtual reality marksmanship task. In Experiment 1, we investigated 1- and 2-stage dwell-based interfaces, finding confirmatory dual gaze dwell input effectively reduced Midas Touch but also reduced task performance and UX compared to an always-on (AO) interface. In Experiment 2, we investigated gaze depth-based interfaces, finding similar negative impacts of confirmatory dwell on Midas Touch, task performance, and UX. Overall, compared to the AO interface, single gaze input interfaces (e.g. single dwell or gaze depth threshold) reduced viewing of task-irrelevant information and yielded similar task performance and UX despite being prone to Midas Touch. Broadly, our findings demonstrate that AR users performing fast-paced dynamic tasks can tolerate some unintentional activation of AR displays if reliable and rapid information access is maintained and point to the need to develop and refine gaze depth estimation algorithms and novel gaze depth-based interfaces that provide rapid access to AR display content.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
在视线自适应增强现实界面中描述信息访问需求:对快节奏和动态使用环境的影响
注视自适应界面可以实现直观的免提增强现实(AR)交互,但在高风险的现实AR使用中,无意的选择(即“点石成”)可能会产生严重后果。在本研究中,我们评估了在快节奏的虚拟现实射击任务中,实现单凝视和双凝视输入的模拟凝视自适应AR界面如何影响士兵的人类表现和用户体验(UX)。在实验1中,我们研究了基于1阶段和2阶段驻留的界面,发现与始终在线(AO)界面相比,验证性双凝视驻留输入有效地减少了点金术,但也降低了任务性能和用户体验。在实验2中,我们研究了基于凝视深度的界面,发现验证性停留对点金术、任务性能和用户体验有类似的负面影响。总的来说,与AO界面相比,单凝视输入界面(例如单停留或凝视深度阈值)减少了对任务无关信息的查看,并产生了类似的任务性能和用户体验,尽管容易出现点金术。总的来说,我们的研究结果表明,如果保持可靠和快速的信息访问,执行快节奏动态任务的AR用户可以容忍一些无意的AR显示激活,并指出需要开发和完善凝视深度估计算法和新颖的基于凝视深度的接口,以提供对AR显示内容的快速访问。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Human-Computer Interaction
Human-Computer Interaction 工程技术-计算机:控制论
CiteScore
12.20
自引率
3.80%
发文量
15
审稿时长
>12 weeks
期刊介绍: Human-Computer Interaction (HCI) is a multidisciplinary journal defining and reporting on fundamental research in human-computer interaction. The goal of HCI is to be a journal of the highest quality that combines the best research and design work to extend our understanding of human-computer interaction. The target audience is the research community with an interest in both the scientific implications and practical relevance of how interactive computer systems should be designed and how they are actually used. HCI is concerned with the theoretical, empirical, and methodological issues of interaction science and system design as it affects the user.
期刊最新文献
File hyper-searching explained Social fidelity in cooperative virtual reality maritime training The future of PIM: pragmatics and potential Clarifying and differentiating discoverability Design and evaluation of a versatile text input device for virtual and immersive workspaces
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1