Optical See-Through Head-Mounted Display With Mitigated Parallax-Related Registration Errors: A User Study Validation

IF 3.5 3区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE IEEE Transactions on Human-Machine Systems Pub Date : 2024-10-15 DOI:10.1109/THMS.2024.3468019
Nadia Cattari;Fabrizio Cutolo;Vincenzo Ferrari
{"title":"Optical See-Through Head-Mounted Display With Mitigated Parallax-Related Registration Errors: A User Study Validation","authors":"Nadia Cattari;Fabrizio Cutolo;Vincenzo Ferrari","doi":"10.1109/THMS.2024.3468019","DOIUrl":null,"url":null,"abstract":"For an optical see-through (OST) augmented reality (AR) head-mounted display (HMD) to assist in performing high-precision activities in the peripersonal space, a fundamental requirement is the correct spatial registration between the virtual information and the real environment. This registration can be achieved through a calibration procedure involving the parameterization of the virtual rendering camera via an eye-replacement camera that observes a calibration pattern rendered onto the OST display. In a previous feasibility study, we demonstrated and proved, with the same eye-replacement camera used for the calibration, that, in the case of an OST display with a focal plane close to the user's working distance, there is no need for prior-to-use viewpoint-specific calibration refinements obtained through eye-tracking cameras or additional alignment-based calibration steps. The viewpoint parallax-related AR registration error is indeed submillimetric within a reasonable range of depths around the display focal plane. This article confirms, through a user study based on a monocular virtual-to-real alignment task, that this finding is accurate and usable. In addition, we found that by performing the alignment-free calibration procedure via a high-resolution camera, the AR registration accuracy is substantially improved compared with that of other state-of-the-art approaches, with an error lower than 1mm over a notable range of distances. These results demonstrate the safe usability of OST HMDs for high-precision task guidance in the peripersonal space.","PeriodicalId":48916,"journal":{"name":"IEEE Transactions on Human-Machine Systems","volume":"54 6","pages":"668-677"},"PeriodicalIF":3.5000,"publicationDate":"2024-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10718696","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Human-Machine Systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10718696/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

For an optical see-through (OST) augmented reality (AR) head-mounted display (HMD) to assist in performing high-precision activities in the peripersonal space, a fundamental requirement is the correct spatial registration between the virtual information and the real environment. This registration can be achieved through a calibration procedure involving the parameterization of the virtual rendering camera via an eye-replacement camera that observes a calibration pattern rendered onto the OST display. In a previous feasibility study, we demonstrated and proved, with the same eye-replacement camera used for the calibration, that, in the case of an OST display with a focal plane close to the user's working distance, there is no need for prior-to-use viewpoint-specific calibration refinements obtained through eye-tracking cameras or additional alignment-based calibration steps. The viewpoint parallax-related AR registration error is indeed submillimetric within a reasonable range of depths around the display focal plane. This article confirms, through a user study based on a monocular virtual-to-real alignment task, that this finding is accurate and usable. In addition, we found that by performing the alignment-free calibration procedure via a high-resolution camera, the AR registration accuracy is substantially improved compared with that of other state-of-the-art approaches, with an error lower than 1mm over a notable range of distances. These results demonstrate the safe usability of OST HMDs for high-precision task guidance in the peripersonal space.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
可减轻视差相关注册错误的光学透视头戴式显示器:用户研究验证
要使光学透视(OST)增强现实(AR)头戴式显示器(HMD)能够帮助在个人周围空间执行高精度活动,一个基本要求就是虚拟信息与真实环境之间的正确空间配准。这种配准可以通过一个校准程序来实现,该程序涉及通过眼球替代摄像机对虚拟渲染摄像机进行参数化,该摄像机观察渲染到 OST 显示屏上的校准模式。在之前的一项可行性研究中,我们利用用于校准的同一台眼球置换相机演示并证明,在焦平面接近用户工作距离的 OST 显示屏上,无需事先通过眼球跟踪相机或额外的基于对齐的校准步骤来获得特定视点的校准改进。在显示焦平面周围的合理深度范围内,与视点视差相关的 AR 注册误差确实是亚毫米级的。本文通过一项基于单眼虚拟到现实配准任务的用户研究,证实了这一结论的准确性和可用性。此外,我们还发现,通过高分辨率相机执行免对准校准程序,与其他最先进的方法相比,增强现实技术的配准精度大幅提高,在显著的距离范围内误差低于 1 毫米。这些结果表明,OST HMD 可安全地用于人周空间的高精度任务引导。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
IEEE Transactions on Human-Machine Systems
IEEE Transactions on Human-Machine Systems COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE-COMPUTER SCIENCE, CYBERNETICS
CiteScore
7.10
自引率
11.10%
发文量
136
期刊介绍: The scope of the IEEE Transactions on Human-Machine Systems includes the fields of human machine systems. It covers human systems and human organizational interactions including cognitive ergonomics, system test and evaluation, and human information processing concerns in systems and organizations.
期刊最新文献
Table of Contents Call for Papers: IEEE Transactions on Human-Machine Systems IEEE Transactions on Human-Machine Systems Information for Authors IEEE Systems, Man, and Cybernetics Society Information IEEE Systems, Man, and Cybernetics Society Information
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1