ManiLoco:一种基于vr的并发对象操作的运动方法。

IF 1.4 Q3 COMPUTER SCIENCE, SOFTWARE ENGINEERING Proceedings of the ACM on computer graphics and interactive techniques Pub Date : 2023-05-01 DOI:10.1145/3585502
Dayu Wan, Xiaolei Guo, Jiahui Dong, Christos Mousas, Yingjie Chen
{"title":"ManiLoco:一种基于vr的并发对象操作的运动方法。","authors":"Dayu Wan,&nbsp;Xiaolei Guo,&nbsp;Jiahui Dong,&nbsp;Christos Mousas,&nbsp;Yingjie Chen","doi":"10.1145/3585502","DOIUrl":null,"url":null,"abstract":"<p><p>The use of virtual reality (VR) in laboratory skill training is rapidly increasing. In such applications, users often need to explore a large virtual environment within a limited physical space while completing a series of hand-based tasks (e.g., object manipulation). However, the most widely used controller-based teleport methods may conflict with the users' hand operation and result in a higher cognitive load, negatively affecting their training experiences. To alleviate these limitations, we designed and implemented a locomotion method called ManiLoco to enable hands-free interaction and thus avoid conflicts and interruptions from other tasks. Users can teleport to a remote object's position by taking a step toward the object while looking at it. We evaluated ManiLoco and compared it with state-of-the-art Point & Teleport in a within-subject experiment with 16 participants. The results confirmed the viability of our foot- and head-based approach and better support concurrent object manipulation in VR training tasks. Furthermore, our locomotion method does not require any additional hardware. It solely relies on the VR head-mounted display (HMD) and our implementation of detecting the user's stepping activity, and it can be easily applied to any VR application as a plugin.</p>","PeriodicalId":74536,"journal":{"name":"Proceedings of the ACM on computer graphics and interactive techniques","volume":"6 1","pages":""},"PeriodicalIF":1.4000,"publicationDate":"2023-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10249678/pdf/nihms-1905180.pdf","citationCount":"0","resultStr":"{\"title\":\"ManiLoco: A VR-Based Locomotion Method for Concurrent Object Manipulation.\",\"authors\":\"Dayu Wan,&nbsp;Xiaolei Guo,&nbsp;Jiahui Dong,&nbsp;Christos Mousas,&nbsp;Yingjie Chen\",\"doi\":\"10.1145/3585502\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>The use of virtual reality (VR) in laboratory skill training is rapidly increasing. In such applications, users often need to explore a large virtual environment within a limited physical space while completing a series of hand-based tasks (e.g., object manipulation). However, the most widely used controller-based teleport methods may conflict with the users' hand operation and result in a higher cognitive load, negatively affecting their training experiences. To alleviate these limitations, we designed and implemented a locomotion method called ManiLoco to enable hands-free interaction and thus avoid conflicts and interruptions from other tasks. Users can teleport to a remote object's position by taking a step toward the object while looking at it. We evaluated ManiLoco and compared it with state-of-the-art Point & Teleport in a within-subject experiment with 16 participants. The results confirmed the viability of our foot- and head-based approach and better support concurrent object manipulation in VR training tasks. Furthermore, our locomotion method does not require any additional hardware. It solely relies on the VR head-mounted display (HMD) and our implementation of detecting the user's stepping activity, and it can be easily applied to any VR application as a plugin.</p>\",\"PeriodicalId\":74536,\"journal\":{\"name\":\"Proceedings of the ACM on computer graphics and interactive techniques\",\"volume\":\"6 1\",\"pages\":\"\"},\"PeriodicalIF\":1.4000,\"publicationDate\":\"2023-05-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10249678/pdf/nihms-1905180.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the ACM on computer graphics and interactive techniques\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3585502\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, SOFTWARE ENGINEERING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the ACM on computer graphics and interactive techniques","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3585502","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
引用次数: 0

摘要

虚拟现实(VR)在实验室技能培训中的应用正在迅速增加。在这样的应用中,用户通常需要在有限的物理空间内探索一个巨大的虚拟环境,同时完成一系列基于手的任务(例如,对象操作)。然而,最广泛使用的基于控制器的远程传送方法可能与用户的手部操作相冲突,导致更高的认知负荷,对他们的训练体验产生负面影响。为了减轻这些限制,我们设计并实现了一种名为ManiLoco的移动方法,以实现免提交互,从而避免其他任务的冲突和中断。用户可以通过在看物体的时候向它迈出一步来传送到远程物体的位置。我们对ManiLoco进行了评估,并将其与最先进的Point & Teleport进行了比较。结果证实了我们基于脚和头的方法的可行性,并更好地支持VR训练任务中的并发对象操作。此外,我们的移动方法不需要任何额外的硬件。它只依赖于VR头戴式显示器(HMD)和我们检测用户行走活动的实现,它可以很容易地作为插件应用到任何VR应用程序中。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
ManiLoco: A VR-Based Locomotion Method for Concurrent Object Manipulation.

The use of virtual reality (VR) in laboratory skill training is rapidly increasing. In such applications, users often need to explore a large virtual environment within a limited physical space while completing a series of hand-based tasks (e.g., object manipulation). However, the most widely used controller-based teleport methods may conflict with the users' hand operation and result in a higher cognitive load, negatively affecting their training experiences. To alleviate these limitations, we designed and implemented a locomotion method called ManiLoco to enable hands-free interaction and thus avoid conflicts and interruptions from other tasks. Users can teleport to a remote object's position by taking a step toward the object while looking at it. We evaluated ManiLoco and compared it with state-of-the-art Point & Teleport in a within-subject experiment with 16 participants. The results confirmed the viability of our foot- and head-based approach and better support concurrent object manipulation in VR training tasks. Furthermore, our locomotion method does not require any additional hardware. It solely relies on the VR head-mounted display (HMD) and our implementation of detecting the user's stepping activity, and it can be easily applied to any VR application as a plugin.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
2.90
自引率
0.00%
发文量
0
期刊最新文献
Using Deep Learning to Increase Eye-Tracking Robustness, Accuracy, and Precision in Virtual Reality. A Multilevel Active-Set Preconditioner for Box-Constrained Pressure Poisson Solvers Motion In-Betweening with Phase Manifolds NeuroDog A Unified Analysis of Penalty-Based Collision Energies
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1