NeuroDog

IF 1.4 Q3 COMPUTER SCIENCE, SOFTWARE ENGINEERING Proceedings of the ACM on computer graphics and interactive techniques Pub Date : 2023-08-16 DOI:10.1145/3606936
Donald E. Egan, D. Cosker, R. Mcdonnell
{"title":"NeuroDog","authors":"Donald E. Egan, D. Cosker, R. Mcdonnell","doi":"10.1145/3606936","DOIUrl":null,"url":null,"abstract":"Virtual reality (VR) allows us to immerse ourselves in alternative worlds in which we can embody avatars to take on new identities. Usually, these avatars are humanoid or possess very strong anthropomorphic qualities. Allowing users of VR to embody non-humanoid virtual characters or animals presents additional challenges. Extreme morphological differences and the complexities of different characters' motions can make the construction of a real-time mapping between input human motion and target character motion a difficult challenge. Previous animal embodiment work has focused on direct mapping of human motion to the target animal via inverse kinematics. This can lead to the target animal moving in a way which is inappropriate or unnatural for the animal type. We present a novel real-time method, incorporating two neural networks, for mapping human motion to realistic quadruped motion. Crucially, the output quadruped motions are realistic, while also being faithful to the input user motions. We incorporate our mapping into a VR embodiment system in which users can embody a virtual quadruped from a first person perspective. Further, we evaluate our system via a perceptual experiment in which we investigate the quality of the synthesised motion, the system's response to user input and the sense of embodiment experienced by users. The main findings of the study are that the system responds as well as traditional embodiment systems to user input, produces higher quality motion and users experience a higher sense of body ownership when compared to a baseline method in which the human to quadruped motion mapping relies solely on inverse kinematics. Finally, our embodiment system relies solely on consumer-grade hardware, thus making it appropriate for use in applications such as VR gaming or VR social platforms.","PeriodicalId":74536,"journal":{"name":"Proceedings of the ACM on computer graphics and interactive techniques","volume":" ","pages":"1 - 19"},"PeriodicalIF":1.4000,"publicationDate":"2023-08-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the ACM on computer graphics and interactive techniques","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3606936","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
引用次数: 0

Abstract

Virtual reality (VR) allows us to immerse ourselves in alternative worlds in which we can embody avatars to take on new identities. Usually, these avatars are humanoid or possess very strong anthropomorphic qualities. Allowing users of VR to embody non-humanoid virtual characters or animals presents additional challenges. Extreme morphological differences and the complexities of different characters' motions can make the construction of a real-time mapping between input human motion and target character motion a difficult challenge. Previous animal embodiment work has focused on direct mapping of human motion to the target animal via inverse kinematics. This can lead to the target animal moving in a way which is inappropriate or unnatural for the animal type. We present a novel real-time method, incorporating two neural networks, for mapping human motion to realistic quadruped motion. Crucially, the output quadruped motions are realistic, while also being faithful to the input user motions. We incorporate our mapping into a VR embodiment system in which users can embody a virtual quadruped from a first person perspective. Further, we evaluate our system via a perceptual experiment in which we investigate the quality of the synthesised motion, the system's response to user input and the sense of embodiment experienced by users. The main findings of the study are that the system responds as well as traditional embodiment systems to user input, produces higher quality motion and users experience a higher sense of body ownership when compared to a baseline method in which the human to quadruped motion mapping relies solely on inverse kinematics. Finally, our embodiment system relies solely on consumer-grade hardware, thus making it appropriate for use in applications such as VR gaming or VR social platforms.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
NeuroDog
虚拟现实(VR)使我们能够沉浸在另类世界中,在这个世界中,我们可以化身为新的身份。通常,这些化身是人形的,或者具有非常强的拟人化品质。允许VR用户体现非人形虚拟角色或动物带来了额外的挑战。极端的形态差异和不同角色运动的复杂性可能会使构建输入人类运动和目标角色运动之间的实时映射成为一项艰巨的挑战。先前的动物实施例工作集中于通过反向运动学将人类运动直接映射到目标动物。这可能导致目标动物以不适合或不自然的方式移动。我们提出了一种新的实时方法,结合了两个神经网络,用于将人类运动映射到真实的四足动物运动。至关重要的是,输出的四足动物运动是真实的,同时也忠实于输入的用户运动。我们将我们的映射融入到VR实施系统中,在该系统中,用户可以从第一人称的角度实施虚拟四足动物。此外,我们通过感知实验来评估我们的系统,在该实验中,我们研究了合成运动的质量、系统对用户输入的响应以及用户体验到的体现感。该研究的主要发现是,与人到四足动物的运动映射仅依赖于反向运动学的基线方法相比,该系统和传统实施例系统一样对用户输入做出响应,产生更高质量的运动,并且用户体验到更高的身体自主感。最后,我们的实施例系统仅依赖于消费级硬件,从而使其适用于VR游戏或VR社交平台等应用。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
2.90
自引率
0.00%
发文量
0
期刊最新文献
Using Deep Learning to Increase Eye-Tracking Robustness, Accuracy, and Precision in Virtual Reality. A Multilevel Active-Set Preconditioner for Box-Constrained Pressure Poisson Solvers Motion In-Betweening with Phase Manifolds NeuroDog A Unified Analysis of Penalty-Based Collision Energies
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1