Judith Hartfill, Jenny Gabel, Lucie Kruse, S. Schmidt, Kevin Riebandt, Simone Kühn, Frank Steinicke
{"title":"Analysis of Detection Thresholds for Hand Redirection during Mid-Air Interactions in Virtual Reality","authors":"Judith Hartfill, Jenny Gabel, Lucie Kruse, S. Schmidt, Kevin Riebandt, Simone Kühn, Frank Steinicke","doi":"10.1145/3489849.3489866","DOIUrl":null,"url":null,"abstract":"Avatars in virtual reality (VR) with fully articulated hands enable users to naturally interact with the virtual environment (VE). Interactions are often performed in a one-to-one mapping between the movements of the user’s real body, for instance, the hands, and the displayed body of the avatar. However, VR also allows manipulating this mapping to introduce non-isomorphic techniques. In this context, research on manipulations of virtual hand movements typically focuses on increasing the user’s interaction space to improve the overall efficiency of hand-based interactions. In this paper, we investigate a hand retargeting method for decelerated hand movements. With this technique, users need to perform larger movements to reach for an object in the VE, which can be utilized, for example, in therapeutic applications. If these gain-based redirections of virtual hand movements are small enough, users become unable to reliably detect them due to the dominance of the visual sense. In a psychophysical experiment, we analyzed detection thresholds for six different motion paths in mid-air for both hands. We found significantly different detection thresholds between movement directions on each spatial axis. To verify our findings, we applied the identified gains in a playful application in a confirmatory study.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"35 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3489849.3489866","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9
Abstract
Avatars in virtual reality (VR) with fully articulated hands enable users to naturally interact with the virtual environment (VE). Interactions are often performed in a one-to-one mapping between the movements of the user’s real body, for instance, the hands, and the displayed body of the avatar. However, VR also allows manipulating this mapping to introduce non-isomorphic techniques. In this context, research on manipulations of virtual hand movements typically focuses on increasing the user’s interaction space to improve the overall efficiency of hand-based interactions. In this paper, we investigate a hand retargeting method for decelerated hand movements. With this technique, users need to perform larger movements to reach for an object in the VE, which can be utilized, for example, in therapeutic applications. If these gain-based redirections of virtual hand movements are small enough, users become unable to reliably detect them due to the dominance of the visual sense. In a psychophysical experiment, we analyzed detection thresholds for six different motion paths in mid-air for both hands. We found significantly different detection thresholds between movement directions on each spatial axis. To verify our findings, we applied the identified gains in a playful application in a confirmatory study.