Martin Feick, K. P. Regitz, Anthony Tang, Tobias Jungbluth, Maurice Rekrut, Antonio Krüger
{"title":"利用生理和交互数据研究虚拟现实中明显的手重定向","authors":"Martin Feick, K. P. Regitz, Anthony Tang, Tobias Jungbluth, Maurice Rekrut, Antonio Krüger","doi":"10.1109/VR55154.2023.00035","DOIUrl":null,"url":null,"abstract":"Hand redirection is effective so long as the introduced offsets are not noticeably disruptive to users. In this work we investigate the use of physiological and interaction data to detect movement discrepancies between a user's real and virtual hand, pushing towards a novel approach to identify discrepancies which are too large and therefore can be noticed. We ran a study with 22 participants, collecting EEG, ECG, EDA, RSP, and interaction data. Our results suggest that EEG and interaction data can be reliably used to detect visuo-motor discrepancies, whereas ECG and RSP seem to suffer from inconsistencies. Our findings also show that participants quickly adapt to large discrepancies, and that they constantly attempt to establish a stable mental model of their environment. Together, these findings suggest that there is no absolute threshold for possible non-detectable discrepancies; instead, it depends primarily on participants' most recent experience with this kind of interaction.","PeriodicalId":346767,"journal":{"name":"2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Investigating Noticeable Hand Redirection in Virtual Reality using Physiological and Interaction Data\",\"authors\":\"Martin Feick, K. P. Regitz, Anthony Tang, Tobias Jungbluth, Maurice Rekrut, Antonio Krüger\",\"doi\":\"10.1109/VR55154.2023.00035\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Hand redirection is effective so long as the introduced offsets are not noticeably disruptive to users. In this work we investigate the use of physiological and interaction data to detect movement discrepancies between a user's real and virtual hand, pushing towards a novel approach to identify discrepancies which are too large and therefore can be noticed. We ran a study with 22 participants, collecting EEG, ECG, EDA, RSP, and interaction data. Our results suggest that EEG and interaction data can be reliably used to detect visuo-motor discrepancies, whereas ECG and RSP seem to suffer from inconsistencies. Our findings also show that participants quickly adapt to large discrepancies, and that they constantly attempt to establish a stable mental model of their environment. Together, these findings suggest that there is no absolute threshold for possible non-detectable discrepancies; instead, it depends primarily on participants' most recent experience with this kind of interaction.\",\"PeriodicalId\":346767,\"journal\":{\"name\":\"2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-03-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/VR55154.2023.00035\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE Conference Virtual Reality and 3D User Interfaces (VR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/VR55154.2023.00035","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Investigating Noticeable Hand Redirection in Virtual Reality using Physiological and Interaction Data
Hand redirection is effective so long as the introduced offsets are not noticeably disruptive to users. In this work we investigate the use of physiological and interaction data to detect movement discrepancies between a user's real and virtual hand, pushing towards a novel approach to identify discrepancies which are too large and therefore can be noticed. We ran a study with 22 participants, collecting EEG, ECG, EDA, RSP, and interaction data. Our results suggest that EEG and interaction data can be reliably used to detect visuo-motor discrepancies, whereas ECG and RSP seem to suffer from inconsistencies. Our findings also show that participants quickly adapt to large discrepancies, and that they constantly attempt to establish a stable mental model of their environment. Together, these findings suggest that there is no absolute threshold for possible non-detectable discrepancies; instead, it depends primarily on participants' most recent experience with this kind of interaction.