Pub Date : 2022-03-01DOI: 10.1109/VRW55335.2022.00152
Prithvi Kohli, David R. Walton, R. K. D. Anjos, A. Steed, Tobias Ritschel
Ventral metamers, pairs of images which may differ substantially in the periphery, but are perceptually identical, offer exciting new possibilities in foveated rendering and image compression, as well as offering insights into the human visual system. However, existing lit-erature has mainly focused on creating metamers of static images. In this work, we develop a method for creating sequences of metameric frames, specifically light fields, with enforced consistency along the temporal, or angular, dimension. This greatly expands the potential applications for these metamers, and expanding metamers along the third dimension offers further new potential for compression.
{"title":"Beyond Flicker, Beyond Blur: View-coherent Metameric Light Fields for Foveated Display","authors":"Prithvi Kohli, David R. Walton, R. K. D. Anjos, A. Steed, Tobias Ritschel","doi":"10.1109/VRW55335.2022.00152","DOIUrl":"https://doi.org/10.1109/VRW55335.2022.00152","url":null,"abstract":"Ventral metamers, pairs of images which may differ substantially in the periphery, but are perceptually identical, offer exciting new possibilities in foveated rendering and image compression, as well as offering insights into the human visual system. However, existing lit-erature has mainly focused on creating metamers of static images. In this work, we develop a method for creating sequences of metameric frames, specifically light fields, with enforced consistency along the temporal, or angular, dimension. This greatly expands the potential applications for these metamers, and expanding metamers along the third dimension offers further new potential for compression.","PeriodicalId":326252,"journal":{"name":"2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"63 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126663409","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-03-01DOI: 10.1109/VRW55335.2022.00034
Krzysztof Szczurowski, Matt Smith
This manuscript describes challenges that we experienced during experiments employing virtual reality headsets. Its main focus is on technical issues that occurred when participants were transported from the environment they were physically occupying to its virtual replica displayed on a virtual reality headset. A range of hardware, software and experimental procedure issues and recommendations are presented, including issues such as the impact of mounting the virtual reality headset by participants on their Skin Conductance and Heart rate, and unexpected virtual reality headset features accidentally turned on by the participants. The method employed for creation of the virtual replica of the physical environment used in this study is described. We also discuss alternative methods that were considered, and which in some circumstances might produce better results than we achieved. This paper is written to share the lessons that we have learned during the beta-testing process and analysis of the data collected during our experiments using virtual reality headsets. The goal of describing these lessons learned is to aid other researchers in adopting techniques to improve the quality and replicability of experiments involving virtual reality headsets and biometric devices.
{"title":"Challenges of experimenting with Virtual Reality","authors":"Krzysztof Szczurowski, Matt Smith","doi":"10.1109/VRW55335.2022.00034","DOIUrl":"https://doi.org/10.1109/VRW55335.2022.00034","url":null,"abstract":"This manuscript describes challenges that we experienced during experiments employing virtual reality headsets. Its main focus is on technical issues that occurred when participants were transported from the environment they were physically occupying to its virtual replica displayed on a virtual reality headset. A range of hardware, software and experimental procedure issues and recommendations are presented, including issues such as the impact of mounting the virtual reality headset by participants on their Skin Conductance and Heart rate, and unexpected virtual reality headset features accidentally turned on by the participants. The method employed for creation of the virtual replica of the physical environment used in this study is described. We also discuss alternative methods that were considered, and which in some circumstances might produce better results than we achieved. This paper is written to share the lessons that we have learned during the beta-testing process and analysis of the data collected during our experiments using virtual reality headsets. The goal of describing these lessons learned is to aid other researchers in adopting techniques to improve the quality and replicability of experiments involving virtual reality headsets and biometric devices.","PeriodicalId":326252,"journal":{"name":"2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116565852","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-03-01DOI: 10.1109/VRW55335.2022.00056
Dominic Lesaca, Henry Cheung, Tapaswini Jena, D. Cliburn
The number of everyday virtual reality (VR) applications is increasing at a remarkable pace. Perhaps the most fundamental interaction in these applications is the ability to travel throughout virtual environments in which users find themselves immersed. Teleportation is often used to support travel in VR applications. While many methods exist for implementing teleportation, relatively little research has been done to compare such methods. In this paper, we describe an experiment to compare four teleportation methods for travel in everyday virtual reality. We found that, for general use, experienced VR users prefer to control a virtual arc with their hand to indicate the location and direction of orientation to which they want to teleport. However, teleporting a single step at a time in the direction of view may support more natural movement and encourage shorter travel paths, but at the expense of longer travel times.
{"title":"Comparing Teleportation Methods for Travel in Everyday Virtual Reality","authors":"Dominic Lesaca, Henry Cheung, Tapaswini Jena, D. Cliburn","doi":"10.1109/VRW55335.2022.00056","DOIUrl":"https://doi.org/10.1109/VRW55335.2022.00056","url":null,"abstract":"The number of everyday virtual reality (VR) applications is increasing at a remarkable pace. Perhaps the most fundamental interaction in these applications is the ability to travel throughout virtual environments in which users find themselves immersed. Teleportation is often used to support travel in VR applications. While many methods exist for implementing teleportation, relatively little research has been done to compare such methods. In this paper, we describe an experiment to compare four teleportation methods for travel in everyday virtual reality. We found that, for general use, experienced VR users prefer to control a virtual arc with their hand to indicate the location and direction of orientation to which they want to teleport. However, teleporting a single step at a time in the direction of view may support more natural movement and encourage shorter travel paths, but at the expense of longer travel times.","PeriodicalId":326252,"journal":{"name":"2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"86 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122671788","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-03-01DOI: 10.1109/VRW55335.2022.00027
Weiqiang Wang, Hantao Zhao, Jinyuan Jia
In the field of the robotic industry, traditional robot simulation platforms require users to install specific software and to be equipped with high-performance computers. This paper designs a lightweight collision detection algorithm that is able to run smoothly from a web browser with a novel simulation platform for a humanoid robot. The proposed algorithm is based on the Morton code for bounding volume hierarchies, which uses the triangular patches detection algorithm to improve the accuracy of detection. Through the presented method, we can use the platform to edit the robot on any browser supporting device without installing extra plug-ins.
{"title":"Lightweight Collision Detection Algorithm in Web3D Robot Simulation Platform","authors":"Weiqiang Wang, Hantao Zhao, Jinyuan Jia","doi":"10.1109/VRW55335.2022.00027","DOIUrl":"https://doi.org/10.1109/VRW55335.2022.00027","url":null,"abstract":"In the field of the robotic industry, traditional robot simulation platforms require users to install specific software and to be equipped with high-performance computers. This paper designs a lightweight collision detection algorithm that is able to run smoothly from a web browser with a novel simulation platform for a humanoid robot. The proposed algorithm is based on the Morton code for bounding volume hierarchies, which uses the triangular patches detection algorithm to improve the accuracy of detection. Through the presented method, we can use the platform to edit the robot on any browser supporting device without installing extra plug-ins.","PeriodicalId":326252,"journal":{"name":"2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"310 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122779927","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-03-01DOI: 10.1109/VRW55335.2022.00036
S. Alves, A. Uribe-Quevedo, Delun Chen, Jon Morris, Sina Radmard
Providing care to seniors and adults with Developmental Disabilities (DD) has seen increased use and development of assistive technologies including service robots. Such robots ease the challenges associated with care, companionship, medication intake, and fall prevention, among others. Research and development in this field rely on in-person data collection to ensure proper robot navigation, interactions, and service. However, the current COVID-19 pandemic has caused the implementation of physical distancing and access restrictions to long-term care facilities, thus making data collection very difficult. This traditional method poses numerous challenges as videos may not be representative of the population in terms of how people move, interact with the environment, or fall. In this paper, we present the development of a VR simulator for robotics navigation and fall detection with digital twins as a solution to test the virtual robot without having access to the real physical location, or real people. The development process required the development of virtual sensors that are able to create LIDAR data for the virtual robot to navigate and detect obstacles. Preliminary testing has allowed us to obtain promising results for the virtual simulator to train a service robot to navigate and detect falls. Our results include virtual maps, robot navigation, and fall detection.
{"title":"Developing a VR Simulator for Robotics Navigation and Human Robot Interactions employing Digital Twins","authors":"S. Alves, A. Uribe-Quevedo, Delun Chen, Jon Morris, Sina Radmard","doi":"10.1109/VRW55335.2022.00036","DOIUrl":"https://doi.org/10.1109/VRW55335.2022.00036","url":null,"abstract":"Providing care to seniors and adults with Developmental Disabilities (DD) has seen increased use and development of assistive technologies including service robots. Such robots ease the challenges associated with care, companionship, medication intake, and fall prevention, among others. Research and development in this field rely on in-person data collection to ensure proper robot navigation, interactions, and service. However, the current COVID-19 pandemic has caused the implementation of physical distancing and access restrictions to long-term care facilities, thus making data collection very difficult. This traditional method poses numerous challenges as videos may not be representative of the population in terms of how people move, interact with the environment, or fall. In this paper, we present the development of a VR simulator for robotics navigation and fall detection with digital twins as a solution to test the virtual robot without having access to the real physical location, or real people. The development process required the development of virtual sensors that are able to create LIDAR data for the virtual robot to navigate and detect obstacles. Preliminary testing has allowed us to obtain promising results for the virtual simulator to train a service robot to navigate and detect falls. Our results include virtual maps, robot navigation, and fall detection.","PeriodicalId":326252,"journal":{"name":"2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"68 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122877795","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-03-01DOI: 10.1109/VRW55335.2022.00296
Yobbahim J. Vite, Yaoping Hu
This paper studied the feasibility of mapping an engagement ratio onto a level of task complexity, when human participants undertook interactive tasks within a virtual reality (VR) environment. Each human participant used a haptic device to push a ball-shaped object through a pipe. There were a total of three pipes, which had three different shapes corresponding to the levels of task complexity mathematically. An electroencephalogram (EEG) device recorded the brain activity of the participant while undertaking the task. The outcomes of the study confirmed the feasibility of mapping the engagement ratio with the levels of task complexity.
{"title":"Feasibility of Mapping Engagement Ratios to Levels of Task Complexity within VR Environments","authors":"Yobbahim J. Vite, Yaoping Hu","doi":"10.1109/VRW55335.2022.00296","DOIUrl":"https://doi.org/10.1109/VRW55335.2022.00296","url":null,"abstract":"This paper studied the feasibility of mapping an engagement ratio onto a level of task complexity, when human participants undertook interactive tasks within a virtual reality (VR) environment. Each human participant used a haptic device to push a ball-shaped object through a pipe. There were a total of three pipes, which had three different shapes corresponding to the levels of task complexity mathematically. An electroencephalogram (EEG) device recorded the brain activity of the participant while undertaking the task. The outcomes of the study confirmed the feasibility of mapping the engagement ratio with the levels of task complexity.","PeriodicalId":326252,"journal":{"name":"2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131486168","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-03-01DOI: 10.1109/VRW55335.2022.00283
Liangding Li, D. Fontes, Carsten Neumann, M. Kinzel, D. Reiners, C. Cruz-Neira
One key factor in stopping the spread of COVID-19 is practicing social distancing. Visualizing possible sneeze droplets' transmission routes in front of an infected person might be an effective way to help people understand the importance of social distancing. This paper presents a mobile virtual reality (VR) interface that helps people visualize droplet dispersion from the target person's view. We implemented a VR application to visualize and interact with the sneeze simulation data immersively. Our application provides an easy way to communicate the correlation between social distance and infected droplets exposure, which is difficult to achieve in the real world.
{"title":"Immersive Visualization of Sneeze Simulation Data on Mobile Devices","authors":"Liangding Li, D. Fontes, Carsten Neumann, M. Kinzel, D. Reiners, C. Cruz-Neira","doi":"10.1109/VRW55335.2022.00283","DOIUrl":"https://doi.org/10.1109/VRW55335.2022.00283","url":null,"abstract":"One key factor in stopping the spread of COVID-19 is practicing social distancing. Visualizing possible sneeze droplets' transmission routes in front of an infected person might be an effective way to help people understand the importance of social distancing. This paper presents a mobile virtual reality (VR) interface that helps people visualize droplet dispersion from the target person's view. We implemented a VR application to visualize and interact with the sneeze simulation data immersively. Our application provides an easy way to communicate the correlation between social distance and infected droplets exposure, which is difficult to achieve in the real world.","PeriodicalId":326252,"journal":{"name":"2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127824065","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-03-01DOI: 10.1109/VRW55335.2022.00246
Diego González Morín, Francisco Pereira, Ester González, P. Pérez, Á. Villegas
Video pass-through Extended Reality (XR) is rapidly gaining interest from developers and researchers. However, video pass-through enabled XR devices' ecosystem is still bounded to expensive hardware. In this paper, we describe our custom hardware and software setup for providing effective video pass-through capabilities to inexpensive commercial Virtual Reality (VR) devices. The proposed hardware setup incorporates a low-cost HD stereo camera rigidly hooked to the VR device using a custom 3D printed attachment. Our software solution, implemented in Unity, overcomes hardware-specific limitations, such as cameras' delays, in a simple yet successful manner.
{"title":"Democratic Video Pass-Through for Commercial Virtual Reality Devices","authors":"Diego González Morín, Francisco Pereira, Ester González, P. Pérez, Á. Villegas","doi":"10.1109/VRW55335.2022.00246","DOIUrl":"https://doi.org/10.1109/VRW55335.2022.00246","url":null,"abstract":"Video pass-through Extended Reality (XR) is rapidly gaining interest from developers and researchers. However, video pass-through enabled XR devices' ecosystem is still bounded to expensive hardware. In this paper, we describe our custom hardware and software setup for providing effective video pass-through capabilities to inexpensive commercial Virtual Reality (VR) devices. The proposed hardware setup incorporates a low-cost HD stereo camera rigidly hooked to the VR device using a custom 3D printed attachment. Our software solution, implemented in Unity, overcomes hardware-specific limitations, such as cameras' delays, in a simple yet successful manner.","PeriodicalId":326252,"journal":{"name":"2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"226 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132806408","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-03-01DOI: 10.1109/VRW55335.2022.00059
S. Bialkova, Chloe Barr
Although Augmented reality (AR) implication has increased in the cosmetic industry, the technology seems to not be always accepted and used by consumers. This is a challenging issue calling further investigation. Present paper addresses this challenge in an attempt to provide the needed understanding on the key drivers of AR experience and how these might enhance the consumer purchase experience itself. After trying a product via virtual try-on, participants in the current study had to provide evaluation in terms of both AR and purchase experience. The results are clear in showing that interactivity, realism, ease of use and immersion modulate AR experience evaluation and thus, the user satisfaction. Purchase experience correlated with utilitarian and hedonic values, predetermined by aesthetic and information quality. Utilitarian and hedonic values also affected the product knowledge. These relationships were well pronounced for participants reporting high satisfaction with the experience. Current outcomes are discussed further in a single framework combining various factors to provide the much-needed understanding on AR environments and their effect on consumer experience evaluation. Results could find a direct application in practices to design AR environments to augment consumer satisfaction and thus, enhancing the purchase journey.
{"title":"Virtual Try-On: How to Enhance Consumer Experience?","authors":"S. Bialkova, Chloe Barr","doi":"10.1109/VRW55335.2022.00059","DOIUrl":"https://doi.org/10.1109/VRW55335.2022.00059","url":null,"abstract":"Although Augmented reality (AR) implication has increased in the cosmetic industry, the technology seems to not be always accepted and used by consumers. This is a challenging issue calling further investigation. Present paper addresses this challenge in an attempt to provide the needed understanding on the key drivers of AR experience and how these might enhance the consumer purchase experience itself. After trying a product via virtual try-on, participants in the current study had to provide evaluation in terms of both AR and purchase experience. The results are clear in showing that interactivity, realism, ease of use and immersion modulate AR experience evaluation and thus, the user satisfaction. Purchase experience correlated with utilitarian and hedonic values, predetermined by aesthetic and information quality. Utilitarian and hedonic values also affected the product knowledge. These relationships were well pronounced for participants reporting high satisfaction with the experience. Current outcomes are discussed further in a single framework combining various factors to provide the much-needed understanding on AR environments and their effect on consumer experience evaluation. Results could find a direct application in practices to design AR environments to augment consumer satisfaction and thus, enhancing the purchase journey.","PeriodicalId":326252,"journal":{"name":"2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133735962","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-03-01DOI: 10.1109/VRW55335.2022.00085
Shahin Doroudian, Zekun Wu, Weichao Wang, Alexia Galati, Aidong Lu
Search and rescue (SAR) is an essential skill for many first responders including firefighters. With the new development of technologies on environmental sensing, we are now equipped with the new capability of utilizing real-time information from a fire scene to assist SAR tasks. This work is to explore the effects of immersive maps, that collect information automatically from the environment including human and building structures, on user behaviors during SAR operations. We have developed a VR prototype system for SAR training with controlled fire scenes and information collections. We also designed and performed a user study focusing on the factors of information amount in an immersive map and danger degree of SAR tasks. We have summarized a set of user behaviors from our study, and captured their features with statistical data analysis. Our results confirm the advantages of real-time information for SAR tasks and differences of user behaviors under dangerous situations. Our results also demonstrate the potential of studying user behaviors with virtual training and deriving insights to design effective training programs.
{"title":"A Study of Real-time Information on User Behaviors during Search and Rescue (SAR) Training of Firefighters","authors":"Shahin Doroudian, Zekun Wu, Weichao Wang, Alexia Galati, Aidong Lu","doi":"10.1109/VRW55335.2022.00085","DOIUrl":"https://doi.org/10.1109/VRW55335.2022.00085","url":null,"abstract":"Search and rescue (SAR) is an essential skill for many first responders including firefighters. With the new development of technologies on environmental sensing, we are now equipped with the new capability of utilizing real-time information from a fire scene to assist SAR tasks. This work is to explore the effects of immersive maps, that collect information automatically from the environment including human and building structures, on user behaviors during SAR operations. We have developed a VR prototype system for SAR training with controlled fire scenes and information collections. We also designed and performed a user study focusing on the factors of information amount in an immersive map and danger degree of SAR tasks. We have summarized a set of user behaviors from our study, and captured their features with statistical data analysis. Our results confirm the advantages of real-time information for SAR tasks and differences of user behaviors under dangerous situations. Our results also demonstrate the potential of studying user behaviors with virtual training and deriving insights to design effective training programs.","PeriodicalId":326252,"journal":{"name":"2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"50 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133976286","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}