Pub Date : 2024-07-03DOI: 10.3389/frvir.2024.1363193
J. Stuart, Anita Stephen, Karen Aul, Michael D. Bumbach, Shari Huffman, Brooke Russo, Benjamin Lok
Introduction: Variations in skin tone can significantly alter the appearance of symptoms such as rashes or bruises. Unfortunately, previous works utilizing Augmented Reality (AR) in simulating visual symptoms have often failed to consider this critical aspect, potentially leading to inadequate training and education. This study seeks to address this gap by integrating generative artificial intelligence (AI) into the AR filter design process.Methods: We conducted a 2 × 5 within-subjects study with second-year nursing students (N = 117) from the University of Florida. The study manipulated two factors: symptom generation style and skin tone. Symptom generation style was manipulated using a filter based on a real symptom image or a filter based on a computer-generated symptom image. Skin tone variations were created by applying AR filters to computer-generated images of faces with five skin tones ranging from light to dark. To control for factors like lighting or 3D tracking, 101 pre-generated images were created for each condition, representing a range of filter transparency levels (0–100). Participants used visual analog scales on a computer screen to adjust the symptom transparency in the images until they observed image changes and distinct symptom patterns. Participants also rated the realism of each condition and provided feedback on how the symptom style and skin tone impacted their perceptions.Results: Students rated the symptoms displayed by the computer-generated AR filters as marginally more realistic than those displayed by the real image AR filters. However, students identified symptoms earlier with the real-image filters. Additionally, SET-M and Theory of Planned Behavior questions indicate that the activity increased students’ feelings of confidence and self-efficacy. Finally, we found that similar to the real world, where symptoms on dark skin tones are identified at later stages of development, students identified symptoms at later stages as skin tone darkened regardless of cue type.Conclusion: This work implemented a novel approach to develop AR filters that display time-based visual cues on diverse skin tones. Additionally, this work provides evidence-based recommendations on how and when generative AI-based AR filters can be effectively used in healthcare education.
介绍:肤色的变化会明显改变皮疹或瘀伤等症状的外观。遗憾的是,以前利用增强现实技术(AR)模拟视觉症状的工作往往没有考虑到这一关键方面,可能导致培训和教育不足。本研究试图通过将生成式人工智能(AI)整合到 AR 滤镜设计过程中来弥补这一不足:我们对佛罗里达大学护理专业二年级学生(117 人)进行了一项 2 × 5 的主体内研究。研究操纵了两个因素:症状生成风格和肤色。症状生成方式是通过基于真实症状图像的滤镜或基于计算机生成的症状图像的滤镜来操控的。肤色的变化是通过将 AR 滤镜应用于计算机生成的人脸图像而产生的,这些图像具有从浅到深的五种肤色。为了控制光照或 3D 跟踪等因素,我们为每种情况创建了 101 张预先生成的图像,代表了不同的滤镜透明度水平(0-100)。参与者使用计算机屏幕上的视觉模拟刻度来调整图像中症状的透明度,直到他们观察到图像变化和明显的症状模式。参与者还对每种情况的逼真度进行评分,并就症状样式和肤色如何影响他们的感知提供反馈:结果:学生对计算机生成的 AR 滤镜所显示症状的评分略高于真实图像 AR 滤镜所显示症状的评分。不过,使用真实图像滤镜时,学生更早识别出症状。此外,SET-M 和计划行为理论问题表明,该活动增强了学生的自信心和自我效能感。最后,我们发现,与现实世界中深色肤色的症状在发育后期才被识别出来的情况类似,无论线索类型如何,学生都能在肤色变深的后期识别出症状:这项研究采用了一种新颖的方法来开发 AR 滤镜,以显示不同肤色的基于时间的视觉提示。此外,这项工作还就如何以及何时在医疗保健教育中有效使用基于生成式人工智能的 AR 滤镜提出了循证建议。
{"title":"Developing augmented reality filters to display visual cues on diverse skin tones","authors":"J. Stuart, Anita Stephen, Karen Aul, Michael D. Bumbach, Shari Huffman, Brooke Russo, Benjamin Lok","doi":"10.3389/frvir.2024.1363193","DOIUrl":"https://doi.org/10.3389/frvir.2024.1363193","url":null,"abstract":"Introduction: Variations in skin tone can significantly alter the appearance of symptoms such as rashes or bruises. Unfortunately, previous works utilizing Augmented Reality (AR) in simulating visual symptoms have often failed to consider this critical aspect, potentially leading to inadequate training and education. This study seeks to address this gap by integrating generative artificial intelligence (AI) into the AR filter design process.Methods: We conducted a 2 × 5 within-subjects study with second-year nursing students (N = 117) from the University of Florida. The study manipulated two factors: symptom generation style and skin tone. Symptom generation style was manipulated using a filter based on a real symptom image or a filter based on a computer-generated symptom image. Skin tone variations were created by applying AR filters to computer-generated images of faces with five skin tones ranging from light to dark. To control for factors like lighting or 3D tracking, 101 pre-generated images were created for each condition, representing a range of filter transparency levels (0–100). Participants used visual analog scales on a computer screen to adjust the symptom transparency in the images until they observed image changes and distinct symptom patterns. Participants also rated the realism of each condition and provided feedback on how the symptom style and skin tone impacted their perceptions.Results: Students rated the symptoms displayed by the computer-generated AR filters as marginally more realistic than those displayed by the real image AR filters. However, students identified symptoms earlier with the real-image filters. Additionally, SET-M and Theory of Planned Behavior questions indicate that the activity increased students’ feelings of confidence and self-efficacy. Finally, we found that similar to the real world, where symptoms on dark skin tones are identified at later stages of development, students identified symptoms at later stages as skin tone darkened regardless of cue type.Conclusion: This work implemented a novel approach to develop AR filters that display time-based visual cues on diverse skin tones. Additionally, this work provides evidence-based recommendations on how and when generative AI-based AR filters can be effectively used in healthcare education.","PeriodicalId":502489,"journal":{"name":"Frontiers in Virtual Reality","volume":"7 10","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-07-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141681371","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-06-13DOI: 10.3389/frvir.2024.1355082
M. Ernst, Jakob Hyldig Nielsen, Eik Runge, S. Bouchard, L. Clemmensen
A large proportion of individuals with anxiety-related disorders refrain from seeking treatment. This may be because traditional exposure treatments induce anxiety. However, advances in exposure using virtual reality technology may encourage more individuals to seek treatment. Furthermore, using biomarkers with VR-based exposure may enable clinicians to assess anxiety levels objectively and collect data in a naturalistic setting.Here, we conduct a systematic review of the literature on the use of biomarkers in VR-based exposure treatment for anxiety. Twenty-seven studies were included, with a total of 1046 participants.We found that heart rate was the only biomarker that tentatively could identify changes within (75% of instances) and between sessions (60% of instances). The levels of synchrony between the findings for overall biomarkers and the results from questionnaires showed inconclusive results. Regarding the levels of synchrony between the findings for particular biomarkers and the results from questionnaires, only skin conductance level was highly synchronous for differences between groups (87% of instances).Based on the present review, biomarkers cannot yet be used reliably to distinguish differences in self-reported symptoms of anxiety in VR-based exposure treatments.
{"title":"Biomarkers in exposure-based treatment of anxiety in virtual reality: a systematic review","authors":"M. Ernst, Jakob Hyldig Nielsen, Eik Runge, S. Bouchard, L. Clemmensen","doi":"10.3389/frvir.2024.1355082","DOIUrl":"https://doi.org/10.3389/frvir.2024.1355082","url":null,"abstract":"A large proportion of individuals with anxiety-related disorders refrain from seeking treatment. This may be because traditional exposure treatments induce anxiety. However, advances in exposure using virtual reality technology may encourage more individuals to seek treatment. Furthermore, using biomarkers with VR-based exposure may enable clinicians to assess anxiety levels objectively and collect data in a naturalistic setting.Here, we conduct a systematic review of the literature on the use of biomarkers in VR-based exposure treatment for anxiety. Twenty-seven studies were included, with a total of 1046 participants.We found that heart rate was the only biomarker that tentatively could identify changes within (75% of instances) and between sessions (60% of instances). The levels of synchrony between the findings for overall biomarkers and the results from questionnaires showed inconclusive results. Regarding the levels of synchrony between the findings for particular biomarkers and the results from questionnaires, only skin conductance level was highly synchronous for differences between groups (87% of instances).Based on the present review, biomarkers cannot yet be used reliably to distinguish differences in self-reported symptoms of anxiety in VR-based exposure treatments.","PeriodicalId":502489,"journal":{"name":"Frontiers in Virtual Reality","volume":"11 9","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-06-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141346674","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-06-13DOI: 10.3389/frvir.2024.1383957
Yuki Mashiyama, Ryota Kondo, M. Fukuoka, Theophilus Teo, Maki Sugimoto
As part of research on human augmentation, multiple bodies are used in a virtual environment. For example, a study on multiple partial body parts has been conducted using up to 64 hands and showed that multiple hands reduced the distance traveled by one hand. However, body perception has yet to be verified. In this study, we investigated how body perception changes when nine hands, partial bodies, are moved synchronously in a virtual environment, compared to a single hand. In addition, we examined whether the sense of body ownership for all nine hands was elicited simultaneously or whether it was elicited for some of the hands while switching between them. Participants performed a reaching task using one or nine hands presented in a virtual environment. After the reaching task, a threat stimulus was given, and hand movements in response to the threat were measured. After completion of each condition, the subjective sense of body ownership and sense of agency was investigated using a Likert scale. The results indicated that users felt the sense of body ownership of several hands for the nine hands and manipulated them by switching their attention to multiple bodies.
{"title":"Investigating body perception of multiple virtual hands in synchronized and asynchronized conditions","authors":"Yuki Mashiyama, Ryota Kondo, M. Fukuoka, Theophilus Teo, Maki Sugimoto","doi":"10.3389/frvir.2024.1383957","DOIUrl":"https://doi.org/10.3389/frvir.2024.1383957","url":null,"abstract":"As part of research on human augmentation, multiple bodies are used in a virtual environment. For example, a study on multiple partial body parts has been conducted using up to 64 hands and showed that multiple hands reduced the distance traveled by one hand. However, body perception has yet to be verified. In this study, we investigated how body perception changes when nine hands, partial bodies, are moved synchronously in a virtual environment, compared to a single hand. In addition, we examined whether the sense of body ownership for all nine hands was elicited simultaneously or whether it was elicited for some of the hands while switching between them. Participants performed a reaching task using one or nine hands presented in a virtual environment. After the reaching task, a threat stimulus was given, and hand movements in response to the threat were measured. After completion of each condition, the subjective sense of body ownership and sense of agency was investigated using a Likert scale. The results indicated that users felt the sense of body ownership of several hands for the nine hands and manipulated them by switching their attention to multiple bodies.","PeriodicalId":502489,"journal":{"name":"Frontiers in Virtual Reality","volume":"40 25","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-06-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141346134","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-06-10DOI: 10.3389/frvir.2024.1368721
Jessica de Souza, Robert Tartz
Our study addresses the challenges limiting the adoption of Extended Reality (XR) Head-Mounted Displays (HMDs), mainly focusing on device quality and cybersickness. We aim to investigate the impact of hardware and software on user experience and task performance while wearing Video See-Through (VST) HMDs. We employ a novel methodology designed to bridge the gaps identified in previous research.This study uses a convergent mixed-methods approach, combining qualitative and quantitative data in a within-subjects evaluation involving 20 participants. This comprehensive evaluation examines visual perception, visual quality, and user experience through a range of tasks. Usability, comfort, and cybersickness are assessed, with insights derived from both user performance metrics and subjective measures collected through in-depth interviews and comments. The study includes three distinct HMDs—two prototypes (PD1 and PD2) and one commercial device (CD1)—to provide a broad analysis of the technology.Our findings reveal that while participants were generally satisfied with VST mode, their preferences varied across devices. CD1 was preferred for its realistic color representation and superior reading task performance due to its high-resolution display and camera. However, visual disturbances and temporal issues differed across devices, with CD1 exhibiting fewer artifacts when stationary but showing more disturbances when participants were moving. Participants found PD1 and PD2 more comfortable for extended use and fewer cybersickness symptoms, but they highlighted color and display resolution issues. These variations underscore the importance of considering both qualitative and quantitative measures in HMD evaluations.This mixed-methods evaluation emphasizes the limitations of relying solely on visual perception performance measures for VST HMDs. By integrating both quantitative and qualitative insights, we offer a more detailed evaluation framework to identify design flaws and user experience issues that quantitative metrics alone might miss. This methodology contributes to the field by illustrating how a mixed-methods approach provides a broader perspective on XR technology, guiding future improvements and enhancing VST adoption in future applications.
{"title":"Visual perception and user satisfaction in video see-through head-mounted displays: a mixed-methods evaluation","authors":"Jessica de Souza, Robert Tartz","doi":"10.3389/frvir.2024.1368721","DOIUrl":"https://doi.org/10.3389/frvir.2024.1368721","url":null,"abstract":"Our study addresses the challenges limiting the adoption of Extended Reality (XR) Head-Mounted Displays (HMDs), mainly focusing on device quality and cybersickness. We aim to investigate the impact of hardware and software on user experience and task performance while wearing Video See-Through (VST) HMDs. We employ a novel methodology designed to bridge the gaps identified in previous research.This study uses a convergent mixed-methods approach, combining qualitative and quantitative data in a within-subjects evaluation involving 20 participants. This comprehensive evaluation examines visual perception, visual quality, and user experience through a range of tasks. Usability, comfort, and cybersickness are assessed, with insights derived from both user performance metrics and subjective measures collected through in-depth interviews and comments. The study includes three distinct HMDs—two prototypes (PD1 and PD2) and one commercial device (CD1)—to provide a broad analysis of the technology.Our findings reveal that while participants were generally satisfied with VST mode, their preferences varied across devices. CD1 was preferred for its realistic color representation and superior reading task performance due to its high-resolution display and camera. However, visual disturbances and temporal issues differed across devices, with CD1 exhibiting fewer artifacts when stationary but showing more disturbances when participants were moving. Participants found PD1 and PD2 more comfortable for extended use and fewer cybersickness symptoms, but they highlighted color and display resolution issues. These variations underscore the importance of considering both qualitative and quantitative measures in HMD evaluations.This mixed-methods evaluation emphasizes the limitations of relying solely on visual perception performance measures for VST HMDs. By integrating both quantitative and qualitative insights, we offer a more detailed evaluation framework to identify design flaws and user experience issues that quantitative metrics alone might miss. This methodology contributes to the field by illustrating how a mixed-methods approach provides a broader perspective on XR technology, guiding future improvements and enhancing VST adoption in future applications.","PeriodicalId":502489,"journal":{"name":"Frontiers in Virtual Reality","volume":" 1268","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-06-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141363792","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-06-07DOI: 10.3389/frvir.2024.1359987
P. Lladó, R. Barumerli, R. Baumgartner, P. Majdak
In augmented reality scenarios, headphones obstruct the direct path of the sound to the ears, affecting the users’ abilities to localize surrounding sound sources and compromising the immersive experience. Unfortunately, the assessment of the perceptual implications of wearing headphones on localization in ecologically valid scenarios is costly and time-consuming. Here, we propose a model-based tool for automatic assessment of the dynamic localization degradation (DLD) introduced by headphones describing the time required to find a target in an auditory-guided visual search task. First, we introduce the DLD score obtained for twelve headphones and the search times with actual listeners. Then, we describe the predictions of the headphone-induced DLD score obtained by an auditory model designed to simulate the listener’s search time. Our results indicate that our tool can predict the degradation score of unseen headphones. Thus, our tool can be applied to automatically assess the impact of headphones on listener experience in augmented reality applications.
{"title":"Predicting the effect of headphones on the time to localize a target in an auditory-guided visual search task","authors":"P. Lladó, R. Barumerli, R. Baumgartner, P. Majdak","doi":"10.3389/frvir.2024.1359987","DOIUrl":"https://doi.org/10.3389/frvir.2024.1359987","url":null,"abstract":"In augmented reality scenarios, headphones obstruct the direct path of the sound to the ears, affecting the users’ abilities to localize surrounding sound sources and compromising the immersive experience. Unfortunately, the assessment of the perceptual implications of wearing headphones on localization in ecologically valid scenarios is costly and time-consuming. Here, we propose a model-based tool for automatic assessment of the dynamic localization degradation (DLD) introduced by headphones describing the time required to find a target in an auditory-guided visual search task. First, we introduce the DLD score obtained for twelve headphones and the search times with actual listeners. Then, we describe the predictions of the headphone-induced DLD score obtained by an auditory model designed to simulate the listener’s search time. Our results indicate that our tool can predict the degradation score of unseen headphones. Thus, our tool can be applied to automatically assess the impact of headphones on listener experience in augmented reality applications.","PeriodicalId":502489,"journal":{"name":"Frontiers in Virtual Reality","volume":" 40","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141371975","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Climate scientists increasingly agree that human behavior significantly contributes to global warming and biodiversity decline. Recent research emphasizes the importance of human-nature connectedness as a reliable predictor of psychological wellbeing and increased engagement in pro-environmental behavior. While evidence supports a positive correlation between human-nature connectedness and pro-environmental behavior, establishing causation remains elusive. Nevertheless, exploring this link is crucial, given its potential to enhance pro-environmental behavior. Armed with this understanding, stakeholders can design and implement successful sustainability interventions that promote wellbeing on individual and collective levels. One psychological phenomenon believed to have a strong effect on human-nature connectedness and pro-environmental behavior is “The Overview Effect,” a term used to describe the shift in awareness some astronauts experience when viewing Earth from outside its atmosphere. This pilot study explored whether a 180-degree virtual reality Overview Effect experience created by EarthscapeVR® influences human-nature connectedness and whether a correlation exists between participants’ average human-nature connectedness scores and openness to experience scores. 60 student participants took part in the study. The results showed significant increases on human-nature connectedness (p < 0.0021) in the experimental condition compared to the control group (p = 0.97), with no correlation (r = 0.137) between participants’ average human-nature connectedness scores and openness to experience scores. While these results are not conclusive and further research is necessary, the initial findings support translating the Overview Effect into virtual reality to promote human-nature connectedness in people.
{"title":"The overview effect and nature-relatedness","authors":"Niall McKeever, Annahita Nezami, Dimitrios Kourtis","doi":"10.3389/frvir.2024.1196312","DOIUrl":"https://doi.org/10.3389/frvir.2024.1196312","url":null,"abstract":"Climate scientists increasingly agree that human behavior significantly contributes to global warming and biodiversity decline. Recent research emphasizes the importance of human-nature connectedness as a reliable predictor of psychological wellbeing and increased engagement in pro-environmental behavior. While evidence supports a positive correlation between human-nature connectedness and pro-environmental behavior, establishing causation remains elusive. Nevertheless, exploring this link is crucial, given its potential to enhance pro-environmental behavior. Armed with this understanding, stakeholders can design and implement successful sustainability interventions that promote wellbeing on individual and collective levels. One psychological phenomenon believed to have a strong effect on human-nature connectedness and pro-environmental behavior is “The Overview Effect,” a term used to describe the shift in awareness some astronauts experience when viewing Earth from outside its atmosphere. This pilot study explored whether a 180-degree virtual reality Overview Effect experience created by EarthscapeVR® influences human-nature connectedness and whether a correlation exists between participants’ average human-nature connectedness scores and openness to experience scores. 60 student participants took part in the study. The results showed significant increases on human-nature connectedness (p < 0.0021) in the experimental condition compared to the control group (p = 0.97), with no correlation (r = 0.137) between participants’ average human-nature connectedness scores and openness to experience scores. While these results are not conclusive and further research is necessary, the initial findings support translating the Overview Effect into virtual reality to promote human-nature connectedness in people.","PeriodicalId":502489,"journal":{"name":"Frontiers in Virtual Reality","volume":"7 2","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-06-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141385069","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-06-05DOI: 10.3389/frvir.2024.1332794
Abdulaziz Alshaer
Introduction: Crowded spaces, especially during significant events or rituals, pose challenges in terms of safety and management. This study introduces a novel approach to address these challenges by leveraging Virtual Reality Exposure (VRE) as a potential solution. Using the Tawaf ritual around the Kaaba and the specific act of touching the Black Stone as a case study, this research explores how VRE can be employed to alter individual behaviors and perceptions in crowded spaces. Methods: Participants completed questionnaires assessing their behaviors and perceptions before and after exposure to two types of Virtual Reality environments: visual-only Virtual Reality (VR) and multi-sensory Virtual Reality (MVR).Results: Results indicated a marked decrease in participants' eagerness to physically interact with the Black Stone after the VR exposure. However, this eagerness saw a minor resurgence in the MVR setting, suggesting a more profound sense of immersion. Additionally, the MVR environment significantly enhanced the participants' overall sense of presence and emotional intensity compared to the visual-only VR.Discussion: This research underscores the potential of VRE as a broader tool for crowd management in various settings, emphasizing its generalizability and contribution to the field. By harnessing the immersive capabilities of VRE, stakeholders can mitigate risks and enhance the experience in crowded scenarios.
{"title":"Investigating the efficacy of virtual reality exposure for crowd management: a real-world application","authors":"Abdulaziz Alshaer","doi":"10.3389/frvir.2024.1332794","DOIUrl":"https://doi.org/10.3389/frvir.2024.1332794","url":null,"abstract":"Introduction: Crowded spaces, especially during significant events or rituals, pose challenges in terms of safety and management. This study introduces a novel approach to address these challenges by leveraging Virtual Reality Exposure (VRE) as a potential solution. Using the Tawaf ritual around the Kaaba and the specific act of touching the Black Stone as a case study, this research explores how VRE can be employed to alter individual behaviors and perceptions in crowded spaces. Methods: Participants completed questionnaires assessing their behaviors and perceptions before and after exposure to two types of Virtual Reality environments: visual-only Virtual Reality (VR) and multi-sensory Virtual Reality (MVR).Results: Results indicated a marked decrease in participants' eagerness to physically interact with the Black Stone after the VR exposure. However, this eagerness saw a minor resurgence in the MVR setting, suggesting a more profound sense of immersion. Additionally, the MVR environment significantly enhanced the participants' overall sense of presence and emotional intensity compared to the visual-only VR.Discussion: This research underscores the potential of VRE as a broader tool for crowd management in various settings, emphasizing its generalizability and contribution to the field. By harnessing the immersive capabilities of VRE, stakeholders can mitigate risks and enhance the experience in crowded scenarios.","PeriodicalId":502489,"journal":{"name":"Frontiers in Virtual Reality","volume":"11 S1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-06-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141382571","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-05-24DOI: 10.3389/frvir.2024.1281017
Christopher Tacca, Barbara A. Kerr, Christopher McLamb, Kaylie Lyons Ridgway, Elizabeth Friis
More than 40% of the U.S. population have experienced mental health disorders since the COVID-19 pandemic. 40% of this group received no treatment for their mental illness. Barriers to treatment include stigma, prohibitive cost, and a belief that treatment is inaccessible, particularly in isolated or rural communities. A novel remote, EEG-enhanced VR psychotherapy system was assessed for its presence and restorativeness, and therapeutic efficacy in improving mood with a single session positive solution-focused session. Thirty adults experiencing depressive symptoms were randomly assigned to either a single session Positive Solutions Focused counseling treatment via Zoom videoconferencing, or the EEG enabled VR psychotherapy system. Participants rated the environment in the VR-EEG therapy as more restorative than Zoom counseling, t = 2.928, p < .004, Cohen’s d = .259, and comparable to the Zoom session in presence. The VR-EEG system performed comparably to Zoom online counseling in clients’ session ratings of depth and smoothness and client reactions, positivity, and arousal. For a treatment to be considered empirically supported, and therefore valid for use in psychotherapy, it must have equal or greater efficacy than a standard treatment or format. VR-EEG, therefore, has promise as a positive, solution-focused, brief therapy for isolated clients with depressive symptoms.
{"title":"Efficacy of a remote virtual reality and EEG enabled psychotherapy system for the treatment of depressive symptoms","authors":"Christopher Tacca, Barbara A. Kerr, Christopher McLamb, Kaylie Lyons Ridgway, Elizabeth Friis","doi":"10.3389/frvir.2024.1281017","DOIUrl":"https://doi.org/10.3389/frvir.2024.1281017","url":null,"abstract":"More than 40% of the U.S. population have experienced mental health disorders since the COVID-19 pandemic. 40% of this group received no treatment for their mental illness. Barriers to treatment include stigma, prohibitive cost, and a belief that treatment is inaccessible, particularly in isolated or rural communities. A novel remote, EEG-enhanced VR psychotherapy system was assessed for its presence and restorativeness, and therapeutic efficacy in improving mood with a single session positive solution-focused session. Thirty adults experiencing depressive symptoms were randomly assigned to either a single session Positive Solutions Focused counseling treatment via Zoom videoconferencing, or the EEG enabled VR psychotherapy system. Participants rated the environment in the VR-EEG therapy as more restorative than Zoom counseling, t = 2.928, p < .004, Cohen’s d = .259, and comparable to the Zoom session in presence. The VR-EEG system performed comparably to Zoom online counseling in clients’ session ratings of depth and smoothness and client reactions, positivity, and arousal. For a treatment to be considered empirically supported, and therefore valid for use in psychotherapy, it must have equal or greater efficacy than a standard treatment or format. VR-EEG, therefore, has promise as a positive, solution-focused, brief therapy for isolated clients with depressive symptoms.","PeriodicalId":502489,"journal":{"name":"Frontiers in Virtual Reality","volume":"20 6","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-05-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141100020","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-05-21DOI: 10.3389/frvir.2024.1377210
Linda Graf, Philipp Sykownik, Gertraud Gradl-Dietsch, Maic Masuch
Virtual Reality (VR) technology allows the design and application of realistic but adaptive learning environments in medical education. In particular, virtual patient systems have logistical and methodological advantages compared to non-computerized interventions. However, evidence for their effectiveness is fragmented as any educational domain introduces its requirements regarding learning goals, measurements of learning outcomes, and application design. In this context, we present preliminary results of evaluating a VR training application for conducting a clinical interview to diagnose mental disorders in children and adolescents using virtual patients. The evaluation focuses on design elements related to the virtual patient’s appearance and natural language capabilities. Our results indicate that our virtual patient design is highly believable and that our dialog system is satisfying. However, conversational flow requires optimization. We discuss design directions and potential enhancements for learner-virtual patient interactions in VR and address future operations to evaluate the effectiveness of our approach.
{"title":"Towards believable and educational conversations with virtual patients","authors":"Linda Graf, Philipp Sykownik, Gertraud Gradl-Dietsch, Maic Masuch","doi":"10.3389/frvir.2024.1377210","DOIUrl":"https://doi.org/10.3389/frvir.2024.1377210","url":null,"abstract":"Virtual Reality (VR) technology allows the design and application of realistic but adaptive learning environments in medical education. In particular, virtual patient systems have logistical and methodological advantages compared to non-computerized interventions. However, evidence for their effectiveness is fragmented as any educational domain introduces its requirements regarding learning goals, measurements of learning outcomes, and application design. In this context, we present preliminary results of evaluating a VR training application for conducting a clinical interview to diagnose mental disorders in children and adolescents using virtual patients. The evaluation focuses on design elements related to the virtual patient’s appearance and natural language capabilities. Our results indicate that our virtual patient design is highly believable and that our dialog system is satisfying. However, conversational flow requires optimization. We discuss design directions and potential enhancements for learner-virtual patient interactions in VR and address future operations to evaluate the effectiveness of our approach.","PeriodicalId":502489,"journal":{"name":"Frontiers in Virtual Reality","volume":"74 3","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-05-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141116750","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-03-28DOI: 10.3389/frvir.2024.1260910
Anand van Zelderen, Nicky Dries, Jochen Menges
In many workplaces, managers provide some employees with unique privileges that support their professional development and stimulate productivity and creativity. Yet with some employees more deserving of a privileged status than others, co-workers feeling left out of the inner circle may begin to exhibit feelings of envy. With workplace envy and intergroup conflicts going hand in hand, the question arises whether co-worker acceptance of employee privileges—where conflict can be constrained through an affirmative re-evaluation of co-workers’ privileged status—may lower the envy experienced by employees. Using virtual reality technology, 112 employees participated in a virtual employee meeting at a virtual organization where they were exposed to a new workforce differentiation practice. We show through our experiment that co-worker acceptance of employee privileges negatively influences workplace envy, which was partially mediated by the anticipated ostracism of employees. Moreover, we show that this effect is only found for employees with privileges, who worry more about being ostracized than their non-privileged co-workers. We anticipate that our findings will enable managers to conscientiously differentiate between their employees, using virtual reality simulations to steer employees’ thoughts and feelings in a direction that benefits both employees and organizations.
{"title":"The curse of employee privilege: harnessing virtual reality technology to inhibit workplace envy","authors":"Anand van Zelderen, Nicky Dries, Jochen Menges","doi":"10.3389/frvir.2024.1260910","DOIUrl":"https://doi.org/10.3389/frvir.2024.1260910","url":null,"abstract":"In many workplaces, managers provide some employees with unique privileges that support their professional development and stimulate productivity and creativity. Yet with some employees more deserving of a privileged status than others, co-workers feeling left out of the inner circle may begin to exhibit feelings of envy. With workplace envy and intergroup conflicts going hand in hand, the question arises whether co-worker acceptance of employee privileges—where conflict can be constrained through an affirmative re-evaluation of co-workers’ privileged status—may lower the envy experienced by employees. Using virtual reality technology, 112 employees participated in a virtual employee meeting at a virtual organization where they were exposed to a new workforce differentiation practice. We show through our experiment that co-worker acceptance of employee privileges negatively influences workplace envy, which was partially mediated by the anticipated ostracism of employees. Moreover, we show that this effect is only found for employees with privileges, who worry more about being ostracized than their non-privileged co-workers. We anticipate that our findings will enable managers to conscientiously differentiate between their employees, using virtual reality simulations to steer employees’ thoughts and feelings in a direction that benefits both employees and organizations.","PeriodicalId":502489,"journal":{"name":"Frontiers in Virtual Reality","volume":"95 9","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-03-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140370997","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}