Pub Date : 2023-06-29DOI: 10.3389/frvir.2023.1207397
Katie Wilson, Phillip E. Pfeiffer
Researchers in music education are exploring the use of virtual reality (VR) and augmented reality (AR) to support piano instruction. Beginner piano students tend to receive short, infrequent lessons, which they practice on their own. This lack of instructor feedback creates opportunities for students to develop improper technique. Current strategies for using AR and VR to guide solo practice use moving shapes to help students to identify what notes to play. Improvements in commercial AR/VR technology will be needed to provide more detailed real-time feedback.
{"title":"Feedback in augmented and virtual reality piano tutoring systems: a mini review","authors":"Katie Wilson, Phillip E. Pfeiffer","doi":"10.3389/frvir.2023.1207397","DOIUrl":"https://doi.org/10.3389/frvir.2023.1207397","url":null,"abstract":"Researchers in music education are exploring the use of virtual reality (VR) and augmented reality (AR) to support piano instruction. Beginner piano students tend to receive short, infrequent lessons, which they practice on their own. This lack of instructor feedback creates opportunities for students to develop improper technique. Current strategies for using AR and VR to guide solo practice use moving shapes to help students to identify what notes to play. Improvements in commercial AR/VR technology will be needed to provide more detailed real-time feedback.","PeriodicalId":73116,"journal":{"name":"Frontiers in virtual reality","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45342560","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-28DOI: 10.3389/frvir.2023.1172381
Martin Feick, Donald Degraen, Fabian Hupperich, A. Krüger
During interaction with objects in Virtual Reality haptic feedback plays a crucial role for creating convincing immersive experiences. Recent work building upon passive haptic feedback has looked towards fabrication processes for designing and creating proxy objects able to communicate objects’ properties and characteristics. However, such approaches remain limited in terms of scalability as for each material a corresponding object needs to be fabricated. To create more flexible 3D-printed proxies, we explore the potential of metamaterials. To this aim, we designed metamaterial structures able to alter their tactile surface properties, e.g., their hardness and roughness, upon lateral compression. In this work, we designed five different metamaterial patterns based on features that are known to affect tactile properties. We evaluated whether our samples were able to successfully convey different levels of roughness and hardness sensations at varying levels of compression. While we found that roughness was significantly affected by compression state, hardness did not seem to follow the same pattern. In a second study, we focused on two metamaterial patterns showing promise for roughness perception and investigated their visuo-haptic perception in Virtual Reality. Here, eight different compression states of our two selected metamaterials were overlaid with six visual material textures. Our results suggest that, especially at low compression states, our metamaterials were the most promising ones to match the textures displayed to the participants. Additionally, when asked which material participants perceived, adjectives, such as “broken” and “damaged” were used. This indicates that metamaterial surface textures could be able to simulate different object states. Our results underline that metamaterial design is able to extend the gamut of tactile experiences of 3D-printed surfaces structures, as a single sample is able to reconfigure its haptic sensation through compression. Graphical Abstract The six visual material textures: concrete, wood, plastic, fabric, glass and metal used in the main experiment. In addition, an example how participants interacted with the samples.
{"title":"MetaReality: enhancing tactile experiences using actuated 3D-printed metamaterials in Virtual Reality","authors":"Martin Feick, Donald Degraen, Fabian Hupperich, A. Krüger","doi":"10.3389/frvir.2023.1172381","DOIUrl":"https://doi.org/10.3389/frvir.2023.1172381","url":null,"abstract":"During interaction with objects in Virtual Reality haptic feedback plays a crucial role for creating convincing immersive experiences. Recent work building upon passive haptic feedback has looked towards fabrication processes for designing and creating proxy objects able to communicate objects’ properties and characteristics. However, such approaches remain limited in terms of scalability as for each material a corresponding object needs to be fabricated. To create more flexible 3D-printed proxies, we explore the potential of metamaterials. To this aim, we designed metamaterial structures able to alter their tactile surface properties, e.g., their hardness and roughness, upon lateral compression. In this work, we designed five different metamaterial patterns based on features that are known to affect tactile properties. We evaluated whether our samples were able to successfully convey different levels of roughness and hardness sensations at varying levels of compression. While we found that roughness was significantly affected by compression state, hardness did not seem to follow the same pattern. In a second study, we focused on two metamaterial patterns showing promise for roughness perception and investigated their visuo-haptic perception in Virtual Reality. Here, eight different compression states of our two selected metamaterials were overlaid with six visual material textures. Our results suggest that, especially at low compression states, our metamaterials were the most promising ones to match the textures displayed to the participants. Additionally, when asked which material participants perceived, adjectives, such as “broken” and “damaged” were used. This indicates that metamaterial surface textures could be able to simulate different object states. Our results underline that metamaterial design is able to extend the gamut of tactile experiences of 3D-printed surfaces structures, as a single sample is able to reconfigure its haptic sensation through compression. Graphical Abstract The six visual material textures: concrete, wood, plastic, fabric, glass and metal used in the main experiment. In addition, an example how participants interacted with the samples.","PeriodicalId":73116,"journal":{"name":"Frontiers in virtual reality","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46830642","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-28DOI: 10.3389/frvir.2023.1106061
Valentin Bauer, Tifanie Bouchara, Olivier Duris, Charlotte Labossière, Marie-Noëlle Clément, P. Bourdot
Augmented Reality (AR) is promising to complement autism approaches, but so far has mainly focused on training socio-emotional abilities for autistic children with mild learning disabilities. To better consider autistic children with severe learning disabilities and complex needs (SLN), stakeholders advise using collaborative AR sensory-based mediation approaches. Magic Bubbles is a multisensory AR environment created based on stakeholders’ interviews, then adapted for a day hospital setting in collaboration with practitioners, and finally validated in terms of acceptability and usability for autistic children with SLN. In this paper, we report on our latest study that explores three main research questions: 1) To what extent can Magic Bubbles secure autistic children with SLN? 2) To what extent can Magic Bubbles prompt the dyadic relationship between an autistic child with SLN and a practitioner? 3) What is the overall quality of experience for autistic children with SLN when using Magic Bubbles? To answer these questions, seven autistic children with SLN participated in at least six weekly sessions over three months in a day hospital setting. Data collection and analysis used qualitative and quantitative methods, mainly drawing upon grounded theory to evaluate their experiences. Findings validate the three research questions, offer a detailed account of children’s experiences with AR, and outline future directions.
{"title":"Head-mounted augmented reality to support reassurance and social interaction for autistic children with severe learning disabilities","authors":"Valentin Bauer, Tifanie Bouchara, Olivier Duris, Charlotte Labossière, Marie-Noëlle Clément, P. Bourdot","doi":"10.3389/frvir.2023.1106061","DOIUrl":"https://doi.org/10.3389/frvir.2023.1106061","url":null,"abstract":"Augmented Reality (AR) is promising to complement autism approaches, but so far has mainly focused on training socio-emotional abilities for autistic children with mild learning disabilities. To better consider autistic children with severe learning disabilities and complex needs (SLN), stakeholders advise using collaborative AR sensory-based mediation approaches. Magic Bubbles is a multisensory AR environment created based on stakeholders’ interviews, then adapted for a day hospital setting in collaboration with practitioners, and finally validated in terms of acceptability and usability for autistic children with SLN. In this paper, we report on our latest study that explores three main research questions: 1) To what extent can Magic Bubbles secure autistic children with SLN? 2) To what extent can Magic Bubbles prompt the dyadic relationship between an autistic child with SLN and a practitioner? 3) What is the overall quality of experience for autistic children with SLN when using Magic Bubbles? To answer these questions, seven autistic children with SLN participated in at least six weekly sessions over three months in a day hospital setting. Data collection and analysis used qualitative and quantitative methods, mainly drawing upon grounded theory to evaluate their experiences. Findings validate the three research questions, offer a detailed account of children’s experiences with AR, and outline future directions.","PeriodicalId":73116,"journal":{"name":"Frontiers in virtual reality","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46584656","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-27DOI: 10.3389/frvir.2023.940536
Arne Kildahl-Andersen, E. F. Hofstad, Hanne Sorger, T. Amundsen, T. Langø, H. O. Leira, G. Kiss
Background: Bronchoscopy for peripheral lung lesions may involve image sources such as computed tomography (CT), fluoroscopy, radial endobronchial ultrasound (R-EBUS), and virtual/electromagnetic navigation bronchoscopy. Our objective was to evaluate the feasibility of replacing these multiple monitors with a head-mounted display (HMD), always providing relevant image data in the line of sight of the bronchoscopist. Methods: A total of 17 pulmonologists wearing a HMD (Microsoft® HoloLens 2) performed bronchoscopy with electromagnetic navigation in a lung phantom. The bronchoscopists first conducted an endobronchial inspection and navigation to the target, followed by an endobronchial ultrasound bronchoscopy. The HMD experience was evaluated using a questionnaire. Finally, the HMD was used in bronchoscopy inspection and electromagnetic navigation of two patients presenting with hemoptysis. Results: In the phantom study, the perceived quality of video and ultrasound images was assessed using a visual analog scale, with 100% representing optimal image quality. The score for video quality was 58% (95% confidence interval [CI] 48%–68%) and for ultrasound image quality, the score was 43% (95% CI 30%–56%). Contrast, color rendering, and resolution were all considered suboptimal. Despite adjusting the brightness settings, video image rendering was considered too dark. Navigation to the target for biopsy sampling was accomplished by all participants, with no significant difference in procedure time between experienced and less experienced bronchoscopists. The overall system latency for the image stream was 0.33–0.35 s. Fifteen of the pulmonologists would consider using HoloLens for navigation in the periphery, and two would not consider using HoloLens in bronchoscopy at all. In the human study, bronchoscopy inspection was feasible for both patients. Conclusion: Bronchoscopy using an HMD was feasible in a lung phantom and in two patients. Video and ultrasound image quality was considered inferior to that of video monitors. HoloLens 2 was suboptimal for airway and mucosa inspection but may be adequate for virtual bronchoscopy navigation.
{"title":"Bronchoscopy using a head-mounted mixed reality device—a phantom study and a first in-patient user experience","authors":"Arne Kildahl-Andersen, E. F. Hofstad, Hanne Sorger, T. Amundsen, T. Langø, H. O. Leira, G. Kiss","doi":"10.3389/frvir.2023.940536","DOIUrl":"https://doi.org/10.3389/frvir.2023.940536","url":null,"abstract":"Background: Bronchoscopy for peripheral lung lesions may involve image sources such as computed tomography (CT), fluoroscopy, radial endobronchial ultrasound (R-EBUS), and virtual/electromagnetic navigation bronchoscopy. Our objective was to evaluate the feasibility of replacing these multiple monitors with a head-mounted display (HMD), always providing relevant image data in the line of sight of the bronchoscopist. Methods: A total of 17 pulmonologists wearing a HMD (Microsoft® HoloLens 2) performed bronchoscopy with electromagnetic navigation in a lung phantom. The bronchoscopists first conducted an endobronchial inspection and navigation to the target, followed by an endobronchial ultrasound bronchoscopy. The HMD experience was evaluated using a questionnaire. Finally, the HMD was used in bronchoscopy inspection and electromagnetic navigation of two patients presenting with hemoptysis. Results: In the phantom study, the perceived quality of video and ultrasound images was assessed using a visual analog scale, with 100% representing optimal image quality. The score for video quality was 58% (95% confidence interval [CI] 48%–68%) and for ultrasound image quality, the score was 43% (95% CI 30%–56%). Contrast, color rendering, and resolution were all considered suboptimal. Despite adjusting the brightness settings, video image rendering was considered too dark. Navigation to the target for biopsy sampling was accomplished by all participants, with no significant difference in procedure time between experienced and less experienced bronchoscopists. The overall system latency for the image stream was 0.33–0.35 s. Fifteen of the pulmonologists would consider using HoloLens for navigation in the periphery, and two would not consider using HoloLens in bronchoscopy at all. In the human study, bronchoscopy inspection was feasible for both patients. Conclusion: Bronchoscopy using an HMD was feasible in a lung phantom and in two patients. Video and ultrasound image quality was considered inferior to that of video monitors. HoloLens 2 was suboptimal for airway and mucosa inspection but may be adequate for virtual bronchoscopy navigation.","PeriodicalId":73116,"journal":{"name":"Frontiers in virtual reality","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-06-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48824176","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-27DOI: 10.3389/frvir.2023.1130156
Jacob Moore, Harald Scheirich, Shreeraj Jadhav, A. Enquobahrie, B. Paniagua, Andrew Wilson, Aaron Bray, G. Sankaranarayanan, Rachel B. Clipp
Introduction: Human error is one of the leading causes of medical error. It is estimated that human error leads to between 250,000 and 440,000 deaths each year. Medical simulation has been shown to improve the skills and confidence of clinicians and reduce medical errors. Surgical simulation is critical for training surgeons in complicated procedures and can be particularly effective in skill retention. Methods: The interactive Medical Simulation Toolkit (iMSTK) is an open source platform with position-based dynamics, continuous collision detection, smooth particle hydrodynamics, integrated haptics, and compatibility with Unity and Unreal, among others. iMSTK provides a wide range of real-time simulation capabilities with a flexible open-source license (Apache 2.0) that encourages adoption across the research and commercial simulation communities. iMSTK uses extended position-based dynamics and an established collision and constraint implementations to model biological tissues and their interactions with medical tools and other tissues. Results: The platform demonstrates performance, that is, compatible with real-time simulation that incorporates both visualization and haptics. iMSTK has been used in a variety of virtual simulations, including for laparoscopic hiatal hernia surgery, laparoscopic cholecystectomy, osteotomy procedures, and kidney biopsy procedures. Discussion: iMSTK currently supports building simulations for a wide range of surgical scenarios. Future work includes expanding Unity support to make it easier to use and improving the speed of the computation to allow for larger scenes and finer meshes for larger surgical procedures.
{"title":"The interactive medical simulation toolkit (iMSTK): an open source platform for surgical simulation","authors":"Jacob Moore, Harald Scheirich, Shreeraj Jadhav, A. Enquobahrie, B. Paniagua, Andrew Wilson, Aaron Bray, G. Sankaranarayanan, Rachel B. Clipp","doi":"10.3389/frvir.2023.1130156","DOIUrl":"https://doi.org/10.3389/frvir.2023.1130156","url":null,"abstract":"Introduction: Human error is one of the leading causes of medical error. It is estimated that human error leads to between 250,000 and 440,000 deaths each year. Medical simulation has been shown to improve the skills and confidence of clinicians and reduce medical errors. Surgical simulation is critical for training surgeons in complicated procedures and can be particularly effective in skill retention. Methods: The interactive Medical Simulation Toolkit (iMSTK) is an open source platform with position-based dynamics, continuous collision detection, smooth particle hydrodynamics, integrated haptics, and compatibility with Unity and Unreal, among others. iMSTK provides a wide range of real-time simulation capabilities with a flexible open-source license (Apache 2.0) that encourages adoption across the research and commercial simulation communities. iMSTK uses extended position-based dynamics and an established collision and constraint implementations to model biological tissues and their interactions with medical tools and other tissues. Results: The platform demonstrates performance, that is, compatible with real-time simulation that incorporates both visualization and haptics. iMSTK has been used in a variety of virtual simulations, including for laparoscopic hiatal hernia surgery, laparoscopic cholecystectomy, osteotomy procedures, and kidney biopsy procedures. Discussion: iMSTK currently supports building simulations for a wide range of surgical scenarios. Future work includes expanding Unity support to make it easier to use and improving the speed of the computation to allow for larger scenes and finer meshes for larger surgical procedures.","PeriodicalId":73116,"journal":{"name":"Frontiers in virtual reality","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-06-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45835208","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-22DOI: 10.3389/frvir.2023.1155700
So-youn Jang, Jisook Park, M. Engberg, B. MacIntyre, J. Bolter
In this paper, we treat VR as a new writing space in the long tradition of inscription. Constructing Virtual Reality (VR) narratives can then be understood as a process of inscribing text in space, and consuming them as a process of “reading” the space. Our research objective is to explore the meaning-making process afforded by spatial narratives—to test whether VR facilitates traditional ways of weaving complex, multiple narrative strands and provides new opportunities for leveraging space. We argue that, as opposed to the linear space of a printed book, a VR narrative space is similar to the physical space of a museum and can be analyzed on three distinct levels: (1) the architecture of the space itself, (2) the collection, and (3) the individual artifacts. To provide a deeper context for designing VR narratives, we designed and implemented a testbed called RealityMedia to explore digital remediations of traditional narrative devices and the spatial, immersive, and interactive affordances of VR. We conducted task-based user study using a VR headset and follow-up qualitative interviews with 20 participants. Our results highlight how the three semantic levels (space, collection, and artifacts) can work together to constitute meaningful narrative experiences in VR.
{"title":"RealityMedia: immersive technology and narrative space","authors":"So-youn Jang, Jisook Park, M. Engberg, B. MacIntyre, J. Bolter","doi":"10.3389/frvir.2023.1155700","DOIUrl":"https://doi.org/10.3389/frvir.2023.1155700","url":null,"abstract":"In this paper, we treat VR as a new writing space in the long tradition of inscription. Constructing Virtual Reality (VR) narratives can then be understood as a process of inscribing text in space, and consuming them as a process of “reading” the space. Our research objective is to explore the meaning-making process afforded by spatial narratives—to test whether VR facilitates traditional ways of weaving complex, multiple narrative strands and provides new opportunities for leveraging space. We argue that, as opposed to the linear space of a printed book, a VR narrative space is similar to the physical space of a museum and can be analyzed on three distinct levels: (1) the architecture of the space itself, (2) the collection, and (3) the individual artifacts. To provide a deeper context for designing VR narratives, we designed and implemented a testbed called RealityMedia to explore digital remediations of traditional narrative devices and the spatial, immersive, and interactive affordances of VR. We conducted task-based user study using a VR headset and follow-up qualitative interviews with 20 participants. Our results highlight how the three semantic levels (space, collection, and artifacts) can work together to constitute meaningful narrative experiences in VR.","PeriodicalId":73116,"journal":{"name":"Frontiers in virtual reality","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47399850","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-14DOI: 10.3389/frvir.2023.1167051
Andrew Vincent, P. Frewen
Introduction: Although virtual reality (VR) is most popularly known for its applications to gaming, other entertainment applications are increasingly being explored including in the sports media industry, but little research has so far examined the experiences induced by VR viewing of a live sporting event. Materials and methods: Participants (n = 93) were university students who were approached in the context of a field study from a nearby community eatery area on the university campus to watch brief segments of a 360° live stream of the home games of their university volleyball and basketball teams both while wearing and not wearing an inexpensive smart-phone based head-mounted display (HMD). Immediately afterward, participants then reported on their relative experience of spatial, interpersonal, and temporal presence, as well as their satisfaction-preference with each of the two viewing modalities, in response to brief face-valid screening questions. Results: The majority of participants experienced greater presence while wearing the VR headset, and approximately one in every two reported preferring to watch the games in VR. Participants’ experience of spatial presence independently correlated with preferring to watch the games in VR. Discussion: Media vendors should offer VR viewing of sports including via inexpensive, smart-phone mediated VR as an additional, cost-effective alternative means of heightening fans’ experience of virtual presence at the games when fans are unable to go to the games in person.
{"title":"Being where, with whom, and when it happens: spatial, interpersonal, and temporal presence while viewing live streaming of collegiate sports in virtual reality","authors":"Andrew Vincent, P. Frewen","doi":"10.3389/frvir.2023.1167051","DOIUrl":"https://doi.org/10.3389/frvir.2023.1167051","url":null,"abstract":"Introduction: Although virtual reality (VR) is most popularly known for its applications to gaming, other entertainment applications are increasingly being explored including in the sports media industry, but little research has so far examined the experiences induced by VR viewing of a live sporting event. Materials and methods: Participants (n = 93) were university students who were approached in the context of a field study from a nearby community eatery area on the university campus to watch brief segments of a 360° live stream of the home games of their university volleyball and basketball teams both while wearing and not wearing an inexpensive smart-phone based head-mounted display (HMD). Immediately afterward, participants then reported on their relative experience of spatial, interpersonal, and temporal presence, as well as their satisfaction-preference with each of the two viewing modalities, in response to brief face-valid screening questions. Results: The majority of participants experienced greater presence while wearing the VR headset, and approximately one in every two reported preferring to watch the games in VR. Participants’ experience of spatial presence independently correlated with preferring to watch the games in VR. Discussion: Media vendors should offer VR viewing of sports including via inexpensive, smart-phone mediated VR as an additional, cost-effective alternative means of heightening fans’ experience of virtual presence at the games when fans are unable to go to the games in person.","PeriodicalId":73116,"journal":{"name":"Frontiers in virtual reality","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46984360","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-13DOI: 10.3389/frvir.2023.1151190
N. Hinricher, Simon König, Chris Schröer, C. Backhaus
User experience and user acceptance of a product are essential for the product’s success. Virtual reality (VR) technology has the potential to assess these parameters early in the development process. However, research is scarce on whether the evaluation of the user experience and user acceptance of prototypes in VR, as well as the simulation of the usage environment, lead to comparable results to reality. To investigate this, a digital twin of a blood pressure monitor (BPM) was created using VR. In a 2 × 2 factorial between-subjects design, 48 participants tested the real or VR BPM. The tests were performed either in a low-detail room at a desk or in a detailed operating room (OR) environment. Participants executed three use scenarios with the BPM and rated their user experience and acceptance with standardized questionnaires. A test leader evaluated the performance of the participants’ actions using a three-point scheme. The number of user interactions, task time, and perceived workload were assessed. The participants rated the user experience of the BPM significantly (p < .05) better in VR. User acceptance was significantly higher when the device was tested in VR and in a detailed OR environment. Participant performance and time on task did not significantly differ between VR and reality. However, there was significantly less interaction with the VR device (p < .001). Participants who tested the device in a detailed OR environment rated their performance significantly worse. In reality, the participants were able to haptically experience the device and thus better assess its quality. Overall, this study shows that user evaluations in VR should focus on objective criteria, such as user errors. Subjective criteria, such as user experience, are significantly biased by VR.
{"title":"Effects of virtual reality and test environment on user experience, usability, and mental workload in the evaluation of a blood pressure monitor","authors":"N. Hinricher, Simon König, Chris Schröer, C. Backhaus","doi":"10.3389/frvir.2023.1151190","DOIUrl":"https://doi.org/10.3389/frvir.2023.1151190","url":null,"abstract":"User experience and user acceptance of a product are essential for the product’s success. Virtual reality (VR) technology has the potential to assess these parameters early in the development process. However, research is scarce on whether the evaluation of the user experience and user acceptance of prototypes in VR, as well as the simulation of the usage environment, lead to comparable results to reality. To investigate this, a digital twin of a blood pressure monitor (BPM) was created using VR. In a 2 × 2 factorial between-subjects design, 48 participants tested the real or VR BPM. The tests were performed either in a low-detail room at a desk or in a detailed operating room (OR) environment. Participants executed three use scenarios with the BPM and rated their user experience and acceptance with standardized questionnaires. A test leader evaluated the performance of the participants’ actions using a three-point scheme. The number of user interactions, task time, and perceived workload were assessed. The participants rated the user experience of the BPM significantly (p < .05) better in VR. User acceptance was significantly higher when the device was tested in VR and in a detailed OR environment. Participant performance and time on task did not significantly differ between VR and reality. However, there was significantly less interaction with the VR device (p < .001). Participants who tested the device in a detailed OR environment rated their performance significantly worse. In reality, the participants were able to haptically experience the device and thus better assess its quality. Overall, this study shows that user evaluations in VR should focus on objective criteria, such as user errors. Subjective criteria, such as user experience, are significantly biased by VR.","PeriodicalId":73116,"journal":{"name":"Frontiers in virtual reality","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-06-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42587491","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-06-06DOI: 10.3389/frvir.2023.952741
B. Arnfred, Johanna Kvist Svendsen, Ali Adjourlu, Carsten Horthøj
Introduction: The use of virtual reality technology to deliver exposure therapy in the treatment of phobic anxiety (i.e., social anxiety disorder, agoraphobia, and specific phobia) has been proposed to be advantageous compared with in-vivo exposure therapy. These supposed advantages depend on the features of the virtual reality technology and how it is used therapeutically. Therefore, the aim of this study was to provide a comprehensive overview of the features of the hardware and software used in studies examining virtual reality exposure therapy studies for phobic anxiety disorders. Methods: 70 studies using virtual reality exposure therapy to treat social anxiety disorder, agoraphobia and/or specific phobia, were systematically reviewed for 46 data points relating to these features. Results: We found that studies generally did not utilize contemporary virtual reality technology and that hardware and software features were inconsistently delineated. Discussion: The implications of these findings are that the use of modern virtual reality technology represents a relevant frontier in anxiety treatment and that a framework for reporting technical features of virtual reality exposure interventions would benefit the field.
{"title":"Scoping review of the hardware and software features of virtual reality exposure therapy for social anxiety disorder, agoraphobia, and specific phobia","authors":"B. Arnfred, Johanna Kvist Svendsen, Ali Adjourlu, Carsten Horthøj","doi":"10.3389/frvir.2023.952741","DOIUrl":"https://doi.org/10.3389/frvir.2023.952741","url":null,"abstract":"Introduction: The use of virtual reality technology to deliver exposure therapy in the treatment of phobic anxiety (i.e., social anxiety disorder, agoraphobia, and specific phobia) has been proposed to be advantageous compared with in-vivo exposure therapy. These supposed advantages depend on the features of the virtual reality technology and how it is used therapeutically. Therefore, the aim of this study was to provide a comprehensive overview of the features of the hardware and software used in studies examining virtual reality exposure therapy studies for phobic anxiety disorders. Methods: 70 studies using virtual reality exposure therapy to treat social anxiety disorder, agoraphobia and/or specific phobia, were systematically reviewed for 46 data points relating to these features. Results: We found that studies generally did not utilize contemporary virtual reality technology and that hardware and software features were inconsistently delineated. Discussion: The implications of these findings are that the use of modern virtual reality technology represents a relevant frontier in anxiety treatment and that a framework for reporting technical features of virtual reality exposure interventions would benefit the field.","PeriodicalId":73116,"journal":{"name":"Frontiers in virtual reality","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48192848","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}