Pub Date : 2023-05-12DOI: 10.1109/ICVR57957.2023.10169632
Yuanbin Shi, Y. Sun, Xiaochen Wang, Quan‐wen Yang, Jiaqi Chen
The production monitoring system of seamless steel pipe is particularly critical to the quality of steel pipe products. However, due to the changeable production environment, scattered production information, difficult observation condition, and potential safety hazards, the production status monitoring of seamless steel pipe is inefficient. To solve the above problems, a new monitoring system is designed and developed based on virtual reality technology, which establishes the real production scene in a 3D virtual space. The 3D model is used to reproduce the on-site production process and realize the three functions of production scene roaming, production status monitoring, and personnel training. The system effectively improves the visibility and convenience of on-site monitoring and enhances the efficiency of production management.
{"title":"Design and Realization of Production Monitoring System for Seamless Steel Pipe Based on Virtual Reality","authors":"Yuanbin Shi, Y. Sun, Xiaochen Wang, Quan‐wen Yang, Jiaqi Chen","doi":"10.1109/ICVR57957.2023.10169632","DOIUrl":"https://doi.org/10.1109/ICVR57957.2023.10169632","url":null,"abstract":"The production monitoring system of seamless steel pipe is particularly critical to the quality of steel pipe products. However, due to the changeable production environment, scattered production information, difficult observation condition, and potential safety hazards, the production status monitoring of seamless steel pipe is inefficient. To solve the above problems, a new monitoring system is designed and developed based on virtual reality technology, which establishes the real production scene in a 3D virtual space. The 3D model is used to reproduce the on-site production process and realize the three functions of production scene roaming, production status monitoring, and personnel training. The system effectively improves the visibility and convenience of on-site monitoring and enhances the efficiency of production management.","PeriodicalId":439483,"journal":{"name":"2023 9th International Conference on Virtual Reality (ICVR)","volume":"302 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123120974","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-05-12DOI: 10.1109/ICVR57957.2023.10169768
Manuel B. Garcia, Danna May C. Mansul, Eymard Pempina, M. R. L. Perez, Rossana Adao
Advances in technology have revolutionized student recruitment strategies employed by educational institutions. These innovations led to the adoption of virtual campus tours to provide prospective students with an immersive expedition into the school facilities replicated in a digital environment. However, the existing virtual tour technologies pose challenges, including cybersickness in virtual reality and limited interactivity in 360-degree videos. In this study, we fill these research gaps by developing a playable and interactive campus virtual tour where potential enrollees can visit and tour the campuses remotely. In addition to a series of beta tests with enrolled students, we recruited students specializing in game development and their associates to evaluate the application using an extended Technology Acceptance Model (TAM) framework. In this evaluation, we found that the application was well-received by prospective students and was regarded as useful in delivering an immersive campus visit experience. From the TAM perspective, it was evident that there was a significant difference in how enrolled and potential students assess the application in terms of perceived usefulness and behavioral intention. The positive acceptance of the application led to the recommendation of playable campus virtual tours as a tool for improving student recruitment strategies.
{"title":"A Playable 3D Virtual Tour for an Interactive Campus Visit Experience: Showcasing School Facilities to Attract Potential Enrollees","authors":"Manuel B. Garcia, Danna May C. Mansul, Eymard Pempina, M. R. L. Perez, Rossana Adao","doi":"10.1109/ICVR57957.2023.10169768","DOIUrl":"https://doi.org/10.1109/ICVR57957.2023.10169768","url":null,"abstract":"Advances in technology have revolutionized student recruitment strategies employed by educational institutions. These innovations led to the adoption of virtual campus tours to provide prospective students with an immersive expedition into the school facilities replicated in a digital environment. However, the existing virtual tour technologies pose challenges, including cybersickness in virtual reality and limited interactivity in 360-degree videos. In this study, we fill these research gaps by developing a playable and interactive campus virtual tour where potential enrollees can visit and tour the campuses remotely. In addition to a series of beta tests with enrolled students, we recruited students specializing in game development and their associates to evaluate the application using an extended Technology Acceptance Model (TAM) framework. In this evaluation, we found that the application was well-received by prospective students and was regarded as useful in delivering an immersive campus visit experience. From the TAM perspective, it was evident that there was a significant difference in how enrolled and potential students assess the application in terms of perceived usefulness and behavioral intention. The positive acceptance of the application led to the recommendation of playable campus virtual tours as a tool for improving student recruitment strategies.","PeriodicalId":439483,"journal":{"name":"2023 9th International Conference on Virtual Reality (ICVR)","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116628168","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-05-12DOI: 10.1109/ICVR57957.2023.10169760
Jiang Haigang, Ling Rui, Tang Linfeng
In the traditional building HVAC system equipment operation and maintenance process, some of the HVAC system equipment installation locations are hidden in the building space, so many HVAC equipment repair and maintenance operations have limited maintenance space and low visualization, and the building air conditioning system fault diagnosis process is often plagued by on-site work collaboration factors resulting in low fault repair efficiency. Based on the above problems, this paper proposes to superimpose Mixed Reality (MR) technology on top of the Building Information Model (BIM) to develop a BIM+MR-based building HVAC system equipment fault diagnosis system to improve the immersive and remote visualization interaction capability in the HVAC system equipment maintenance process through the digital twin technology. The BIM+MR fault diagnosis system is developed to improve the immersive and remote visualization interaction capability in the HVAC system equipment maintenance process using digital twin technology. The technical verification shows that the efficiency of HVAC’s air conditioning equipment fault diagnosis in the pilot project has increased by 1.88 times after applying MR technology.
{"title":"Use of Mixed Reality in HVAC System Equipment Fault Detection and Diagnosis Method","authors":"Jiang Haigang, Ling Rui, Tang Linfeng","doi":"10.1109/ICVR57957.2023.10169760","DOIUrl":"https://doi.org/10.1109/ICVR57957.2023.10169760","url":null,"abstract":"In the traditional building HVAC system equipment operation and maintenance process, some of the HVAC system equipment installation locations are hidden in the building space, so many HVAC equipment repair and maintenance operations have limited maintenance space and low visualization, and the building air conditioning system fault diagnosis process is often plagued by on-site work collaboration factors resulting in low fault repair efficiency. Based on the above problems, this paper proposes to superimpose Mixed Reality (MR) technology on top of the Building Information Model (BIM) to develop a BIM+MR-based building HVAC system equipment fault diagnosis system to improve the immersive and remote visualization interaction capability in the HVAC system equipment maintenance process through the digital twin technology. The BIM+MR fault diagnosis system is developed to improve the immersive and remote visualization interaction capability in the HVAC system equipment maintenance process using digital twin technology. The technical verification shows that the efficiency of HVAC’s air conditioning equipment fault diagnosis in the pilot project has increased by 1.88 times after applying MR technology.","PeriodicalId":439483,"journal":{"name":"2023 9th International Conference on Virtual Reality (ICVR)","volume":"56 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125474754","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-05-12DOI: 10.1109/ICVR57957.2023.10169459
O. E. Cinar, K. Rafferty, David Cutting, Hui Wang
This paper presents the implementation of an Augmented Reality (AR) application in order to enhance the understanding of Python collection data types. AR is a technology which has gained popularity in recent years. This technology has the potential of completely transforming how we learn, work and think. In this context, the importance of this technology has increased visibly in terms of engineering education. One of the challenges encountered is that students often struggle with abstract programming concepts and find programming hard to conceptualise. More specifically, first-year electrical and electronic engineering students may have difficulty in understanding and learning Python collection data types (List, Tuple and Dictionary) and their main differences. AR is one of the solutions to solve this problem. An AR application has been implemented by making this software concept more visible and easier to comprehend through AR and 3D visualisation. A user study was defined in which two groups were respectively provided the AR application and the same information contained in a printed booklet. Comprehension tests on the subject before and after they used the learning resource were used to gauge how effective each intervention was. The t-test method was used to analyse the user study results. It was concluded that using the AR application (post-test score mean: 8.8) rather than reading the booklet (post-test score mean: 7.5) resulted in a higher test score mean.
{"title":"Using Augmented Reality to Enhance Learning and Understanding of Abstract Programming Concepts","authors":"O. E. Cinar, K. Rafferty, David Cutting, Hui Wang","doi":"10.1109/ICVR57957.2023.10169459","DOIUrl":"https://doi.org/10.1109/ICVR57957.2023.10169459","url":null,"abstract":"This paper presents the implementation of an Augmented Reality (AR) application in order to enhance the understanding of Python collection data types. AR is a technology which has gained popularity in recent years. This technology has the potential of completely transforming how we learn, work and think. In this context, the importance of this technology has increased visibly in terms of engineering education. One of the challenges encountered is that students often struggle with abstract programming concepts and find programming hard to conceptualise. More specifically, first-year electrical and electronic engineering students may have difficulty in understanding and learning Python collection data types (List, Tuple and Dictionary) and their main differences. AR is one of the solutions to solve this problem. An AR application has been implemented by making this software concept more visible and easier to comprehend through AR and 3D visualisation. A user study was defined in which two groups were respectively provided the AR application and the same information contained in a printed booklet. Comprehension tests on the subject before and after they used the learning resource were used to gauge how effective each intervention was. The t-test method was used to analyse the user study results. It was concluded that using the AR application (post-test score mean: 8.8) rather than reading the booklet (post-test score mean: 7.5) resulted in a higher test score mean.","PeriodicalId":439483,"journal":{"name":"2023 9th International Conference on Virtual Reality (ICVR)","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115438661","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-05-12DOI: 10.1109/ICVR57957.2023.10169666
Gao Xiang, Mu Xiaomin, Qin Zhuangyan, Seo Eun Kyeong, Wu Qitao, Deng Bangkun
This paper aims to study the official virtual characters “ Huai Xiaobu” and “ Huai Xiaomei” in Huai’an City, how to rely on virtual reality technology to enable canal culture transmission. Under the background of metauniverse, Huai’an Canal culture transmission as the entry point, through the analysis and induction of virtual reality technology, first of all, on the basis of traditional IP, the use of 3D design software to further digital virtual people and VR design creation. Based on the image of “Huai Xiaobu” and “Huai Xiaomei”, a series of derivative design, virtual image and reality combined video design, the virtual character “Huai Xiaobu” communication form is divided into virtual reality, augmented reality, mixed reality three case forms to achieve, further enhance the communication power and influence of Huai’an new media. This article introduces the basic principles of various methods, overcome the specific technology of the case, compare and summarize the advantages and disadvantages of various forms, and finally achieve the promotion of the canal cultural information transmission rate, to meet the emerging needs of the audience, it is a highly comprehensive city history and humanity, not only can directly express the cultural and economic characteristics of Huai’an, broaden the depth and breadth of communication, Cooperatively build multi-level and multi-dimensional communication ecological pattern, but also can become the name card of Huai’an, combined with the characteristics of Huai’an, improve the internal cohesion of Huai’an, external reputation.
{"title":"The Application of Digital Virtual Man in the Cultural Communication of Grand Canal — Taking “Huai Xiaobu” and “Huai Xiaomei” as Examples","authors":"Gao Xiang, Mu Xiaomin, Qin Zhuangyan, Seo Eun Kyeong, Wu Qitao, Deng Bangkun","doi":"10.1109/ICVR57957.2023.10169666","DOIUrl":"https://doi.org/10.1109/ICVR57957.2023.10169666","url":null,"abstract":"This paper aims to study the official virtual characters “ Huai Xiaobu” and “ Huai Xiaomei” in Huai’an City, how to rely on virtual reality technology to enable canal culture transmission. Under the background of metauniverse, Huai’an Canal culture transmission as the entry point, through the analysis and induction of virtual reality technology, first of all, on the basis of traditional IP, the use of 3D design software to further digital virtual people and VR design creation. Based on the image of “Huai Xiaobu” and “Huai Xiaomei”, a series of derivative design, virtual image and reality combined video design, the virtual character “Huai Xiaobu” communication form is divided into virtual reality, augmented reality, mixed reality three case forms to achieve, further enhance the communication power and influence of Huai’an new media. This article introduces the basic principles of various methods, overcome the specific technology of the case, compare and summarize the advantages and disadvantages of various forms, and finally achieve the promotion of the canal cultural information transmission rate, to meet the emerging needs of the audience, it is a highly comprehensive city history and humanity, not only can directly express the cultural and economic characteristics of Huai’an, broaden the depth and breadth of communication, Cooperatively build multi-level and multi-dimensional communication ecological pattern, but also can become the name card of Huai’an, combined with the characteristics of Huai’an, improve the internal cohesion of Huai’an, external reputation.","PeriodicalId":439483,"journal":{"name":"2023 9th International Conference on Virtual Reality (ICVR)","volume":"228 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122150887","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-05-12DOI: 10.1109/ICVR57957.2023.10169813
Yin Guojun, Fan Jinyu, Liu Yang, Li Xin
The digital protection of the cultural heritage of traditional agriculture and water conservancy equipment, as well as the integration and dissemination of culture and tourism, are important issues facing the relevant academic circles at present. Taking the representative Chinese traditional wheelbarrow as the research object, this paper proposes a cultural heritage protection and inheritance path through historical research, design analysis and virtual reality technology development. That is, the shape, function, material, technology and principle of the artifacts are verified by historical methods, and the original appearance is restored by combining 3D digital scanning and modeling. On this basis, the immersive interactive experience game platform is built by virtual reality engine, so as to realize the effective practical exploration of interdisciplinary digital protection of cultural heritage and the integrated transmission of culture and tourism
{"title":"Chinese Traditional Wheelbarrow Restoration and Game Design Based on Virtual Reality Technology","authors":"Yin Guojun, Fan Jinyu, Liu Yang, Li Xin","doi":"10.1109/ICVR57957.2023.10169813","DOIUrl":"https://doi.org/10.1109/ICVR57957.2023.10169813","url":null,"abstract":"The digital protection of the cultural heritage of traditional agriculture and water conservancy equipment, as well as the integration and dissemination of culture and tourism, are important issues facing the relevant academic circles at present. Taking the representative Chinese traditional wheelbarrow as the research object, this paper proposes a cultural heritage protection and inheritance path through historical research, design analysis and virtual reality technology development. That is, the shape, function, material, technology and principle of the artifacts are verified by historical methods, and the original appearance is restored by combining 3D digital scanning and modeling. On this basis, the immersive interactive experience game platform is built by virtual reality engine, so as to realize the effective practical exploration of interdisciplinary digital protection of cultural heritage and the integrated transmission of culture and tourism","PeriodicalId":439483,"journal":{"name":"2023 9th International Conference on Virtual Reality (ICVR)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114801683","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-05-12DOI: 10.1109/ICVR57957.2023.10169686
Jingjing Zhang, Chuanzhi Su, Mengjie Huang, Liwen Liang, Rui Yang
Virtual reality (VR) users typically interact with the virtual world in two main ways: through the hand-held controller or body motion tracking. With the development of tracker technique, it is a promising direction for designers and developers to enrich the interaction by adding more possibilities to physical tool designs combined with the tracker in VR. The virtual representation of the tool can be varied in VR applications, but the real-time haptic interaction that physical tools provide is usually limited. Little is known about users’ embodied perception of the tool in VR, and research has not yet determined how different physical tool designs affect users’ sense of tool embodiment. This research explores the effect of physical tools on tool embodiment in VR. Six physical tools with different hand grasping gestures were designed and applied in the experiment of the user study. The findings reveal that one physical tool significantly differed from the virtual representation was rated the lowest self-reported tool embodiment and negative feedback from users. This study offers the design recommendation for physical tools with trackers for VR applications and expands existing research on tool embodiment.
{"title":"Impact of Physical Tool Designs on User Embodiment of Tools in Virtual Reality","authors":"Jingjing Zhang, Chuanzhi Su, Mengjie Huang, Liwen Liang, Rui Yang","doi":"10.1109/ICVR57957.2023.10169686","DOIUrl":"https://doi.org/10.1109/ICVR57957.2023.10169686","url":null,"abstract":"Virtual reality (VR) users typically interact with the virtual world in two main ways: through the hand-held controller or body motion tracking. With the development of tracker technique, it is a promising direction for designers and developers to enrich the interaction by adding more possibilities to physical tool designs combined with the tracker in VR. The virtual representation of the tool can be varied in VR applications, but the real-time haptic interaction that physical tools provide is usually limited. Little is known about users’ embodied perception of the tool in VR, and research has not yet determined how different physical tool designs affect users’ sense of tool embodiment. This research explores the effect of physical tools on tool embodiment in VR. Six physical tools with different hand grasping gestures were designed and applied in the experiment of the user study. The findings reveal that one physical tool significantly differed from the virtual representation was rated the lowest self-reported tool embodiment and negative feedback from users. This study offers the design recommendation for physical tools with trackers for VR applications and expands existing research on tool embodiment.","PeriodicalId":439483,"journal":{"name":"2023 9th International Conference on Virtual Reality (ICVR)","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114834067","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-05-12DOI: 10.1109/ICVR57957.2023.10169295
Li Yin, Hang Fu, Demin Yang, Xingqi Fan, Puxun Tu, Xiaojun Chen
Surgical procedures are often associated with a range of potential risks, including nerve tissue damage resulting from complex lesion structures, narrow surgical spaces, or the low accuracy of the operating surgeon. The rise of augmented reality technology has attracted wide attention in the field of complementary medicine. The safe and low delay augmented-reality assisted surgery system can effectively guide the accurate operation. At the same time, augmented reality technology can make surgical sites more intuitive. In this work, we propose a novel surgical operating system that leverages coordinate system transformation and optical navigation to improve precision and safety. Our approach involves the integration of medical images and optical tracking devices before and during surgery to achieve coordinate transformation between the patient’s position, surgical instruments, and the pre-surgery CT scan. This process involves calibration, registration, and other critical procedures. At the same time, the study also introduced the function of augmented reality. Doctors wearing augmented reality glasses can use the system to see more vivid scenes of surgery. During the calibration phase, we utilize biplanar and interpolation algorithms to obtain the conversion relationship between the C-arm emission source and the X-ray image coordinate system. In the registration stage, we convert the preoperative 3D CT into a DRR image using digital reconstruction image technology, which we then register with the 2D X-ray during the operation. To optimize the iteration, we employ the improved Powell algorithm. The algorithm demonstrates an average angle and position error of 0.62° and 0.56mm, respectively. Through the transformation of spatial position relationship, the transformation relationship between preoperative CT and intraoperative patient position was obtained. A user-friendly, integrated software system has been developed that allows users to view the relative relationship between surgical instruments and patient position in CT in real time during surgery. An augmented reality module has also been introduced, allowing operators to see specific surgical models in specific glasses. In this study, the establishment and model test of the algorithm were realized, thus providing auxiliary functions for high precision surgical operation.
{"title":"AR-Based Surgical Navigation System Based on Dual Plane Calibration and 2D/3D Registration","authors":"Li Yin, Hang Fu, Demin Yang, Xingqi Fan, Puxun Tu, Xiaojun Chen","doi":"10.1109/ICVR57957.2023.10169295","DOIUrl":"https://doi.org/10.1109/ICVR57957.2023.10169295","url":null,"abstract":"Surgical procedures are often associated with a range of potential risks, including nerve tissue damage resulting from complex lesion structures, narrow surgical spaces, or the low accuracy of the operating surgeon. The rise of augmented reality technology has attracted wide attention in the field of complementary medicine. The safe and low delay augmented-reality assisted surgery system can effectively guide the accurate operation. At the same time, augmented reality technology can make surgical sites more intuitive. In this work, we propose a novel surgical operating system that leverages coordinate system transformation and optical navigation to improve precision and safety. Our approach involves the integration of medical images and optical tracking devices before and during surgery to achieve coordinate transformation between the patient’s position, surgical instruments, and the pre-surgery CT scan. This process involves calibration, registration, and other critical procedures. At the same time, the study also introduced the function of augmented reality. Doctors wearing augmented reality glasses can use the system to see more vivid scenes of surgery. During the calibration phase, we utilize biplanar and interpolation algorithms to obtain the conversion relationship between the C-arm emission source and the X-ray image coordinate system. In the registration stage, we convert the preoperative 3D CT into a DRR image using digital reconstruction image technology, which we then register with the 2D X-ray during the operation. To optimize the iteration, we employ the improved Powell algorithm. The algorithm demonstrates an average angle and position error of 0.62° and 0.56mm, respectively. Through the transformation of spatial position relationship, the transformation relationship between preoperative CT and intraoperative patient position was obtained. A user-friendly, integrated software system has been developed that allows users to view the relative relationship between surgical instruments and patient position in CT in real time during surgery. An augmented reality module has also been introduced, allowing operators to see specific surgical models in specific glasses. In this study, the establishment and model test of the algorithm were realized, thus providing auxiliary functions for high precision surgical operation.","PeriodicalId":439483,"journal":{"name":"2023 9th International Conference on Virtual Reality (ICVR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130576025","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-05-12DOI: 10.1109/ICVR57957.2023.10169506
Alyssa Bigbee
This article is an informed practice and approach to the application of virtual reality (VR) in Dance/Movement Therapy (DMT) within a telehealth setting. It explores how VR can be incorporated into Dance/Movement Therapy. VR has been proven to be affective in other forms of therapy including: exposure therapy, music therapy, and art therapy. However, there is no existing literature showing how effective it can be in Dance/Movement Therapy. As the use of telehealth continues to increase, there is a need for interventions that are engaging and offering opportunities for connection between clinicians and the patient. This practice was developed at a university in a large city for adolescents aged 18-25 during the COVID-19 pandemic shut down and continues to be implemented in virtual DMT. This approach explores what theoretical approaches and existing DMT interventions could be used in VR within a telehealth experience. Current literature was reviewed from relevant topics including virtual reality history & design, therapeutic VR interventions, medical applications, and VR in creative arts therapies. Existing VR applications such as YouTube VR, In Mind VR, vTime XR, and Rec Room were used during this practice as there is no existing VR applications for Dance/Movement Therapy. Dance/Movement Therapy VR interventions were provided over a six-week period to three participants in accordance to their presenting concerns. Themes were extracted and discussed for future applications. The results displayed potential interventions for addressing grief, depression, and chronic pain. Participants also felt more engaged using the VR & Dance/Movement Therapy in sessions. There were also opportunities to enhance exposure therapy by incorporating DMT interventions to address symptoms produced by anxiety inducing phobias.
这篇文章是一个知情的实践和方法,虚拟现实(VR)在舞蹈/运动治疗(DMT)在远程医疗设置中的应用。它探讨了如何将VR融入舞蹈/运动疗法。VR已被证明在其他形式的治疗中有效,包括:暴露疗法、音乐疗法和艺术疗法。然而,没有现有的文献表明它在舞蹈/运动治疗中有多有效。随着远程保健的使用不断增加,需要采取具有吸引力的干预措施,并为临床医生和患者之间的联系提供机会。这种做法是在COVID-19大流行关闭期间在大城市的一所大学为18-25岁的青少年开发的,并继续在虚拟DMT中实施。该方法探讨了哪些理论方法和现有的DMT干预措施可以在远程医疗体验中用于虚拟现实。本文从虚拟现实的历史与设计、虚拟现实治疗干预、医疗应用和虚拟现实在创意艺术治疗中的应用等方面综述了当前的相关文献。现有的VR应用程序,如YouTube VR, In Mind VR, vTime XR和Rec Room在这次实践中使用,因为没有现有的VR应用程序用于舞蹈/运动治疗。舞蹈/运动疗法VR干预在六周的时间内根据他们的关注提供给三名参与者。提取主题并讨论未来的应用。结果显示了解决悲伤、抑郁和慢性疼痛的潜在干预措施。参与者在使用VR和舞蹈/运动疗法时也感到更投入。也有机会通过纳入DMT干预措施来加强暴露疗法,以解决由焦虑引起的恐惧症产生的症状。
{"title":"Application of Virtual Reality in Dance/Movement Therapy","authors":"Alyssa Bigbee","doi":"10.1109/ICVR57957.2023.10169506","DOIUrl":"https://doi.org/10.1109/ICVR57957.2023.10169506","url":null,"abstract":"This article is an informed practice and approach to the application of virtual reality (VR) in Dance/Movement Therapy (DMT) within a telehealth setting. It explores how VR can be incorporated into Dance/Movement Therapy. VR has been proven to be affective in other forms of therapy including: exposure therapy, music therapy, and art therapy. However, there is no existing literature showing how effective it can be in Dance/Movement Therapy. As the use of telehealth continues to increase, there is a need for interventions that are engaging and offering opportunities for connection between clinicians and the patient. This practice was developed at a university in a large city for adolescents aged 18-25 during the COVID-19 pandemic shut down and continues to be implemented in virtual DMT. This approach explores what theoretical approaches and existing DMT interventions could be used in VR within a telehealth experience. Current literature was reviewed from relevant topics including virtual reality history & design, therapeutic VR interventions, medical applications, and VR in creative arts therapies. Existing VR applications such as YouTube VR, In Mind VR, vTime XR, and Rec Room were used during this practice as there is no existing VR applications for Dance/Movement Therapy. Dance/Movement Therapy VR interventions were provided over a six-week period to three participants in accordance to their presenting concerns. Themes were extracted and discussed for future applications. The results displayed potential interventions for addressing grief, depression, and chronic pain. Participants also felt more engaged using the VR & Dance/Movement Therapy in sessions. There were also opportunities to enhance exposure therapy by incorporating DMT interventions to address symptoms produced by anxiety inducing phobias.","PeriodicalId":439483,"journal":{"name":"2023 9th International Conference on Virtual Reality (ICVR)","volume":"49 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126321380","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2023-05-12DOI: 10.1109/ICVR57957.2023.10169359
Haodong Chen, Yuk Ying Chung, Li Tan, Xiaoming Chen
Event cameras are sensors inspired by biological systems that specialize in capturing changes in brightness. These emerging cameras offer many advantages over conventional frame-based cameras, including high dynamic range, high frame rates, and extremely low power consumption. Due to these advantages, event cameras have increasingly been adapted in various fields, such as frame interpolation, semantic segmentation, odometry, and SLAM. However, their application in 3D reconstruction for VR applications is underexplored. Previous methods in this field mainly focused on 3D reconstruction through depth map estimation. Methods that produce dense 3D reconstruction generally require multiple cameras, while methods that utilize a single event camera can only produce a semi-dense result. Other single-camera methods that can produce dense 3D reconstruction rely on creating a pipeline that either incorporates the aforementioned methods or other existing Structure from Motion (SfM) or Multi-view Stereo (MVS) methods. In this paper, we propose a novel approach for solving dense 3D reconstruction using only a single event camera. To the best of our knowledge, our work is the first attempt in this regard. Our preliminary results demonstrate that the proposed method can produce visually distinguishable dense 3D reconstructions directly without requiring pipelines like those used by existing methods. Additionally, we have created a synthetic dataset with 39, 739 object scans using an event camera simulator. This dataset will help accelerate other relevant research in this field.
{"title":"Dense Voxel 3D Reconstruction Using a Monocular Event Camera","authors":"Haodong Chen, Yuk Ying Chung, Li Tan, Xiaoming Chen","doi":"10.1109/ICVR57957.2023.10169359","DOIUrl":"https://doi.org/10.1109/ICVR57957.2023.10169359","url":null,"abstract":"Event cameras are sensors inspired by biological systems that specialize in capturing changes in brightness. These emerging cameras offer many advantages over conventional frame-based cameras, including high dynamic range, high frame rates, and extremely low power consumption. Due to these advantages, event cameras have increasingly been adapted in various fields, such as frame interpolation, semantic segmentation, odometry, and SLAM. However, their application in 3D reconstruction for VR applications is underexplored. Previous methods in this field mainly focused on 3D reconstruction through depth map estimation. Methods that produce dense 3D reconstruction generally require multiple cameras, while methods that utilize a single event camera can only produce a semi-dense result. Other single-camera methods that can produce dense 3D reconstruction rely on creating a pipeline that either incorporates the aforementioned methods or other existing Structure from Motion (SfM) or Multi-view Stereo (MVS) methods. In this paper, we propose a novel approach for solving dense 3D reconstruction using only a single event camera. To the best of our knowledge, our work is the first attempt in this regard. Our preliminary results demonstrate that the proposed method can produce visually distinguishable dense 3D reconstructions directly without requiring pipelines like those used by existing methods. Additionally, we have created a synthetic dataset with 39, 739 object scans using an event camera simulator. This dataset will help accelerate other relevant research in this field.","PeriodicalId":439483,"journal":{"name":"2023 9th International Conference on Virtual Reality (ICVR)","volume":"41 5","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120990362","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}