In ball games, it is important that the players are able to estimate the position of the other players from a bird's-eye view based on the information obtained from their first-person view. We have developed a training system for improving this ability. The user wears a head-mounted display and can simulate ball games in 360° from the first-person view. The system allows the user to rearrange all players, and a ball from the bird's-eye view. The user can then track the other players from the first-person viewpoint and perform actions specific to the ball game such as passing, receiving a ball, and (if a defense player) following offense players.
{"title":"Sports Training System for Visualizing Bird's-Eye View from First-Person View","authors":"Shunki Shimizu, Kaoru Surni","doi":"10.1109/VR.2019.8798227","DOIUrl":"https://doi.org/10.1109/VR.2019.8798227","url":null,"abstract":"In ball games, it is important that the players are able to estimate the position of the other players from a bird's-eye view based on the information obtained from their first-person view. We have developed a training system for improving this ability. The user wears a head-mounted display and can simulate ball games in 360° from the first-person view. The system allows the user to rearrange all players, and a ball from the bird's-eye view. The user can then track the other players from the first-person viewpoint and perform actions specific to the ball game such as passing, receiving a ball, and (if a defense player) following offense players.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"61 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128145025","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Hyeongyeop Kang, Geonsun Lee, D. Kang, O. Kwon, Jun Yeup Cho, Ho-Jung Choi, JunaHvun Han
In a cable-driven suspension system developed to simulate the reduced gravity of lunar or Martian surfaces, we propose to manipu-late/reduce the physical cues of forward jumps so as to overcome the limited workspace problem. The physical cues should be manipulated in a way that the discrepancy from the visual cues provided through the HMD is not noticeable by users. We identified the extent to which forward jumps can be manipulated naturally. We combined it with visual gains, which can scale visual cues without being noticed by users. The test results obtained in a prototype application show that we can use both trajectory manipulation and visual gains to overcome the spatial limit. We also investigated the user experiences when making significantly high and far jumps. The results will be helpful in designing astronaut-training systems and various VR entertainment content.
{"title":"Jumping Further: Forward Jumps in a Gravity-reduced Immersive Virtual Environment","authors":"Hyeongyeop Kang, Geonsun Lee, D. Kang, O. Kwon, Jun Yeup Cho, Ho-Jung Choi, JunaHvun Han","doi":"10.1109/VR.2019.8798251","DOIUrl":"https://doi.org/10.1109/VR.2019.8798251","url":null,"abstract":"In a cable-driven suspension system developed to simulate the reduced gravity of lunar or Martian surfaces, we propose to manipu-late/reduce the physical cues of forward jumps so as to overcome the limited workspace problem. The physical cues should be manipulated in a way that the discrepancy from the visual cues provided through the HMD is not noticeable by users. We identified the extent to which forward jumps can be manipulated naturally. We combined it with visual gains, which can scale visual cues without being noticed by users. The test results obtained in a prototype application show that we can use both trajectory manipulation and visual gains to overcome the spatial limit. We also investigated the user experiences when making significantly high and far jumps. The results will be helpful in designing astronaut-training systems and various VR entertainment content.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"88 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121498781","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Velko Vechev, J. Zárate, David Lindlbauer, R. Hinchet, H. Shea, Otmar Hilliges
We introduce TacTiles, light (1.8g), low-power (130 mW), and small form-factor (1 cm3) electromagnetic actuators that can form a flexible haptic array to provide localized tactile feedback. Our novel hardware design uses a custom 8-layer PCB, dampening materials, and asymmetric latching, enabling two distinct modes of actuation: contact and pulse mode. We leverage these modes in Virtual Reality (VR) to render continuous contact with objects and the exploration of object surfaces and volumes with spatial haptic patterns. Results from a series of experiments show that users are able to localize feedback, discriminate between modes with high accuracy, and differentiate objects from haptic surfaces and volumes even without looking at them.
{"title":"TacTiles: Dual-Mode Low-Power Electromagnetic Actuators for Rendering Continuous Contact and Spatial Haptic Patterns in VR","authors":"Velko Vechev, J. Zárate, David Lindlbauer, R. Hinchet, H. Shea, Otmar Hilliges","doi":"10.1109/VR.2019.8797921","DOIUrl":"https://doi.org/10.1109/VR.2019.8797921","url":null,"abstract":"We introduce TacTiles, light (1.8g), low-power (130 mW), and small form-factor (1 cm3) electromagnetic actuators that can form a flexible haptic array to provide localized tactile feedback. Our novel hardware design uses a custom 8-layer PCB, dampening materials, and asymmetric latching, enabling two distinct modes of actuation: contact and pulse mode. We leverage these modes in Virtual Reality (VR) to render continuous contact with objects and the exploration of object surfaces and volumes with spatial haptic patterns. Results from a series of experiments show that users are able to localize feedback, discriminate between modes with high accuracy, and differentiate objects from haptic surfaces and volumes even without looking at them.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121857755","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Kosuke Hiratani, D. Iwai, Parinya Punpongsanon, Kosuke Sato
Shadowless Projector is projection mapping system in which a shadow (more specifically, umbra) does not suffer the projected result. A typical shadow removal technique used a multiple overlapping projection system. In this paper, we propose a shadow-less projection method with single projector. Inspired by a surgical light system that does not cast shadows on patients' bodies in clinical practice, we apply a special optical system that consists of methodically positioned vertical mirrors. This optical system works as a large aperture lens, it is impossible to block all projected ray by a small object such as a hand. Consequently, only penumbra is caused, which leads to a shadow-less projection.
{"title":"Shadowless Projector: Suppressing Shadows in Projection Mapping with Micro Mirror Array Plate","authors":"Kosuke Hiratani, D. Iwai, Parinya Punpongsanon, Kosuke Sato","doi":"10.1109/VR.2019.8798245","DOIUrl":"https://doi.org/10.1109/VR.2019.8798245","url":null,"abstract":"Shadowless Projector is projection mapping system in which a shadow (more specifically, umbra) does not suffer the projected result. A typical shadow removal technique used a multiple overlapping projection system. In this paper, we propose a shadow-less projection method with single projector. Inspired by a surgical light system that does not cast shadows on patients' bodies in clinical practice, we apply a special optical system that consists of methodically positioned vertical mirrors. This optical system works as a large aperture lens, it is impossible to block all projected ray by a small object such as a hand. Consequently, only penumbra is caused, which leads to a shadow-less projection.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116993867","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Sinhwa Kang, Jake Chanenson, P. Ghate, Peter Cowal, M. Weaver, D. Krum
Virtual reality (VR) has been widely utilized for training and education purposes because of pedagogical, safety, and economic benefits. The investigation of moral judgment is a particularly interesting VR application, related to training. For this study, we designed a within-subject experiment manipulating the role of study participants in a Trolley Dilemma scenario: either victim or driver. We conducted a pilot study with four participants and describe preliminary results and implications in this poster.
{"title":"Advancing Ethical Decision Making in Virtual Reality","authors":"Sinhwa Kang, Jake Chanenson, P. Ghate, Peter Cowal, M. Weaver, D. Krum","doi":"10.1109/VR.2019.8798151","DOIUrl":"https://doi.org/10.1109/VR.2019.8798151","url":null,"abstract":"Virtual reality (VR) has been widely utilized for training and education purposes because of pedagogical, safety, and economic benefits. The investigation of moral judgment is a particularly interesting VR application, related to training. For this study, we designed a within-subject experiment manipulating the role of study participants in a Trolley Dilemma scenario: either victim or driver. We conducted a pilot study with four participants and describe preliminary results and implications in this poster.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"3 3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123730757","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Brian M. Williamson, E. Taranta, Pat Garrity, R. Sottilare, J. Laviola
Accurate tracking of a user in a marker-less environment can be difficult, even more so when agile head or hand movements are expected. When relying on feature detection as part of a SLAM algorithm the issue arises that a large rotational delta causes previously tracked features to become lost. One approach to overcome this problem is with multiple sensors increasing the horizontal field of view. In this paper, we perform a systematic evaluation of tracking accuracy by recording several agile movements and providing different camera configurations to evaluate against. We begin with four sensors in a square configuration and test the resulting output from a chosen SLAM algorithm. We then systematically remove a camera from the feed covering all permutations to determine the level of accuracy and tracking loss. We cover some of the lessons learned in this preliminary experiment and how it may guide researchers in tracking extremely agile movements.
{"title":"A Systematic Evaluation of Multi-Sensor Array Configurations for SLAM Tracking with Agile Movements","authors":"Brian M. Williamson, E. Taranta, Pat Garrity, R. Sottilare, J. Laviola","doi":"10.1109/VR.2019.8798007","DOIUrl":"https://doi.org/10.1109/VR.2019.8798007","url":null,"abstract":"Accurate tracking of a user in a marker-less environment can be difficult, even more so when agile head or hand movements are expected. When relying on feature detection as part of a SLAM algorithm the issue arises that a large rotational delta causes previously tracked features to become lost. One approach to overcome this problem is with multiple sensors increasing the horizontal field of view. In this paper, we perform a systematic evaluation of tracking accuracy by recording several agile movements and providing different camera configurations to evaluate against. We begin with four sensors in a square configuration and test the resulting output from a chosen SLAM algorithm. We then systematically remove a camera from the feed covering all permutations to determine the level of accuracy and tracking loss. We cover some of the lessons learned in this preliminary experiment and how it may guide researchers in tracking extremely agile movements.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"82 11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123421031","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper presents an experiment assessing the feeling of spatial presence in both real and remote environments (respectively the socalled “natural presence” and “telepresence”). Twenty-eight (28) participants performed a 3D-pointing task while being located in a real office and the same office remotely rendered over HMD. The spatial presence was evaluated by means of the ITC-SOPI questionnaire and users' behaviour analysis (trajectories of head during the task). The analysis also included the effect of different levels of immersion of the system - visual-only versus visual and audio - rendering in such environments. The results show a higher sense of spatial presence for the remote condition, regardless of the degree of immersion, and for the “visual and audio” condition regardless of the environment. Additionally, trajectory analysis of users' heads reveals that participants behaved similarly in both environments.
{"title":"Spatial Presence in Real and Remote Immersive Environments","authors":"Nawel Khenak, J. Vézien, David Thery, P. Bourdot","doi":"10.1109/VR.2019.8797801","DOIUrl":"https://doi.org/10.1109/VR.2019.8797801","url":null,"abstract":"This paper presents an experiment assessing the feeling of spatial presence in both real and remote environments (respectively the socalled “natural presence” and “telepresence”). Twenty-eight (28) participants performed a 3D-pointing task while being located in a real office and the same office remotely rendered over HMD. The spatial presence was evaluated by means of the ITC-SOPI questionnaire and users' behaviour analysis (trajectories of head during the task). The analysis also included the effect of different levels of immersion of the system - visual-only versus visual and audio - rendering in such environments. The results show a higher sense of spatial presence for the remote condition, regardless of the degree of immersion, and for the “visual and audio” condition regardless of the environment. Additionally, trajectory analysis of users' heads reveals that participants behaved similarly in both environments.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128883562","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
David Heidrich, S. Oberdörfer, Marc Erich Latoschik
Slot machines are one of the most played games by pathological gamblers. New technologies, e.g. immersive Virtual Reality (VR), offer more possibilities to exploit erroneous beliefs in the context of gambling. However, the risk potential of VR-based gambling has not been researched, yet. A higher immersion might increase harmful aspects, thus making VR realizations more dangerous. Measuring harm-inducing factors reveals the risk potential of virtual gambling. In a user study, we analyze a slot machine realized as a desktop 3D and as an immersive VR version. Both versions are compared in respect to effects on dissociation, urge to gamble, dark flow, and illusion of control. Our study shows significantly higher values of dissociation, dark flow, and urge to gamble in the VR version. Presence significantly correlates with all measured harm-inducing factors. We demonstrate that VR-based gambling has a higher risk potential. This creates the importance of regulating VR-based gambling.
{"title":"The Effects of Immersion on Harm-inducing Factors in Virtual Slot Machines","authors":"David Heidrich, S. Oberdörfer, Marc Erich Latoschik","doi":"10.1109/VR.2019.8798021","DOIUrl":"https://doi.org/10.1109/VR.2019.8798021","url":null,"abstract":"Slot machines are one of the most played games by pathological gamblers. New technologies, e.g. immersive Virtual Reality (VR), offer more possibilities to exploit erroneous beliefs in the context of gambling. However, the risk potential of VR-based gambling has not been researched, yet. A higher immersion might increase harmful aspects, thus making VR realizations more dangerous. Measuring harm-inducing factors reveals the risk potential of virtual gambling. In a user study, we analyze a slot machine realized as a desktop 3D and as an immersive VR version. Both versions are compared in respect to effects on dissociation, urge to gamble, dark flow, and illusion of control. Our study shows significantly higher values of dissociation, dark flow, and urge to gamble in the VR version. Presence significantly correlates with all measured harm-inducing factors. We demonstrate that VR-based gambling has a higher risk potential. This creates the importance of regulating VR-based gambling.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"48 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124806609","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Coretet is a virtual reality instrument that explores the translation of performance gesture and mechanic from traditional bowed string instruments into an inherently non-physical implementation. Built using the Unreal Engine 4 and Pure Data, Coretet offers musicians a flexible and articulate musical instrument to play as well as a networked performance environment capable of supporting and presenting a traditional four-member string quartet. This paper discusses the technical implementation of Coretet and explores the musical and performative possibilities through the translation of physical instrument design into virtual reality.
{"title":"Coretet: A 21st Century Virtual Reality Musical Instrument for Solo and Networked Ensemble Performance","authors":"Rob Hamilton","doi":"10.1109/VR.2019.8797825","DOIUrl":"https://doi.org/10.1109/VR.2019.8797825","url":null,"abstract":"Coretet is a virtual reality instrument that explores the translation of performance gesture and mechanic from traditional bowed string instruments into an inherently non-physical implementation. Built using the Unreal Engine 4 and Pure Data, Coretet offers musicians a flexible and articulate musical instrument to play as well as a networked performance environment capable of supporting and presenting a traditional four-member string quartet. This paper discusses the technical implementation of Coretet and explores the musical and performative possibilities through the translation of physical instrument design into virtual reality.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"416 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124176012","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Immersive technologies have the potential to overcome physical limitations and virtually deliver field site experiences, for example, into the classroom. Yet, little is known about the features of immersive technologies that contribute to successful place-based learning. Immersive technologies afford embodied experiences by mimicking natural embodied interactions through a user's egocentric perspective. Additionally, they allow for beyond reality experiences integrating contextual information that cannot be provided at actual field sites. The current study singles out one aspect of place-based learning: Scale. In an empirical evaluation, scale was manipulated as part of two immersive virtual field trip (iVFT) experiences in order to disentangle its effect on place-based learning. Students either attended an actual field trip (AFT) or experienced one of two iVFTs using a head-mounted display. The iVFTs either mimicked the actual field trip or provided beyond reality experiences offering access to the field site from an elevated perspective using pseudo-aerial 360° imagery. Results show that students with access to the elevated perspective had significantly better scores, for example, on their spatial situation model (SSM). Our findings provide first results on how an increased (geographic) scale, which is accessible through an elevated perspective, boosts the development of SSMs. The reported study is part of a larger immersive education effort. Inspired by the positive results, we discuss our plan for a more rigorous assessment of scale effects on both self- and objectively assessed performance measures of spatial learning.
{"title":"Scale - Unexplored Opportunities for Immersive Technologies in Place-based Learning","authors":"Jiayan Zhao, A. Klippel","doi":"10.1109/VR.2019.8797867","DOIUrl":"https://doi.org/10.1109/VR.2019.8797867","url":null,"abstract":"Immersive technologies have the potential to overcome physical limitations and virtually deliver field site experiences, for example, into the classroom. Yet, little is known about the features of immersive technologies that contribute to successful place-based learning. Immersive technologies afford embodied experiences by mimicking natural embodied interactions through a user's egocentric perspective. Additionally, they allow for beyond reality experiences integrating contextual information that cannot be provided at actual field sites. The current study singles out one aspect of place-based learning: Scale. In an empirical evaluation, scale was manipulated as part of two immersive virtual field trip (iVFT) experiences in order to disentangle its effect on place-based learning. Students either attended an actual field trip (AFT) or experienced one of two iVFTs using a head-mounted display. The iVFTs either mimicked the actual field trip or provided beyond reality experiences offering access to the field site from an elevated perspective using pseudo-aerial 360° imagery. Results show that students with access to the elevated perspective had significantly better scores, for example, on their spatial situation model (SSM). Our findings provide first results on how an increased (geographic) scale, which is accessible through an elevated perspective, boosts the development of SSMs. The reported study is part of a larger immersive education effort. Inspired by the positive results, we discuss our plan for a more rigorous assessment of scale effects on both self- and objectively assessed performance measures of spatial learning.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126826561","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}