eEyes is an integrated aid system for people with low vision. eEyes is a wearable device that can be attached to glasses, which consists of a wearable unit and a self-developed standalone computing unit. eEyes employed Natural User Interface (NUI) technology such as user's hand gesture recognition. An eEyes user can interact with eEyes through static hand gestures (open hand, pointing, v sign, etc.). eEyes's hand gesture recognition technology was developed using on skin-color-based hand gesture recognition techniques. This NUI technology on eEyes provides users with enhanced usability for using eEyes's services. eEyes's capabilities can be also applied to other fields such as mobile VR and AR interactions.
{"title":"eEyes – an Integrated Aid System for the Blind and People with Low Vision","authors":"Jinhyun Park, Jun Park","doi":"10.1145/3359997.3365755","DOIUrl":"https://doi.org/10.1145/3359997.3365755","url":null,"abstract":"eEyes is an integrated aid system for people with low vision. eEyes is a wearable device that can be attached to glasses, which consists of a wearable unit and a self-developed standalone computing unit. eEyes employed Natural User Interface (NUI) technology such as user's hand gesture recognition. An eEyes user can interact with eEyes through static hand gestures (open hand, pointing, v sign, etc.). eEyes's hand gesture recognition technology was developed using on skin-color-based hand gesture recognition techniques. This NUI technology on eEyes provides users with enhanced usability for using eEyes's services. eEyes's capabilities can be also applied to other fields such as mobile VR and AR interactions.","PeriodicalId":448139,"journal":{"name":"Proceedings of the 17th International Conference on Virtual-Reality Continuum and its Applications in Industry","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-11-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115150488","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this work we present a redirected walking scheme suitable for shared spaces in a virtual reality environment. We show our redirected walking to work for the case of two physical spaces (a host and a guest) being merged into a single virtual host space. The redirection is based on warping the guest space into the host space using a conformal mapping that preserves the local shape and features. We compare our technique with state-of-the art indoor redirection schemes and show its efficiency. We found our method to have better task performance, higher social presence, and less simulator sickness.
{"title":"Conformal Redirected Walking for Shared Indoor Spaces","authors":"Yashvardhan Tomar, Ayushi Srivastava, Arindam Dey, Ojaswa Sharma","doi":"10.1145/3359997.3365702","DOIUrl":"https://doi.org/10.1145/3359997.3365702","url":null,"abstract":"In this work we present a redirected walking scheme suitable for shared spaces in a virtual reality environment. We show our redirected walking to work for the case of two physical spaces (a host and a guest) being merged into a single virtual host space. The redirection is based on warping the guest space into the host space using a conformal mapping that preserves the local shape and features. We compare our technique with state-of-the art indoor redirection schemes and show its efficiency. We found our method to have better task performance, higher social presence, and less simulator sickness.","PeriodicalId":448139,"journal":{"name":"Proceedings of the 17th International Conference on Virtual-Reality Continuum and its Applications in Industry","volume":"30 2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-11-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116677543","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Predicting the user’s visual attention enables a virtual reality (VR) environment to provide a context-aware and interactive user experience. Researchers have attempted to understand visual attention using eye-tracking data in a 2D plane. In this poster, we propose the first 3D eye-tracking dataset for visual attention modelling in the context of a virtual museum. It comprises about 7 million records and may facilitate visual attention modelling in a 3D VR space.
{"title":"An Eye-Tracking Dataset for Visual Attention Modelling in a Virtual Museum Context","authors":"Yunzhan Zhou, Tian Feng, Shihui Shuai, Xiangdong Li, Ling-yun Sun, H. Duh","doi":"10.1145/3359997.3365738","DOIUrl":"https://doi.org/10.1145/3359997.3365738","url":null,"abstract":"Predicting the user’s visual attention enables a virtual reality (VR) environment to provide a context-aware and interactive user experience. Researchers have attempted to understand visual attention using eye-tracking data in a 2D plane. In this poster, we propose the first 3D eye-tracking dataset for visual attention modelling in the context of a virtual museum. It comprises about 7 million records and may facilitate visual attention modelling in a 3D VR space.","PeriodicalId":448139,"journal":{"name":"Proceedings of the 17th International Conference on Virtual-Reality Continuum and its Applications in Industry","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-11-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122329888","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Kay Vasey, Olivier Bos, Fadhil Nasser, Adeline Tan, Benjamin Li JunTing, Khoo Eng Tat, T. Marsh
Partnered with UN Environment, an interactive and immersive virtual reality (VR) experience was produced that takes participants on a journey through the human stomach to raise awareness of microplastics that are unknowingly consumed in our daily lives. The creative practice and research addresses UN Sustainable Development Goal #12: Responsible Consumption and Production. Through novel, exciting and engaging interactive narrative and gameplay mechanics, Water Bodies delivers a serious underlying message to encourage participants to question their use of plastic and to drive towards a positive environmental impact.
{"title":"Water Bodies: VR Interactive Narrative and Gameplay for Social Impact","authors":"Kay Vasey, Olivier Bos, Fadhil Nasser, Adeline Tan, Benjamin Li JunTing, Khoo Eng Tat, T. Marsh","doi":"10.1145/3359997.3365746","DOIUrl":"https://doi.org/10.1145/3359997.3365746","url":null,"abstract":"Partnered with UN Environment, an interactive and immersive virtual reality (VR) experience was produced that takes participants on a journey through the human stomach to raise awareness of microplastics that are unknowingly consumed in our daily lives. The creative practice and research addresses UN Sustainable Development Goal #12: Responsible Consumption and Production. Through novel, exciting and engaging interactive narrative and gameplay mechanics, Water Bodies delivers a serious underlying message to encourage participants to question their use of plastic and to drive towards a positive environmental impact.","PeriodicalId":448139,"journal":{"name":"Proceedings of the 17th International Conference on Virtual-Reality Continuum and its Applications in Industry","volume":"468 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-11-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122788842","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We propose a VR system “Möbiusschleife” to make the VR and real world directly interact and enlarge the boundary of the VR player’s field. Conventional VR systems generally adopt head- mounted display (HMD) to present altered experiences from the real, but external audiences cannot receive them. In this paper, we present a virtual “window” which provides bidirectional face-to- face interaction between the VR and real world with a touch display and webcam. We also prepare a simple stereoscopic display enabling the VR player and objects to move as if part of the VR world leaps into the real world. Lastly, we integrate all the above systems as a simple chat application with a virtual character.
{"title":"Möbiusschleife: Beyond the Bounds of a Closed-Loop VR System","authors":"Koki Toda, Sayuki Hayashi","doi":"10.1145/3359997.3365720","DOIUrl":"https://doi.org/10.1145/3359997.3365720","url":null,"abstract":"We propose a VR system “Möbiusschleife” to make the VR and real world directly interact and enlarge the boundary of the VR player’s field. Conventional VR systems generally adopt head- mounted display (HMD) to present altered experiences from the real, but external audiences cannot receive them. In this paper, we present a virtual “window” which provides bidirectional face-to- face interaction between the VR and real world with a touch display and webcam. We also prepare a simple stereoscopic display enabling the VR player and objects to move as if part of the VR world leaps into the real world. Lastly, we integrate all the above systems as a simple chat application with a virtual character.","PeriodicalId":448139,"journal":{"name":"Proceedings of the 17th International Conference on Virtual-Reality Continuum and its Applications in Industry","volume":"760 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-11-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117011848","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Theophilus Teo, Gun A. Lee, M. Billinghurst, Matt Adcock
In this paper, we investigate the social aspects and effects of various visual cues in a 360 panorama-based Mixed Reality remote collaboration system. We conducted a series of user studies using a prototype system to compare the effects of different combinations of visual cues including hand gestures, ray pointing and drawing in a remote collaborative task. We found that adding ray pointing and drawing can enhance the social aspects of a remote collaborative experience and also can lower the required task mental effort. We discuss the findings and suggest directions for future research.
{"title":"Investigating the use of Different Visual Cues to Improve Social Presence within a 360 Mixed Reality Remote Collaboration*","authors":"Theophilus Teo, Gun A. Lee, M. Billinghurst, Matt Adcock","doi":"10.1145/3359997.3365687","DOIUrl":"https://doi.org/10.1145/3359997.3365687","url":null,"abstract":"In this paper, we investigate the social aspects and effects of various visual cues in a 360 panorama-based Mixed Reality remote collaboration system. We conducted a series of user studies using a prototype system to compare the effects of different combinations of visual cues including hand gestures, ray pointing and drawing in a remote collaborative task. We found that adding ray pointing and drawing can enhance the social aspects of a remote collaborative experience and also can lower the required task mental effort. We discuss the findings and suggest directions for future research.","PeriodicalId":448139,"journal":{"name":"Proceedings of the 17th International Conference on Virtual-Reality Continuum and its Applications in Industry","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-11-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128573800","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ryan Anthony J. de Belen, T. Bednarz, D. D. Favero
In the last few decades, there has been a significant increase in demand for Wearable Assistive Technologies (WATs) useful to overcome functional limitations of individuals. Although advances in Computer Graphics (CG), Computer Vision (CV), and Artificial Intelligence (AI) have the potential to address a wide range of human needs, fully integrated systems that consider age-related changes in elderly people are still pretty uncommon. In this work, we present a WAT that follows interaction design guidelines to ensure reliability, usability, and suitability for everyday use. The WAT enables elderly people to improve interactions with Mixed Reality (MR) and Internet of Things (IoT) technologies. It properly aids and assists elderly people in daily activities such as analysing the environment, recognising and searching for objects, wayfinding, and navigation. We believe that this technology is helpful for blind, low-vision, or hearing-impaired independent elderly people to improve their quality of life while maintaining self-independence.
{"title":"Integrating Mixed Reality and Internet of Things as an Assistive Technology for Elderly People Living in a Smart Home","authors":"Ryan Anthony J. de Belen, T. Bednarz, D. D. Favero","doi":"10.1145/3359997.3365742","DOIUrl":"https://doi.org/10.1145/3359997.3365742","url":null,"abstract":"In the last few decades, there has been a significant increase in demand for Wearable Assistive Technologies (WATs) useful to overcome functional limitations of individuals. Although advances in Computer Graphics (CG), Computer Vision (CV), and Artificial Intelligence (AI) have the potential to address a wide range of human needs, fully integrated systems that consider age-related changes in elderly people are still pretty uncommon. In this work, we present a WAT that follows interaction design guidelines to ensure reliability, usability, and suitability for everyday use. The WAT enables elderly people to improve interactions with Mixed Reality (MR) and Internet of Things (IoT) technologies. It properly aids and assists elderly people in daily activities such as analysing the environment, recognising and searching for objects, wayfinding, and navigation. We believe that this technology is helpful for blind, low-vision, or hearing-impaired independent elderly people to improve their quality of life while maintaining self-independence.","PeriodicalId":448139,"journal":{"name":"Proceedings of the 17th International Conference on Virtual-Reality Continuum and its Applications in Industry","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-11-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130020707","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
As grammar of VR storytelling evolves, we must look beyond the technical capabilities of the medium and associated perceptual immersion, in order to better understand the effect of narrative on the users. This paper presents a qualitative analysis of the experience of a VR Cinema and the experiencer’s connection to the narrative. The study attempts to illustrate the significance of narrative immersion with respect to the 360° medium of storytelling in VR. In addition how the various elements in such a narrative lead to experiential fidelity is examined. We believe that the insights gathered would help VR filmmakers in creating effective narrative experiences.
{"title":"Grammar of VR Storytelling: Narrative Immersion and Experiential Fidelity in VR Cinema","authors":"Jayesh S. Pillai, Manvi Verma","doi":"10.1145/3359997.3365680","DOIUrl":"https://doi.org/10.1145/3359997.3365680","url":null,"abstract":"As grammar of VR storytelling evolves, we must look beyond the technical capabilities of the medium and associated perceptual immersion, in order to better understand the effect of narrative on the users. This paper presents a qualitative analysis of the experience of a VR Cinema and the experiencer’s connection to the narrative. The study attempts to illustrate the significance of narrative immersion with respect to the 360° medium of storytelling in VR. In addition how the various elements in such a narrative lead to experiential fidelity is examined. We believe that the insights gathered would help VR filmmakers in creating effective narrative experiences.","PeriodicalId":448139,"journal":{"name":"Proceedings of the 17th International Conference on Virtual-Reality Continuum and its Applications in Industry","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-11-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131119979","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Erwin Wu, Florian Perteneder, H. Koike, Takayuki Nozawa
Alpine ski training is restricted by environmental requirements and the incremental and cyclical ways of how movement and form are taught. Therefore, we propose a virtual reality ski training system based on an indoor ski simulator. The system uses two trackers to capture the motion of skis so that users can control them on the virtual ski slope. For training we captured the motion of professional athletes and replay them to the users to help them to improve their skills. In two studies, we explore the utility of visual cues to help users to effectively learn the motion patterns from the pro-skier. In addition, we look at the impact of feedback on this learning process. The work provides the basis for developing and understanding the possibilities and limitations of VR ski simulators, which have the potential to support skiers in their learning process.
{"title":"How to VizSki: Visualizing Captured Skier Motion in a VR Ski Training Simulator","authors":"Erwin Wu, Florian Perteneder, H. Koike, Takayuki Nozawa","doi":"10.1145/3359997.3365698","DOIUrl":"https://doi.org/10.1145/3359997.3365698","url":null,"abstract":"Alpine ski training is restricted by environmental requirements and the incremental and cyclical ways of how movement and form are taught. Therefore, we propose a virtual reality ski training system based on an indoor ski simulator. The system uses two trackers to capture the motion of skis so that users can control them on the virtual ski slope. For training we captured the motion of professional athletes and replay them to the users to help them to improve their skills. In two studies, we explore the utility of visual cues to help users to effectively learn the motion patterns from the pro-skier. In addition, we look at the impact of feedback on this learning process. The work provides the basis for developing and understanding the possibilities and limitations of VR ski simulators, which have the potential to support skiers in their learning process.","PeriodicalId":448139,"journal":{"name":"Proceedings of the 17th International Conference on Virtual-Reality Continuum and its Applications in Industry","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-11-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115805023","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper addresses the feasibility of situated storytelling using Simultaneous Localisation and Mapping (SLAM) enabled augmented reality (AR) on a mobile phone. We specifically focus on storytelling in the heritage context as it provides a rich environment for stories to be told in. We conducted expert interviews with several museum and heritage sites to identify major themes for storytelling in the heritage context. These themes informed the development of an AR based storytelling application for a mobile phone. We evaluated the application in a user study and gained further insight into the factors that users appreciate in AR based storytelling. From these insights we derive several high level design guidelines that may inform future system development for situated storytelling, especially in the heritage context.
{"title":"Situated Storytelling with SLAM Enabled Augmented Reality","authors":"Sarah Ketchell, W. Chinthammit, U. Engelke","doi":"10.1145/3359997.3365681","DOIUrl":"https://doi.org/10.1145/3359997.3365681","url":null,"abstract":"This paper addresses the feasibility of situated storytelling using Simultaneous Localisation and Mapping (SLAM) enabled augmented reality (AR) on a mobile phone. We specifically focus on storytelling in the heritage context as it provides a rich environment for stories to be told in. We conducted expert interviews with several museum and heritage sites to identify major themes for storytelling in the heritage context. These themes informed the development of an AR based storytelling application for a mobile phone. We evaluated the application in a user study and gained further insight into the factors that users appreciate in AR based storytelling. From these insights we derive several high level design guidelines that may inform future system development for situated storytelling, especially in the heritage context.","PeriodicalId":448139,"journal":{"name":"Proceedings of the 17th International Conference on Virtual-Reality Continuum and its Applications in Industry","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-11-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128658065","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}