Augmented Reality (AR) allows for the augmentation of relevant in-situ information in real-time, making it an effective assistive tool to support and enhance task performance. Manual assembly is a promising area in which AR can be used to assist users. Findings from various user studies evaluating the use of AR-based support systems for manual assembly have shown that it is able to reduce mental workload and improve task performance, and that its usability is greatly influenced by user interface (UI) design. Many prototypes have been developed to identify the optimal information presentation, with each study implementing its own interface design approach. The challenge lies in determining a generalized and standardized approach to interface design: what information should be present and how should they be presented? The paper reviews the types of visual features that have been implemented in existing systems and how it had assisted users in performing tasks. The questions of what information should be available and how to present them, guides the discussion from which a set of guidelines is formulated, aided by theories from cognitive science. This paper proposes the following guidelines for the interface design of AR systems for manual assembly: to include information which are present in traditional paper-based instructions; to use exogenous cues for the tasks of identifying, locating and picking parts; to use endogenous cues to guide in assembling parts. The paper contributes to the design AR systems for manual assembly by proposing a standardized approach to interface design.
{"title":"Guidelines for the Interface Design of AR Systems for Manual Assembly","authors":"Nor Farzana Syaza Jeffri, D. R. A. Rambli","doi":"10.1145/3385378.3385389","DOIUrl":"https://doi.org/10.1145/3385378.3385389","url":null,"abstract":"Augmented Reality (AR) allows for the augmentation of relevant in-situ information in real-time, making it an effective assistive tool to support and enhance task performance. Manual assembly is a promising area in which AR can be used to assist users. Findings from various user studies evaluating the use of AR-based support systems for manual assembly have shown that it is able to reduce mental workload and improve task performance, and that its usability is greatly influenced by user interface (UI) design. Many prototypes have been developed to identify the optimal information presentation, with each study implementing its own interface design approach. The challenge lies in determining a generalized and standardized approach to interface design: what information should be present and how should they be presented? The paper reviews the types of visual features that have been implemented in existing systems and how it had assisted users in performing tasks. The questions of what information should be available and how to present them, guides the discussion from which a set of guidelines is formulated, aided by theories from cognitive science. This paper proposes the following guidelines for the interface design of AR systems for manual assembly: to include information which are present in traditional paper-based instructions; to use exogenous cues for the tasks of identifying, locating and picking parts; to use endogenous cues to guide in assembling parts. The paper contributes to the design AR systems for manual assembly by proposing a standardized approach to interface design.","PeriodicalId":169609,"journal":{"name":"Proceedings of the 2020 4th International Conference on Virtual and Augmented Reality Simulations","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132338017","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The emergence of Augmented Reality (AR) in retail to enhance online and offline shopping customer experiences is fueling different opportunities. Mastercard Labs, with ODG and Qualcomm, implemented an AR smart glass with iris authentication and digital wallet integration proof of concept in Saks Fifth Avenue retail stores. The purpose was to understand customers' perceived ease of use of the set up for utilitarian and hedonic values. A qualitative case study was conducted using the Technology Acceptance Model as the theoretical basis combining utilitarian and hedonistic values associated with the perceived usefulness of this technology. The outcomes demonstrated a place for AR in retail in-store however the technology needs to mature to enable frictionless functional integration of capabilities, improved usability of smartglasses, more battery life, better heat absorption and improved network bandwidth.
{"title":"Augmented Reality in Retail-A Case Study: Technology implications to Utilitarian, Aesthetic and Enjoyment values","authors":"Nageswaran Vaidyanathan","doi":"10.1145/3385378.3385383","DOIUrl":"https://doi.org/10.1145/3385378.3385383","url":null,"abstract":"The emergence of Augmented Reality (AR) in retail to enhance online and offline shopping customer experiences is fueling different opportunities. Mastercard Labs, with ODG and Qualcomm, implemented an AR smart glass with iris authentication and digital wallet integration proof of concept in Saks Fifth Avenue retail stores. The purpose was to understand customers' perceived ease of use of the set up for utilitarian and hedonic values. A qualitative case study was conducted using the Technology Acceptance Model as the theoretical basis combining utilitarian and hedonistic values associated with the perceived usefulness of this technology. The outcomes demonstrated a place for AR in retail in-store however the technology needs to mature to enable frictionless functional integration of capabilities, improved usability of smartglasses, more battery life, better heat absorption and improved network bandwidth.","PeriodicalId":169609,"journal":{"name":"Proceedings of the 2020 4th International Conference on Virtual and Augmented Reality Simulations","volume":"222 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121255595","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Human behavior is considered to be the weakest link in the field of cybersecurity. Despite the development of a wide range of Augmented Reality (AR) applications in various domains, no AR application is available to educate users and increase their awareness of cybersecurity issues. Thus, we developed a game based on AR techniques as an Android app called CybAR. Since there have been few acceptance studies in the field of AR, it was particularly important to identify the factors that affect user acceptance of AR technology. Technology acceptance studies typically predict behavioral adoption by investigating the relationship between attitudes and intentions, even though intention may not be the best predictor of actual behavior. Personality constructs and dimensions of cultural difference have recently been found to explain even more variance in behavior and provide insights into user behavior. The objective of this study is to identify the personality traits that affect users' acceptance of CybAR and increase their cybersecurity awareness. The study also aims to identify cultural factors that influence acceptance of CybAR by comparing Saudi Arabian and Australian users according to Hofstede's cultural value dimensions. Thus the potential predictors of CybAR app usage were derived from the extended unified theory of acceptance and usage of technology (UTAUT2), personality traits, and cultural moderators.
{"title":"Factors Affecting Acceptance of a Mobile Augmented Reality Application for Cybersecurity Awareness","authors":"Hamed Alqahtani, Manolya Kavakli-Thorne","doi":"10.1145/3385378.3385382","DOIUrl":"https://doi.org/10.1145/3385378.3385382","url":null,"abstract":"Human behavior is considered to be the weakest link in the field of cybersecurity. Despite the development of a wide range of Augmented Reality (AR) applications in various domains, no AR application is available to educate users and increase their awareness of cybersecurity issues. Thus, we developed a game based on AR techniques as an Android app called CybAR. Since there have been few acceptance studies in the field of AR, it was particularly important to identify the factors that affect user acceptance of AR technology. Technology acceptance studies typically predict behavioral adoption by investigating the relationship between attitudes and intentions, even though intention may not be the best predictor of actual behavior. Personality constructs and dimensions of cultural difference have recently been found to explain even more variance in behavior and provide insights into user behavior. The objective of this study is to identify the personality traits that affect users' acceptance of CybAR and increase their cybersecurity awareness. The study also aims to identify cultural factors that influence acceptance of CybAR by comparing Saudi Arabian and Australian users according to Hofstede's cultural value dimensions. Thus the potential predictors of CybAR app usage were derived from the extended unified theory of acceptance and usage of technology (UTAUT2), personality traits, and cultural moderators.","PeriodicalId":169609,"journal":{"name":"Proceedings of the 2020 4th International Conference on Virtual and Augmented Reality Simulations","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132065335","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this paper, we present an innovative application of Virtual Reality in human fall detection. Fall detection is a challenging problem in the public healthcare domain. Despite significant efforts into developing reliable and effective fall detection algorithms and devices by researchers and engineers, minimal success has been seen. The lack of recorded fall data and the data quality have been identified as a major obstacle. To address this issue, we are proposing a framework for generating fall data in virtual environments. Our initial results have indicated that the virtual fall data generated using the proposed framework are of sufficient quality and could be used to improve fall detection algorithms. Although the approach proposed is to be used for fall detection, it is fully applicable to other domains that require training data.
{"title":"Virtual Falls: Application of VR in Fall Detection","authors":"Vinh T. Bui, Minh Bui","doi":"10.1145/3385378.3385388","DOIUrl":"https://doi.org/10.1145/3385378.3385388","url":null,"abstract":"In this paper, we present an innovative application of Virtual Reality in human fall detection. Fall detection is a challenging problem in the public healthcare domain. Despite significant efforts into developing reliable and effective fall detection algorithms and devices by researchers and engineers, minimal success has been seen. The lack of recorded fall data and the data quality have been identified as a major obstacle. To address this issue, we are proposing a framework for generating fall data in virtual environments. Our initial results have indicated that the virtual fall data generated using the proposed framework are of sufficient quality and could be used to improve fall detection algorithms. Although the approach proposed is to be used for fall detection, it is fully applicable to other domains that require training data.","PeriodicalId":169609,"journal":{"name":"Proceedings of the 2020 4th International Conference on Virtual and Augmented Reality Simulations","volume":"163 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121465309","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Augmented reality (AR) is an emerging technology that has yet to be a mature consumer product. Given that user experience plays a significant role in the success of new technologies, the development of appropriate AR user interfaces needed. As such, defining the challenges to current AR user interfaces is a steppingstone to enhancing user experience. There are three principal components of interaction in AR systems: the user, the user interface and the virtual content. Examples from the literature are provided to identify the challenges to each interaction component. Both the layout of the AR interaction component and the identified challenges aim to help and inform researchers and AR software developers to choose the AR interaction aspects that best serves their purpose, while also contributing to the enhancement of user experience.
{"title":"Interaction in Augmented Reality: Challenges to Enhance User Experience","authors":"Yahya Ghazwani, Shamus P. Smith","doi":"10.1145/3385378.3385384","DOIUrl":"https://doi.org/10.1145/3385378.3385384","url":null,"abstract":"Augmented reality (AR) is an emerging technology that has yet to be a mature consumer product. Given that user experience plays a significant role in the success of new technologies, the development of appropriate AR user interfaces needed. As such, defining the challenges to current AR user interfaces is a steppingstone to enhancing user experience. There are three principal components of interaction in AR systems: the user, the user interface and the virtual content. Examples from the literature are provided to identify the challenges to each interaction component. Both the layout of the AR interaction component and the identified challenges aim to help and inform researchers and AR software developers to choose the AR interaction aspects that best serves their purpose, while also contributing to the enhancement of user experience.","PeriodicalId":169609,"journal":{"name":"Proceedings of the 2020 4th International Conference on Virtual and Augmented Reality Simulations","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115458630","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This research is an exploratory study that evaluates the potential for using a three-dimensional (3D) virtual reality headset in air traffic control scenarios by considering whether they offer advantages in identifying potential in-flight collisions in comparison to traditional 2-dimensional (2D) displays. Presenting large volumes of data on 2D displays may limit speed and efficiency of air traffic control work. By comparison, virtual reality (VR) allows users to experience immersion within a virtual environment which facilitates different modes of interaction with large and complex datasets. Fifteen participants were involved in this explorative study, none of whom were trained air traffic controllers. Each participant observed a number of simulated flight scenarios using both a 2D display and a 3D VR headset. A combination of quantitative and qualitative data was collected using simulation event logs and post-observation questionnaires. The quantitative data from the simulation logs generally shows that potential collisions are detected more quickly using VR. Despite this, participants did not feel as able to detect potential collisions using virtual reality.
{"title":"An Evaluation of the Effectiveness of Virtual Reality in Air Traffic Control","authors":"Yemon Lee, S. Marks, A. Connor","doi":"10.1145/3385378.3385380","DOIUrl":"https://doi.org/10.1145/3385378.3385380","url":null,"abstract":"This research is an exploratory study that evaluates the potential for using a three-dimensional (3D) virtual reality headset in air traffic control scenarios by considering whether they offer advantages in identifying potential in-flight collisions in comparison to traditional 2-dimensional (2D) displays. Presenting large volumes of data on 2D displays may limit speed and efficiency of air traffic control work. By comparison, virtual reality (VR) allows users to experience immersion within a virtual environment which facilitates different modes of interaction with large and complex datasets. Fifteen participants were involved in this explorative study, none of whom were trained air traffic controllers. Each participant observed a number of simulated flight scenarios using both a 2D display and a 3D VR headset. A combination of quantitative and qualitative data was collected using simulation event logs and post-observation questionnaires. The quantitative data from the simulation logs generally shows that potential collisions are detected more quickly using VR. Despite this, participants did not feel as able to detect potential collisions using virtual reality.","PeriodicalId":169609,"journal":{"name":"Proceedings of the 2020 4th International Conference on Virtual and Augmented Reality Simulations","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125639588","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ilesanmi Olade, Hai-Ning Liang, Charles Fleming, Christopher Champion
Virtual reality applications are carving out a new niche within the entertainment and business user-sphere, therefore reliable security and usability are essential to achieving consumer confidence. In this paper, we are exploring (1) the suitability of porting the popular SWIPE mobile device authentication system for use within virtual reality (VR) by observing the advantages and vulnerabilities. (2) The effects of the interaction devices such as the hand-held-controller (HHC), the LeapMotion sensor, EyeTracker and the head-mounted-display (HMD). Our study is in three-folds, a web study (N=219) to collect and analyze possible SWIPE password patterns, then a mobile device study (N=15) and a VR study (N = 15) to evaluate the speed, login errors, usability of the SWIPE authentication system in both environment for comparison. We are also interested in the effectiveness of shoulder-surfing within VR as it is known to be a weakness in mobile devices.
{"title":"Exploring the Vulnerabilities and Advantages of SWIPE or Pattern authentication in Virtual Reality (VR)","authors":"Ilesanmi Olade, Hai-Ning Liang, Charles Fleming, Christopher Champion","doi":"10.1145/3385378.3385385","DOIUrl":"https://doi.org/10.1145/3385378.3385385","url":null,"abstract":"Virtual reality applications are carving out a new niche within the entertainment and business user-sphere, therefore reliable security and usability are essential to achieving consumer confidence. In this paper, we are exploring (1) the suitability of porting the popular SWIPE mobile device authentication system for use within virtual reality (VR) by observing the advantages and vulnerabilities. (2) The effects of the interaction devices such as the hand-held-controller (HHC), the LeapMotion sensor, EyeTracker and the head-mounted-display (HMD). Our study is in three-folds, a web study (N=219) to collect and analyze possible SWIPE password patterns, then a mobile device study (N=15) and a VR study (N = 15) to evaluate the speed, login errors, usability of the SWIPE authentication system in both environment for comparison. We are also interested in the effectiveness of shoulder-surfing within VR as it is known to be a weakness in mobile devices.","PeriodicalId":169609,"journal":{"name":"Proceedings of the 2020 4th International Conference on Virtual and Augmented Reality Simulations","volume":"154 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115169834","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We present a multi-device collaboration principle for virtual environments, using a combination of virtual and augmented reality (VR/AR) technology, used in the context of two educational applications, a virtual nasal cavity, and a visualisation of earthquake data. A head-mounted display (HMD) and a 3D-tracked tablet create two views of a shared virtual space. This allows two users to collaborate, utilising the strengths of each of the two technologies, e.g., intuitive spatial navigation and interaction in VR, and touch control of the visualisation parameters via the AR tablet. Touch gestures on the tablet are translated into a pointer ray in VR, so the users can easily indicate spatial features. The underlying networking infrastructure allows for an extension of this application to more than two users and across different rendering platforms.
{"title":"Multi-Device Collaboration in Virtual Environments","authors":"S. Marks, David White","doi":"10.1145/3385378.3385381","DOIUrl":"https://doi.org/10.1145/3385378.3385381","url":null,"abstract":"We present a multi-device collaboration principle for virtual environments, using a combination of virtual and augmented reality (VR/AR) technology, used in the context of two educational applications, a virtual nasal cavity, and a visualisation of earthquake data. A head-mounted display (HMD) and a 3D-tracked tablet create two views of a shared virtual space. This allows two users to collaborate, utilising the strengths of each of the two technologies, e.g., intuitive spatial navigation and interaction in VR, and touch control of the visualisation parameters via the AR tablet. Touch gestures on the tablet are translated into a pointer ray in VR, so the users can easily indicate spatial features. The underlying networking infrastructure allows for an extension of this application to more than two users and across different rendering platforms.","PeriodicalId":169609,"journal":{"name":"Proceedings of the 2020 4th International Conference on Virtual and Augmented Reality Simulations","volume":"251 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133026989","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Arvin Christopher C. Reyes, N. P. Del Gallego, J. A. Deja
We developed a mixed reality guidance system (MRGS) for motherboard assembly with purpose of exploring the usability and viability of using mixed reality and tangible augmented reality (TAR) for simulating hands-on manual assembly tasks. TAR was used to remove the need for real-world parts as well as to provide a natural interaction medium for our system. To evaluate our system, we conducted two usability studies involving 25 (10 experienced and 15 naive) participants. For the first study, participants were tasked to rate only the proposed interaction technique. Both experienced and naive participants gave acceptable scores, with experienced users giving significantly higher ratings. In the second study, participants were tasked to partially assemble the motherboard using the MRGS. Participants who utilized the MRGS were able to properly determine the correct orientation and location of the motherboard parts, in contrast to the control group. Observations of users while performing the tasks as well as user feedback through survey questionnaires and interviews are presented and discussed in this paper.
{"title":"Mixed Reality Guidance System for Motherboard Assembly Using Tangible Augmented Reality","authors":"Arvin Christopher C. Reyes, N. P. Del Gallego, J. A. Deja","doi":"10.1145/3385378.3385379","DOIUrl":"https://doi.org/10.1145/3385378.3385379","url":null,"abstract":"We developed a mixed reality guidance system (MRGS) for motherboard assembly with purpose of exploring the usability and viability of using mixed reality and tangible augmented reality (TAR) for simulating hands-on manual assembly tasks. TAR was used to remove the need for real-world parts as well as to provide a natural interaction medium for our system. To evaluate our system, we conducted two usability studies involving 25 (10 experienced and 15 naive) participants. For the first study, participants were tasked to rate only the proposed interaction technique. Both experienced and naive participants gave acceptable scores, with experienced users giving significantly higher ratings. In the second study, participants were tasked to partially assemble the motherboard using the MRGS. Participants who utilized the MRGS were able to properly determine the correct orientation and location of the motherboard parts, in contrast to the control group. Observations of users while performing the tasks as well as user feedback through survey questionnaires and interviews are presented and discussed in this paper.","PeriodicalId":169609,"journal":{"name":"Proceedings of the 2020 4th International Conference on Virtual and Augmented Reality Simulations","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128353918","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Navigation System are very much essential for en-route assistance, indoor positioning etc. For outdoor environment navigation is still better as compared to complex indoor environment. In this research work, we have focused on building an indoor navigation application which uses augmented reality to assist people in navigating at complex buildings and also making a cloud platform (Content Management System) from where the administrator of a particular building can be able to modify and manage the navigation path. We have used Unity 3d framework to develop the AR based mobile application. The application can be run on smartphones. It has been seen that this augmented reality-based application provides better interface and experience than the traditional 2D maps or the paper maps that are displayed outside buildings to help in the navigation. To evaluate the concept proposed in the research, technical evaluations were performed at a hospital building.
{"title":"Indoor Navigation Using Augmented Reality","authors":"Prashant Verma, Kushal Agrawal, V. Sarasvathi","doi":"10.1145/3385378.3385387","DOIUrl":"https://doi.org/10.1145/3385378.3385387","url":null,"abstract":"Navigation System are very much essential for en-route assistance, indoor positioning etc. For outdoor environment navigation is still better as compared to complex indoor environment. In this research work, we have focused on building an indoor navigation application which uses augmented reality to assist people in navigating at complex buildings and also making a cloud platform (Content Management System) from where the administrator of a particular building can be able to modify and manage the navigation path. We have used Unity 3d framework to develop the AR based mobile application. The application can be run on smartphones. It has been seen that this augmented reality-based application provides better interface and experience than the traditional 2D maps or the paper maps that are displayed outside buildings to help in the navigation. To evaluate the concept proposed in the research, technical evaluations were performed at a hospital building.","PeriodicalId":169609,"journal":{"name":"Proceedings of the 2020 4th International Conference on Virtual and Augmented Reality Simulations","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133392278","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}