In this study, we developed a soundscape system that reproduces a real soundscape in 3D in a virtual space, based on the "100 Best Soundscapes in Japan" selected by the Ministry of the Environment. Using the game engine "Unity", we proposed a method of embedding recorded sound sources into virtual objects placed in a virtual space, and sequentially playing back the sound when the user walks or turns around in the virtual space. By using this system, it is possible to reproduce the same sound field in the virtual space as in the real space, and it can be applied in various places.
{"title":"Preservation and Reproduction of Real Soundscapes in Virtual Space for the \"100 Best Soundscapes in Japan\"","authors":"Yoshihiko Okubo, Y. Oishi, Yasuo Kawai","doi":"10.1145/3489849.3489902","DOIUrl":"https://doi.org/10.1145/3489849.3489902","url":null,"abstract":"In this study, we developed a soundscape system that reproduces a real soundscape in 3D in a virtual space, based on the \"100 Best Soundscapes in Japan\" selected by the Ministry of the Environment. Using the game engine \"Unity\", we proposed a method of embedding recorded sound sources into virtual objects placed in a virtual space, and sequentially playing back the sound when the user walks or turns around in the virtual space. By using this system, it is possible to reproduce the same sound field in the virtual space as in the real space, and it can be applied in various places.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133799058","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
AR navigation has been widely used in mobile devices. However, users sometimes immerse in the navigation interface and ignore probable dangers in the real environment. It is necessary to remind users of potential dangers and avoid accidents. Most of existing works focus on how to effectively guide users in AR but few concern about design of danger reminder. In this paper, we build a virtual AR navigation system and compare user experience on different types of obstacle reminder. Furthermore, we compare the influence of color, motion and appearance distance on effectiveness of the reminder. Results show that red color and bi-color are more obvious than blue color for reminder. Motion such as flickering effect helps enhance remind effectiveness.
{"title":"Designing Obstacle Reminder for Safe AR Navigation","authors":"Xinyi Su, Xuechen Zhao, C. Cao","doi":"10.1145/3489849.3489905","DOIUrl":"https://doi.org/10.1145/3489849.3489905","url":null,"abstract":"AR navigation has been widely used in mobile devices. However, users sometimes immerse in the navigation interface and ignore probable dangers in the real environment. It is necessary to remind users of potential dangers and avoid accidents. Most of existing works focus on how to effectively guide users in AR but few concern about design of danger reminder. In this paper, we build a virtual AR navigation system and compare user experience on different types of obstacle reminder. Furthermore, we compare the influence of color, motion and appearance distance on effectiveness of the reminder. Results show that red color and bi-color are more obvious than blue color for reminder. Motion such as flickering effect helps enhance remind effectiveness.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134179721","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Tianshu Xu, V. V. Yallapragada, M. Tangney, S. Tabirca
Due to the pandemic limitations caused by Covid-19, people need to work at home and carry on the meetings virtually. Virtual meeting tools start popularizing and thriving. Those tools allow users to see each other through screen and camera, chat through voice and text, and share content or ideas through screen share. However, screen sharing protein models through virtual meetings is not easy due to the difficulty of viewing protein 3D (Three Dimensional) structures from a 2D (Two Dimensional) screen. Moreover, interactions upon a protein are also limited. ProMVR is a tool the author developed to tackle the issue that protein designers may find limitations working in a traditional 2D or 3D environment and they may find it hard to communicate their ideas with other designers. Since ProMVR is a VR tool, it allows users to “jump into” a virtual environment, take a close look at protein models, and have intuitive interactions.
{"title":"ProMVR - Protein Multiplayer Virtual Reality Tool","authors":"Tianshu Xu, V. V. Yallapragada, M. Tangney, S. Tabirca","doi":"10.1145/3489849.3489935","DOIUrl":"https://doi.org/10.1145/3489849.3489935","url":null,"abstract":"Due to the pandemic limitations caused by Covid-19, people need to work at home and carry on the meetings virtually. Virtual meeting tools start popularizing and thriving. Those tools allow users to see each other through screen and camera, chat through voice and text, and share content or ideas through screen share. However, screen sharing protein models through virtual meetings is not easy due to the difficulty of viewing protein 3D (Three Dimensional) structures from a 2D (Two Dimensional) screen. Moreover, interactions upon a protein are also limited. ProMVR is a tool the author developed to tackle the issue that protein designers may find limitations working in a traditional 2D or 3D environment and they may find it hard to communicate their ideas with other designers. Since ProMVR is a VR tool, it allows users to “jump into” a virtual environment, take a close look at protein models, and have intuitive interactions.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"50 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134276348","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Andrew Zhang, Jennifer Jacobs, Misha Sra, Tobias Höllerer
In this work, we present a system that adds augmented reality interaction and 3D-space utilization to educational videoconferencing for a more engaging distance learning experience. We developed infrastructure and user interfaces that enable the use of an instructor’s physical 3D space as a teaching stage, promote student interaction, and take advantage of the flexibility of adding virtual content to the physical world. The system is implemented using hand-held mobile augmented reality to maximize device availability, scalability, and ready deployment, elevating traditional video lectures to immersive mixed reality experiences. We use multiple devices on the teacher’s end to provide different simultaneous views of a teaching space towards a better understanding of the 3D space.
{"title":"Multi-View AR Streams for Interactive 3D Remote Teaching","authors":"Andrew Zhang, Jennifer Jacobs, Misha Sra, Tobias Höllerer","doi":"10.1145/3489849.3489950","DOIUrl":"https://doi.org/10.1145/3489849.3489950","url":null,"abstract":"In this work, we present a system that adds augmented reality interaction and 3D-space utilization to educational videoconferencing for a more engaging distance learning experience. We developed infrastructure and user interfaces that enable the use of an instructor’s physical 3D space as a teaching stage, promote student interaction, and take advantage of the flexibility of adding virtual content to the physical world. The system is implemented using hand-held mobile augmented reality to maximize device availability, scalability, and ready deployment, elevating traditional video lectures to immersive mixed reality experiences. We use multiple devices on the teacher’s end to provide different simultaneous views of a teaching space towards a better understanding of the 3D space.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121104072","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
S. Safikhani, M. Holly, Alexander Kainz, J. Pirker
Researchers study the user experience in Virtual Reality (VR) typically by collecting either sensory data or using questionnaires. While traditional questionnaire formats present it through web-based survey tools (out-VR), recent studies investigate the effects of presenting questionnaires directly in the virtual environment (in-VR). The in-VR questionnaire can be defined as an implemented user-interface object that allows interaction with questionnaires in VR that do not break the immersion. Integrating questionnaires directly into the virtual environment, however, also challenges design decisions. While most previous research presents in-VR questionnaires in the form of 2D panels in the virtual environment, we want to investigate the difference from such traditional formats to a presentation of a questionnaire format in the form of an interactive object as part of the environment. Accordingly, we evaluate and compare two different in-VR questionnaire designs and a traditional web-based form (out-VR) to assess user experience, the effect on presence, duration of completing the questionnaires, and users’ preferences. As the means for achieving this goal, we developed an immersive questionnaire toolkit that provides a general solution for implementing in-VR questionnaires and exchanging data with popular survey services. This toolkit enables us to run our study both on-site and remotely. As a first small study, 16 users, either on-site or remotely, attended by completing the System Usability Scale, NASA TLX, and the iGroup Presence Questionnaire after a playful activity. The first results indicate that there is no significant difference in the case of usability and presence between different design layouts. Furthermore, we could not find a significant difference also for the task load except between 2D and web-based layout for mental demand and frustration as well as the duration of completing the questionnaire. The results also indicate that users generally prefer in-VR questionnaire designs to the traditional ones. The study can be expanded to include more participants in user studies as a means of gaining more concrete results. Furthermore, additional questionnaire design alternatives can also help to provide us with a more usable and accurate questionnaire design in VR.
{"title":"The Influence of in-VR Questionnaire Design on the User Experience","authors":"S. Safikhani, M. Holly, Alexander Kainz, J. Pirker","doi":"10.1145/3489849.3489884","DOIUrl":"https://doi.org/10.1145/3489849.3489884","url":null,"abstract":"Researchers study the user experience in Virtual Reality (VR) typically by collecting either sensory data or using questionnaires. While traditional questionnaire formats present it through web-based survey tools (out-VR), recent studies investigate the effects of presenting questionnaires directly in the virtual environment (in-VR). The in-VR questionnaire can be defined as an implemented user-interface object that allows interaction with questionnaires in VR that do not break the immersion. Integrating questionnaires directly into the virtual environment, however, also challenges design decisions. While most previous research presents in-VR questionnaires in the form of 2D panels in the virtual environment, we want to investigate the difference from such traditional formats to a presentation of a questionnaire format in the form of an interactive object as part of the environment. Accordingly, we evaluate and compare two different in-VR questionnaire designs and a traditional web-based form (out-VR) to assess user experience, the effect on presence, duration of completing the questionnaires, and users’ preferences. As the means for achieving this goal, we developed an immersive questionnaire toolkit that provides a general solution for implementing in-VR questionnaires and exchanging data with popular survey services. This toolkit enables us to run our study both on-site and remotely. As a first small study, 16 users, either on-site or remotely, attended by completing the System Usability Scale, NASA TLX, and the iGroup Presence Questionnaire after a playful activity. The first results indicate that there is no significant difference in the case of usability and presence between different design layouts. Furthermore, we could not find a significant difference also for the task load except between 2D and web-based layout for mental demand and frustration as well as the duration of completing the questionnaire. The results also indicate that users generally prefer in-VR questionnaire designs to the traditional ones. The study can be expanded to include more participants in user studies as a means of gaining more concrete results. Furthermore, additional questionnaire design alternatives can also help to provide us with a more usable and accurate questionnaire design in VR.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125337952","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
V. Lalioti, Sophia Ppali, Andrew J. Thomas, Ragnar Hrafnkelsson, M. Grierson, C. Ang, Bea S. Wohl, A. Covaci
In this paper, we propose VR Rehearse & Perform - a Virtual Reality application for enhancing the rehearsal efforts of performers by providing them access to accurate recreations - both visual and acoustical - of iconic concert venues.
{"title":"VR Rehearse & Perform - A platform for rehearsing in Virtual Reality","authors":"V. Lalioti, Sophia Ppali, Andrew J. Thomas, Ragnar Hrafnkelsson, M. Grierson, C. Ang, Bea S. Wohl, A. Covaci","doi":"10.1145/3489849.3489896","DOIUrl":"https://doi.org/10.1145/3489849.3489896","url":null,"abstract":"In this paper, we propose VR Rehearse & Perform - a Virtual Reality application for enhancing the rehearsal efforts of performers by providing them access to accurate recreations - both visual and acoustical - of iconic concert venues.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125425502","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Immersive head-mounted augmented reality allows users to overlay 3D digital content on a user’s view of the world. Current-generation devices primarily support interaction modalities such as gesture, gaze and voice, which are readily available to most users yet lack precision and tactility, rendering them fatiguing for extended interactions. We propose using smartphones, which are also readily available, as companion devices complementing existing AR interaction modalities. We leverage user familiarity with smartphone interactions, coupled with their support for precise, tactile touch input, to unlock a broad range of interaction techniques and applications - for instance, turning the phone into an interior design palette, touch-enabled catapult or AR-rendered sword. We describe a prototype implementation of our interaction techniques using an off-the-shelf AR headset and smartphone, demonstrate applications, and report on the results of a positional accuracy study.
{"title":"PAIR: Phone as an Augmented Immersive Reality Controller","authors":"Arda Ege Unlu, R. Xiao","doi":"10.1145/3489849.3489878","DOIUrl":"https://doi.org/10.1145/3489849.3489878","url":null,"abstract":"Immersive head-mounted augmented reality allows users to overlay 3D digital content on a user’s view of the world. Current-generation devices primarily support interaction modalities such as gesture, gaze and voice, which are readily available to most users yet lack precision and tactility, rendering them fatiguing for extended interactions. We propose using smartphones, which are also readily available, as companion devices complementing existing AR interaction modalities. We leverage user familiarity with smartphone interactions, coupled with their support for precise, tactile touch input, to unlock a broad range of interaction techniques and applications - for instance, turning the phone into an interior design palette, touch-enabled catapult or AR-rendered sword. We describe a prototype implementation of our interaction techniques using an off-the-shelf AR headset and smartphone, demonstrate applications, and report on the results of a positional accuracy study.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"92 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127621919","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
German Intensive Care Units (ICUs) are in crisis, struggling with an increasing shortage of skilled workers, ultimately putting patients’ safety at risk. To counteract this process, researchers are increasingly concerned with finding digital solutions which aim to support healthcare professionals by enhancing the efficiency of reoccurring critical caring tasks and thus, improve working conditions. In this regard, this paper evaluates the application of Augmented Reality (AR) for patient monitoring for critical care nursing. Grounded on an observational study, semi-structured interviews, as well as a quantitative analysis, mobile patient monitoring scenarios, present particularly during patient transport, were identified as an innovative context of use of AR in the field. Additionally, user requirements such as high wearability, hands-free operability, and clear data representation could be derived from the obtained study results. For validation of these and identification of further requirements, three prototypes differing in their data illustration format were subsequently developed and quantitatively, as well as qualitatively evaluated by conducting an online survey. Thereby, it became evident that future implementations of a corresponding system for patient monitoring ought to integrate a context-dependent data presentation in particular, as this combines high navigability and availability of required data. Identifying patient monitoring during patient transport as a potential context of use, as well as distinguishing a context-dependent design approach as favorable constitute two key contributions of this work and provide a foundation on which future implementations of AR systems in the nursing domain and other related contexts can be established.
{"title":"opticARe - Augmented Reality Mobile Patient Monitoring in Intensive Care Units","authors":"S. Kimmel, Vanessa Cobus, Wilko Heuten","doi":"10.1145/3489849.3489852","DOIUrl":"https://doi.org/10.1145/3489849.3489852","url":null,"abstract":"German Intensive Care Units (ICUs) are in crisis, struggling with an increasing shortage of skilled workers, ultimately putting patients’ safety at risk. To counteract this process, researchers are increasingly concerned with finding digital solutions which aim to support healthcare professionals by enhancing the efficiency of reoccurring critical caring tasks and thus, improve working conditions. In this regard, this paper evaluates the application of Augmented Reality (AR) for patient monitoring for critical care nursing. Grounded on an observational study, semi-structured interviews, as well as a quantitative analysis, mobile patient monitoring scenarios, present particularly during patient transport, were identified as an innovative context of use of AR in the field. Additionally, user requirements such as high wearability, hands-free operability, and clear data representation could be derived from the obtained study results. For validation of these and identification of further requirements, three prototypes differing in their data illustration format were subsequently developed and quantitatively, as well as qualitatively evaluated by conducting an online survey. Thereby, it became evident that future implementations of a corresponding system for patient monitoring ought to integrate a context-dependent data presentation in particular, as this combines high navigability and availability of required data. Identifying patient monitoring during patient transport as a potential context of use, as well as distinguishing a context-dependent design approach as favorable constitute two key contributions of this work and provide a foundation on which future implementations of AR systems in the nursing domain and other related contexts can be established.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122505048","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Immersive Virtual Reality enables users to experience 3D models and other virtual content in ways that cannot be achieved on a flat screen, and several modern Virtual Reality applications now give users the ability to include or create their own content and objects. With user-generated content however, objects may come in all shapes and sizes. This necessitates the use of object manipulation methods that are effective regardless of object size. In this work we evaluate two methods for manipulating virtual objects of varying sizes. World Pull enables the user to directly manipulate and scale the virtual environment, while Pivot Manipulation enables the user to rotate objects around a set of predefined pivot points. The methods were compared to a traditional 6 degree of freedom manipulation method during a user study and the results showed that World Pull performed better in terms of precision for small and large objects, while Pivot Manipulation performed better for large objects.
{"title":"An Evaluation of Methods for Manipulating Virtual Objects at Different Scales","authors":"Jesper Gaarsdal, Sune Wolff, C. Madsen","doi":"10.1145/3489849.3489907","DOIUrl":"https://doi.org/10.1145/3489849.3489907","url":null,"abstract":"Immersive Virtual Reality enables users to experience 3D models and other virtual content in ways that cannot be achieved on a flat screen, and several modern Virtual Reality applications now give users the ability to include or create their own content and objects. With user-generated content however, objects may come in all shapes and sizes. This necessitates the use of object manipulation methods that are effective regardless of object size. In this work we evaluate two methods for manipulating virtual objects of varying sizes. World Pull enables the user to directly manipulate and scale the virtual environment, while Pivot Manipulation enables the user to rotate objects around a set of predefined pivot points. The methods were compared to a traditional 6 degree of freedom manipulation method during a user study and the results showed that World Pull performed better in terms of precision for small and large objects, while Pivot Manipulation performed better for large objects.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121960089","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We present emoPaint, a virtual reality application that allows users to create paintings with expressive emotion-based brushes and shapes. While previous systems have introduced painting in 3D space, emoPaint focuses on supporting emotional characteristics by allowing users to use brushes corresponding to specific emotions or to create their own emotion brushes and paint with the corresponding visual elements. Our system provides a variety of line textures, shape representations and color palettes for each emotion to enable users to control expression of emotions in their paintings. In this work we describe our implementation and illustrate paintings created using emoPaint.
{"title":"Exploring Emotion Brushes for a Virtual Reality Painting Tool","authors":"Jungah Son, Misha Sra","doi":"10.1145/3489849.3489925","DOIUrl":"https://doi.org/10.1145/3489849.3489925","url":null,"abstract":"We present emoPaint, a virtual reality application that allows users to create paintings with expressive emotion-based brushes and shapes. While previous systems have introduced painting in 3D space, emoPaint focuses on supporting emotional characteristics by allowing users to use brushes corresponding to specific emotions or to create their own emotion brushes and paint with the corresponding visual elements. Our system provides a variety of line textures, shape representations and color palettes for each emotion to enable users to control expression of emotions in their paintings. In this work we describe our implementation and illustrate paintings created using emoPaint.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130830430","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}