Pub Date : 2022-03-01DOI: 10.1109/VRW55335.2022.00189
Necip Fazil Yildiran, Ülkü Meteriz-Yildiran, David A. Mohaisen
We present AiRType for AR/VR HMDs that enables text entry through bare hands for more natural perception. The hand models in the virtual environment mirror hand movements of the user and user targets and selects the keys via hand models. AiRType fully leverages the additional dimension without restraining the interaction space by users' arm lengths. It can be attached to anywhere and can be scaled freely. We evaluated and compared AiRType with the baseline-the built-in keyboard of Magic Leap 1. AiRType shows 27% decrease in the error rate, 3.3% increase in character-per-second, and 9.4% increase in user satisfaction.
{"title":"AiRType: An Air-tapping Keyboard for Augmented Reality Environments","authors":"Necip Fazil Yildiran, Ülkü Meteriz-Yildiran, David A. Mohaisen","doi":"10.1109/VRW55335.2022.00189","DOIUrl":"https://doi.org/10.1109/VRW55335.2022.00189","url":null,"abstract":"We present AiRType for AR/VR HMDs that enables text entry through bare hands for more natural perception. The hand models in the virtual environment mirror hand movements of the user and user targets and selects the keys via hand models. AiRType fully leverages the additional dimension without restraining the interaction space by users' arm lengths. It can be attached to anywhere and can be scaled freely. We evaluated and compared AiRType with the baseline-the built-in keyboard of Magic Leap 1. AiRType shows 27% decrease in the error rate, 3.3% increase in character-per-second, and 9.4% increase in user satisfaction.","PeriodicalId":326252,"journal":{"name":"2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114335370","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-03-01DOI: 10.1109/VRW55335.2022.00055
Kate Loveys, Mark Sagar, M. Billinghurst, Nastaran Saffaryazdi, E. Broadbent
Digital humans are autonomously-animated virtual people whose social interactions are driven by artificial intelligence. They are increasingly being deployed in applications such as healthcare, customer service, and education, and they may have a place in the metaverse. For digital humans to have effective social relationships with users, it is important that they are capable of empathetic interactions. This research aims to evaluate and build upon the autonomous empathy system of a digital human through five experimental studies. Psychological and physiological data will be collected, and the effects will be compared in an Augmented Reality environment and cross-culturally. This paper presents the research agenda, and discusses considerations and challenges for empathetic interactions with digital humans.
{"title":"Exploring Empathy with Digital Humans","authors":"Kate Loveys, Mark Sagar, M. Billinghurst, Nastaran Saffaryazdi, E. Broadbent","doi":"10.1109/VRW55335.2022.00055","DOIUrl":"https://doi.org/10.1109/VRW55335.2022.00055","url":null,"abstract":"Digital humans are autonomously-animated virtual people whose social interactions are driven by artificial intelligence. They are increasingly being deployed in applications such as healthcare, customer service, and education, and they may have a place in the metaverse. For digital humans to have effective social relationships with users, it is important that they are capable of empathetic interactions. This research aims to evaluate and build upon the autonomous empathy system of a digital human through five experimental studies. Psychological and physiological data will be collected, and the effects will be compared in an Augmented Reality environment and cross-culturally. This paper presents the research agenda, and discusses considerations and challenges for empathetic interactions with digital humans.","PeriodicalId":326252,"journal":{"name":"2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"522 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114604677","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-03-01DOI: 10.1109/VRW55335.2022.00315
Verena Biener
Knowledge workers typically work on some kind of computer and other than an internet connection, they rarely need access to specific work environments or devices. This makes it feasible for them to also work in different environments or in mobile settings, like public transportation. In such spaces comfort and productivity could be decreased due to hardware limitations like small screen sizes or input devices and environmental clutter. Mixed reality (MR) has the potential to solve such issues. It can provide the user with additional display space that can even include the third dimension. It can open up new possibilities for interacting with virtual content using gestures or spatially tracked devices. And it can allow the users to modify the work environment according to their personal preferences. This doctoral thesis aims at exploring the challenges of using MR for mobile knowledge work and how to effectively support knowledge worker tasks through appropriate interaction techniques.
{"title":"[DC] Mixed Reality Interaction for Mobile Knowledge Work","authors":"Verena Biener","doi":"10.1109/VRW55335.2022.00315","DOIUrl":"https://doi.org/10.1109/VRW55335.2022.00315","url":null,"abstract":"Knowledge workers typically work on some kind of computer and other than an internet connection, they rarely need access to specific work environments or devices. This makes it feasible for them to also work in different environments or in mobile settings, like public transportation. In such spaces comfort and productivity could be decreased due to hardware limitations like small screen sizes or input devices and environmental clutter. Mixed reality (MR) has the potential to solve such issues. It can provide the user with additional display space that can even include the third dimension. It can open up new possibilities for interacting with virtual content using gestures or spatially tracked devices. And it can allow the users to modify the work environment according to their personal preferences. This doctoral thesis aims at exploring the challenges of using MR for mobile knowledge work and how to effectively support knowledge worker tasks through appropriate interaction techniques.","PeriodicalId":326252,"journal":{"name":"2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114829100","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-03-01DOI: 10.1109/VRW55335.2022.00087
Sang-Ho Eom, S. Kim, S. Rahimpour, M. Gorlatova
Augmented Reality (AR) is increasingly used in medical applications for visualizing medical information. In this paper, we present an AR-assisted surgical guidance system that aims to improve the accuracy of catheter placement in ventriculostomy, a common neurosurgical procedure. We build upon previous work on neurosurgical AR, which has focused on enabling the surgeon to visualize a patient's ventricular anatomy, to additionally integrate surgical tool tracking and contextual guidance. Specifically, using accurate tracking of optical markers via an external multi-camera OptiTrack system, we enable Microsoft HoloLens 2-based visualizations of ventricular anatomy, catheter placement, and the information on how far the catheter tip is from its target. We describe the system we developed, present initial hologram registration results, and comment on the next steps that will prepare our system for clinical evaluations.
{"title":"AR-Assisted Surgical Guidance System for Ventriculostomy","authors":"Sang-Ho Eom, S. Kim, S. Rahimpour, M. Gorlatova","doi":"10.1109/VRW55335.2022.00087","DOIUrl":"https://doi.org/10.1109/VRW55335.2022.00087","url":null,"abstract":"Augmented Reality (AR) is increasingly used in medical applications for visualizing medical information. In this paper, we present an AR-assisted surgical guidance system that aims to improve the accuracy of catheter placement in ventriculostomy, a common neurosurgical procedure. We build upon previous work on neurosurgical AR, which has focused on enabling the surgeon to visualize a patient's ventricular anatomy, to additionally integrate surgical tool tracking and contextual guidance. Specifically, using accurate tracking of optical markers via an external multi-camera OptiTrack system, we enable Microsoft HoloLens 2-based visualizations of ventricular anatomy, catheter placement, and the information on how far the catheter tip is from its target. We describe the system we developed, present initial hologram registration results, and comment on the next steps that will prepare our system for clinical evaluations.","PeriodicalId":326252,"journal":{"name":"2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130276545","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-03-01DOI: 10.1109/VRW55335.2022.00319
B. Powley
Existing immersive systems for analysing geospatial data relating to ecosystem services are not designed for all groups involved with land use decision making. Land management scientists have different requirements compared to non-experts as the tasks they perform are different. Land use decision making needs better tools for assisting the analysis and exploration of land use decisions, and their effect on ecosystem services. In this research, a user centred design process is applied for developing and evaluating an immersive VR visualization tool to assist with better decision making around land use. Interviews with experts found issues with how their current tool presents analysis results, and problems with communicating their results to stakeholders. A literature review found no pre-existing immersive VR systems specifically for analysing tradeoffs among ecosystem services.
{"title":"[DC] Immersive Analytics for Understanding Ecosystem Services Tradeoffs","authors":"B. Powley","doi":"10.1109/VRW55335.2022.00319","DOIUrl":"https://doi.org/10.1109/VRW55335.2022.00319","url":null,"abstract":"Existing immersive systems for analysing geospatial data relating to ecosystem services are not designed for all groups involved with land use decision making. Land management scientists have different requirements compared to non-experts as the tasks they perform are different. Land use decision making needs better tools for assisting the analysis and exploration of land use decisions, and their effect on ecosystem services. In this research, a user centred design process is applied for developing and evaluating an immersive VR visualization tool to assist with better decision making around land use. Interviews with experts found issues with how their current tool presents analysis results, and problems with communicating their results to stakeholders. A literature review found no pre-existing immersive VR systems specifically for analysing tradeoffs among ecosystem services.","PeriodicalId":326252,"journal":{"name":"2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130449340","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-03-01DOI: 10.1109/VRW55335.2022.00112
S. Brettschuh, M. Holly, Maria Hulla, Patrick Herstätter, J. Pirker
The industry is currently operating in the fourth industrial revolution, also called Industry 4.0, the central element of which is the digitization of companies. One promising implementation of Industry 4.0 is the use of virtual reality (VR). Due to its development in terms of better affordability and accessibility in recent years, VR has gained increasing popularity. This creates new opportunities especially for small and medium-sized enterprises (SMEs) to use VR to boost business and optimize their product development and processes. In this paper, we aim to identify the potentials and challenges of using VR in SMEs by means of a systematic literature review. Therefore, we focus on analyzing the current state of the art, including use cases, used technology, opportunities, and challenges faced by the companies.
{"title":"Virtual Reality in Small and Medium-Sized Enterprises: A Systematic Literature Review","authors":"S. Brettschuh, M. Holly, Maria Hulla, Patrick Herstätter, J. Pirker","doi":"10.1109/VRW55335.2022.00112","DOIUrl":"https://doi.org/10.1109/VRW55335.2022.00112","url":null,"abstract":"The industry is currently operating in the fourth industrial revolution, also called Industry 4.0, the central element of which is the digitization of companies. One promising implementation of Industry 4.0 is the use of virtual reality (VR). Due to its development in terms of better affordability and accessibility in recent years, VR has gained increasing popularity. This creates new opportunities especially for small and medium-sized enterprises (SMEs) to use VR to boost business and optimize their product development and processes. In this paper, we aim to identify the potentials and challenges of using VR in SMEs by means of a systematic literature review. Therefore, we focus on analyzing the current state of the art, including use cases, used technology, opportunities, and challenges faced by the companies.","PeriodicalId":326252,"journal":{"name":"2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129254676","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-03-01DOI: 10.1109/VRW55335.2022.00271
AadilMehdi J. Sanchawala, Mara Dimofte, Steven K. Feiner
We present an on-site 3D-animated audiovisual tour guide augmented reality application developed for Snap Spectacles 2021. The primary goal of this project is to explore how to use this experimental product to create an augmented reality tour guide. In addition, we present the design considerations for the user interface and the underlying system architecture. We illustrate the workflow of the tour application and discuss our experience working with Spectacles 2021 and its experimental API. We also present our design choices and directions for future work.
{"title":"A Location-Triggered Augmented Reality Walking Tour Using Snap Spectacles 2021","authors":"AadilMehdi J. Sanchawala, Mara Dimofte, Steven K. Feiner","doi":"10.1109/VRW55335.2022.00271","DOIUrl":"https://doi.org/10.1109/VRW55335.2022.00271","url":null,"abstract":"We present an on-site 3D-animated audiovisual tour guide augmented reality application developed for Snap Spectacles 2021. The primary goal of this project is to explore how to use this experimental product to create an augmented reality tour guide. In addition, we present the design considerations for the user interface and the underlying system architecture. We illustrate the workflow of the tour application and discuss our experience working with Spectacles 2021 and its experimental API. We also present our design choices and directions for future work.","PeriodicalId":326252,"journal":{"name":"2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"93 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126635057","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-03-01DOI: 10.1109/VRW55335.2022.00144
Anja K. Faulhaber, Moritz Hoppe, Ludger Schmidt
We propose a concept for displaying notifications in the peripheral field of view of smart glasses aiming to achieve a balance between perception and distraction depending on the priority of the notification. We designed three different visualizations for notifications of low, medium, and high priority. To evaluate this concept, we conducted a study with 24 participants who reacted to the notifications while performing a primary task. Reaction times for the low-priority notification were significantly higher. The medium- and high-priority notifications did not show a clear difference.
{"title":"Priority-Dependent Display of Notifications in the Peripheral Field of View of Smart Glasses","authors":"Anja K. Faulhaber, Moritz Hoppe, Ludger Schmidt","doi":"10.1109/VRW55335.2022.00144","DOIUrl":"https://doi.org/10.1109/VRW55335.2022.00144","url":null,"abstract":"We propose a concept for displaying notifications in the peripheral field of view of smart glasses aiming to achieve a balance between perception and distraction depending on the priority of the notification. We designed three different visualizations for notifications of low, medium, and high priority. To evaluate this concept, we conducted a study with 24 participants who reacted to the notifications while performing a primary task. Reaction times for the low-priority notification were significantly higher. The medium- and high-priority notifications did not show a clear difference.","PeriodicalId":326252,"journal":{"name":"2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121520527","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-03-01DOI: 10.1109/VRW55335.2022.00079
Lauren E. Buck, Mauricio Flores Vargas, R. Mcdonnell
Auditory feedback is an important element in the perception of one's surroundings. Vocal intonation betrays emotion, step frequency determines urgency, and volume gives way to proximity. While it is widely understood that sound is an important element in game design and film, research on how auditory feedback affects interactions taking place in virtual reality experiences is lacking. In this work, we propose an experiment in which we will begin to uncover the effect of auditory feedback, particularly spatial audio, on the preservation of personal space in an immersive virtual environment. Users will be exposed to visual and auditory stimuli and asked to report both interpersonal and peripersonal space. Personal space is dynamically responsive to circumstance, and is notably malleable by the presence of auditory stimuli in the real world. Consequently, we expect personal space to dynamically change in the presence of auditory stimuli in an immersive virtual environment. There is no prior study into how auditory feedback affects the maintenance of this space in virtual reality.
{"title":"The Effect of Spatial Audio on the Virtual Representation of Personal Space","authors":"Lauren E. Buck, Mauricio Flores Vargas, R. Mcdonnell","doi":"10.1109/VRW55335.2022.00079","DOIUrl":"https://doi.org/10.1109/VRW55335.2022.00079","url":null,"abstract":"Auditory feedback is an important element in the perception of one's surroundings. Vocal intonation betrays emotion, step frequency determines urgency, and volume gives way to proximity. While it is widely understood that sound is an important element in game design and film, research on how auditory feedback affects interactions taking place in virtual reality experiences is lacking. In this work, we propose an experiment in which we will begin to uncover the effect of auditory feedback, particularly spatial audio, on the preservation of personal space in an immersive virtual environment. Users will be exposed to visual and auditory stimuli and asked to report both interpersonal and peripersonal space. Personal space is dynamically responsive to circumstance, and is notably malleable by the presence of auditory stimuli in the real world. Consequently, we expect personal space to dynamically change in the presence of auditory stimuli in an immersive virtual environment. There is no prior study into how auditory feedback affects the maintenance of this space in virtual reality.","PeriodicalId":326252,"journal":{"name":"2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"124 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123043493","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2022-03-01DOI: 10.1109/VRW55335.2022.00311
Lee Lisle, Feiyu Lu, Shakiba Davari, Ibrahim Asadullah Tahmid, Alexander Giovannelli, Cory Ilo, Leonardo Pavanatto, Lei Zhang, Luke Schlueter, D. Bowman
In this paper we present our solution to the 2022 3DUI Contest challenge. We aim to provide an immersive VR experience to increase player's awareness of trash pollution in the ocean while improving the current interaction techniques in virtual environments. To achieve these objectives, we adapted two classic interaction techniques, Go-Go and World in Miniature (WiM), to provide an engaging minigame in which the user collects the trash in the ocean. To improve the precision and address occlusion issues in the traditional Go-Go technique we propose ReX Go-Go. We also propose an adaptation to WiM, referred to as Rabbit-Out-of-the-Hat to allow an exocentric interaction for easier object retrieval interaction.
在本文中,我们提出了2022年3DUI竞赛挑战的解决方案。我们的目标是提供一种沉浸式的VR体验,以提高玩家对海洋垃圾污染的认识,同时改进当前虚拟环境中的交互技术。为了实现这些目标,我们采用了Go-Go和World in Miniature (WiM)这两种经典互动技术,提供了一款吸引人的迷你游戏,让玩家在其中收集海洋中的垃圾。为了提高精度和解决传统Go-Go技术中的遮挡问题,我们提出了ReX Go-Go。我们还提出了对WiM的一种适应,称为rabbit - out -the- hat,以允许外中心交互,从而更容易地进行对象检索交互。
{"title":"Clean the Ocean: An Immersive VR Experience Proposing New Modifications to Go-Go and WiM Techniques","authors":"Lee Lisle, Feiyu Lu, Shakiba Davari, Ibrahim Asadullah Tahmid, Alexander Giovannelli, Cory Ilo, Leonardo Pavanatto, Lei Zhang, Luke Schlueter, D. Bowman","doi":"10.1109/VRW55335.2022.00311","DOIUrl":"https://doi.org/10.1109/VRW55335.2022.00311","url":null,"abstract":"In this paper we present our solution to the 2022 3DUI Contest challenge. We aim to provide an immersive VR experience to increase player's awareness of trash pollution in the ocean while improving the current interaction techniques in virtual environments. To achieve these objectives, we adapted two classic interaction techniques, Go-Go and World in Miniature (WiM), to provide an engaging minigame in which the user collects the trash in the ocean. To improve the precision and address occlusion issues in the traditional Go-Go technique we propose ReX Go-Go. We also propose an adaptation to WiM, referred to as Rabbit-Out-of-the-Hat to allow an exocentric interaction for easier object retrieval interaction.","PeriodicalId":326252,"journal":{"name":"2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122187349","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}