Nialah Jenae Wilson-Small, D. Goedicke, Kirstin H. Petersen, Shiri Azenkot
Drones (micro unmanned aerial vehicles) are becoming more prevalent in applications that bring them into close human spaces. This is made possible in part by clear drone-to-human communication strategies. However, current auditory and visual communication methods only work with strict environmental settings. To continue expanding the possibilities for drones to be useful in human spaces, we explore ways to overcome these limitations through physical touch. We present a new application for drones--physical instructive feedback. To do this we designed three different physical interaction modes for a drone. We then conducted a user study (N=12) to answer fundamental questions of where and how people want to physically interact with drones, and what people naturally infer the physical touch is communicating. We then used these insights to conduct a second user study (N=14) to understand the best way for a drone to communicate instructions to a human in a movement task. We found that continuous physical feedback is both the preferred mode and is more effective at providing instruction than incremental feedback.
{"title":"A Drone Teacher: Designing Physical Human-Drone Interactions for Movement Instruction","authors":"Nialah Jenae Wilson-Small, D. Goedicke, Kirstin H. Petersen, Shiri Azenkot","doi":"10.1145/3568162.3576985","DOIUrl":"https://doi.org/10.1145/3568162.3576985","url":null,"abstract":"Drones (micro unmanned aerial vehicles) are becoming more prevalent in applications that bring them into close human spaces. This is made possible in part by clear drone-to-human communication strategies. However, current auditory and visual communication methods only work with strict environmental settings. To continue expanding the possibilities for drones to be useful in human spaces, we explore ways to overcome these limitations through physical touch. We present a new application for drones--physical instructive feedback. To do this we designed three different physical interaction modes for a drone. We then conducted a user study (N=12) to answer fundamental questions of where and how people want to physically interact with drones, and what people naturally infer the physical touch is communicating. We then used these insights to conduct a second user study (N=14) to understand the best way for a drone to communicate instructions to a human in a movement task. We found that continuous physical feedback is both the preferred mode and is more effective at providing instruction than incremental feedback.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"32 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76158836","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Human-robot interaction has the power to influence human norms and culture. While there is potential benefit in using this power to create positive social change, so too is there risk in merely reinforcing existing social biases which uphold systems of oppression. As the most salient forms of oppression arise along lines of social identity, it stands to reason that we must take utmost care in leveraging human-like identity cues when designing social robots and other agentic embodiments. Yet, the understanding of how to do this is not well-developed. Towards forming an ethics of robot identity, we begin by surveying the state of thought on the topic in human-robot interaction. We do this by conducting a structured review of HRI conference proceedings analyzed from a feminist, intersectional perspective. Our initial findings suggest that existing literature has not fully engaged with intersectionality, embodies an alarming pathologization of neurodivergence, and almost wholly neglects the examination of race.
{"title":"Examining the State of Robot Identity","authors":"Lux Miranda, Ginevra Castellano, Katie Winkle","doi":"10.1145/3568294.3580168","DOIUrl":"https://doi.org/10.1145/3568294.3580168","url":null,"abstract":"Human-robot interaction has the power to influence human norms and culture. While there is potential benefit in using this power to create positive social change, so too is there risk in merely reinforcing existing social biases which uphold systems of oppression. As the most salient forms of oppression arise along lines of social identity, it stands to reason that we must take utmost care in leveraging human-like identity cues when designing social robots and other agentic embodiments. Yet, the understanding of how to do this is not well-developed. Towards forming an ethics of robot identity, we begin by surveying the state of thought on the topic in human-robot interaction. We do this by conducting a structured review of HRI conference proceedings analyzed from a feminist, intersectional perspective. Our initial findings suggest that existing literature has not fully engaged with intersectionality, embodies an alarming pathologization of neurodivergence, and almost wholly neglects the examination of race.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"39 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85083642","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Kubota, Rainee Pei, Ethan Sun, Dagoberto Cruz-Sandoval, Soyon Kim, L. Riek
Many robot-delivered health interventions aim to support people longitudinally at home to complement or replace in-clinic treatments. However, there is little guidance on how robots can support collaborative goal setting (CGS). CGS is the process in which a person works with a clinician to set and modify their goals for care; it can improve treatment adherence and efficacy. However, for home-deployed robots, clinicians will have limited availability to help set and modify goals over time, which necessitates that robots support CGS on their own. In this work, we explore how robots can facilitate CGS in the context of our robot CARMEN (Cognitively Assistive Robot for Motivation and Neurorehabilitation), which delivers neurorehabilitation to people with mild cognitive impairment (PwMCI). We co-designed robot behaviors for supporting CGS with clinical neuropsychologists and PwMCI, and prototyped them on CARMEN. We present feedback on how PwMCI envision these behaviors supporting goal progress and motivation during an intervention. We report insights on how to support this process with home-deployed robots and propose a framework to support HRI researchers interested in exploring this both in the context of cognitively assistive robots and beyond. This work supports designing and implementing CGS on robots, which will ultimately extend the efficacy of robot-delivered health interventions.
{"title":"Get SMART: Collaborative Goal Setting with Cognitively Assistive Robots","authors":"A. Kubota, Rainee Pei, Ethan Sun, Dagoberto Cruz-Sandoval, Soyon Kim, L. Riek","doi":"10.1145/3568162.3576993","DOIUrl":"https://doi.org/10.1145/3568162.3576993","url":null,"abstract":"Many robot-delivered health interventions aim to support people longitudinally at home to complement or replace in-clinic treatments. However, there is little guidance on how robots can support collaborative goal setting (CGS). CGS is the process in which a person works with a clinician to set and modify their goals for care; it can improve treatment adherence and efficacy. However, for home-deployed robots, clinicians will have limited availability to help set and modify goals over time, which necessitates that robots support CGS on their own. In this work, we explore how robots can facilitate CGS in the context of our robot CARMEN (Cognitively Assistive Robot for Motivation and Neurorehabilitation), which delivers neurorehabilitation to people with mild cognitive impairment (PwMCI). We co-designed robot behaviors for supporting CGS with clinical neuropsychologists and PwMCI, and prototyped them on CARMEN. We present feedback on how PwMCI envision these behaviors supporting goal progress and motivation during an intervention. We report insights on how to support this process with home-deployed robots and propose a framework to support HRI researchers interested in exploring this both in the context of cognitively assistive robots and beyond. This work supports designing and implementing CGS on robots, which will ultimately extend the efficacy of robot-delivered health interventions.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"33 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81563767","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Searching via speech with a robot can be used to better support children in expressing their information needs. We report on an exploratory study where children (N=35) worked on search tasks with two robots using different interaction styles. One system posed closed, yes/no questions and was more system-driven while the other system used open-ended questions and was more user-driven. We studied children's preferences and experiences of these interaction styles using questionnaires and semi-structured interviews. We found no overall strong preference between the interaction styles. However, some children reported task-dependent preferences. We further report on children's interpretation and reasoning around interaction styles for robots supporting information search.
{"title":"Robot-Supported Information Search: Which Conversational Interaction Style do Children Prefer?","authors":"Suyash Sharma, T. Beelen, K. Truong","doi":"10.1145/3568294.3580128","DOIUrl":"https://doi.org/10.1145/3568294.3580128","url":null,"abstract":"Searching via speech with a robot can be used to better support children in expressing their information needs. We report on an exploratory study where children (N=35) worked on search tasks with two robots using different interaction styles. One system posed closed, yes/no questions and was more system-driven while the other system used open-ended questions and was more user-driven. We studied children's preferences and experiences of these interaction styles using questionnaires and semi-structured interviews. We found no overall strong preference between the interaction styles. However, some children reported task-dependent preferences. We further report on children's interpretation and reasoning around interaction styles for robots supporting information search.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"12 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84693712","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The use of social robots as a tool for language learning has been studied quite extensively recently. Although their effectiveness and comparison with other technologies are well studied, the effects of the robot's appearance and the interaction setting have received less attention. As educational robots are envisioned to appear in household or school environments, it is important to investigate how their designed persona or interaction dynamics affect learning outcomes. In such environments, children may do the activities together or alone or perform them in the presence of an adult or another child. In this regard, we have identified two novel factors to investigate: the robot's perceived age (adult or child) and the number of learners interacting with the robot simultaneously (one or two). We designed an incidental word learning card game with the Furhat robot and ran a between-subject experiment with 75 middle school participants. We investigated the interactions and effects of children's word learning outcomes, speech activity, and perception of the robot's role. The results show that children who played alone with the robot had better word retention and anthropomorphized the robot more, compared to those who played in pairs. Furthermore, unlike previous findings from human-human interactions, children did not show different behaviors in the presence of a robot designed as an adult or a child. We discuss these factors in detail and make a novel contribution to the direct comparison of collaborative versus individual learning and the new concept of the robot's age.
{"title":"I Learn Better Alone! Collaborative and Individual Word Learning With a Child and Adult Robot","authors":"Alireza M. Kamelabad, G. Skantze","doi":"10.1145/3568162.3577004","DOIUrl":"https://doi.org/10.1145/3568162.3577004","url":null,"abstract":"The use of social robots as a tool for language learning has been studied quite extensively recently. Although their effectiveness and comparison with other technologies are well studied, the effects of the robot's appearance and the interaction setting have received less attention. As educational robots are envisioned to appear in household or school environments, it is important to investigate how their designed persona or interaction dynamics affect learning outcomes. In such environments, children may do the activities together or alone or perform them in the presence of an adult or another child. In this regard, we have identified two novel factors to investigate: the robot's perceived age (adult or child) and the number of learners interacting with the robot simultaneously (one or two). We designed an incidental word learning card game with the Furhat robot and ran a between-subject experiment with 75 middle school participants. We investigated the interactions and effects of children's word learning outcomes, speech activity, and perception of the robot's role. The results show that children who played alone with the robot had better word retention and anthropomorphized the robot more, compared to those who played in pairs. Furthermore, unlike previous findings from human-human interactions, children did not show different behaviors in the presence of a robot designed as an adult or a child. We discuss these factors in detail and make a novel contribution to the direct comparison of collaborative versus individual learning and the new concept of the robot's age.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"7 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81923008","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yuhan Hu, Jin Ryu, David Gundana, Kirstin H. Petersen, H. Kress-Gazit, G. Hoffman
Robots have the potential to assist in emergency evacuation tasks, but it is not clear how robots should behave to evacuate people who are not fully compliant, perhaps due to panic or other priorities in an emergency. In this paper, we compare two robot strategies: an actively nudging robot that initiates evacuation and pulls toward the exit and a passively waiting robot that stays around users and waits for instruction. Both strategies were automatically synthesized from a description of the desired behavior. We conduct a within participant study ( = 20) in a simulated environment to compare the evacuation effectiveness between the two robot strategies. Our results indicate an advantage of the nudging robot for effective evacuation when being exposed to the evacuation scenario for the first time. The waiting robot results in lower efficiency, higher mental load, and more physical conflicts. However, participants like the waiting robots equally or slightly more when they repeat the evacuation scenario and are more familiar with the situation. Our qualitative analysis of the participants' feedback suggests several design implications for future emergency evacuation robots.
{"title":"Nudging or Waiting?: Automatically Synthesized Robot Strategies for Evacuating Noncompliant Users in an Emergency Situation","authors":"Yuhan Hu, Jin Ryu, David Gundana, Kirstin H. Petersen, H. Kress-Gazit, G. Hoffman","doi":"10.1145/3568162.3576955","DOIUrl":"https://doi.org/10.1145/3568162.3576955","url":null,"abstract":"Robots have the potential to assist in emergency evacuation tasks, but it is not clear how robots should behave to evacuate people who are not fully compliant, perhaps due to panic or other priorities in an emergency. In this paper, we compare two robot strategies: an actively nudging robot that initiates evacuation and pulls toward the exit and a passively waiting robot that stays around users and waits for instruction. Both strategies were automatically synthesized from a description of the desired behavior. We conduct a within participant study ( = 20) in a simulated environment to compare the evacuation effectiveness between the two robot strategies. Our results indicate an advantage of the nudging robot for effective evacuation when being exposed to the evacuation scenario for the first time. The waiting robot results in lower efficiency, higher mental load, and more physical conflicts. However, participants like the waiting robots equally or slightly more when they repeat the evacuation scenario and are more familiar with the situation. Our qualitative analysis of the participants' feedback suggests several design implications for future emergency evacuation robots.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"11 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81976013","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Significant segments of the HRI literature rely on or promote the ability to reason about human identity characteristics, including age, gender, and cultural background. However, attempting to handle identity characteristics raises a number of critical ethical concerns, especially given the spatiotemporal dynamics of these characteristics. In this paper I question whether human identity characteristics can and should be represented, recognized, or reasoned about by robots, with special attention paid to the construct of race, due to its relative lack of consideration within the HRI community. As I will argue, while there are a number of well-warranted reasons why HRI researchers might want to enable robotic consideration of identity characteristics, these reasons are outweighed by a number of key ontological, perceptual, and deployment-oriented concerns. This argument raises troubling questions as to whether robots should even be able to understand or generate descriptions of people, and how they would do so while avoiding these ethical concerns. Finally, I conclude with a discussion of what this means for the HRI community, in terms of both algorithm and robot design, and speculate as to possible paths forward.
{"title":"The Eye of the Robot Beholder: Ethical Risks of Representation, Recognition, and Reasoning over Identity Characteristics in Human-Robot Interaction","authors":"T. Williams","doi":"10.1145/3568294.3580031","DOIUrl":"https://doi.org/10.1145/3568294.3580031","url":null,"abstract":"Significant segments of the HRI literature rely on or promote the ability to reason about human identity characteristics, including age, gender, and cultural background. However, attempting to handle identity characteristics raises a number of critical ethical concerns, especially given the spatiotemporal dynamics of these characteristics. In this paper I question whether human identity characteristics can and should be represented, recognized, or reasoned about by robots, with special attention paid to the construct of race, due to its relative lack of consideration within the HRI community. As I will argue, while there are a number of well-warranted reasons why HRI researchers might want to enable robotic consideration of identity characteristics, these reasons are outweighed by a number of key ontological, perceptual, and deployment-oriented concerns. This argument raises troubling questions as to whether robots should even be able to understand or generate descriptions of people, and how they would do so while avoiding these ethical concerns. Finally, I conclude with a discussion of what this means for the HRI community, in terms of both algorithm and robot design, and speculate as to possible paths forward.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"391 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"78060690","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Adrian Anhuaman, Carlos Granados, William Meza, Roberto Raez
Autistic kids have difficulties communicating with others and learning new things in an academic environment. Cogui is a robot designed for ASD children. It converses with children in a reciprocal way in order to emphasize with the kid and help them in their learning process while having fun.
{"title":"Cogui","authors":"Adrian Anhuaman, Carlos Granados, William Meza, Roberto Raez","doi":"10.1145/3568294.3580202","DOIUrl":"https://doi.org/10.1145/3568294.3580202","url":null,"abstract":"Autistic kids have difficulties communicating with others and learning new things in an academic environment. Cogui is a robot designed for ASD children. It converses with children in a reciprocal way in order to emphasize with the kid and help them in their learning process while having fun.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"43 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74138499","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Deploying a social robot in the real world means that it must interact with speakers from diverse backgrounds, who in turn are likely to show substantial accent and dialect variation. Linguistic variation in social context has been well studied in human-human interaction; however, the influence of these factors on human interactions with digital agents, especially embodied agents such as robots, has received less attention. Here we present an ongoing project where the goal is to develop a social robot that is suitable for deployment in ethnically-diverse areas with distinctive regional accents. To help in developing this robot, we carried out an online survey of Scottish adults to understand their expectations for conversational interaction with a robot. The results confirm that social factors constraining accent and dialect are likely to be significant issues for human-robot interaction in this context, and so must be taken into account in the design of the system at all levels.
{"title":"Social Robotics meets Sociolinguistics: Investigating Accent Bias and Social Context in HRI","authors":"M. Foster, J. Stuart-Smith","doi":"10.1145/3568294.3580063","DOIUrl":"https://doi.org/10.1145/3568294.3580063","url":null,"abstract":"Deploying a social robot in the real world means that it must interact with speakers from diverse backgrounds, who in turn are likely to show substantial accent and dialect variation. Linguistic variation in social context has been well studied in human-human interaction; however, the influence of these factors on human interactions with digital agents, especially embodied agents such as robots, has received less attention. Here we present an ongoing project where the goal is to develop a social robot that is suitable for deployment in ethnically-diverse areas with distinctive regional accents. To help in developing this robot, we carried out an online survey of Scottish adults to understand their expectations for conversational interaction with a robot. The results confirm that social factors constraining accent and dialect are likely to be significant issues for human-robot interaction in this context, and so must be taken into account in the design of the system at all levels.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"84 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75962293","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
As people's expectations regarding robots are still mostly shaped by the media and Science Fiction, there exists a gap between imaginaries of robots and the state-of-the-art of robotic technologies. Care robots are one example of existing robots that the general public has little awareness about. In this report, we introduce a card-based game prototype developed with the goal to bridge this gap and explore how people conceive of existing care robots as a part of their daily lives. Based on the trial game runs, we conclude that game-based approach is effective as a device to inform participants in a playful setting about existing care robots and to elicit conversations about the role such robots could play in their lives. In the future, we plan to adapt the prototype and create a design game prototype to develop novel use cases for care robots.
{"title":"Bridging the Gap: Using a Game-based Approach to Raise Lay People's Awareness About Care Robots","authors":"Katharina Brunnmayr, A. Weiss","doi":"10.1145/3568294.3580125","DOIUrl":"https://doi.org/10.1145/3568294.3580125","url":null,"abstract":"As people's expectations regarding robots are still mostly shaped by the media and Science Fiction, there exists a gap between imaginaries of robots and the state-of-the-art of robotic technologies. Care robots are one example of existing robots that the general public has little awareness about. In this report, we introduce a card-based game prototype developed with the goal to bridge this gap and explore how people conceive of existing care robots as a part of their daily lives. Based on the trial game runs, we conclude that game-based approach is effective as a device to inform participants in a playful setting about existing care robots and to elicit conversations about the role such robots could play in their lives. In the future, we plan to adapt the prototype and create a design game prototype to develop novel use cases for care robots.","PeriodicalId":36515,"journal":{"name":"ACM Transactions on Human-Robot Interaction","volume":"16 1","pages":""},"PeriodicalIF":5.1,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73987901","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}