T. Williams, D. Szafir, T. Chakraborti, H. B. Amor
The 2 nd International Workshop on Virtual, Augmented, and Mixed Reality for Human-Robot Interactions (VAM-HRI) will bring together HRI, Robotics, and Mixed Reality researchers to identify challenges in mixed reality interactions between humans and robots. Topics relevant to the workshop include development of robots that can interact with humans in mixed reality, use of virtual reality for developing interactive robots, the design of new augmented reality interfaces that mediate communication between humans and robots, comparisons of the capabilities and perceptions of robots and virtual agents, and best design practices. VAM-HRI was held for the first time at HRI 2018, where it served as the first workshop of its kind at an academic AI or Robotics conference, and served as a timely call to arms to the academic community in response to the growing promise of this emerging field. VAM-HRI 2019 will follow on the success of VAM-HRI 2018, and present new opportunities for expanding this nascent research community. Website http://vam-hri.xyz/
{"title":"Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI)","authors":"T. Williams, D. Szafir, T. Chakraborti, H. B. Amor","doi":"10.1145/3371382.3374850","DOIUrl":"https://doi.org/10.1145/3371382.3374850","url":null,"abstract":"The 2 nd International Workshop on Virtual, Augmented, and Mixed Reality for Human-Robot Interactions (VAM-HRI) will bring together HRI, Robotics, and Mixed Reality researchers to identify challenges in mixed reality interactions between humans and robots. Topics relevant to the workshop include development of robots that can interact with humans in mixed reality, use of virtual reality for developing interactive robots, the design of new augmented reality interfaces that mediate communication between humans and robots, comparisons of the capabilities and perceptions of robots and virtual agents, and best design practices. VAM-HRI was held for the first time at HRI 2018, where it served as the first workshop of its kind at an academic AI or Robotics conference, and served as a timely call to arms to the academic community in response to the growing promise of this emerging field. VAM-HRI 2019 will follow on the success of VAM-HRI 2018, and present new opportunities for expanding this nascent research community. Website http://vam-hri.xyz/","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"20 1","pages":"671-672"},"PeriodicalIF":0.0,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82813143","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-03-01DOI: 10.1109/HRI.2019.8673287
R. Gutierrez, Elaine Schaertl Short, S. Niekum, A. Thomaz
Robots deployed in human environments will inevitably encounter unmodeled scenarios which are likely to result in execution failures. To address this issue, we would like to allow co-present naive users to correct and improve the robot's behavior as these edge cases are encountered over time.
{"title":"Learning from Corrective Demonstrations","authors":"R. Gutierrez, Elaine Schaertl Short, S. Niekum, A. Thomaz","doi":"10.1109/HRI.2019.8673287","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673287","url":null,"abstract":"Robots deployed in human environments will inevitably encounter unmodeled scenarios which are likely to result in execution failures. To address this issue, we would like to allow co-present naive users to correct and improve the robot's behavior as these edge cases are encountered over time.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"28 1","pages":"712-714"},"PeriodicalIF":0.0,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83604735","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-03-01DOI: 10.1109/HRI.2019.8673328
Pragathi Praveena, G. Subramani, Bilge Mutlu, Michael Gleicher
In this section, we discuss some extensions of Section III and expand on the limitations mentioned in Section VI of the main article.
在本节中,我们将讨论第三节的一些扩展,并扩展主要文章第六节中提到的限制。
{"title":"Supplementary Material for Characterizing Input Methods for Human-to-Robot Demonstrations","authors":"Pragathi Praveena, G. Subramani, Bilge Mutlu, Michael Gleicher","doi":"10.1109/HRI.2019.8673328","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673328","url":null,"abstract":"In this section, we discuss some extensions of Section III and expand on the limitations mentioned in Section VI of the main article.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"53 1","pages":"1-3"},"PeriodicalIF":0.0,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83609994","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-03-01DOI: 10.1109/HRI.2019.8673136
Sungyong Lee, Suncheol Kwon
Robotic rollators (walkers with motor) require appropriate assist performance to improve the quality of life of the elderly and disabled. In this pilot study, the performance of commercial robotic rollators was evaluated by analyzing the effect of load condition on muscle activities and knee and hip joint angle when walking on a slope. Results showed that, the highest muscle activity occurred under a load of 5 kg but there was no difference in joint angle according to load condition. In future studies, we plan to test the performance with many subjects. In addition, we will perform motion analysis and muscle activity analysis according to changes in assist, brake, and speed levels.
{"title":"Analysis of User Interaction While Walking on Slopes Using Robotic Rollators – Pilot Study","authors":"Sungyong Lee, Suncheol Kwon","doi":"10.1109/HRI.2019.8673136","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673136","url":null,"abstract":"Robotic rollators (walkers with motor) require appropriate assist performance to improve the quality of life of the elderly and disabled. In this pilot study, the performance of commercial robotic rollators was evaluated by analyzing the effect of load condition on muscle activities and knee and hip joint angle when walking on a slope. Results showed that, the highest muscle activity occurred under a load of 5 kg but there was no difference in joint angle according to load condition. In future studies, we plan to test the performance with many subjects. In addition, we will perform motion analysis and muscle activity analysis according to changes in assist, brake, and speed levels.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"28 1","pages":"662-663"},"PeriodicalIF":0.0,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83852935","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-03-01DOI: 10.1109/HRI.2019.8673095
T. Nomura, T. Kanda, Sachie Yamada
We developed a self-report measurement, Moral Concern for Robots Scale (MCRS), which measures whether people believe that a robot has moral standing, deserves moral care, and merits protection. The results of an online survey ($pmb{N}= 200$) confirmed the concurrent validity and predictive validity of the scale in the sense that the scale scores are successfully used to predict people's intentions for prosocial behaviors.
{"title":"Measurement of Moral Concern for Robots","authors":"T. Nomura, T. Kanda, Sachie Yamada","doi":"10.1109/HRI.2019.8673095","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673095","url":null,"abstract":"We developed a self-report measurement, Moral Concern for Robots Scale (MCRS), which measures whether people believe that a robot has moral standing, deserves moral care, and merits protection. The results of an online survey ($pmb{N}= 200$) confirmed the concurrent validity and predictive validity of the scale in the sense that the scale scores are successfully used to predict people's intentions for prosocial behaviors.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"14 1","pages":"540-541"},"PeriodicalIF":0.0,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89841895","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-03-01DOI: 10.1109/HRI.2019.8673100
M. Lager, E. A. Topp, J. Malec
We compared three different Graphical User Interfaces (GUI) that we have designed and implemented to enable human supervision of an unmanned ship. Our findings indicate that a 3D GUI presented either on a screen or in a Virtual Reality (VR) setting provides several objective and subjective benefits compared to a Baseline GUI representing traditional tools.
{"title":"Remote Supervision of an Unmanned Surface Vessel - A Comparison of Interfaces","authors":"M. Lager, E. A. Topp, J. Malec","doi":"10.1109/HRI.2019.8673100","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673100","url":null,"abstract":"We compared three different Graphical User Interfaces (GUI) that we have designed and implemented to enable human supervision of an unmanned ship. Our findings indicate that a 3D GUI presented either on a screen or in a Virtual Reality (VR) setting provides several objective and subjective benefits compared to a Baseline GUI representing traditional tools.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"40 2","pages":"546-547"},"PeriodicalIF":0.0,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91472870","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-03-01DOI: 10.1109/HRI.2019.8673177
Dina Utami, T. Bickmore
Intimate relationships are integral parts of human societies, yet many relationships are in distress. Couples counseling has been shown to be effective in preventing and alleviating relationship distress, yet many couples do not seek professional help, due to cost, logistic, and discomfort in disclosing private problems. In this paper, we describe our efforts towards the development a fully automated couples counselor robot, and focus specifically on the problem of identifying and processing “collaborative responses”, in which a human couple co-construct a response to a query from the robot. We present an analysis of collaborative responses obtained from a pilot study, then develop a data-driven model to detect end of collaborative responses for regulating turn taking during a counseling session. Our model uses a combination of multimodal features, and achieves an offline weighted F-score of 0.81. Finally, we present findings from a quasi-experimental study with a robot facilitating a counseling session to promote intimacy with romantic couples. Our findings suggest that the session improves couples intimacy and positive affect. An online evaluation of the end-of-collaborative-response model demonstrates an F-score of 0.72.
{"title":"Collaborative User Responses in Multiparty Interaction with a Couples Counselor Robot","authors":"Dina Utami, T. Bickmore","doi":"10.1109/HRI.2019.8673177","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673177","url":null,"abstract":"Intimate relationships are integral parts of human societies, yet many relationships are in distress. Couples counseling has been shown to be effective in preventing and alleviating relationship distress, yet many couples do not seek professional help, due to cost, logistic, and discomfort in disclosing private problems. In this paper, we describe our efforts towards the development a fully automated couples counselor robot, and focus specifically on the problem of identifying and processing “collaborative responses”, in which a human couple co-construct a response to a query from the robot. We present an analysis of collaborative responses obtained from a pilot study, then develop a data-driven model to detect end of collaborative responses for regulating turn taking during a counseling session. Our model uses a combination of multimodal features, and achieves an offline weighted F-score of 0.81. Finally, we present findings from a quasi-experimental study with a robot facilitating a counseling session to promote intimacy with romantic couples. Our findings suggest that the session improves couples intimacy and positive affect. An online evaluation of the end-of-collaborative-response model demonstrates an F-score of 0.72.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"22 1","pages":"294-303"},"PeriodicalIF":0.0,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83924172","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-03-01DOI: 10.1109/hri.2019.8673267
Rohan R. Paleja, M. Gombolay
The development of human-robot systems able to leverage the strengths of both humans and their robotic counterparts has been greatly sought after because of the foreseen, broad-ranging impact across industry and research. We believe the true potential of these systems cannot be reached unless the robot is able to act with a high level of autonomy, reducing the burden of manual tasking or teleoperation. To achieve this level of autonomy, robots must be able to work fluidly with its human partners, inferring their needs without explicit commands. This inference requires the robot to be able to detect and classify the heterogeneity of its partners. We propose a framework for learning from heterogeneous demonstration based upon Bayesian inference and evaluate a suite of approaches on a real-world dataset of gameplay from StarCraft II. This evaluation provides evidence that our Bayesian approach can outperform conventional methods by up to 12.8%.
{"title":"Heterogeneous Learning from Demonstration","authors":"Rohan R. Paleja, M. Gombolay","doi":"10.1109/hri.2019.8673267","DOIUrl":"https://doi.org/10.1109/hri.2019.8673267","url":null,"abstract":"The development of human-robot systems able to leverage the strengths of both humans and their robotic counterparts has been greatly sought after because of the foreseen, broad-ranging impact across industry and research. We believe the true potential of these systems cannot be reached unless the robot is able to act with a high level of autonomy, reducing the burden of manual tasking or teleoperation. To achieve this level of autonomy, robots must be able to work fluidly with its human partners, inferring their needs without explicit commands. This inference requires the robot to be able to detect and classify the heterogeneity of its partners. We propose a framework for learning from heterogeneous demonstration based upon Bayesian inference and evaluate a suite of approaches on a real-world dataset of gameplay from StarCraft II. This evaluation provides evidence that our Bayesian approach can outperform conventional methods by up to 12.8%.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"1 1","pages":"730-732"},"PeriodicalIF":0.0,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"88671787","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-03-01DOI: 10.1109/HRI.2019.8673116
Rui Li, Marc van Almkerk, S. V. Waveren, E. Carter, Iolanda Leite
Virtual Reality (VR) can greatly benefit Human-Robot Interaction (HRI) as a tool to effectively iterate across robot designs. However, possible system limitations of VR could influence the results such that they do not fully reflect real-life encounters with robots. In order to better deploy VR in HRI, we need to establish a basic understanding of what the differences are between HRI studies in the real world and in VR. This paper investigates the differences between the real life and VR with a focus on proxemic preferences, in combination with exploring the effects of visual familiarity and spatial sound within the VR experience. Results suggested that people prefer closer interaction distances with a real, physical robot than with a virtual robot in VR. Additionally, the virtual robot was perceived as more discomforting than the real robot, which could result in the differences in proxemics. Overall, these results indicate that the perception of the robot has to be evaluated before the interaction can be studied. However, the results also suggested that VR settings with different visual familiarities are consistent with each other in how they affect HRI proxemics and virtual robot perceptions, indicating the freedom to study HRI in various scenarios in VR. The effect of spatial sound in VR drew a more complex picture and thus calls for more in-depth research to understand its influence on HRI in VR.
{"title":"Comparing Human-Robot Proxemics Between Virtual Reality and the Real World","authors":"Rui Li, Marc van Almkerk, S. V. Waveren, E. Carter, Iolanda Leite","doi":"10.1109/HRI.2019.8673116","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673116","url":null,"abstract":"Virtual Reality (VR) can greatly benefit Human-Robot Interaction (HRI) as a tool to effectively iterate across robot designs. However, possible system limitations of VR could influence the results such that they do not fully reflect real-life encounters with robots. In order to better deploy VR in HRI, we need to establish a basic understanding of what the differences are between HRI studies in the real world and in VR. This paper investigates the differences between the real life and VR with a focus on proxemic preferences, in combination with exploring the effects of visual familiarity and spatial sound within the VR experience. Results suggested that people prefer closer interaction distances with a real, physical robot than with a virtual robot in VR. Additionally, the virtual robot was perceived as more discomforting than the real robot, which could result in the differences in proxemics. Overall, these results indicate that the perception of the robot has to be evaluated before the interaction can be studied. However, the results also suggested that VR settings with different visual familiarities are consistent with each other in how they affect HRI proxemics and virtual robot perceptions, indicating the freedom to study HRI in various scenarios in VR. The effect of spatial sound in VR drew a more complex picture and thus calls for more in-depth research to understand its influence on HRI in VR.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"102 6","pages":"431-439"},"PeriodicalIF":0.0,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91473930","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2019-03-01DOI: 10.1109/HRI.2019.8673088
M. Strait, Heather L. Urry, P. Muentener
Both perceptual mechanisms (e.g., threat detection/avoidance) and social mechanisms (e.g., fears fostered via negative media) may explain the existence of the uncanny valley; however, existing literature lacks sufficient evidence to decide whether one, the other, or a combination best accounts for the valley's effects. As perceptually oriented explanations imply the valley should be evident early in development, we investigated whether it presents in the responding of children ($N=80$; ages 5–10) to agents of varying human similarity. We found that, like adults, children were most averse to highly humanlike robots (relative to less humanlike robots and humans). But, unlike adults, children's aversion did not translate to avoidance. The findings thus indicate, consistent with perceptual explanations, that the valley effect manifests well before adulthood. However, further research is needed to understand the emergence of the valley's behavioral consequences.
{"title":"Children's Responding to Humanlike Agents Reflects an Uncanny Valley","authors":"M. Strait, Heather L. Urry, P. Muentener","doi":"10.1109/HRI.2019.8673088","DOIUrl":"https://doi.org/10.1109/HRI.2019.8673088","url":null,"abstract":"Both perceptual mechanisms (e.g., threat detection/avoidance) and social mechanisms (e.g., fears fostered via negative media) may explain the existence of the uncanny valley; however, existing literature lacks sufficient evidence to decide whether one, the other, or a combination best accounts for the valley's effects. As perceptually oriented explanations imply the valley should be evident early in development, we investigated whether it presents in the responding of children ($N=80$; ages 5–10) to agents of varying human similarity. We found that, like adults, children were most averse to highly humanlike robots (relative to less humanlike robots and humans). But, unlike adults, children's aversion did not translate to avoidance. The findings thus indicate, consistent with perceptual explanations, that the valley effect manifests well before adulthood. However, further research is needed to understand the emergence of the valley's behavioral consequences.","PeriodicalId":6600,"journal":{"name":"2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI)","volume":"41 1","pages":"506-515"},"PeriodicalIF":0.0,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"81872402","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}