This article discusses the relevance of the motion behavior and adaptation of a collaborative robot for human-robot cooperation. Two experiments on cooperative assembly are shown. First, a human-human experiment with defined test conditions evaluates the aspects of distance, nearest body part, and predictability as significant. Second, a human-robot experiment shows that fixed trajectories and conservative dynamic parameters lead to a quick gain of confidence of the participants. Besides, the data shows that a realistic use case with complex tasks is key to evaluate the impact of motion parameters.
{"title":"Motion Analysis of Human-Human and Human-Robot Cooperation During Industrial Assembly Tasks","authors":"J. Höcherl, B. Wrede, T. Schlegl","doi":"10.1145/3125739.3132615","DOIUrl":"https://doi.org/10.1145/3125739.3132615","url":null,"abstract":"This article discusses the relevance of the motion behavior and adaptation of a collaborative robot for human-robot cooperation. Two experiments on cooperative assembly are shown. First, a human-human experiment with defined test conditions evaluates the aspects of distance, nearest body part, and predictability as significant. Second, a human-robot experiment shows that fixed trajectories and conservative dynamic parameters lead to a quick gain of confidence of the participants. Besides, the data shows that a realistic use case with complex tasks is key to evaluate the impact of motion parameters.","PeriodicalId":346669,"journal":{"name":"Proceedings of the 5th International Conference on Human Agent Interaction","volume":"56 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122924578","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jiong Sun, S. Redyuk, E. Billing, D. Högberg, Paul E. Hemeren
This paper presents an ongoing study on affective human-robot interaction. In our previous research, touch type is shown to be informative for communicated emotion. Here, a soft matrix array sensor is used to capture the tactile interaction between human and robot and 6 machine learning methods including CNN, RNN and C3D are implemented to classify different touch types, constituting a pre-stage to recognizing emotional tactile interaction. Results show an average recognition rate of 95% by C3D for classified touch types, which provide stable classification results for developing social touch technology.
{"title":"Tactile Interaction and Social Touch: Classifying Human Touch Using a Soft Tactile Sensor","authors":"Jiong Sun, S. Redyuk, E. Billing, D. Högberg, Paul E. Hemeren","doi":"10.1145/3125739.3132614","DOIUrl":"https://doi.org/10.1145/3125739.3132614","url":null,"abstract":"This paper presents an ongoing study on affective human-robot interaction. In our previous research, touch type is shown to be informative for communicated emotion. Here, a soft matrix array sensor is used to capture the tactile interaction between human and robot and 6 machine learning methods including CNN, RNN and C3D are implemented to classify different touch types, constituting a pre-stage to recognizing emotional tactile interaction. Results show an average recognition rate of 95% by C3D for classified touch types, which provide stable classification results for developing social touch technology.","PeriodicalId":346669,"journal":{"name":"Proceedings of the 5th International Conference on Human Agent Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128679316","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In this paper, we verified two kinds of two-dimensional mind perception models of humanoid virtual agents and investigate the relationship between the models and effect of emotional contagion. To verify the two kinds of dimensional models, we used questionnaires from prior works and our own questionnaire. From these questionnaires, we constructed an "agency"- "experience" model and "familiarity"-"'reality" model from EFA. These two models are valid for distinguishing humanoid agents and predicting the effect of emotional contagion. The factor scores of "experience" and "familiarity" have a high correlation coefficient with the effect of emotional contagion. This result suggests a method for designing humanoid agents that have a high emotional contagion ability.
{"title":"Two-Dimensional Mind Perception Model of Humanoid Virtual Agent","authors":"T. Matsui, S. Yamada","doi":"10.1145/3125739.3125761","DOIUrl":"https://doi.org/10.1145/3125739.3125761","url":null,"abstract":"In this paper, we verified two kinds of two-dimensional mind perception models of humanoid virtual agents and investigate the relationship between the models and effect of emotional contagion. To verify the two kinds of dimensional models, we used questionnaires from prior works and our own questionnaire. From these questionnaires, we constructed an \"agency\"- \"experience\" model and \"familiarity\"-\"'reality\" model from EFA. These two models are valid for distinguishing humanoid agents and predicting the effect of emotional contagion. The factor scores of \"experience\" and \"familiarity\" have a high correlation coefficient with the effect of emotional contagion. This result suggests a method for designing humanoid agents that have a high emotional contagion ability.","PeriodicalId":346669,"journal":{"name":"Proceedings of the 5th International Conference on Human Agent Interaction","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124117050","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Haruka Kasuga, Daisuke Sakamoto, N. Munekata, T. Ono
Pets are humanity'V?stra G?talands oldest friend since ancient times. People have been living with them since then, and the relationship between people and pets as family members at home is well researched. Recently, social robots are entering family lives, and a new research field is being born, namely, a triad relationship between people, pets, and social robots. To investigate how a social robot affects a human-animal relationship in the home, an exploratory field experiment was conducted. In this experiment, a robot, called NAO, was introduced into the homes of 10 families, and 22 participants (with 12 pets:4 dogs and 8 cats), called 'owners' hereafter, were asked to interact with NAO. NAO was operated under two conditions:speaking positively to the pets, and speaking negatively to them. Just five sentenses that NAO spoke to the pets and two sentenses to the owners were different. The results of the study show that changing NAO's attitude to the pets affected both the owners? impression of the robot and the pet's impression of the robot (perceived by the owners).
{"title":"A Social Robot in a Human-Animal Relationship at Home: A Field Study","authors":"Haruka Kasuga, Daisuke Sakamoto, N. Munekata, T. Ono","doi":"10.1145/3125739.3125759","DOIUrl":"https://doi.org/10.1145/3125739.3125759","url":null,"abstract":"Pets are humanity'V?stra G?talands oldest friend since ancient times. People have been living with them since then, and the relationship between people and pets as family members at home is well researched. Recently, social robots are entering family lives, and a new research field is being born, namely, a triad relationship between people, pets, and social robots. To investigate how a social robot affects a human-animal relationship in the home, an exploratory field experiment was conducted. In this experiment, a robot, called NAO, was introduced into the homes of 10 families, and 22 participants (with 12 pets:4 dogs and 8 cats), called 'owners' hereafter, were asked to interact with NAO. NAO was operated under two conditions:speaking positively to the pets, and speaking negatively to them. Just five sentenses that NAO spoke to the pets and two sentenses to the owners were different. The results of the study show that changing NAO's attitude to the pets affected both the owners? impression of the robot and the pet's impression of the robot (perceived by the owners).","PeriodicalId":346669,"journal":{"name":"Proceedings of the 5th International Conference on Human Agent Interaction","volume":"66 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126205797","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yuki Yamashita, H. Ishihara, Takashi Ikeda, M. Asada
Personality impressions of robots have been regarded as one of the crucial factors in human-robot interaction. To design the personality impressions, we should know how the visual, auditory, and tactile impressions determine the personality impressions. In this study, we investigated the relationships between touch sensations and personality impressions with a child-type android robot in two conditions where 40 Japanese participants touched a part of the robot with different appearance of the face. Factor and path analyses were conducted on the evaluation scores of the sensations and impressions provided by the participants. As a result, two significant positive causal relationships (p < 0.001) were found between the Preference and Resilience touch sensations, and the Likability and Capability personality impressions, respectively, in both robot conditions. On the other hand, several other causal relationships were found only in one condition. This suggests that there are appearance-dependent and appearance-independent relationships between touch sensations and personality impressions.
{"title":"Appearance of a Robot Influences Causal Relationship between Touch Sensation and the Personality Impression","authors":"Yuki Yamashita, H. Ishihara, Takashi Ikeda, M. Asada","doi":"10.1145/3125739.3132587","DOIUrl":"https://doi.org/10.1145/3125739.3132587","url":null,"abstract":"Personality impressions of robots have been regarded as one of the crucial factors in human-robot interaction. To design the personality impressions, we should know how the visual, auditory, and tactile impressions determine the personality impressions. In this study, we investigated the relationships between touch sensations and personality impressions with a child-type android robot in two conditions where 40 Japanese participants touched a part of the robot with different appearance of the face. Factor and path analyses were conducted on the evaluation scores of the sensations and impressions provided by the participants. As a result, two significant positive causal relationships (p < 0.001) were found between the Preference and Resilience touch sensations, and the Likability and Capability personality impressions, respectively, in both robot conditions. On the other hand, several other causal relationships were found only in one condition. This suggests that there are appearance-dependent and appearance-independent relationships between touch sensations and personality impressions.","PeriodicalId":346669,"journal":{"name":"Proceedings of the 5th International Conference on Human Agent Interaction","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126578150","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Lilian Schröder, Victoria Buchholz, Victoria Helmich, L. Hindemith, B. Wrede, Lars Schillingmann
Interactive storytelling is a social situation that places a number of demands on a system when realized by an artificial agent. It can be used as a means of teaching or entertainment by a social robot or agent. To be successful, the storytelling has to be interesting and responsive. An agent needs to be aware of the user's state and coordinate the course of the story with the user input. Thus, this is an attractive scenario for the examination of HAI topics by user studies. We implemented an interactive storytelling system using the anthropomorphic robot head Flobi that presents the story with multimodal output and, at the same time, is responsive to multimodal input. The system is designed to serve as a basis for future experiments.
{"title":"A Multimodal Interactive Storytelling Agent Using the Anthropomorphic Robot Head Flobi","authors":"Lilian Schröder, Victoria Buchholz, Victoria Helmich, L. Hindemith, B. Wrede, Lars Schillingmann","doi":"10.1145/3125739.3132600","DOIUrl":"https://doi.org/10.1145/3125739.3132600","url":null,"abstract":"Interactive storytelling is a social situation that places a number of demands on a system when realized by an artificial agent. It can be used as a means of teaching or entertainment by a social robot or agent. To be successful, the storytelling has to be interesting and responsive. An agent needs to be aware of the user's state and coordinate the course of the story with the user input. Thus, this is an attractive scenario for the examination of HAI topics by user studies. We implemented an interactive storytelling system using the anthropomorphic robot head Flobi that presents the story with multimodal output and, at the same time, is responsive to multimodal input. The system is designed to serve as a basis for future experiments.","PeriodicalId":346669,"journal":{"name":"Proceedings of the 5th International Conference on Human Agent Interaction","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125117030","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
K. Karjalainen, Anna Elisabeth Sofia Romell, P. Ratsamee, A. Yantaç, M. Fjeld, M. Obaid
Recent research has focused on how to facilitate interaction between humans and robots, giving rise to the field of human robot interaction. A related research area is human-drone interaction (HDI), investigating how interaction between humans and drones can be expanded in novel and meaningful ways. In this work, we explore the use of drones as companions in a home environment. We present three consecutive studies addressing user requirements and design space of companion drones. Following a user-centered approach, the three stages include online questionnaire, design workshops, and simulated virtual reality (VR) home environment. Our results show that participants preferred the idea of a drone companion at home, particularly for tasks such as fetching items and cleaning. The participants were also positive towards a drone companion that featured anthropomorphic features.
{"title":"Social Drone Companion for the Home Environment: a User-Centric Exploration","authors":"K. Karjalainen, Anna Elisabeth Sofia Romell, P. Ratsamee, A. Yantaç, M. Fjeld, M. Obaid","doi":"10.1145/3125739.3125774","DOIUrl":"https://doi.org/10.1145/3125739.3125774","url":null,"abstract":"Recent research has focused on how to facilitate interaction between humans and robots, giving rise to the field of human robot interaction. A related research area is human-drone interaction (HDI), investigating how interaction between humans and drones can be expanded in novel and meaningful ways. In this work, we explore the use of drones as companions in a home environment. We present three consecutive studies addressing user requirements and design space of companion drones. Following a user-centered approach, the three stages include online questionnaire, design workshops, and simulated virtual reality (VR) home environment. Our results show that participants preferred the idea of a drone companion at home, particularly for tasks such as fetching items and cleaning. The participants were also positive towards a drone companion that featured anthropomorphic features.","PeriodicalId":346669,"journal":{"name":"Proceedings of the 5th International Conference on Human Agent Interaction","volume":"188 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115551968","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
H. Tokushige, Takuji Narumi, Sayaka Ono, Y. Fuwamoto, T. Tanikawa, M. Hirose
As intelligent agents learn to behave increasingly autonomously and simulate a high level of intelligence, human interaction with them will be increasingly unpredictable. Would you accept an unexpected and sometimes irrational but actually correct recommendation by an agent you trust? We performed two experiments in which participants played a game. In this game, the participants chose a path by referring to a recommendation from the agent in one of two experimental conditions:the correct or the faulty condition. After interactions with the agent, the participants received an unexpected recommendation by the agent. The results showed that, while the trust measured by a questionnaire in the correct condition was higher than that in the faulty condition, there was no significant difference in the number of people who accepted the recommendation. Furthermore, the trust in the agent made decision time significantly longer when the recommendation was not rational.
{"title":"Trust Lengthens Decision Time on Unexpected Recommendations in Human-agent Interaction","authors":"H. Tokushige, Takuji Narumi, Sayaka Ono, Y. Fuwamoto, T. Tanikawa, M. Hirose","doi":"10.1145/3125739.3125751","DOIUrl":"https://doi.org/10.1145/3125739.3125751","url":null,"abstract":"As intelligent agents learn to behave increasingly autonomously and simulate a high level of intelligence, human interaction with them will be increasingly unpredictable. Would you accept an unexpected and sometimes irrational but actually correct recommendation by an agent you trust? We performed two experiments in which participants played a game. In this game, the participants chose a path by referring to a recommendation from the agent in one of two experimental conditions:the correct or the faulty condition. After interactions with the agent, the participants received an unexpected recommendation by the agent. The results showed that, while the trust measured by a questionnaire in the correct condition was higher than that in the faulty condition, there was no significant difference in the number of people who accepted the recommendation. Furthermore, the trust in the agent made decision time significantly longer when the recommendation was not rational.","PeriodicalId":346669,"journal":{"name":"Proceedings of the 5th International Conference on Human Agent Interaction","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125762092","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Assuming that a mind is a source of variety in behavior is important in a non-zero-sum situation in which cooperators and competitors (free riders) are mixed. In such a context, one should rapidly differentiate between competitors and cooperators and avoid useless battles against competitors. Once an actor's mind (intentions) and current situation have been identified, the strategy of assuming a mind enables one to infer the actor's future behavior even if the situation is different. In the present study, we conducted an experiment to assess whether humans predict the future behavior of an agent (a 1 degree of freedom (DOF) stick) in a different situation by attributing malice or benevolence. We developed a stick-mediated interactive hole that provides minimum modal interaction in a non-zero-sum game situation. Participants were asked to insert as many sticks as they could into the hole within two minutes. The motor behind the hole produced cooperative or obstructive actions. The results show that participants who performed the task with a cooperative hole attributed benevolence and predicted future cooperative behavior in a different task and that participants who performed the task with an obstructive hole attributed malice but did not predict future obstructive behavior.
{"title":"A Study of Good and Evil Using 1 DOF Sticks","authors":"E. Ishikawa, K. Terada","doi":"10.1145/3125739.3132605","DOIUrl":"https://doi.org/10.1145/3125739.3132605","url":null,"abstract":"Assuming that a mind is a source of variety in behavior is important in a non-zero-sum situation in which cooperators and competitors (free riders) are mixed. In such a context, one should rapidly differentiate between competitors and cooperators and avoid useless battles against competitors. Once an actor's mind (intentions) and current situation have been identified, the strategy of assuming a mind enables one to infer the actor's future behavior even if the situation is different. In the present study, we conducted an experiment to assess whether humans predict the future behavior of an agent (a 1 degree of freedom (DOF) stick) in a different situation by attributing malice or benevolence. We developed a stick-mediated interactive hole that provides minimum modal interaction in a non-zero-sum game situation. Participants were asked to insert as many sticks as they could into the hole within two minutes. The motor behind the hole produced cooperative or obstructive actions. The results show that participants who performed the task with a cooperative hole attributed benevolence and predicted future cooperative behavior in a different task and that participants who performed the task with an obstructive hole attributed malice but did not predict future obstructive behavior.","PeriodicalId":346669,"journal":{"name":"Proceedings of the 5th International Conference on Human Agent Interaction","volume":"81 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127190864","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Sam Thellman, Jacob Lundberg, Mattias Arvola, T. Ziemke
Several Wizard-of-Oz techniques have been developed to make robots appear autonomous and more social in human-robot interaction. Many of the existing solutions use control interfaces that introduce significant time delays and hamper the robot operator's ability to produce socially appropriate responses in real time interactions. We present work in progress on a novel wizard control interface designed to overcome these limitations:a motion tracking-based system which allows the wizard to act as if he or she is the robot. The wizard sees the other through the robot's perspective, and uses his or her own bodily movements to control it. We discuss potential applications and extensions of this system, and conclude by discussing possible methodological advantages and disadvantages.
{"title":"What Is It Like to Be a Bot?: Toward More Immediate Wizard-of-Oz Control in Social Human-Robot Interaction","authors":"Sam Thellman, Jacob Lundberg, Mattias Arvola, T. Ziemke","doi":"10.1145/3125739.3132580","DOIUrl":"https://doi.org/10.1145/3125739.3132580","url":null,"abstract":"Several Wizard-of-Oz techniques have been developed to make robots appear autonomous and more social in human-robot interaction. Many of the existing solutions use control interfaces that introduce significant time delays and hamper the robot operator's ability to produce socially appropriate responses in real time interactions. We present work in progress on a novel wizard control interface designed to overcome these limitations:a motion tracking-based system which allows the wizard to act as if he or she is the robot. The wizard sees the other through the robot's perspective, and uses his or her own bodily movements to control it. We discuss potential applications and extensions of this system, and conclude by discussing possible methodological advantages and disadvantages.","PeriodicalId":346669,"journal":{"name":"Proceedings of the 5th International Conference on Human Agent Interaction","volume":"197 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114870251","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}