Christopher Deligianis, C. Stanton, C. McGarty, C. Stevens
{"title":"群体间偏见对类人机器人信任和接近行为的影响","authors":"Christopher Deligianis, C. Stanton, C. McGarty, C. Stevens","doi":"10.5898/JHRI.6.3.Deligianis","DOIUrl":null,"url":null,"abstract":"As robots become commonplace, and for successful human-robot interaction to occur, people will need to trust them. Two experiments were conducted using the \"minimal group paradigm\" to explore whether social identity theory influences trust formation and impressions of a robot. In Experiment 1, participants were allocated to either a \"robot\" or \"computer\" group, and then they played a cooperative visual tracking game with an Aldebaran Nao humanoid robot as a partner. We hypothesised participants in the \"robot group\" would demonstrate intergroup bias by sitting closer to the robot (proxemics) and trusting the robot's suggested answers more frequently than their \"computer group\" counterparts. Experiment 2 used an almost identical procedure with a different set of participants; however, all participants were assigned to the \"robot group\" and thee different levels of anthropomorphic robot movement were manipulated. Our results suggest that intergroup bias and humanlike movement can significantly affect human-robot approach behaviour. Significant effects were found for trusting the robot's suggested answers with respect to task difficulty, but not for group membership or robot movement.","PeriodicalId":92076,"journal":{"name":"Journal of human-robot interaction","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2017-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"22","resultStr":"{\"title\":\"The impact of intergroup bias on trust and approach behaviour towards a humanoid robot\",\"authors\":\"Christopher Deligianis, C. Stanton, C. McGarty, C. Stevens\",\"doi\":\"10.5898/JHRI.6.3.Deligianis\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"As robots become commonplace, and for successful human-robot interaction to occur, people will need to trust them. Two experiments were conducted using the \\\"minimal group paradigm\\\" to explore whether social identity theory influences trust formation and impressions of a robot. In Experiment 1, participants were allocated to either a \\\"robot\\\" or \\\"computer\\\" group, and then they played a cooperative visual tracking game with an Aldebaran Nao humanoid robot as a partner. We hypothesised participants in the \\\"robot group\\\" would demonstrate intergroup bias by sitting closer to the robot (proxemics) and trusting the robot's suggested answers more frequently than their \\\"computer group\\\" counterparts. Experiment 2 used an almost identical procedure with a different set of participants; however, all participants were assigned to the \\\"robot group\\\" and thee different levels of anthropomorphic robot movement were manipulated. Our results suggest that intergroup bias and humanlike movement can significantly affect human-robot approach behaviour. Significant effects were found for trusting the robot's suggested answers with respect to task difficulty, but not for group membership or robot movement.\",\"PeriodicalId\":92076,\"journal\":{\"name\":\"Journal of human-robot interaction\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"22\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of human-robot interaction\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.5898/JHRI.6.3.Deligianis\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of human-robot interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5898/JHRI.6.3.Deligianis","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
The impact of intergroup bias on trust and approach behaviour towards a humanoid robot
As robots become commonplace, and for successful human-robot interaction to occur, people will need to trust them. Two experiments were conducted using the "minimal group paradigm" to explore whether social identity theory influences trust formation and impressions of a robot. In Experiment 1, participants were allocated to either a "robot" or "computer" group, and then they played a cooperative visual tracking game with an Aldebaran Nao humanoid robot as a partner. We hypothesised participants in the "robot group" would demonstrate intergroup bias by sitting closer to the robot (proxemics) and trusting the robot's suggested answers more frequently than their "computer group" counterparts. Experiment 2 used an almost identical procedure with a different set of participants; however, all participants were assigned to the "robot group" and thee different levels of anthropomorphic robot movement were manipulated. Our results suggest that intergroup bias and humanlike movement can significantly affect human-robot approach behaviour. Significant effects were found for trusting the robot's suggested answers with respect to task difficulty, but not for group membership or robot movement.