Oshrat Ayalon, Hannah Hok, Alex Shaw, Goren Gordon
{"title":"When it is ok to give the Robot Less: Children’s Fairness Intuitions Towards Robots","authors":"Oshrat Ayalon, Hannah Hok, Alex Shaw, Goren Gordon","doi":"10.1007/s12369-023-01047-4","DOIUrl":null,"url":null,"abstract":"Abstract Children develop intuitions about fairness relatively early in development. While we know that children believe other humans care about distributional fairness, considerably less is known about whether they believe other agents, such as robots, do as well. In two experiments (N = 273) we investigated 4- to 9-year-old children’s intuitions about whether robots would be upset about unfair treatment as human children. Children were told about a scenario in which resources were being split between a human child and a target recipient: either another child or a robot across two conditions. The target recipient (either child or robot) received less than another child. They were then asked to evaluate how fair the distribution was, and whether the target recipient would be upset. Both Experiment 1 and 2 used the same design, but Experiment 2 also included a video demonstrating the robot’s mechanistic “robotic” movements. Our results show that children thought it was more fair to share unequally when the disadvantaged recipient was a robot rather than a child (Experiment 1 and 2). Furthermore, children thought that the child would be more upset than the robot (Experiment 2). Finally, we found that this tendency to treat these two conditions differently became stronger with age (Experiment 2). These results suggest that young children treat robots and children similarly in resource allocation tasks, but increasingly differentiate them with age. Specifically, children evaluate inequality as less unfair when the target recipient is a robot, and think that robots will be less angry about inequality.","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"111 1","pages":"0"},"PeriodicalIF":3.8000,"publicationDate":"2023-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Social Robotics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s12369-023-01047-4","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0
Abstract
Abstract Children develop intuitions about fairness relatively early in development. While we know that children believe other humans care about distributional fairness, considerably less is known about whether they believe other agents, such as robots, do as well. In two experiments (N = 273) we investigated 4- to 9-year-old children’s intuitions about whether robots would be upset about unfair treatment as human children. Children were told about a scenario in which resources were being split between a human child and a target recipient: either another child or a robot across two conditions. The target recipient (either child or robot) received less than another child. They were then asked to evaluate how fair the distribution was, and whether the target recipient would be upset. Both Experiment 1 and 2 used the same design, but Experiment 2 also included a video demonstrating the robot’s mechanistic “robotic” movements. Our results show that children thought it was more fair to share unequally when the disadvantaged recipient was a robot rather than a child (Experiment 1 and 2). Furthermore, children thought that the child would be more upset than the robot (Experiment 2). Finally, we found that this tendency to treat these two conditions differently became stronger with age (Experiment 2). These results suggest that young children treat robots and children similarly in resource allocation tasks, but increasingly differentiate them with age. Specifically, children evaluate inequality as less unfair when the target recipient is a robot, and think that robots will be less angry about inequality.
期刊介绍:
Social Robotics is the study of robots that are able to interact and communicate among themselves, with humans, and with the environment, within the social and cultural structure attached to its role. The journal covers a broad spectrum of topics related to the latest technologies, new research results and developments in the area of social robotics on all levels, from developments in core enabling technologies to system integration, aesthetic design, applications and social implications. It provides a platform for like-minded researchers to present their findings and latest developments in social robotics, covering relevant advances in engineering, computing, arts and social sciences.
The journal publishes original, peer reviewed articles and contributions on innovative ideas and concepts, new discoveries and improvements, as well as novel applications, by leading researchers and developers regarding the latest fundamental advances in the core technologies that form the backbone of social robotics, distinguished developmental projects in the area, as well as seminal works in aesthetic design, ethics and philosophy, studies on social impact and influence, pertaining to social robotics.