{"title":"Facets of Trust and Distrust in Collaborative Robots at the Workplace: Towards a Multidimensional and Relational Conceptualisation","authors":"Tobias Kopp","doi":"10.1007/s12369-023-01082-1","DOIUrl":null,"url":null,"abstract":"<p>The relevance of trust on the road to successful human-robot interaction is widely acknowledged. Thereby, trust is commonly understood as a monolithic concept characterising dyadic relations between a human and a robot. However, this conceptualisation seems oversimplified and neglects the specific interaction context. In a multidisciplinary approach, this conceptual analysis synthesizes sociological notions of trust and distrust, psychological trust models, and ideas of philosophers of technology in order to pave the way for a multidimensional, relational and context-sensitive conceptualisation of human-robot trust and distrust. In this vein, trust is characterised functionally as a mechanism to cope with environmental complexity when dealing with ambiguously perceived hybrid robots such as collaborative robots, which enable human-robot interactions without physical separation in the workplace context. Common definitions of trust in the HRI context emphasise that trust is based on concrete expectations regarding individual goals. Therefore, I propose a three-dimensional notion of trust that binds trust to a reference object and accounts for various coexisting goals at the workplace. Furthermore, the assumption that robots represent trustees in a narrower sense is challenged by unfolding influential relational networks of trust within the organisational context. In terms of practical implications, trust is distinguished from acceptance and actual technology usage, which may be promoted by trust, but are strongly influenced by contextual moderating factors. In addition, theoretical arguments for considering distrust not only as the opposite of trust, but as an alternative and coexisting complexity reduction mechanism are outlined. Finally, the article presents key conclusions and future research avenues.</p>","PeriodicalId":14361,"journal":{"name":"International Journal of Social Robotics","volume":"23 1","pages":""},"PeriodicalIF":3.8000,"publicationDate":"2024-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Social Robotics","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s12369-023-01082-1","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0
Abstract
The relevance of trust on the road to successful human-robot interaction is widely acknowledged. Thereby, trust is commonly understood as a monolithic concept characterising dyadic relations between a human and a robot. However, this conceptualisation seems oversimplified and neglects the specific interaction context. In a multidisciplinary approach, this conceptual analysis synthesizes sociological notions of trust and distrust, psychological trust models, and ideas of philosophers of technology in order to pave the way for a multidimensional, relational and context-sensitive conceptualisation of human-robot trust and distrust. In this vein, trust is characterised functionally as a mechanism to cope with environmental complexity when dealing with ambiguously perceived hybrid robots such as collaborative robots, which enable human-robot interactions without physical separation in the workplace context. Common definitions of trust in the HRI context emphasise that trust is based on concrete expectations regarding individual goals. Therefore, I propose a three-dimensional notion of trust that binds trust to a reference object and accounts for various coexisting goals at the workplace. Furthermore, the assumption that robots represent trustees in a narrower sense is challenged by unfolding influential relational networks of trust within the organisational context. In terms of practical implications, trust is distinguished from acceptance and actual technology usage, which may be promoted by trust, but are strongly influenced by contextual moderating factors. In addition, theoretical arguments for considering distrust not only as the opposite of trust, but as an alternative and coexisting complexity reduction mechanism are outlined. Finally, the article presents key conclusions and future research avenues.
期刊介绍:
Social Robotics is the study of robots that are able to interact and communicate among themselves, with humans, and with the environment, within the social and cultural structure attached to its role. The journal covers a broad spectrum of topics related to the latest technologies, new research results and developments in the area of social robotics on all levels, from developments in core enabling technologies to system integration, aesthetic design, applications and social implications. It provides a platform for like-minded researchers to present their findings and latest developments in social robotics, covering relevant advances in engineering, computing, arts and social sciences.
The journal publishes original, peer reviewed articles and contributions on innovative ideas and concepts, new discoveries and improvements, as well as novel applications, by leading researchers and developers regarding the latest fundamental advances in the core technologies that form the backbone of social robotics, distinguished developmental projects in the area, as well as seminal works in aesthetic design, ethics and philosophy, studies on social impact and influence, pertaining to social robotics.