扩展的人工智能厌恶:人们否认人工智能用户的人性。

IF 6.4 1区 心理学 Q1 PSYCHOLOGY, SOCIAL Journal of personality and social psychology Pub Date : 2024-11-11 DOI:10.1037/pspi0000480
Jianning Dang, Li Liu
{"title":"扩展的人工智能厌恶:人们否认人工智能用户的人性。","authors":"Jianning Dang, Li Liu","doi":"10.1037/pspi0000480","DOIUrl":null,"url":null,"abstract":"<p><p>Artificial intelligence (AI) tools are often perceived as lacking humanlike qualities, leading to a preference for human experts over AI assistance. Extending prior research on AI aversion, the current research explores the potential aversion toward those using AI to seek advice. Through eight preregistered studies (total <i>N</i> = 2,317) across multiple AI use scenarios, we found that people denied humanness, especially emotional capacity and human nature traits, to AI advice seekers in comparison to human advice seekers (Studies 1-5 and S1-S3). This is because people perceived less similarity between themselves and AI advice seekers (vs. human advice seekers), with a stronger mediating role of perceived similarity among individuals with greater aversion to AI (Studies 2 and S1). Dehumanization of AI advice seekers predicted less behavioral support for (Study 3) and helping intention toward (Studies S2 and S3) them and could be alleviated through anthropomorphism-related interventions, such as perceiving humanlike qualities in AI or utilizing generative AI (Studies 4 and 5). These findings represent an important theoretical step in advancing research on AI aversion and add to the ongoing discussion on the potential adverse outcomes of AI, focusing on AI users. (PsycInfo Database Record (c) 2024 APA, all rights reserved).</p>","PeriodicalId":16691,"journal":{"name":"Journal of personality and social psychology","volume":" ","pages":""},"PeriodicalIF":6.4000,"publicationDate":"2024-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Extended artificial intelligence aversion: People deny humanness to artificial intelligence users.\",\"authors\":\"Jianning Dang, Li Liu\",\"doi\":\"10.1037/pspi0000480\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Artificial intelligence (AI) tools are often perceived as lacking humanlike qualities, leading to a preference for human experts over AI assistance. Extending prior research on AI aversion, the current research explores the potential aversion toward those using AI to seek advice. Through eight preregistered studies (total <i>N</i> = 2,317) across multiple AI use scenarios, we found that people denied humanness, especially emotional capacity and human nature traits, to AI advice seekers in comparison to human advice seekers (Studies 1-5 and S1-S3). This is because people perceived less similarity between themselves and AI advice seekers (vs. human advice seekers), with a stronger mediating role of perceived similarity among individuals with greater aversion to AI (Studies 2 and S1). Dehumanization of AI advice seekers predicted less behavioral support for (Study 3) and helping intention toward (Studies S2 and S3) them and could be alleviated through anthropomorphism-related interventions, such as perceiving humanlike qualities in AI or utilizing generative AI (Studies 4 and 5). These findings represent an important theoretical step in advancing research on AI aversion and add to the ongoing discussion on the potential adverse outcomes of AI, focusing on AI users. (PsycInfo Database Record (c) 2024 APA, all rights reserved).</p>\",\"PeriodicalId\":16691,\"journal\":{\"name\":\"Journal of personality and social psychology\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":6.4000,\"publicationDate\":\"2024-11-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of personality and social psychology\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://doi.org/10.1037/pspi0000480\",\"RegionNum\":1,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"PSYCHOLOGY, SOCIAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of personality and social psychology","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1037/pspi0000480","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, SOCIAL","Score":null,"Total":0}
引用次数: 0

摘要

人工智能(AI)工具通常被认为缺乏类似人类的品质,从而导致人们更倾向于使用人类专家而非人工智能辅助工具。目前的研究扩展了之前关于人工智能厌恶的研究,探讨了人们对使用人工智能寻求建议的潜在厌恶。通过对多个人工智能使用场景的八项预先登记研究(总人数 = 2317 人),我们发现,与人类咨询者相比,人们否认人工智能咨询者的人性,尤其是情感能力和人性特征(研究 1-5 和 S1-S3)。这是因为人们认为自己与人工智能建议寻求者(与人类建议寻求者相比)之间的相似性较低,而在对人工智能更反感的个体中,相似性的中介作用更强(研究 2 和 S1)。对人工智能建议寻求者的非人化预示着对他们的行为支持(研究 3)和帮助意向(研究 S2 和 S3)较少,可以通过拟人化相关的干预措施来缓解,如在人工智能中感知类似人类的品质或利用生成式人工智能(研究 4 和 5)。这些研究结果代表了推进人工智能厌恶研究的重要理论步骤,并为正在进行的关于人工智能潜在不良后果的讨论增添了新的内容,其重点是人工智能用户。(PsycInfo Database Record (c) 2024 APA, 版权所有)。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Extended artificial intelligence aversion: People deny humanness to artificial intelligence users.

Artificial intelligence (AI) tools are often perceived as lacking humanlike qualities, leading to a preference for human experts over AI assistance. Extending prior research on AI aversion, the current research explores the potential aversion toward those using AI to seek advice. Through eight preregistered studies (total N = 2,317) across multiple AI use scenarios, we found that people denied humanness, especially emotional capacity and human nature traits, to AI advice seekers in comparison to human advice seekers (Studies 1-5 and S1-S3). This is because people perceived less similarity between themselves and AI advice seekers (vs. human advice seekers), with a stronger mediating role of perceived similarity among individuals with greater aversion to AI (Studies 2 and S1). Dehumanization of AI advice seekers predicted less behavioral support for (Study 3) and helping intention toward (Studies S2 and S3) them and could be alleviated through anthropomorphism-related interventions, such as perceiving humanlike qualities in AI or utilizing generative AI (Studies 4 and 5). These findings represent an important theoretical step in advancing research on AI aversion and add to the ongoing discussion on the potential adverse outcomes of AI, focusing on AI users. (PsycInfo Database Record (c) 2024 APA, all rights reserved).

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
12.70
自引率
3.90%
发文量
250
期刊介绍: Journal of personality and social psychology publishes original papers in all areas of personality and social psychology and emphasizes empirical reports, but may include specialized theoretical, methodological, and review papers.Journal of personality and social psychology is divided into three independently edited sections. Attitudes and Social Cognition addresses all aspects of psychology (e.g., attitudes, cognition, emotion, motivation) that take place in significant micro- and macrolevel social contexts.
期刊最新文献
Compassionate love and beneficence in the family. How people (fail to) control the influence of affective stimuli on attitudes. A contest study to reduce attractiveness-based discrimination in social judgment. Group information enhances recognition of both learned and unlearned face appearances. Moderators of test-retest reliability in implicit and explicit attitudes.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1