Interactive crowdsourcing to fact-check politicians.

IF 2.7 3区 心理学 Q2 PSYCHOLOGY, APPLIED Journal of Experimental Psychology-Applied Pub Date : 2024-03-01 Epub Date: 2023-08-31 DOI:10.1037/xap0000492
Santos Espina Mairal, Florencia Bustos, Guillermo Solovey, Joaquín Navajas
{"title":"Interactive crowdsourcing to fact-check politicians.","authors":"Santos Espina Mairal, Florencia Bustos, Guillermo Solovey, Joaquín Navajas","doi":"10.1037/xap0000492","DOIUrl":null,"url":null,"abstract":"<p><p>The discourse of political leaders often contains false information that can misguide the public. Fact-checking agencies around the world try to reduce the negative influence of politicians by verifying their words. However, these agencies face a problem of scalability and require innovative solutions to deal with their growing amount of work. While the previous studies have shown that crowdsourcing is a promising approach to fact-check news in a scalable manner, it remains unclear whether crowdsourced judgements are useful to verify the speech of politicians. This article fills that gap by studying the effect of social influence on the accuracy of collective judgements about the veracity of political speech. In this work, we performed two experiments (Study 1: <i>N</i> = 180; Study 2: <i>N</i> = 240) where participants judged the veracity of 20 politically balanced phrases. Then, they were exposed to social information from politically homogeneous or heterogeneous participants. Finally, they provided revised individual judgements. We found that only heterogeneous social influence increased the accuracy of participants compared to a control condition. Overall, our results uncover the effect of social influence on the accuracy of collective judgements about the veracity of political speech and show how interactive crowdsourcing strategies can help fact-checking agencies. (PsycInfo Database Record (c) 2024 APA, all rights reserved).</p>","PeriodicalId":48003,"journal":{"name":"Journal of Experimental Psychology-Applied","volume":" ","pages":"3-15"},"PeriodicalIF":2.7000,"publicationDate":"2024-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Experimental Psychology-Applied","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1037/xap0000492","RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2023/8/31 0:00:00","PubModel":"Epub","JCR":"Q2","JCRName":"PSYCHOLOGY, APPLIED","Score":null,"Total":0}
引用次数: 0

Abstract

The discourse of political leaders often contains false information that can misguide the public. Fact-checking agencies around the world try to reduce the negative influence of politicians by verifying their words. However, these agencies face a problem of scalability and require innovative solutions to deal with their growing amount of work. While the previous studies have shown that crowdsourcing is a promising approach to fact-check news in a scalable manner, it remains unclear whether crowdsourced judgements are useful to verify the speech of politicians. This article fills that gap by studying the effect of social influence on the accuracy of collective judgements about the veracity of political speech. In this work, we performed two experiments (Study 1: N = 180; Study 2: N = 240) where participants judged the veracity of 20 politically balanced phrases. Then, they were exposed to social information from politically homogeneous or heterogeneous participants. Finally, they provided revised individual judgements. We found that only heterogeneous social influence increased the accuracy of participants compared to a control condition. Overall, our results uncover the effect of social influence on the accuracy of collective judgements about the veracity of political speech and show how interactive crowdsourcing strategies can help fact-checking agencies. (PsycInfo Database Record (c) 2024 APA, all rights reserved).

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
互动式众包对政客进行事实核查。
政治领导人的言论往往包含虚假信息,会误导公众。世界各地的事实核查机构试图通过核实政治人物的言论来减少他们的负面影响。然而,这些机构面临着可扩展性问题,需要创新的解决方案来应对不断增长的工作量。尽管之前的研究表明,众包是以可扩展的方式对新闻进行事实核查的一种很有前途的方法,但众包判断是否有助于核实政治家的言论仍不清楚。本文通过研究社会影响对政治言论真实性集体判断准确性的影响,填补了这一空白。在这项工作中,我们进行了两项实验(研究 1:N = 180;研究 2:N = 240),让参与者判断 20 个政治平衡短语的真实性。然后,他们接触来自政治同质或异质参与者的社会信息。最后,他们提供修改后的个人判断。我们发现,与对照条件相比,只有异质社会影响才会提高参与者的准确性。总之,我们的研究结果揭示了社会影响对政治言论真实性集体判断准确性的影响,并展示了互动众包策略如何帮助事实核查机构。(PsycInfo Database Record (c) 2024 APA, 版权所有)。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
4.90
自引率
3.80%
发文量
110
期刊介绍: The mission of the Journal of Experimental Psychology: Applied® is to publish original empirical investigations in experimental psychology that bridge practically oriented problems and psychological theory. The journal also publishes research aimed at developing and testing of models of cognitive processing or behavior in applied situations, including laboratory and field settings. Occasionally, review articles are considered for publication if they contribute significantly to important topics within applied experimental psychology. Areas of interest include applications of perception, attention, memory, decision making, reasoning, information processing, problem solving, learning, and skill acquisition.
期刊最新文献
A rate-them-all lineup procedure increases information but reduces discriminability. Comparing generating predictions with retrieval practice as learning strategies for primary school children. A comparison between numeric confidence ratings and verbal confidence statements. Prior knowledge and new learning: An experimental study of domain-specific knowledge. Time on task effects during interactive visual search.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1