{"title":"The majority of fact-checking labels in the United States are intense and this decreases engagement intention","authors":"Haoning Xue, Jingwen Zhang, Cuihua Shen, Magdalena Wojcieszak","doi":"10.1093/hcr/hqae007","DOIUrl":null,"url":null,"abstract":"\n Fact-checking labels have been widely accepted as an effective misinformation correction method. However, there is limited theoretical understanding of fact-checking labels’ impact. This study theorizes that language intensity influences fact-checking label processing and tests this idea through a multi-method design. We first rely on a large-scale observational dataset of fact-checking labels from 7 U.S. fact-checking organizations (N = 33,755) to examine the labels’ language intensity and then use a controlled online experiment in the United States (N = 656) to systematically test the causal effects of fact-checking label intensity (low, moderate, or high) and fact-checking source (professional journalists or artificial intelligence) on perceived message credibility of and the intention to engage with fact-checking messages. We found that two-thirds of existing labels were intense. Such high-intensity labels had null effects on messages’ perceived credibility, yet decreased engagement intention, especially when labels were attributed to AI. Using more intense labels may not be an effective fact-checking approach.","PeriodicalId":51377,"journal":{"name":"Human Communication Research","volume":null,"pages":null},"PeriodicalIF":4.4000,"publicationDate":"2024-05-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Human Communication Research","FirstCategoryId":"98","ListUrlMain":"https://doi.org/10.1093/hcr/hqae007","RegionNum":1,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMMUNICATION","Score":null,"Total":0}
引用次数: 0
Abstract
Fact-checking labels have been widely accepted as an effective misinformation correction method. However, there is limited theoretical understanding of fact-checking labels’ impact. This study theorizes that language intensity influences fact-checking label processing and tests this idea through a multi-method design. We first rely on a large-scale observational dataset of fact-checking labels from 7 U.S. fact-checking organizations (N = 33,755) to examine the labels’ language intensity and then use a controlled online experiment in the United States (N = 656) to systematically test the causal effects of fact-checking label intensity (low, moderate, or high) and fact-checking source (professional journalists or artificial intelligence) on perceived message credibility of and the intention to engage with fact-checking messages. We found that two-thirds of existing labels were intense. Such high-intensity labels had null effects on messages’ perceived credibility, yet decreased engagement intention, especially when labels were attributed to AI. Using more intense labels may not be an effective fact-checking approach.
期刊介绍:
Human Communication Research is one of the official journals of the prestigious International Communication Association and concentrates on presenting the best empirical work in the area of human communication. It is a top-ranked communication studies journal and one of the top ten journals in the field of human communication. Major topic areas for the journal include language and social interaction, nonverbal communication, interpersonal communication, organizational communication and new technologies, mass communication, health communication, intercultural communication, and developmental issues in communication.