1
|
Radkani S, Landau-Wells M, Saxe R. How rational inference about authority debunking can curtail, sustain, or spread belief polarization. PNAS NEXUS 2024; 3:pgae393. [PMID: 39411098 PMCID: PMC11475407 DOI: 10.1093/pnasnexus/pgae393] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/08/2024] [Accepted: 08/19/2024] [Indexed: 10/19/2024]
Abstract
In polarized societies, divided subgroups of people have different perspectives on a range of topics. Aiming to reduce polarization, authorities may use debunking to lend support to one perspective over another. Debunking by authorities gives all observers shared information, which could reduce disagreement. In practice, however, debunking may have no effect or could even contribute to further polarization of beliefs. We developed a cognitively inspired model of observers' rational inferences from an authority's debunking. After observing each debunking attempt, simulated observers simultaneously update their beliefs about the perspective underlying the debunked claims and about the authority's motives, using an intuitive causal model of the authority's decision-making process. We varied the observers' prior beliefs and uncertainty systematically. Simulations generated a range of outcomes, from belief convergence (less common) to persistent divergence (more common). In many simulations, observers who initially held shared beliefs about the authority later acquired polarized beliefs about the authority's biases and commitment to truth. These polarized beliefs constrained the authority's influence on new topics, making it possible for belief polarization to spread. We discuss the implications of the model with respect to beliefs about elections.
Collapse
Affiliation(s)
- Setayesh Radkani
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA 02139, USA
| | - Marika Landau-Wells
- Travers Department of Political Science, University of California, Berkeley, CA 94705, USA
| | - Rebecca Saxe
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA 02139, USA
- McGovern Institute for Brain Research, Massachusetts Institute of Technology, Cambridge, MA 02139, USA
| |
Collapse
|
2
|
Buczel KA, Siwiak A, Szpitalak M, Polczyk R. How do forewarnings and post-warnings affect misinformation reliance? The impact of warnings on the continued influence effect and belief regression. Mem Cognit 2024; 52:1048-1064. [PMID: 38261249 DOI: 10.3758/s13421-024-01520-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/06/2024] [Indexed: 01/24/2024]
Abstract
People often continue to rely on certain information in their reasoning, even if this information has been retracted; this is called the continued influence effect (CIE) of misinformation. One technique for reducing this effect involves explicitly warning people that there is a possibility that they might have been misled. The present study aimed to investigate these warnings' effectiveness, depending on when they were given (either before or after misinformation). In two experiments (N = 337), we found that while a forewarning did reduce reliance on misinformation, retrospectively warned participants (when the warning was placed either between the misinformation and the retraction or just before testing) relied on the misinformation to a similar degree as unwarned participants. However, the protective effect of the forewarning was not durable, as shown by the fact that reliance on the misinformation increased for over 7 days following the first testing, despite continued memory of the retraction.
Collapse
Affiliation(s)
- Klara Austeja Buczel
- Institute of Psychology, Jagiellonian University, Kraków, Poland.
- Doctoral School in the Social Sciences, Jagiellonian University, Kraków, Poland.
| | - Adam Siwiak
- Institute of Psychology, Jagiellonian University, Kraków, Poland
- Doctoral School in the Social Sciences, Jagiellonian University, Kraków, Poland
| | | | - Romuald Polczyk
- Institute of Psychology, Jagiellonian University, Kraków, Poland
| |
Collapse
|
3
|
Siebert J, Siebert JU. Enhancing misinformation correction: New variants and a combination of awareness training and counter-speech to mitigate belief perseverance bias. PLoS One 2024; 19:e0299139. [PMID: 38363785 PMCID: PMC10871482 DOI: 10.1371/journal.pone.0299139] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2023] [Accepted: 02/05/2024] [Indexed: 02/18/2024] Open
Abstract
Belief perseverance bias refers to individuals' tendency to persevere in biased opinions even after the misinformation that initially shaped those opinions has been retracted. This study contributes to research on reducing the negative impact of misinformation by mitigating the belief perseverance bias. The study explores the previously proposed awareness-training and counter-speech debiasing techniques, further developing them by introducing new variants and combining them. We investigate their effectiveness in mitigating the belief perseverance bias after the retraction of misinformation related to a real-life issue in an experiment involving N = 876 individuals, of whom 364 exhibit belief perseverance bias. The effectiveness of the debiasing techniques is assessed by measuring the difference between the baseline opinions before exposure to misinformation and the opinions after exposure to a debiasing technique. Our study confirmed the effectiveness of the awareness-training and counter-speech debiasing techniques in mitigating the belief perseverance bias, finding no discernible differences in the effectiveness between the previously proposed and the new variants. Moreover, we observed that the combination of awareness training and counter-speech is more effective in mitigating the belief perseverance bias than the single debiasing techniques.
Collapse
Affiliation(s)
- Jana Siebert
- Faculty of Arts, Department of Economic and Managerial Studies, Palacky University Olomouc, Olomouc, Czech Republic
| | | |
Collapse
|
4
|
Blair RA, Gottlieb J, Nyhan B, Paler L, Argote P, Stainfield CJ. Interventions to counter misinformation: Lessons from the Global North and applications to the Global South. Curr Opin Psychol 2024; 55:101732. [PMID: 38070207 DOI: 10.1016/j.copsyc.2023.101732] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2023] [Revised: 11/03/2023] [Accepted: 11/06/2023] [Indexed: 01/28/2024]
Abstract
We synthesize evidence from 176 experimental estimates of 11 interventions intended to combat misinformation in the Global North and Global South, which we classify as informational, educational, sociopsychological, or institutional. Among these, we find the most consistent positive evidence for two informational interventions in both Global North and Global South contexts: inoculation/prebunking and debunking. In a complementary survey of 138 misinformation scholars and practitioners, we find that experts tend to be most optimistic about interventions that have been least widely studied or that have been shown to be mostly ineffective. We provide a searchable database of misinformation randomized controlled trials and suggest avenues for future research to close the gap between expert opinion and academic research.
Collapse
Affiliation(s)
- Robert A Blair
- Department of Political Science and Watson Institute for International and Public Affairs, Brown University, United States
| | - Jessica Gottlieb
- Hobby School of Public Affairs, University of Houston, United States
| | - Brendan Nyhan
- Department of Government, Dartmouth College, United States.
| | - Laura Paler
- Department of Government, School of Public Affairs, American University, United States
| | - Pablo Argote
- Department of Political Science and International Relations, University of Southern California, United States
| | | |
Collapse
|
5
|
Prike T, Blackley P, Swire-Thompson B, Ecker UKH. Examining the replicability of backfire effects after standalone corrections. Cogn Res Princ Implic 2023; 8:39. [PMID: 37395864 PMCID: PMC10317933 DOI: 10.1186/s41235-023-00492-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2022] [Accepted: 06/06/2023] [Indexed: 07/04/2023] Open
Abstract
Corrections are a frequently used and effective tool for countering misinformation. However, concerns have been raised that corrections may introduce false claims to new audiences when the misinformation is novel. This is because boosting the familiarity of a claim can increase belief in that claim, and thus exposing new audiences to novel misinformation-even as part of a correction-may inadvertently increase misinformation belief. Such an outcome could be conceptualized as a familiarity backfire effect, whereby a familiarity boost increases false-claim endorsement above a control-condition or pre-correction baseline. Here, we examined whether standalone corrections-that is, corrections presented without initial misinformation exposure-can backfire and increase participants' reliance on the misinformation in their subsequent inferential reasoning, relative to a no-misinformation, no-correction control condition. Across three experiments (total N = 1156) we found that standalone corrections did not backfire immediately (Experiment 1) or after a one-week delay (Experiment 2). However, there was some mixed evidence suggesting corrections may backfire when there is skepticism regarding the correction (Experiment 3). Specifically, in Experiment 3, we found the standalone correction to backfire in open-ended responses, but only when there was skepticism towards the correction. However, this did not replicate with the rating scales measure. Future research should further examine whether skepticism towards the correction is the first replicable mechanism for backfire effects to occur.
Collapse
Affiliation(s)
- Toby Prike
- School of Psychological Science, University of Western Australia, Perth, Australia.
| | - Phoebe Blackley
- School of Psychological Science, University of Western Australia, Perth, Australia
| | - Briony Swire-Thompson
- Network Science Institute, Northeastern University, Boston, USA
- Institute of Quantitative Social Science, Harvard University, Cambridge, USA
| | - Ullrich K H Ecker
- School of Psychological Science, University of Western Australia, Perth, Australia
- Public Policy Institute, University of Western Australia, Perth, Australia
| |
Collapse
|
6
|
Siebert J, Siebert JU. Effective mitigation of the belief perseverance bias after the retraction of misinformation: Awareness training and counter-speech. PLoS One 2023; 18:e0282202. [PMID: 36888583 PMCID: PMC9994702 DOI: 10.1371/journal.pone.0282202] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2022] [Accepted: 02/09/2023] [Indexed: 03/09/2023] Open
Abstract
The spread and influence of misinformation have become a matter of concern in society as misinformation can negatively impact individuals' beliefs, opinions and, consequently, decisions. Research has shown that individuals persevere in their biased beliefs and opinions even after the retraction of misinformation. This phenomenon is known as the belief perseverance bias. However, research on mitigating the belief perseverance bias after the retraction of misinformation has been limited. Only a few debiasing techniques with limited practical applicability have been proposed, and research on comparing various techniques in terms of their effectiveness has been scarce. This paper contributes to research on mitigating the belief perseverance bias after the retraction of misinformation by proposing counter-speech and awareness-training techniques and comparing them in terms of effectiveness to the existing counter-explanation technique in an experiment with N = 251 participants. To determine changes in opinions, the extent of the belief perseverance bias and the effectiveness of the debiasing techniques in mitigating the belief perseverance bias, we measure participants' opinions four times in the experiment by using Likert items and phi-coefficient measures. The effectiveness of the debiasing techniques is assessed by measuring the difference between the baseline opinions before exposure to misinformation and the opinions after exposure to a debiasing technique. Further, we discuss the efforts of the providers and recipients of debiasing and the practical applicability of the debiasing techniques. The CS technique, with a very large effect size, is the most effective among the three techniques. The CE and AT techniques, with medium effect sizes, are close to being equivalent in terms of their effectiveness. The CS and AT techniques are associated with less cognitive and time effort of the recipients of debiasing than the CE technique, while the AT and CE techniques require less effort from the providers of debiasing than the CS technique.
Collapse
Affiliation(s)
- Jana Siebert
- Department of Applied Economics, Faculty of Arts, Palacky University Olomouc, Olomouc, Czech Republic
| | - Johannes Ulrich Siebert
- Department of Business and Management, Management Center Innsbruck, Innsbruck, Austria
- * E-mail:
| |
Collapse
|