1
|
Buczel KA, Siwiak A, Szpitalak M, Polczyk R. How do forewarnings and post-warnings affect misinformation reliance? The impact of warnings on the continued influence effect and belief regression. Mem Cognit 2024:10.3758/s13421-024-01520-z. [PMID: 38261249 DOI: 10.3758/s13421-024-01520-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/06/2024] [Indexed: 01/24/2024]
Abstract
People often continue to rely on certain information in their reasoning, even if this information has been retracted; this is called the continued influence effect (CIE) of misinformation. One technique for reducing this effect involves explicitly warning people that there is a possibility that they might have been misled. The present study aimed to investigate these warnings' effectiveness, depending on when they were given (either before or after misinformation). In two experiments (N = 337), we found that while a forewarning did reduce reliance on misinformation, retrospectively warned participants (when the warning was placed either between the misinformation and the retraction or just before testing) relied on the misinformation to a similar degree as unwarned participants. However, the protective effect of the forewarning was not durable, as shown by the fact that reliance on the misinformation increased for over 7 days following the first testing, despite continued memory of the retraction.
Collapse
Affiliation(s)
- Klara Austeja Buczel
- Institute of Psychology, Jagiellonian University, Kraków, Poland.
- Doctoral School in the Social Sciences, Jagiellonian University, Kraków, Poland.
| | - Adam Siwiak
- Institute of Psychology, Jagiellonian University, Kraków, Poland
- Doctoral School in the Social Sciences, Jagiellonian University, Kraków, Poland
| | | | - Romuald Polczyk
- Institute of Psychology, Jagiellonian University, Kraków, Poland
| |
Collapse
|
2
|
Prike T, Ecker UKH. Effective correction of misinformation. Curr Opin Psychol 2023; 54:101712. [PMID: 37944323 DOI: 10.1016/j.copsyc.2023.101712] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2023] [Revised: 10/14/2023] [Accepted: 10/16/2023] [Indexed: 11/12/2023]
Abstract
This paper reviews correction effectiveness, highlighting which factors matter, which do not, and where further research is needed. To boost effectiveness, we recommend using detailed corrections and providing an alternative explanation wherever possible. We also recommend providing a reminder of the initial misinformation and repeating the correction. Presenting corrections pre-emptively (i.e., prebunking) or after misinformation exposure is unlikely to greatly impact correction effectiveness. There is also limited risk of repeating misinformation within a correction or that a correction will inadvertently spread misinformation to new audiences. Further research is needed into which correction formats are most effective, whether boosting correction memorability can enhance effectiveness, the effectiveness of discrediting a misinformation source, and whether distrusted correction sources can contribute to corrections backfiring.
Collapse
Affiliation(s)
- Toby Prike
- School of Psychological Science, University of Western Australia, Perth, Australia.
| | - Ullrich K H Ecker
- School of Psychological Science, University of Western Australia, Perth, Australia
| |
Collapse
|
3
|
Ospina J, Orosz G, Spencer S. The relation between authoritarian leadership and belief in fake news. Sci Rep 2023; 13:12860. [PMID: 37553407 PMCID: PMC10409744 DOI: 10.1038/s41598-023-39807-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2022] [Accepted: 07/31/2023] [Indexed: 08/10/2023] Open
Abstract
Individual factors such as cognitive capacities matter when one is requested to spot fake news. We suggest, however, that social influence-specifically as exercised by an authoritarian leader-might matter more if one is expected to agree with the fake news. We developed a single-item prototype measure of leadership styles and recruited participants from four Western democratic countries (Australia, Canada, United Kingdom, United States, N = 501) who identified their immediate boss as an autonomous, paternalistic, or authoritarian leader. Then they were asked to evaluate the accuracy of several fake news articles and their expectations to agree with their boss when asked about these articles. People with authoritarian bosses were less accurate in spotting fake news (Cohen's d = 0.32) compared to employees with autonomous bosses. The bigger effect, however, was that they would agree with their boss about the fake news article when it was shared by their authoritarian boss compared to employees with autonomous (Cohen's d = 1.30) or paternalistic bosses (Cohen's d = 0.70). We argue that in addition to effects on the perceived accuracy of information, social influence, conformity, and obedience are crucial and unacknowledged factors of how misinformation may be maintained and propagated by authoritarian leaders.
Collapse
Affiliation(s)
- Juan Ospina
- Department of Psychology, The Ohio State University, Columbus, USA.
| | - Gábor Orosz
- ULR 7369-URePSSS-Unité de Recherche Pluridisciplinaire Sport Santé Société, Sherpas, Univ. Lille, Univ. Artois, Univ. Littoral Côte d'Opale, Liévin, France
| | - Steven Spencer
- Department of Psychology, The Ohio State University, Columbus, USA
| |
Collapse
|
4
|
Prike T, Blackley P, Swire-Thompson B, Ecker UKH. Examining the replicability of backfire effects after standalone corrections. Cogn Res Princ Implic 2023; 8:39. [PMID: 37395864 DOI: 10.1186/s41235-023-00492-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2022] [Accepted: 06/06/2023] [Indexed: 07/04/2023] Open
Abstract
Corrections are a frequently used and effective tool for countering misinformation. However, concerns have been raised that corrections may introduce false claims to new audiences when the misinformation is novel. This is because boosting the familiarity of a claim can increase belief in that claim, and thus exposing new audiences to novel misinformation-even as part of a correction-may inadvertently increase misinformation belief. Such an outcome could be conceptualized as a familiarity backfire effect, whereby a familiarity boost increases false-claim endorsement above a control-condition or pre-correction baseline. Here, we examined whether standalone corrections-that is, corrections presented without initial misinformation exposure-can backfire and increase participants' reliance on the misinformation in their subsequent inferential reasoning, relative to a no-misinformation, no-correction control condition. Across three experiments (total N = 1156) we found that standalone corrections did not backfire immediately (Experiment 1) or after a one-week delay (Experiment 2). However, there was some mixed evidence suggesting corrections may backfire when there is skepticism regarding the correction (Experiment 3). Specifically, in Experiment 3, we found the standalone correction to backfire in open-ended responses, but only when there was skepticism towards the correction. However, this did not replicate with the rating scales measure. Future research should further examine whether skepticism towards the correction is the first replicable mechanism for backfire effects to occur.
Collapse
Affiliation(s)
- Toby Prike
- School of Psychological Science, University of Western Australia, Perth, Australia.
| | - Phoebe Blackley
- School of Psychological Science, University of Western Australia, Perth, Australia
| | - Briony Swire-Thompson
- Network Science Institute, Northeastern University, Boston, USA
- Institute of Quantitative Social Science, Harvard University, Cambridge, USA
| | - Ullrich K H Ecker
- School of Psychological Science, University of Western Australia, Perth, Australia
- Public Policy Institute, University of Western Australia, Perth, Australia
| |
Collapse
|
5
|
The independent effects of source expertise and trustworthiness on retraction believability: The moderating role of vested interest. Mem Cognit 2022; 51:845-861. [PMID: 36460863 PMCID: PMC9718466 DOI: 10.3758/s13421-022-01374-3] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/13/2022] [Indexed: 12/03/2022]
Abstract
Past research suggests that the trustworthiness of a source issuing a retraction of misinformation impacts retraction effectiveness, whereas source expertise does not. However, this prior research largely used expert sources who had a vested interest in issuing the retraction, which might have reduced the impact of those expert sources. We predicted that source expertise can impact a retraction's believability independent of trustworthiness, but that this is most likely when the source does not have a vested interest in issuing a retraction. Study 1 demonstrated that retractions from an expert source are believed more and lead to less continued belief in misinformation than retractions from an inexpert source while controlling for perceptions of trustworthiness. Additionally, Study 1 demonstrated that this only occurs when the source had no vested interest in issuing the retraction. Study 2 found similar effects using a design containing manipulations of both expertise and trustworthiness. These results suggest that source expertise can impact retraction effectiveness and that vested interest is a variable that is critical to consider when determining when this will occur.
Collapse
|
6
|
Updating false beliefs: The role of misplaced vs. well-placed certainty. Psychon Bull Rev 2022; 30:712-721. [PMID: 36266602 DOI: 10.3758/s13423-022-02196-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/02/2022] [Indexed: 11/08/2022]
Abstract
People can update their misconceptions or false beliefs by learning from corrective sources. However, research has shown that people vary drastically in the extent to which they learn from feedback and update their false beliefs accordingly. That past work drew attention to cognitive and motivational factors such as cognitive rigidity and closed-mindedness as inhibitors of belief updating. Here we examined a novel epistemic structure, misplaced certainty, a subjective sense of certainty while recognizing uncertainty in oneself or most people (e.g., I feel certain although I recognize X is technically uncertain or it is technically uncertain according to most people), as a unique predictor of lower belief updating. In a preregistered study, we hypothesized that those with high chronic misplaced certainty would be less likely to learn from feedback and revise their misconceptions in a feedback-learning task. In our analyses, we controlled for well-placed certainty-certainty while recognizing no doubt in oneself or most others. We also controlled for variables associated with closed-minded cognition. Consistent with our predictions, those with high misplaced certainty were less likely to revise their false beliefs in response to corrective feedback. In contrast, those with high well-placed certainty were more likely to learn from corrective feedback and revise their false beliefs. By shedding light on the nuances of different forms of subjective certainty, the present work aims to pave the way for further research on epistemic factors in the perseverance and correction of false beliefs.
Collapse
|
7
|
Information misbehaviour: modelling the motivations for the creation, acceptance and dissemination of misinformation. JOURNAL OF DOCUMENTATION 2022. [DOI: 10.1108/jd-05-2022-0116] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
PurposeMisinformation is a significant phenomenon in today's world: the purpose of this paper is to explore the motivations behind the creation and use of misinformation.Design/methodology/approachA literature review was undertaken, covering the English and Russian language sources. Content analysis was used to identify the different kinds of motivation relating to the stages of creating and communicating misinformation. The authors applied Schutz's analysis of motivational types.FindingsThe main types of motivation for creating and facilitating misinformation were identified as “in-order-to motivations”, i.e. seeking to bring about some desired state, whereas the motivations for using and, to a significant extent, sharing misinformation were “because” motivations, i.e. rooted in the individual's personal history.Originality/valueThe general model of the motivations underlying misinformation is original as is the application of Schutz's typification of motivations to the different stages in the creation, dissemination and use of misinformation.
Collapse
|
8
|
Abstract
From vaccination refusal to climate change denial, antiscience views are threatening humanity. When different individuals are provided with the same piece of scientific evidence, why do some accept whereas others dismiss it? Building on various emerging data and models that have explored the psychology of being antiscience, we specify four core bases of key principles driving antiscience attitudes. These principles are grounded in decades of research on attitudes, persuasion, social influence, social identity, and information processing. They apply across diverse domains of antiscience phenomena. Specifically, antiscience attitudes are more likely to emerge when a scientific message comes from sources perceived as lacking credibility; when the recipients embrace the social membership or identity of groups with antiscience attitudes; when the scientific message itself contradicts what recipients consider true, favorable, valuable, or moral; or when there is a mismatch between the delivery of the scientific message and the epistemic style of the recipient. Politics triggers or amplifies many principles across all four bases, making it a particularly potent force in antiscience attitudes. Guided by the key principles, we describe evidence-based counteractive strategies for increasing public acceptance of science.
Collapse
|
9
|
Zwaan RA. Conspiracy Thinking as Situation Model Construction. Curr Opin Psychol 2022; 47:101413. [DOI: 10.1016/j.copsyc.2022.101413] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/02/2022] [Revised: 06/19/2022] [Accepted: 06/24/2022] [Indexed: 11/30/2022]
|