1
|
Garry M, Chan WM, Foster J, Henkel LA. Large language models (LLMs) and the institutionalization of misinformation. Trends Cogn Sci 2024:S1364-6613(24)00221-3. [PMID: 39393958 DOI: 10.1016/j.tics.2024.08.007] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2024] [Revised: 08/17/2024] [Accepted: 08/24/2024] [Indexed: 10/13/2024]
Abstract
Large language models (LLMs), such as ChatGPT, flood the Internet with true and false information, crafted and delivered with techniques that psychological science suggests will encourage people to think that information is true. What's more, as people feed this misinformation back into the Internet, emerging LLMs will adopt it and feed it back in other models. Such a scenario means we could lose access to information that helps us tell what is real from unreal - to do 'reality monitoring.' If that happens, misinformation will be the new foundation we use to plan, to make decisions, and to vote. We will lose trust in our institutions and each other.
Collapse
Affiliation(s)
- Maryanne Garry
- Psychology, The University of Waikato, Hamilton, New Zealand.
| | - Way Ming Chan
- Psychology, The University of Waikato, Hamilton, New Zealand
| | - Jeffrey Foster
- Cybersecurity Studies, Macquarie University, Sydney, Australia
| | - Linda A Henkel
- Psychological and Brain Sciences, Fairfield University, Fairfield, CT, USA
| |
Collapse
|
2
|
Jiang Y, Schwarz N, Reynolds KJ, Newman EJ. Repetition increases belief in climate-skeptical claims, even for climate science endorsers. PLoS One 2024; 19:e0307294. [PMID: 39110668 PMCID: PMC11305575 DOI: 10.1371/journal.pone.0307294] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2023] [Accepted: 07/02/2024] [Indexed: 08/10/2024] Open
Abstract
Does repeated exposure to climate-skeptic claims influence their acceptance as true, even among climate science endorsers? Research with general knowledge claims shows that repeated exposure to a claim increases its perceived truth when it is encountered again. However, motivated cognition research suggests that people primarily endorse what they already believe. Across two experiments, climate science endorsers were more likely to believe claims that were consistent with their prior beliefs, but repeated exposure increased perceptions of truth for climate-science and climate-skeptic claims to a similar extent. Even counter-attitudinal claims benefit from previous exposure, highlighting the insidious effect of repetition.
Collapse
Affiliation(s)
- Yangxueqing Jiang
- School of Medicine and Psychology, The Australian National University, Canberra, ACT, Australia
| | - Norbert Schwarz
- Mind and Society Center, University of Southern California, Los Angeles, California, United States of America
- Department of Psychology, University of Southern California, Los Angeles, California, United States of America
- Marshall School of Business, University of Southern California, Los Angeles, California, United States of America
| | - Katherine J. Reynolds
- School of Medicine and Psychology, The Australian National University, Canberra, ACT, Australia
- Melbourne Graduate School of Education, The University of Melbourne, Parkville, VIC, Australia
| | - Eryn J. Newman
- School of Medicine and Psychology, The Australian National University, Canberra, ACT, Australia
| |
Collapse
|
3
|
Arcos K, Hausman H, Storm BC. Are you sure? Examining the potential benefits of truth-checking as a learning activity. Q J Exp Psychol (Hove) 2024; 77:1635-1649. [PMID: 37787466 PMCID: PMC11295426 DOI: 10.1177/17470218231206813] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2023] [Revised: 08/18/2023] [Accepted: 09/15/2023] [Indexed: 10/04/2023]
Abstract
Learners may be uncertain about whether encountered information is true. Uncertainty may encourage people to critically assess information's accuracy, serving as a kind of desirable difficulty that benefits learning. Uncertainty may also have negative effects, however, leading people to mistrust true information or to later misremember false information as true. In three experiments, participants read history statements. In one condition, all statements were true, and the participants knew it. In the other two conditions, some statements were true, and others were false. Participants were either told the statements' accuracy or they guessed the statements' accuracy prior to feedback, a manipulation we refer to as truth-checking. All participants were then tested on recalling the true information and on recognising true versus false statements. We observed a significant benefit of truth-checking in one of the three experiments, suggesting that truth-checking may have some potential to enhance learning, perhaps by inducing people to encode to-be-learned information more deeply than they would otherwise. Even so, the benefit may come at a cost-truth-checking took significantly longer than study alone, and it led to a greater likelihood of thinking false information was true, suggesting costs of truth-checking may tend to outweigh benefits.
Collapse
Affiliation(s)
- Karen Arcos
- Division of Social Sciences, Department of Psychology, University of California, Santa Cruz, Santa Cruz, CA, USA
| | - Hannah Hausman
- Division of Social Sciences, Department of Psychology, University of California, Santa Cruz, Santa Cruz, CA, USA
| | - Benjamin C Storm
- Division of Social Sciences, Department of Psychology, University of California, Santa Cruz, Santa Cruz, CA, USA
| |
Collapse
|
4
|
Speckmann F, Unkelbach C. Illusions of knowledge due to mere repetition. Cognition 2024; 247:105791. [PMID: 38593568 DOI: 10.1016/j.cognition.2024.105791] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/03/2023] [Revised: 03/24/2024] [Accepted: 04/03/2024] [Indexed: 04/11/2024]
Abstract
Repeating information increases people's belief that the repeated information is true. This truth effect has been widely researched and is relevant for topics such as fake news and misinformation. Another effect of repetition, which is also relevant to those topics, has not been extensively studied so far: Do people believe they knew something before it was repeated? We used a standard truth effect paradigm in four pre-registered experiments (total N = 773), including a presentation and judgment phase. However, instead of "true"/"false" judgments, participants indicated whether they knew a given trivia statement before participating in the experiment. Across all experiments, participants judged repeated information as "known" more often than novel information. Participants even judged repeated false information to know it to be false. In addition, participants also generated sources of their knowledge. The inability to distinguish recent information from well-established knowledge in memory adds an explanation for the persistence and strength of repetition effects on truth. The truth effect might be so robust because people believe to know the repeatedly presented information as a matter of fact.
Collapse
|
5
|
Udry J, Barber SJ. The illusory truth effect: A review of how repetition increases belief in misinformation. Curr Opin Psychol 2024; 56:101736. [PMID: 38113667 DOI: 10.1016/j.copsyc.2023.101736] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/24/2023] [Revised: 11/01/2023] [Accepted: 11/10/2023] [Indexed: 12/21/2023]
Abstract
Repetition increases belief in information, a phenomenon known as the illusory truth effect. In laboratory experiments, the illusory truth effect has often been examined using general trivia statements as stimuli, but repetition also increases belief in misinformation, such as fake news headlines and conspiracy beliefs. Repetition even increases belief in claims that are implausible or that contradict prior knowledge. Repetition also has broader impacts beyond belief, such as increasing sharing intentions of news headlines and decreasing how unethical an act is perceived to be. Although the illusory truth effect is robust, some interventions reduce its magnitude, including instruction to focus on accuracy and awareness of the illusory truth effect. These strategies may be effective for reducing belief in misinformation.
Collapse
Affiliation(s)
- Jessica Udry
- Department of Psychology, Georgia State University, USA
| | - Sarah J Barber
- Department of Psychology, Georgia State University, USA; Gerontology Institute, Georgia State University, Atlanta, GA, USA.
| |
Collapse
|
6
|
Ramsey AT, Liu Y, Trueblood JS. Can Invalid Information Be Ignored When It Is Detected? Psychol Sci 2024; 35:328-344. [PMID: 38483515 DOI: 10.1177/09567976241231571] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 04/12/2024] Open
Abstract
With the rapid spread of information via social media, individuals are prone to misinformation exposure that they may utilize when forming beliefs. Over five experiments (total N = 815 adults, recruited through Amazon Mechanical Turk in the United States), we investigated whether people could ignore quantitative information when they judged for themselves that it was misreported. Participants recruited online viewed sets of values sampled from Gaussian distributions to estimate the underlying means. They attempted to ignore invalid information, which were outlier values inserted into the value sequences. Results indicated participants were able to detect outliers. Nevertheless, participants' estimates were still biased in the direction of the outlier, even when they were most certain that they detected invalid information. The addition of visual warning cues and different task scenarios did not fully eliminate systematic over- and underestimation. These findings suggest that individuals may incorporate invalid information they meant to ignore when forming beliefs.
Collapse
Affiliation(s)
| | - Yanjun Liu
- School of Psychology, University of New South Wales
- Department of Psychological and Brain Sciences, Indiana University Bloomington
| | | |
Collapse
|
7
|
Mayo R. Trust or distrust? Neither! The right mindset for confronting disinformation. Curr Opin Psychol 2024; 56:101779. [PMID: 38134524 DOI: 10.1016/j.copsyc.2023.101779] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2023] [Revised: 12/02/2023] [Accepted: 12/04/2023] [Indexed: 12/24/2023]
Abstract
A primary explanation for why individuals believe disinformation is the truth bias, a predisposition to accept information as true. However, this bias is context-dependent, as research shows that rejection becomes the predominant process in a distrust mindset. Consequently, trust and distrust emerge as pivotal factors in addressing disinformation. The current review offers a more nuanced perspective by illustrating that whereas distrust may act as an antidote to the truth bias, it can also paradoxically serve as a catalyst for belief in disinformation. The review concludes that mindsets other than those rooted solely in trust (or distrust), such as an evaluative mindset, may prove to be more effective in detecting and refuting disinformation.
Collapse
Affiliation(s)
- Ruth Mayo
- The Hebrew University of Jerusalem, Israel.
| |
Collapse
|
8
|
Rapp DN, Withall MM. Confidence as a metacognitive contributor to and consequence of misinformation experiences. Curr Opin Psychol 2024; 55:101735. [PMID: 38041918 DOI: 10.1016/j.copsyc.2023.101735] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2023] [Revised: 11/03/2023] [Accepted: 11/10/2023] [Indexed: 12/04/2023]
Abstract
Exposures to inaccurate information can lead people to become confused about what is true, to doubt their understandings, and to rely on the ideas later. Recent work has begun to investigate the role of metacognition in these effects. We review research foregrounding confidence as an exemplar metacognitive contributor to misinformation experiences. Miscalibrations between confidence about what one knows, and the actual knowledge one possesses, can help explain why people might hold fast to misinformed beliefs even in the face of counterevidence. Miscalibrations can also emerge after brief exposures to new misinformation, allowing even obvious inaccuracies to influence subsequent performance. Evidence additionally suggests confidence may present a useful target for intervention, helping to encourage careful evaluation under the right conditions.
Collapse
Affiliation(s)
- David N Rapp
- Department of Psychology, Northwestern University, Evanston, IL, USA; School of Education and Social Policy, Northwestern University, Evanston, IL, USA.
| | - Mandy M Withall
- Department of Psychology, Northwestern University, Evanston, IL, USA
| |
Collapse
|
9
|
Riesthuis P, Woods J. "That's just like, your opinion, man": the illusory truth effect on opinions. PSYCHOLOGICAL RESEARCH 2024; 88:284-306. [PMID: 37300704 PMCID: PMC10257371 DOI: 10.1007/s00426-023-01845-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2023] [Accepted: 05/22/2023] [Indexed: 06/12/2023]
Abstract
With the expanse of technology, people are constantly exposed to an abundance of information. Of vital importance is to understand how people assess the truthfulness of such information. One indicator of perceived truthfulness seems to be whether it is repeated. That is, people tend to perceive repeated information, regardless of its veracity, as more truthful than new information, also known as the illusory truth effect. In the present study, we examined whether such effect is also observed for opinions and whether the manner in which the information is encoded influenced the illusory truth effect. Across three experiments, participants (n = 552) were presented with a list of true information, misinformation, general opinion, and/or social-political opinion statements. First, participants were either instructed to indicate whether the presented statement was a fact or opinion based on its syntax structure (Exp. 1 & 2) or assign each statement to a topic category (Exp. 3). Subsequently, participants rated the truthfulness of various new and repeated statements. Results showed that repeated information, regardless of the type of information, received higher subjective truth ratings when participants simply encoded them by assigning each statement to a topic. However, when general and social-political opinions were encoded as an opinion, we found no evidence of such effect. Moreover, we found a reversed illusory truth effect for general opinion statements when only considering information that was encoded as an opinion. These findings suggest that how information is encoded plays a crucial role in evaluating truth.
Collapse
Affiliation(s)
- Paul Riesthuis
- Leuven Institute of Criminology, KU Leuven, Leuven, Belgium.
- Faculty of Psychology and Neuroscience, Maastricht University, Maastricht, Netherlands.
| | - Josh Woods
- Faculty of Psychology, Grand View University, Des Moines, IA, USA
| |
Collapse
|
10
|
Ly DP, Bernstein DM, Newman EJ. An ongoing secondary task can reduce the illusory truth effect. Front Psychol 2024; 14:1215432. [PMID: 38235277 PMCID: PMC10792064 DOI: 10.3389/fpsyg.2023.1215432] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2023] [Accepted: 10/02/2023] [Indexed: 01/19/2024] Open
Abstract
Introduction People are more likely to believe repeated information-this is known as the Illusory Truth Effect (ITE). Recent research on the ITE has shown that semantic processing of statements plays a key role. In our day to day experience, we are often multi-tasking which can impact our ongoing processing of information around us. In three experiments, we investigate how asking participants to engage in an ongoing secondary task in the ITE paradigm influences the magnitude of the effect of repetition on belief. Methods Using an adapted ITE paradigm, we embedded a secondary task into each trial of the encoding and/or test phase (e.g., having participants count the number of vowels in a target word of each trivia claim) and calculated the overall accuracy on the task. Results We found that the overall ITE was larger when participants had no ongoing secondary task during the experiment. Further, we predicted and found that higher accuracy on the secondary task was associated with a larger ITE. Discussion These findings provide initial evidence that engaging in an ongoing secondary task may reduce the impact of repetition. Our findings suggest that exploring the impact of secondary tasks on the ITE is a fruitful area for further research.
Collapse
Affiliation(s)
- Deva P. Ly
- School of Medicine and Psychology, Australian National University, Canberra, ACT, Australia
| | - Daniel M. Bernstein
- Department of Psychology, Kwantlen Polytechnic University, Surrey, BC, Canada
| | - Eryn J. Newman
- School of Medicine and Psychology, Australian National University, Canberra, ACT, Australia
| |
Collapse
|
11
|
Brashier NM, Ho CH, Hogue TK, Schacter DL. Retrieval fluency inflates perceived preparation for difficult problems. Memory 2024; 32:83-89. [PMID: 38109129 PMCID: PMC10865271 DOI: 10.1080/09658211.2023.2284401] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/13/2023] [Accepted: 11/07/2023] [Indexed: 12/19/2023]
Abstract
When faced with a difficult problem, people often rely on past experiences. While remembering clearly helps us reach solutions, can retrieval also lead to misperceptions of our abilities? In three experiments, participants encountered "worst case scenarios" they likely had never experienced and that would be difficult to navigate without extensive training (e.g., bitten by snake). Learning brief tips improved problem-solving performance later, but retrieval increased feelings of preparation by an even larger margin. This gap occurred regardless of whether people thought that tips came from an expert or another participant in the study, and it did not reflect mere familiarity with the problems themselves. Instead, our results suggest that the ease experienced while remembering, or retrieval fluency, inflated feelings of preparation.
Collapse
|
12
|
Mattavelli S, Béna J, Corneille O, Unkelbach C. People underestimate the influence of repetition on truth judgments (and more so for themselves than for others). Cognition 2024; 242:105651. [PMID: 37871412 DOI: 10.1016/j.cognition.2023.105651] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2023] [Revised: 10/07/2023] [Accepted: 10/14/2023] [Indexed: 10/25/2023]
Abstract
People judge repeated statements as more truthful than new statements: a truth effect. In three pre-registered experiments (N = 463), we examined whether people expect repetition to influence truth judgments more for others than for themselves: a bias blind spot in the truth effect. In Experiments 1 and 2, using moderately plausible and implausible statements, respectively, the test for the bias blind spot did not pass the significance threshold set for a two-step sequential analysis. Experiment 3 considered moderately plausible statements but with a larger sample of participants. Additionally, it compared actual performance after a two-day delay with participants' predictions for themselves and others. This time, we found clear evidence for a bias blind spot in the truth effect. Experiment 3 also showed that participants underestimated the magnitude of the truth effect, especially so for themselves, and that predictions and actual truth effect scores were not significantly related. Finally, an integrative analysis focusing on a more conservative between-participant approach found clear frequentist and Bayesian evidence for a bias blind spot. Overall, the results indicate that people (1) hold beliefs about the effect of repetition on truth judgments, (2) believe that this effect is larger for others than for themselves, (3) and underestimate the effect's magnitude, and (4) particularly so for themselves.
Collapse
Affiliation(s)
- Simone Mattavelli
- University of Milano-Bicocca, Italy; Vita-Salute San Raffaele University, Italy.
| | - Jérémy Béna
- UCLouvain, Belgium; Aix-Marseille Université, France
| | | | | |
Collapse
|
13
|
Prike T, Blackley P, Swire-Thompson B, Ecker UKH. Examining the replicability of backfire effects after standalone corrections. Cogn Res Princ Implic 2023; 8:39. [PMID: 37395864 PMCID: PMC10317933 DOI: 10.1186/s41235-023-00492-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2022] [Accepted: 06/06/2023] [Indexed: 07/04/2023] Open
Abstract
Corrections are a frequently used and effective tool for countering misinformation. However, concerns have been raised that corrections may introduce false claims to new audiences when the misinformation is novel. This is because boosting the familiarity of a claim can increase belief in that claim, and thus exposing new audiences to novel misinformation-even as part of a correction-may inadvertently increase misinformation belief. Such an outcome could be conceptualized as a familiarity backfire effect, whereby a familiarity boost increases false-claim endorsement above a control-condition or pre-correction baseline. Here, we examined whether standalone corrections-that is, corrections presented without initial misinformation exposure-can backfire and increase participants' reliance on the misinformation in their subsequent inferential reasoning, relative to a no-misinformation, no-correction control condition. Across three experiments (total N = 1156) we found that standalone corrections did not backfire immediately (Experiment 1) or after a one-week delay (Experiment 2). However, there was some mixed evidence suggesting corrections may backfire when there is skepticism regarding the correction (Experiment 3). Specifically, in Experiment 3, we found the standalone correction to backfire in open-ended responses, but only when there was skepticism towards the correction. However, this did not replicate with the rating scales measure. Future research should further examine whether skepticism towards the correction is the first replicable mechanism for backfire effects to occur.
Collapse
Affiliation(s)
- Toby Prike
- School of Psychological Science, University of Western Australia, Perth, Australia.
| | - Phoebe Blackley
- School of Psychological Science, University of Western Australia, Perth, Australia
| | - Briony Swire-Thompson
- Network Science Institute, Northeastern University, Boston, USA
- Institute of Quantitative Social Science, Harvard University, Cambridge, USA
| | - Ullrich K H Ecker
- School of Psychological Science, University of Western Australia, Perth, Australia
- Public Policy Institute, University of Western Australia, Perth, Australia
| |
Collapse
|
14
|
Vellani V, Zheng S, Ercelik D, Sharot T. The illusory truth effect leads to the spread of misinformation. Cognition 2023; 236:105421. [PMID: 36871397 PMCID: PMC10636596 DOI: 10.1016/j.cognition.2023.105421] [Citation(s) in RCA: 9] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2022] [Revised: 02/21/2023] [Accepted: 02/22/2023] [Indexed: 03/06/2023]
Abstract
Misinformation can negatively impact people's lives in domains ranging from health to politics. An important research goal is to understand how misinformation spreads in order to curb it. Here, we test whether and how a single repetition of misinformation fuels its spread. Over two experiments (N = 260) participants indicated which statements they would like to share with other participants on social media. Half of the statements were repeated and half were new. The results reveal that participants were more likely to share statements they had previously been exposed to. Importantly, the relationship between repetition and sharing was mediated by perceived accuracy. That is, repetition of misinformation biased people's judgment of accuracy and as a result fuelled the spread of misinformation. The effect was observed in the domain of health (Exp 1) and general knowledge (Exp 2), suggesting it is not tied to a specific domain.
Collapse
Affiliation(s)
- Valentina Vellani
- Affective Brain Lab, Department of Experimental Psychology, University College London, London WC1H 0AP, UK; Max Planck University College London Centre for Computational Psychiatry and Ageing Research, London WC1B 5EH, UK.
| | - Sarah Zheng
- Affective Brain Lab, Department of Experimental Psychology, University College London, London WC1H 0AP, UK; Max Planck University College London Centre for Computational Psychiatry and Ageing Research, London WC1B 5EH, UK
| | - Dilay Ercelik
- Affective Brain Lab, Department of Experimental Psychology, University College London, London WC1H 0AP, UK; Max Planck University College London Centre for Computational Psychiatry and Ageing Research, London WC1B 5EH, UK
| | - Tali Sharot
- Affective Brain Lab, Department of Experimental Psychology, University College London, London WC1H 0AP, UK; Max Planck University College London Centre for Computational Psychiatry and Ageing Research, London WC1B 5EH, UK; Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA, USA.
| |
Collapse
|
15
|
Yacoby A, Reggev N, Maril A. Lack of source memory as a potential marker of early assimilation of novel items into current knowledge. Neuropsychologia 2023; 185:108569. [PMID: 37121268 DOI: 10.1016/j.neuropsychologia.2023.108569] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/16/2022] [Revised: 04/05/2023] [Accepted: 04/23/2023] [Indexed: 05/02/2023]
Abstract
In daily life, humans process a plethora of new information that can be either consistent (familiar) or inconsistent (novel) with prior knowledge. Over time, both types of information can integrate into our accumulated knowledge base via distinct pathways. However, the mnemonic processes supporting the integration of information that is inconsistent with prior knowledge remain under-characterized. In the current study, we used functional magnetic resonance imaging (fMRI) to examine the initial assimilation of novel items into the semantic network. Participants saw three repetitions of adjective-noun word pairs that were either consistent or inconsistent with prior knowledge. Twenty-four hours later, they were presented with the same stimuli again while undergoing fMRI scans. Outside the scanner, participants completed a surprise recognition test. We found that when the episodic context associated with initially inconsistent items was irretrievable, the neural signature of these items was indistinguishable from that of consistent items. In contrast, initially inconsistent items with accessible episodic contexts showed neural signatures that differed from those associated with consistent items. We suggest that, at least one day post encoding, items inconsistent with prior knowledge can show early assimilation into the semantic network only when their episodic contexts become inaccessible during retrieval, thus evoking a sense of familiarity.
Collapse
Affiliation(s)
- Amnon Yacoby
- Department of Cognitive Sciences, The Hebrew University of Jerusalem, Jerusalem, Israel
| | - Niv Reggev
- Department of Psychology and the School of Brain Sciences and Cognition, Ben Gurion University, Beer Sheva, Israel
| | - Anat Maril
- Department of Cognitive Sciences, The Hebrew University of Jerusalem, Jerusalem, Israel; Department of Psychology, The Hebrew University of Jerusalem, Jerusalem, Israel.
| |
Collapse
|
16
|
Brashier NM. Do conspiracy theorists think too much or too little? Curr Opin Psychol 2023; 49:101504. [PMID: 36577227 DOI: 10.1016/j.copsyc.2022.101504] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2022] [Revised: 10/19/2022] [Accepted: 10/21/2022] [Indexed: 12/30/2022]
Abstract
Conspiracy theories explain distressing events as malevolent actions by powerful groups. Why do people believe in secret plots when other explanations are more probable? On the one hand, conspiracy theorists seem to disregard accuracy; they tend to endorse mutually incompatible conspiracies, think intuitively, use heuristics, and hold other irrational beliefs. But by definition, conspiracy theorists reject the mainstream explanation for an event, often in favor of a more complex account. They exhibit a general distrust of others and expend considerable effort to find 'evidence' supporting their beliefs. In searching for answers, conspiracy theorists likely expose themselves to misleading information online and overestimate their own knowledge. Understanding when elaboration and cognitive effort might backfire is crucial, as conspiracy beliefs lead to political disengagement, environmental inaction, prejudice, and support for violence.
Collapse
Affiliation(s)
- Nadia M Brashier
- Department of Psychological Sciences, Purdue University, 703 Third St, West Lafayette, IN 47907, USA.
| |
Collapse
|
17
|
That's interesting! The role of epistemic emotions and perceived credibility in the relation between prior beliefs and susceptibility to fake-news. COMPUTERS IN HUMAN BEHAVIOR 2022. [DOI: 10.1016/j.chb.2022.107619] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/23/2022]
|
18
|
Bromme R. [Informed trust in science: lessons from the COVID 19 pandemic for the conceptualization of science literacy]. UNTERRICHTSWISSENSCHAFT 2022; 50:331-345. [PMID: 36320590 PMCID: PMC9610333 DOI: 10.1007/s42010-022-00159-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/29/2022] [Revised: 09/27/2022] [Accepted: 10/05/2022] [Indexed: 11/05/2022]
Abstract
Informed trust in science is necessary for the 'interfaces' within the flow of knowledge between citizens' everyday understanding of the pandemic and the dynamically evolving state of knowledge in the sciences. This is the core thesis of this paper. Without science, the COVID-19 pandemic can neither be understood nor controlled, and for this to happen, citizens must engage with science based knowledge. However, such knowledge is dynamic (evolving and intertwined with normative issues). Furthermore, science based knowledge competes with pseudoscientific contributions. As non-experts, laypersons must therefore decide whom to trust. The paper describes the concept of functional scientific literacy as a prerequisite of informed trust. The knowledge bases for judgments of informed trust should be taught in school and judging rationally about the trustworthiness of science-related knowledge claims should be practiced.
Collapse
Affiliation(s)
- Rainer Bromme
- grid.5949.10000 0001 2172 9288Institut für Psychologie, Universität Münster, Fliednerstr. 21, 48149 Münster, Deutschland
| |
Collapse
|
19
|
Evaluative mindsets can protect against the influence of false information. Cognition 2022; 225:105121. [DOI: 10.1016/j.cognition.2022.105121] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2021] [Revised: 03/29/2022] [Accepted: 04/02/2022] [Indexed: 11/19/2022]
|
20
|
Story stimuli for instantiating true and false beliefs about the world. Behav Res Methods 2022:10.3758/s13428-022-01904-6. [PMID: 35790682 PMCID: PMC9255489 DOI: 10.3758/s13428-022-01904-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/07/2022] [Indexed: 11/08/2022]
Abstract
We offer short story (“vignette”) materials that have been developed and tested with the intention of influencing people’s true and false beliefs about the world. First, we present norming data on the baseline rates at which participants from both U.S.-census matched and general U.S. online samples were correctly able to classify a selected set of accurate (e.g., aerobic exercise strengthens your heart and lungs) and inaccurate (e.g., aerobic exercise weakens your heart and lungs) assertions as “True” or “False.” Next, we present data which validate that reading vignettes in which people discuss these accurate and inaccurate assertions influences participants’ subsequent judgments of the validity of the asserted claims. These vignettes are brief, easy-to-read, allow for flexible and accountable online data collection, and reflect realistic accurate and inaccurate claims that people routinely encounter (e.g., preventative health behaviors, use of alternative medicines and therapies, etc.). As intended, vignettes containing inaccurate assertions increased participants’ subsequent judgment errors, while vignettes containing accurate assertions decreased participants’ subsequent judgment errors, both relative to participants’ judgments after not reading related information. In an additional experiment, we used the vignette materials to replicate findings from Salovich et al. (2021), wherein participants reported lower confidence in correct judgments and higher confidence in incorrect judgments after having read inaccurate assertions. Overall, these materials are well suited for investigations on the consequences of exposures to accurate and inaccurate information, address limitations in currently available stimuli, and align with trends in research practice (e.g., online sampling) within psychological science.
Collapse
|
21
|
Henderson EL, Westwood SJ, Simons DJ. A reproducible systematic map of research on the illusory truth effect. Psychon Bull Rev 2022; 29:1065-1088. [PMID: 34708397 PMCID: PMC9166874 DOI: 10.3758/s13423-021-01995-w] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 08/13/2021] [Indexed: 11/10/2022]
Abstract
People believe information more if they have encountered it before, a finding known as the illusory truth effect. But what is the evidence for the generality and pervasiveness of the illusory truth effect? Our preregistered systematic map describes the existing knowledge base and objectively assesses the quality, completeness and interpretability of the evidence provided by empirical studies in the literature. A systematic search of 16 bibliographic and grey literature databases identified 93 reports with a total of 181 eligible studies. All studies were conducted at Western universities, and most used convenience samples. Most studies used verbatim repetition of trivia statements in a single testing session with a minimal delay between exposure and test. The exposure tasks, filler tasks and truth measures varied substantially across studies, with no standardisation of materials or procedures. Many reports lacked transparency, both in terms of open science practices and reporting of descriptive statistics and exclusions. Systematic mapping resulted in a searchable database of illusory truth effect studies ( https://osf.io/37xma/ ). Key limitations of the current literature include the need for greater diversity of materials as stimuli (e.g., political or health contents), more participants from non-Western countries, studies examining effects of multiple repetitions and longer intersession intervals, and closer examination of the dependency of effects on the choice of exposure task and truth measure. These gaps could be investigated using carefully designed multi-lab studies. With a lack of external replications, preregistrations, data and code, verifying replicability and robustness is only possible for a small number of studies.
Collapse
Affiliation(s)
- Emma L. Henderson
- Faculty of Business and Social Sciences, Kingston University, Kingston Hill Campus, Kingston Hill, Kingston upon Thames, KT2 7LB UK
| | - Samuel J. Westwood
- Department of Psychology, Institute of Human Sciences, Millennium City Building, University of Wolverhampton, Wolverhampton, WV1 1LY UK
- Department of Child & Adolescent Psychiatry, Institute of Psychiatry, Psychology and Neuroscience, King’s College London, London, SE5 8AB UK
| | - Daniel J. Simons
- Department of Psychology, University of Illinois, 603 E. Daniel Street, Champaign, IL 61820 USA
| |
Collapse
|
22
|
The effect of others’ repeated retrieval on the illusion of truth for emotional information. CURRENT PSYCHOLOGY 2022. [DOI: 10.1007/s12144-022-03105-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
23
|
Prior exposure increases judged truth even during periods of mind wandering. Psychon Bull Rev 2022; 29:1997-2007. [PMID: 35477849 DOI: 10.3758/s13423-022-02101-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/05/2022] [Indexed: 11/08/2022]
Abstract
Much of our day is spent mind-wandering-periods of inattention characterized by a lack of awareness of external stimuli and information. Whether we are paying attention or not, information surrounds us constantly-some true and some false. The proliferation of false information in news and social media highlights the critical need to understand the psychological mechanisms underlying our beliefs about what is true. People often rely on heuristics to judge the truth of information. For example, repeated information is more likely to be judged as true than new information (i.e., the illusory truth effect). However, despite the prevalence of mind wandering in our daily lives, current research on the contributing factors to the illusory truth effect have largely ignored periods of inattention as experimentally informative. Here, we aim to address this gap in our knowledge, investigating whether mind wandering during initial exposure to information has an effect on later belief in the truth of that information. That is, does the illusory truth effect occur even when people report not paying attention to the information at hand. Across three studies we demonstrate that even during periods of mind wandering, the repetition of information increases truth judgments. Further, our results suggest that the severity of mind wandering moderated truth ratings, such that greater levels of mind wandering decreased truth judgements for previously presented information.
Collapse
|
24
|
Goldberg AE, Ferreira F. Good-enough language production. Trends Cogn Sci 2022; 26:300-311. [PMID: 35241380 PMCID: PMC8956348 DOI: 10.1016/j.tics.2022.01.005] [Citation(s) in RCA: 15] [Impact Index Per Article: 7.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/22/2021] [Revised: 01/10/2022] [Accepted: 01/18/2022] [Indexed: 11/24/2022]
Abstract
Our ability to comprehend and produce language is one of humans' most impressive skills, but it is not flawless. We must convey and interpret messages via a noisy channel in ever-changing contexts and we sometimes fail to access an optimal combination of words and grammatical constructions. Here, we extend the notion of good-enough (GN) comprehension to GN production, which allows us to unify a wide range of phenomena including overly vague word choices, agreement errors, resumptive pronouns, transfer effects, and children's overextensions and regularizations. We suggest these all involve the accessing and production of a 'GN' option when a more-optimal option is inaccessible. The role of accessibility highlights the need to relate memory encoding and retrieval processes to language comprehension and production.
Collapse
Affiliation(s)
- Adele E Goldberg
- Department of Psychology, Princeton University, Princeton, NJ 08544, USA.
| | - Fernanda Ferreira
- Department of Psychology, University of California, Davis, Davis, CA 95616, USA.
| |
Collapse
|
25
|
Jordan K, Zajac R, Bernstein D, Joshi C, Garry M. Trivially informative semantic context inflates people's confidence they can perform a highly complex skill. ROYAL SOCIETY OPEN SCIENCE 2022; 9:211977. [PMID: 35308623 PMCID: PMC8924756 DOI: 10.1098/rsos.211977] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/16/2021] [Accepted: 02/22/2022] [Indexed: 05/03/2023]
Abstract
Some research suggests people are overconfident because of personality characteristics, lack of insight, or because overconfidence is beneficial in its own right. But other research fits with the possibility that fluent experience in the moment can rapidly drive overconfidence. For example, fluency can push people to become overconfident in their ability to throw a dart, know how rainbows form or predict the future value of a commodity. But surely there are limits to overconfidence. That is, even in the face of fluency manipulations known to increase feelings of confidence, reasonable people would reject the thought that they, for example, might be able to land a plane in an emergency. To address this question, we conducted two experiments comprising a total of 780 people. We asked some people (but not others) to watch a trivially informative video of a pilot landing a plane before they rated their confidence in their own ability to land a plane. We found watching the video inflated people's confidence that they could land a plane. Our findings extend prior work by suggesting that increased semantic context creates illusions not just of prior experience or understanding-but also of the ability to actually do something implausible.
Collapse
Affiliation(s)
- Kayla Jordan
- School of Psychology, The University of Waikato, 1 Knighton Road, Hamilton 3240, New Zealand
| | - Rachel Zajac
- School of Psychology, University of Otago, 362 Leith Street, Dunedin 9016, New Zealand
| | - Daniel Bernstein
- Department of Psychology, Kwantlen Polytechnic University, 12666 72 Ave, Surrey, British Columbia V3W2M8, Canada
| | - Chaitanya Joshi
- School of Psychology, The University of Waikato, 1 Knighton Road, Hamilton 3240, New Zealand
| | - Maryanne Garry
- School of Psychology, The University of Waikato, 1 Knighton Road, Hamilton 3240, New Zealand
| |
Collapse
|
26
|
Lacassagne D, Béna J, Corneille O. Is Earth a perfect square? Repetition increases the perceived truth of highly implausible statements. Cognition 2022; 223:105052. [PMID: 35144111 DOI: 10.1016/j.cognition.2022.105052] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2021] [Revised: 01/28/2022] [Accepted: 01/29/2022] [Indexed: 11/03/2022]
Abstract
A single exposure to statements is typically enough to increase their perceived truth. This Truth-by-Repetition (TBR) effect has long been assumed to occur only with statements whose truth value is unknown to participants. Contrary to this hypothesis, recent research has found that statements contradicting participants' prior knowledge (as established from a first sample of participants) show a TBR effect following their repetition (in a second, independent sample of participants). As for now, however, attempts at finding a TBR effect for blatantly false (i.e., highly implausible) statements have failed. Here, we reasoned that highly implausible statements such as Elephants run faster than cheetahs may show repetition effects, provided a sensitive truth measure is used and statements are repeated more than just once. In a preregistered experiment, participants judged on a 100-point scale the truth of highly implausible statements that were either new to them or had been presented five times before judgment. We observed an effect of repetition: repeated statements were judged more true than new ones, although all judgments were judged below the scale midpoint. Exploratory analyses additionally show that about half the participants showed no or even a reversed effect of repetition. The results provide the first empirical evidence that repetition can increase perceived truth even for highly implausible statements, although not equally so for all participants and not to the point of making the statements look true.
Collapse
|
27
|
Unkelbach C, Taşbaş EHO. Repeating stereotypes: Increased belief and subsequent discrimination. EUROPEAN JOURNAL OF SOCIAL PSYCHOLOGY 2022. [DOI: 10.1002/ejsp.2835] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
28
|
Saerys-Foy JE, LoCasto PC, Burn D, Ferranti D. Superman Takes a Taxi: Testing Theories of Validation with Inconsistencies in Fantastic Narratives. DISCOURSE PROCESSES 2022. [DOI: 10.1080/0163853x.2021.1994298] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Affiliation(s)
| | | | - David Burn
- Department of Mathematics Quinnipiac University
| | | |
Collapse
|
29
|
Pennycook G. A framework for understanding reasoning errors: From fake news to climate change and beyond. ADVANCES IN EXPERIMENTAL SOCIAL PSYCHOLOGY 2022. [DOI: 10.1016/bs.aesp.2022.11.003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/13/2022]
|
30
|
Abstract
People rate and judge repeated information more true than novel information. This truth-by-repetition effect is of relevance for explaining belief in fake news, conspiracy theories, or misinformation effects. To ascertain whether increased motivation could reduce this effect, we tested the influence of monetary incentives on participants’ truth judgments. We used a standard truth paradigm, consisting of a presentation and judgment phase with factually true and false information, and incentivized every truth judgment. Monetary incentives may influence truth judgments in two ways. First, participants may rely more on relevant knowledge, leading to better discrimination between true and false statements. Second, participants may rely less on repetition, leading to a lower bias to respond “true.” We tested these predictions in a preregistered and high-powered experiment. However, incentives did not influence the percentage of “true” judgments or correct responses in general, despite participants’ longer response times in the incentivized conditions and evidence for knowledge about the statements. Our findings show that even monetary consequences do not protect against the truth-by-repetition effect, further substantiating its robustness and relevance and highlighting its potential hazardous effects when used in purposeful misinformation.
Collapse
|
31
|
COVID-19 as infodemic: The impact of political orientation and open-mindedness on the discernment of misinformation in WhatsApp. JUDGMENT AND DECISION MAKING 2021. [DOI: 10.1017/s193029750000855x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/04/2023]
Abstract
AbstractMessaging applications are changing the communication landscape in emerging countries. While offering speed and affordability, these solutions have also opened the way for the spread of misinformation. Aiming to better understand the dynamics of COVID-19 as infodemic, we asked Brazilian participants (n=1007) to report the perceived accuracy of 20 messages (10 true and 10 false). Each message was randomly presented within five fictitious WhatsApp group chats of varying political orientation. Correlational analyses revealed that right-wing participants had lower levels of truth discernment as did those with greater trust in social media as a reliable source of coronavirus information. Conversely, open-minded thinking about evidence and trust in the WHO and traditional media was positively associated with truth discernment. Familiarity with the content consistently increased perceived truthness for both true and false messages. Results point to the nefarious effects of COVID-19 politicization and underline the importance of promoting the ability to recognize and value new evidence as well as enhancing trust in international agencies and traditional media.
Collapse
|
32
|
Judging fast and slow: The truth effect does not increase under time-pressure conditions. JUDGMENT AND DECISION MAKING 2021. [DOI: 10.1017/s193029750000841x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/04/2023]
Abstract
AbstractDue to the information overload in today’s digital age, people may sometimes feel pressured to process and judge information especially fast. In three experiments, we examined whether time pressure increases the repetition-based truth effect — the tendency to judge repeatedly encountered statements more likely as “true” than novel statements. Based on the Heuristic-Systematic Model, a dual-process model in the field of persuasion research, we expected that time pressure would boost the truth effect by increasing reliance on processing fluency as a presumably heuristic cue for truth, and by decreasing knowledge retrieval as a presumably slow and systematic process that determines truth judgments. However, contrary to our expectation, time pressure did not moderate the truth effect. Importantly, this was the case for difficult statements, for which most people lack prior knowledge, as well as for easy statements, for which most people hold relevant knowledge. Overall, the findings clearly speak against the conception of fast, fluency-based truth judgments versus slow, knowledge-based truth judgments. In contrast, the results are compatible with a referential theory of the truth effect that does not distinguish between different types of truth judgments. Instead, it assumes that truth judgments rely on the coherence of localized networks in people’s semantic memory, formed by both repetition and prior knowledge.
Collapse
|
33
|
Pillai RM, Fazio LK. The effects of repeating false and misleading information on belief. WILEY INTERDISCIPLINARY REVIEWS. COGNITIVE SCIENCE 2021; 12:e1573. [PMID: 34423562 DOI: 10.1002/wcs.1573] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/28/2020] [Revised: 07/20/2021] [Accepted: 07/22/2021] [Indexed: 11/09/2022]
Abstract
False and misleading information is readily accessible in people's environments, oftentimes reaching people repeatedly. This repeated exposure can significantly affect people's beliefs about the world, as has been noted by scholars in political science, communication, and cognitive, developmental, and social psychology. In particular, repetition increases belief in false information, even when the misinformation contradicts prior knowledge. We review work across these disciplines, identifying factors that may heighten, diminish, or have no impact on these adverse effects of repetition on belief. Specifically, we organize our discussion around variations in what information is repeated, to whom the information is repeated, how people interact with this repetition, and how people's beliefs are measured. A key cross-disciplinary theme is that the most influential factor is how carefully or critically people process the false information. However, several open questions remain when comparing findings across different fields and approaches. We conclude by noting a need for more interdisciplinary work to help resolve these questions, as well as a need for more work in naturalistic settings so that we can better understand and combat the effects of repeated circulation of false and misleading information in society. This article is categorized under: Psychology > Memory Psychology > Reasoning and Decision Making.
Collapse
Affiliation(s)
| | - Lisa K Fazio
- Vanderbilt University, Nashville, Tennessee, USA
| |
Collapse
|
34
|
Confer SV, Diller JW, Danforth JS. A Behavior-Analytic Approach to Antivaccination Practices. BEHAVIOR AND SOCIAL ISSUES 2021; 30:648-665. [PMID: 38624918 PMCID: PMC8186869 DOI: 10.1007/s42822-021-00051-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 03/12/2021] [Indexed: 11/07/2022]
Abstract
In recent years, there has been an increase in outbreaks of diseases that are preventable by vaccination. As vaccination involves behavior, behavior analysts are uniquely positioned to contribute solutions to this socially significant problem. The present article explores a behavior-analytic approach to understanding the function of the behavior of both people who have their children vaccinated and those who do not have their children vaccinated, and potential interventions to increase vaccination rates. An introduction to the problem is followed by a brief history of the antivaccination movement. In our analysis, a failure to vaccinate is conceptualized as a noncompliance response (i.e., medical nonadherence), and conditions giving rise to that noncompliance are evaluated. In this process, the roles of avoidance, the functional-altering impact of rule-governed behavior, relational frames, and countercontrol are considered. Potential solutions informed by applied behavior-analytic literature, including contingency management and behavioral safety, are discussed.
Collapse
|
35
|
Henderson EL, Simons DJ, Barr DJ. The Trajectory of Truth: A Longitudinal Study of the Illusory Truth Effect. J Cogn 2021; 4:29. [PMID: 34164597 PMCID: PMC8194981 DOI: 10.5334/joc.161] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2020] [Accepted: 04/19/2021] [Indexed: 11/23/2022] Open
Abstract
Repeated statements are rated as subjectively truer than comparable new statements, even though repetition alone provides no new, probative information (the illusory truth effect). Contrary to some theoretical predictions, the illusory truth effect seems to be similar in magnitude for repetitions occurring after minutes or weeks. This Registered Report describes a longitudinal investigation of the illusory truth effect (n = 608, n = 567 analysed) in which we systematically manipulated intersession interval (immediately, one day, one week, and one month) in order to test whether the illusory truth effect is immune to time. Both our hypotheses were supported: We observed an illusory truth effect at all four intervals (overall effect: χ 2(1) = 169.91; M repeated = 4.52, M new = 4.14; H1), with the effect diminishing as delay increased (H2). False information repeated over short timescales might have a greater effect on truth judgements than repetitions over longer timescales. Researchers should consider the implications of the choice of intersession interval when designing future illusory truth effect research.
Collapse
Affiliation(s)
- Emma L. Henderson
- Faculty of Business and Social Sciences, Kingston University, Kingston Hill Campus, Kingston Hill, Kingston upon Thames, KT2 7LB, UK
- School of Psychology, University of Surrey, Guildford, Surrey, GU2 7XH, UK
| | - Daniel J. Simons
- Department of Psychology, University of Illinois at Urbana-Champaign, US
| | - Dale J. Barr
- Institute of Neuroscience & Psychology, University of Glasgow, UK
| |
Collapse
|
36
|
Unkelbach C, Speckmann F. Mere repetition increases belief in factually true COVID-19-related information. JOURNAL OF APPLIED RESEARCH IN MEMORY AND COGNITION 2021. [DOI: 10.1016/j.jarmac.2021.02.001] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
37
|
Hassan A, Barber SJ. The effects of repetition frequency on the illusory truth effect. COGNITIVE RESEARCH-PRINCIPLES AND IMPLICATIONS 2021; 6:38. [PMID: 33983553 PMCID: PMC8116821 DOI: 10.1186/s41235-021-00301-5] [Citation(s) in RCA: 22] [Impact Index Per Article: 7.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/29/2020] [Accepted: 04/28/2021] [Indexed: 11/10/2022]
Abstract
Repeated information is often perceived as more truthful than new information. This finding is known as the illusory truth effect, and it is typically thought to occur because repetition increases processing fluency. Because fluency and truth are frequently correlated in the real world, people learn to use processing fluency as a marker for truthfulness. Although the illusory truth effect is a robust phenomenon, almost all studies examining it have used three or fewer repetitions. To address this limitation, we conducted two experiments using a larger number of repetitions. In Experiment 1, we showed participants trivia statements up to 9 times and in Experiment 2 statements were shown up to 27 times. Later, participants rated the truthfulness of the previously seen statements and of new statements. In both experiments, we found that perceived truthfulness increased as the number of repetitions increased. However, these truth rating increases were logarithmic in shape. The largest increase in perceived truth came from encountering a statement for the second time, and beyond this were incrementally smaller increases in perceived truth for each additional repetition. These findings add to our theoretical understanding of the illusory truth effect and have applications for advertising, politics, and the propagation of "fake news."
Collapse
Affiliation(s)
- Aumyo Hassan
- Department of Psychology, San Francisco State University, 1600 Holloway Avenue, Ethnic Studies & Psychology Building, San Francisco, 94132, California, USA
| | - Sarah J Barber
- Department of Psychology, Georgia State University, P.O. Box 5010, Atlanta, GA, 30302, USA.
| |
Collapse
|
38
|
Liang S, Dong X, Yan Y, Chang Y. The Influence of the Inconsistent Color Presentation of the Original Price and Sale Price on Purchase Likelihood. Front Psychol 2021; 12:603754. [PMID: 33790827 PMCID: PMC8005512 DOI: 10.3389/fpsyg.2021.603754] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2020] [Accepted: 02/08/2021] [Indexed: 11/13/2022] Open
Abstract
Retailers like to use different colors to present the sale price and original price when they are presenting a promotion price. How does the inconsistent color presentation of the prices influence consumers’ purchase likelihood? The extant research does not consider this question. This article will address this question. Drawing on incongruence theory and the persuasion knowledge model (PKM), this article proposes that when the color of the sale price is inconsistent (vs. consistent) with that of the original price, consumers show less preference for the sale price. The reason is that consumers perceive the price as being less trustworthy, which leads to a lower purchase likelihood. Furthermore, this effect is affected by the brand awareness of products. Specifically, when products are less-known brands, the inconsistent (vs. consistent) colors of the sale price and original price will lead to a lower purchase likelihood. In contrast, when products are well-known brands, the inconsistent (vs. consistent) colors of the sale price and original price will lead to a high purchase likelihood. In this article, four studies are used to verify these hypotheses, and implications of theory and practice of the present research are discussed.
Collapse
Affiliation(s)
| | | | - Yanling Yan
- Zhengzhou University of Light Industry, Zhengzhou, China
| | - Yaping Chang
- Huazhong University of Science and Technology, Wuhan, China
| |
Collapse
|
39
|
Pennycook G, Rand DG. The Psychology of Fake News. Trends Cogn Sci 2021; 25:388-402. [PMID: 33736957 DOI: 10.1016/j.tics.2021.02.007] [Citation(s) in RCA: 210] [Impact Index Per Article: 70.0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2020] [Revised: 02/13/2021] [Accepted: 02/17/2021] [Indexed: 12/25/2022]
Abstract
We synthesize a burgeoning literature investigating why people believe and share false or highly misleading news online. Contrary to a common narrative whereby politics drives susceptibility to fake news, people are 'better' at discerning truth from falsehood (despite greater overall belief) when evaluating politically concordant news. Instead, poor truth discernment is associated with lack of careful reasoning and relevant knowledge, and the use of heuristics such as familiarity. Furthermore, there is a substantial disconnect between what people believe and what they share on social media. This dissociation is largely driven by inattention, more so than by purposeful sharing of misinformation. Thus, interventions can successfully nudge social media users to focus more on accuracy. Crowdsourced veracity ratings can also be leveraged to improve social media ranking algorithms.
Collapse
Affiliation(s)
- Gordon Pennycook
- Hill/Levene Schools of Business, University of Regina, Regina, SK S4S 0A2, Canada; Department of Psychology, University of Regina, Regina, SK S4S 0A2, Canada.
| | - David G Rand
- Sloan School of Management, Massachusetts Institute of Technology, Cambridge, MA 02142, USA; Institute for Data, Systems, and Society, Massachusetts Institute of Technology, Cambridge, MA 02142, USA; Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA 02142, USA.
| |
Collapse
|
40
|
Grimes DR. Medical disinformation and the unviable nature of COVID-19 conspiracy theories. PLoS One 2021; 16:e0245900. [PMID: 33711025 PMCID: PMC7954317 DOI: 10.1371/journal.pone.0245900] [Citation(s) in RCA: 47] [Impact Index Per Article: 15.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/09/2020] [Accepted: 01/10/2021] [Indexed: 01/13/2023] Open
Abstract
The coronavirus pandemic has seen a marked rise in medical disinformation across social media. A variety of claims have garnered considerable traction, including the assertion that COVID is a hoax or deliberately manufactured, that 5G frequency radiation causes coronavirus, and that the pandemic is a ruse by big pharmaceutical companies to profiteer off a vaccine. An estimated 30% of some populations subscribe some form of COVID medico-scientific conspiracy narratives, with detrimental impacts for themselves and others. Consequently, exposing the lack of veracity of these claims is of considerable importance. Previous work has demonstrated that historical medical and scientific conspiracies are highly unlikely to be sustainable. In this article, an expanded model for a hypothetical en masse COVID conspiracy is derived. Analysis suggests that even under ideal circumstances for conspirators, commonly encountered conspiratorial claims are highly unlikely to endure, and would quickly be exposed. This work also explores the spectrum of medico-scientific acceptance, motivations behind propagation of falsehoods, and the urgent need for the medical and scientific community to anticipate and counter the emergence of falsehoods.
Collapse
Affiliation(s)
- David Robert Grimes
- School of Physical Sciences, Dublin City University, Dublin, Leinster, Ireland
- Department of Oncology, University of Oxford, Roosevelt Drive, Oxford, Oxfordshire, United Kingdom
| |
Collapse
|
41
|
Jalbert M, Schwarz N, Newman E. Only half of what i’ll tell you is true: Expecting to encounter falsehoods reduces illusory truth. JOURNAL OF APPLIED RESEARCH IN MEMORY AND COGNITION 2020. [DOI: 10.1016/j.jarmac.2020.08.010] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|
42
|
Calvillo DP, Smelter TJ. An initial accuracy focus reduces the effect of prior exposure on perceived accuracy of news headlines. COGNITIVE RESEARCH-PRINCIPLES AND IMPLICATIONS 2020; 5:55. [PMID: 33151449 PMCID: PMC7644737 DOI: 10.1186/s41235-020-00257-y] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/05/2020] [Accepted: 10/18/2020] [Indexed: 11/10/2022]
Abstract
The illusory truth effect occurs when the repetition of a claim increases its perceived truth. Previous studies have demonstrated the illusory truth effect with true and false news headlines. The present study examined the effects that different ratings made during initial exposure have on the illusory truth effect with news headlines. In two experiments, participants (total N = 575) rated a set of news headlines in one of two conditions. Some participants rated how interesting they were, and others rated how truthful they were. Participants later rated the perceived accuracy of a larger set of headlines that included previously rated and new headlines. In both experiments, prior exposure increased perceived accuracy for participants who made initial interest ratings, but not for participants who made initial truthfulness ratings. The increase in perceived accuracy that accompanies repeated exposure was attenuated when participants considered the accuracy of the headlines at initial exposure. Experiment 2 also found evidence for a political bias: participants rated politically concordant headlines as more accurate than politically discordant headlines. The magnitude of this bias was related to performance on a cognitive reflection test; more analytic participants demonstrated greater political bias. These results highlight challenges that fake news presents and suggest that initially encoding headlines’ perceived truth can serve to combat the illusion that a familiar headline is a truthful one.
Collapse
Affiliation(s)
- Dustin P Calvillo
- Psychology Department, California State University San Marcos, 333 South Twin Oaks Valley Road, San Marcos, CA, 92096, USA.
| | - Thomas J Smelter
- Psychology Department, California State University San Marcos, 333 South Twin Oaks Valley Road, San Marcos, CA, 92096, USA
| |
Collapse
|
43
|
Calio F, Nadarevic L, Musch J. How explicit warnings reduce the truth effect: A multinomial modeling approach. Acta Psychol (Amst) 2020; 211:103185. [PMID: 33130489 DOI: 10.1016/j.actpsy.2020.103185] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/04/2019] [Revised: 08/10/2020] [Accepted: 08/29/2020] [Indexed: 11/28/2022] Open
Abstract
The finding that repeating a statement typically increases its perceived truth has been referred to as the truth effect. Previous research has found that warning participants about the truth effect can successfully reduce, but not eliminate the effect. We used a multinomial modeling approach to investigate how warnings affect the cognitive processes that are assumed to underlie judgments of truth. In a laboratory experiment (N = 167), half of the participants were warned about the truth effect before judging the truth of repeated and new statements. Importantly, whereas half of the presented statements were of relatively unknown validity, participants could likely identify the correct truth status for the other half of the statements by drawing on stored knowledge. Multinomial modeling analyses revealed that warning instructions did not affect the retrieval of knowledge or participants' guessing behavior relative to a control condition. Instead, warned participants exhibited a significantly reduced tendency to rely on experiential information such as processing fluency when judging a repeated statement's truth. However, this was only the case for statements for which participants held relevant knowledge. These results are consistent with the notion that it is possible to discount metacognitive experiences such as processing ease when their informational value is questioned. Specifically, our findings suggest that people are less likely to base their judgments of truth on experiential information and metacognitive experiences induced by repetition if (a) they are warned about the deceptive power of repetition, and (b) other valid cues are available to inform their judgments.
Collapse
Affiliation(s)
- Frank Calio
- Department of Experimental Psychology, University of Düsseldorf, Germany.
| | - Lena Nadarevic
- Department of Psychology, School of Social Sciences, University of Mannheim, Germany
| | - Jochen Musch
- Department of Experimental Psychology, University of Düsseldorf, Germany
| |
Collapse
|
44
|
The truth revisited: Bayesian analysis of individual differences in the truth effect. Psychon Bull Rev 2020; 28:750-765. [PMID: 33104997 PMCID: PMC8219594 DOI: 10.3758/s13423-020-01814-8] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/05/2020] [Indexed: 11/08/2022]
Abstract
The repetition-induced truth effect refers to a phenomenon where people rate repeated statements as more likely true than novel statements. In this paper, we document qualitative individual differences in the effect. While the overwhelming majority of participants display the usual positive truth effect, a minority are the opposite-they reliably discount the validity of repeated statements, what we refer to as negative truth effect. We examine eight truth-effect data sets where individual-level data are curated. These sets are composed of 1105 individuals performing 38,904 judgments. Through Bayes factor model comparison, we show that reliable negative truth effects occur in five of the eight data sets. The negative truth effect is informative because it seems unreasonable that the mechanisms mediating the positive truth effect are the same that lead to a discounting of repeated statements' validity. Moreover, the presence of qualitative differences motivates a different type of analysis of individual differences based on ordinal (i.e., Which sign does the effect have?) rather than metric measures. To our knowledge, this paper reports the first such reliable qualitative differences in a cognitive task.
Collapse
|
45
|
Doherty JF. When fiction becomes fact: exaggerating host manipulation by parasites. Proc Biol Sci 2020; 287:20201081. [PMID: 33049168 DOI: 10.1098/rspb.2020.1081] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/18/2022] Open
Abstract
In an era where some find fake news around every corner, the use of sensationalism has inevitably found its way into the scientific literature. This is especially the case for host manipulation by parasites, a phenomenon in which a parasite causes remarkable change in the appearance or behaviour of its host. This concept, which has deservedly garnered popular interest throughout the world in recent years, is nearly 50 years old. In the past two decades, the use of scientific metaphors, including anthropomorphisms and science fiction, to describe host manipulation has become more and more prevalent. It is possible that the repeated use of such catchy, yet misleading words in both the popular media and the scientific literature could unintentionally hamper our understanding of the complexity and extent of host manipulation, ultimately shaping its narrative in part or in full. In this commentary, the impacts of exaggerating host manipulation are brought to light by examining trends in the use of embellishing words. By looking at key examples of exaggerated claims from widely reported host-parasite systems found in the recent scientific literature, it would appear that some of the fiction surrounding host manipulation has since become fact.
Collapse
|
46
|
Searching for the Backfire Effect: Measurement and Design Considerations. JOURNAL OF APPLIED RESEARCH IN MEMORY AND COGNITION 2020; 9:286-299. [PMID: 32905023 PMCID: PMC7462781 DOI: 10.1016/j.jarmac.2020.06.006] [Citation(s) in RCA: 57] [Impact Index Per Article: 14.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/14/2020] [Revised: 06/20/2020] [Accepted: 06/21/2020] [Indexed: 01/14/2023]
Abstract
One of the most concerning notions for science communicators, fact-checkers, and advocates of truth, is the backfire effect; this is when a correction leads to an individual increasing their belief in the very misconception the correction is aiming to rectify. There is currently a debate in the literature as to whether backfire effects exist at all, as recent studies have failed to find the phenomenon, even under theoretically favorable conditions. In this review, we summarize the current state of the worldview and familiarity backfire effect literatures. We subsequently examine barriers to measuring the backfire phenomenon, discuss approaches to improving measurement and design, and conclude with recommendations for fact-checkers. We suggest that backfire effects are not a robust empirical phenomenon, and more reliable measures, powerful designs, and stronger links between experimental design and theory could greatly help move the field ahead.
Collapse
|
47
|
Fazio LK. Repetition Increases Perceived Truth Even for Known Falsehoods. COLLABRA-PSYCHOLOGY 2020. [DOI: 10.1525/collabra.347] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
Repetition increases belief in false statements. This illusory truth effect occurs with many different types of statements (e.g., trivia facts, news headlines, advertisements), and even occurs when the false statement contradicts participants’ prior knowledge. However, existing studies of the effect of prior knowledge on the illusory truth effect share a common flaw; they measure participants’ knowledge after the experimental manipulation and thus conditionalize responses on posttreatment variables. In the current study, we measure prior knowledge prior to the experimental manipulation and thus provide a cleaner measurement of the causal effect of repetition on belief. We again find that prior knowledge does not protect against the illusory truth effect. Repeated false statements were given higher truth ratings than novel statements, even when they contradicted participants’ prior knowledge.
Collapse
Affiliation(s)
- Lisa K. Fazio
- Department of Psychology and Human Development, Vanderbilt University, Nashville, Tennessee, US
| |
Collapse
|
48
|
Abstract
Misinformation causes serious harm, from sowing doubt in modern medicine to inciting violence. Older adults are especially susceptible - they shared the most fake news during the 2016 US election. The most intuitive explanation for this pattern blames cognitive deficits. While older adults forget where they learned information, fluency remains intact and decades of accumulated knowledge helps them evaluate claims. Thus, cognitive declines cannot fully explain older adults' engagement with fake news. Late adulthood also involves social changes, including general trust, difficulty detecting lies, and less emphasis on accuracy when communicating. In addition, older adults are relative newcomers to social media, who may struggle to spot sponsored content or manipulated images. In a post-truth world, interventions should consider older adults' shifting social goals and gaps in their digital literacy.
Collapse
|
49
|
Look it up: Online search reduces the problematic effects of exposures to inaccuracies. Mem Cognit 2020; 48:1128-1145. [DOI: 10.3758/s13421-020-01047-z] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
50
|
Abstract
Deceptive claims surround us, embedded in fake news, advertisements, political propaganda, and rumors. How do people know what to believe? Truth judgments reflect inferences drawn from three types of information: base rates, feelings, and consistency with information retrieved from memory. First, people exhibit a bias to accept incoming information, because most claims in our environments are true. Second, people interpret feelings, like ease of processing, as evidence of truth. And third, people can (but do not always) consider whether assertions match facts and source information stored in memory. This three-part framework predicts specific illusions (e.g., truthiness, illusory truth), offers ways to correct stubborn misconceptions, and suggests the importance of converging cues in a post-truth world, where falsehoods travel further and faster than the truth.
Collapse
Affiliation(s)
- Nadia M. Brashier
- Department of Psychology, Harvard University, Cambridge, Massachusetts 02138, USA
| | - Elizabeth J. Marsh
- Department of Psychology & Neuroscience, Duke University, Durham, North Carolina 27708, USA
| |
Collapse
|