1
|
Johnson A, Madsen JK. Inoculation hesitancy: an exploration of challenges in scaling inoculation theory. ROYAL SOCIETY OPEN SCIENCE 2024; 11:231711. [PMID: 39100154 PMCID: PMC11296080 DOI: 10.1098/rsos.231711] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/08/2023] [Revised: 03/01/2024] [Accepted: 04/17/2024] [Indexed: 08/06/2024]
Abstract
Inoculation theory research offers a promising psychological 'vaccination' against misinformation. But are people willing to take it? Expanding on the inoculation metaphor, we introduce the concept of 'inoculation hesitancy' as a framework for exploring reluctance to engage with misinformation interventions. Study 1 investigated whether individuals feel a need for misinformation inoculations. In a comparative self-evaluation, participants assessed their own experiences with misinformation and expectations of inoculation and compared them to those of the average person. Results exposed a better-than-average effect. While participants were concerned over the problem of misinformation, they estimated that they were less likely to be exposed to it and more skilful at detecting it than the average person. Their self-described likelihood of engaging with inoculation was moderate, and they believed other people would benefit more from being inoculated. In Study 2, participants evaluated their inclination to watch inoculation videos from sources varying in trustworthiness and political affiliation. Results suggest that participants are significantly less willing to accept inoculations from low-trust sources and less likely to accept inoculations from partisan sources that are antithetical to their own political beliefs. Overall, this research identifies motivational obstacles in reaching herd immunity with inoculation theory, guiding future development of inoculation interventions.
Collapse
Affiliation(s)
- Alexandra Johnson
- Department of Psychological and Behavioural Sciences, London School of Economics and Political Science Houghton Street, LondonWC2A 2AE, UK
| | - Jens Koed Madsen
- Department of Psychological and Behavioural Sciences, London School of Economics and Political Science Houghton Street, LondonWC2A 2AE, UK
| |
Collapse
|
2
|
Goebel JT, Susmann MW, Parthasarathy S, El Gamal H, Garrett RK, Wegener DT. Belief-consistent information is most shared despite being the least surprising. Sci Rep 2024; 14:6109. [PMID: 38480773 PMCID: PMC10937659 DOI: 10.1038/s41598-024-56086-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2023] [Accepted: 03/01/2024] [Indexed: 03/17/2024] Open
Abstract
In the classical information theoretic framework, information "value" is proportional to how novel/surprising the information is. Recent work building on such notions claimed that false news spreads faster than truth online because false news is more novel and therefore surprising. However, another determinant of surprise, semantic meaning (e.g., information's consistency or inconsistency with prior beliefs), should also influence value and sharing. Examining sharing behavior on Twitter, we observed separate relations of novelty and belief consistency with sharing. Though surprise could not be assessed in those studies, belief consistency should relate to less surprise, suggesting the relevance of semantic meaning beyond novelty. In two controlled experiments, belief-consistent (vs. belief-inconsistent) information was shared more despite consistent information being the least surprising. Manipulated novelty did not predict sharing or surprise. Thus, classical information theoretic predictions regarding perceived value and sharing would benefit from considering semantic meaning in contexts where people hold pre-existing beliefs.
Collapse
Affiliation(s)
- Jacob T Goebel
- Department of Psychology, Ohio State University, Columbus, OH, USA.
| | - Mark W Susmann
- Department of Psychology, Ohio State University, Columbus, OH, USA
- Department of Psychology, Vanderbilt University, Nashville, TN, USA
- Department of Computer Science and Engineering, Ohio State University, Columbus, OH, USA
| | | | - Hesham El Gamal
- Faculty of Engineering, University of Sydney, Sydney, NSW, Australia
| | - R Kelly Garrett
- School of Communication, Ohio State University, Columbus, OH, USA
| | - Duane T Wegener
- Department of Psychology, Ohio State University, Columbus, OH, USA
| |
Collapse
|
3
|
Globig LK, Holtz N, Sharot T. Changing the incentive structure of social media platforms to halt the spread of misinformation. eLife 2023; 12:85767. [PMID: 37278047 DOI: 10.7554/elife.85767] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2022] [Accepted: 04/21/2023] [Indexed: 06/07/2023] Open
Abstract
The powerful allure of social media platforms has been attributed to the human need for social rewards. Here, we demonstrate that the spread of misinformation on such platforms is facilitated by existing social 'carrots' (e.g., 'likes') and 'sticks' (e.g., 'dislikes') that are dissociated from the veracity of the information shared. Testing 951 participants over six experiments, we show that a slight change to the incentive structure of social media platforms, such that social rewards and punishments are contingent on information veracity, produces a considerable increase in the discernment of shared information. Namely, an increase in the proportion of true information shared relative to the proportion of false information shared. Computational modeling (i.e., drift-diffusion models) revealed the underlying mechanism of this effect is associated with an increase in the weight participants assign to evidence consistent with discerning behavior. The results offer evidence for an intervention that could be adopted to reduce misinformation spread, which in turn could reduce violence, vaccine hesitancy and political polarization, without reducing engagement.
Collapse
Affiliation(s)
- Laura K Globig
- Affective Brain Lab, Department of Experimental Psychology, University College London, London, United Kingdom
- The Max Planck UCL Centre for Computational Psychiatry and Ageing Research, University College London, London, United Kingdom
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, United States
| | - Nora Holtz
- Affective Brain Lab, Department of Experimental Psychology, University College London, London, United Kingdom
- The Max Planck UCL Centre for Computational Psychiatry and Ageing Research, University College London, London, United Kingdom
| | - Tali Sharot
- Affective Brain Lab, Department of Experimental Psychology, University College London, London, United Kingdom
- The Max Planck UCL Centre for Computational Psychiatry and Ageing Research, University College London, London, United Kingdom
- Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, United States
| |
Collapse
|
4
|
McPhedran R, Ratajczak M, Mawby M, King E, Yang Y, Gold N. Psychological inoculation protects against the social media infodemic. Sci Rep 2023; 13:5780. [PMID: 37031339 PMCID: PMC10082776 DOI: 10.1038/s41598-023-32962-1] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2022] [Accepted: 04/05/2023] [Indexed: 04/10/2023] Open
Abstract
Misinformation can have a profound detrimental impact on populations' wellbeing. In this large UK-based online experiment (n = 2430), we assessed the performance of false tag and inoculation interventions in protecting against different forms of misinformation ('variants'). While previous experiments have used perception- or intention-based outcome measures, we presented participants with real-life misinformation posts in a social media platform simulation and measured their engagement, a more ecologically valid approach. Our pre-registered mixed-effects models indicated that both interventions reduced engagement with misinformation, but inoculation was most effective. However, random differences analysis revealed that the protection conferred by inoculation differed across posts. Moderation analysis indicated that immunity provided by inoculation is robust to variation in individuals' cognitive reflection. This study provides novel evidence on the general effectiveness of inoculation interventions over false tags, social media platforms' current approach. Given inoculation's effect heterogeneity, a concert of interventions will likely be required for future safeguarding efforts.
Collapse
Affiliation(s)
- Robert McPhedran
- Behavioural Practice, Behavioural Practice, Kantar Public UK, 4 Millbank, London, SW1P 3JA, UK.
| | - Michael Ratajczak
- Behavioural Practice, Behavioural Practice, Kantar Public UK, 4 Millbank, London, SW1P 3JA, UK
- Department of Linguistics and English Language, Lancaster University, Bailrigg, LA1 4YL, Lancaster, UK
| | - Max Mawby
- Behavioural Practice, Behavioural Practice, Kantar Public UK, 4 Millbank, London, SW1P 3JA, UK
| | - Emily King
- Behavioural Practice, Behavioural Practice, Kantar Public UK, 4 Millbank, London, SW1P 3JA, UK
| | - Yuchen Yang
- Behavioural Practice, Behavioural Practice, Kantar Public UK, 4 Millbank, London, SW1P 3JA, UK
| | - Natalie Gold
- Behavioural Practice, Behavioural Practice, Kantar Public UK, 4 Millbank, London, SW1P 3JA, UK
- Centre for Philosophy of Natural and Social Science (CPNSS), London School of Economics: London School of Economics and Political Science, Houghton Street, London, WC2A 2AE, UK
| |
Collapse
|