1
|
Kozyreva A, Lorenz-Spreen P, Herzog SM, Ecker UKH, Lewandowsky S, Hertwig R, Ali A, Bak-Coleman J, Barzilai S, Basol M, Berinsky AJ, Betsch C, Cook J, Fazio LK, Geers M, Guess AM, Huang H, Larreguy H, Maertens R, Panizza F, Pennycook G, Rand DG, Rathje S, Reifler J, Schmid P, Smith M, Swire-Thompson B, Szewach P, van der Linden S, Wineburg S. Toolbox of individual-level interventions against online misinformation. Nat Hum Behav 2024; 8:1044-1052. [PMID: 38740990 DOI: 10.1038/s41562-024-01881-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/01/2023] [Accepted: 04/05/2024] [Indexed: 05/16/2024]
Abstract
The spread of misinformation through media and social networks threatens many aspects of society, including public health and the state of democracies. One approach to mitigating the effect of misinformation focuses on individual-level interventions, equipping policymakers and the public with essential tools to curb the spread and influence of falsehoods. Here we introduce a toolbox of individual-level interventions for reducing harm from online misinformation. Comprising an up-to-date account of interventions featured in 81 scientific papers from across the globe, the toolbox provides both a conceptual overview of nine main types of interventions, including their target, scope and examples, and a summary of the empirical evidence supporting the interventions, including the methods and experimental paradigms used to test them. The nine types of interventions covered are accuracy prompts, debunking and rebuttals, friction, inoculation, lateral reading and verification strategies, media-literacy tips, social norms, source-credibility labels, and warning and fact-checking labels.
Collapse
Affiliation(s)
- Anastasia Kozyreva
- Center for Adaptive Rationality, Max Planck Institute for Human Development, Berlin, Germany.
| | - Philipp Lorenz-Spreen
- Center for Adaptive Rationality, Max Planck Institute for Human Development, Berlin, Germany
| | - Stefan M Herzog
- Center for Adaptive Rationality, Max Planck Institute for Human Development, Berlin, Germany
| | - Ullrich K H Ecker
- School of Psychological Science & Public Policy Institute, University of Western Australia, Perth, Western Australia, Australia
| | - Stephan Lewandowsky
- School of Psychological Science, University of Bristol, Bristol, UK
- Department of Psychology, University of Potsdam, Potsdam, Germany
| | - Ralph Hertwig
- Center for Adaptive Rationality, Max Planck Institute for Human Development, Berlin, Germany
| | - Ayesha Ali
- Department of Economics, Lahore University of Management Sciences, Lahore, Pakistan
| | - Joe Bak-Coleman
- Craig Newmark Center, School of Journalism, Columbia University, New York, NY, USA
| | - Sarit Barzilai
- Department of Learning and Instructional Sciences, University of Haifa, Haifa, Israel
| | - Melisa Basol
- Department of Psychology, University of Cambridge, Cambridge, UK
| | - Adam J Berinsky
- Department of Political Science, Massachusetts Institute of Technology, Cambridge, MA, USA
| | - Cornelia Betsch
- Institute for Planetary Health Behaviour, University of Erfurt, Erfurt, Germany
- Bernhard Nocht Institute for Tropical Medicine, Hamburg, Germany
| | - John Cook
- Melbourne Centre for Behaviour Change, University of Melbourne, Melbourne, Victoria, Australia
| | - Lisa K Fazio
- Department of Psychology and Human Development, Vanderbilt University, Nashville, TN, USA
| | - Michael Geers
- Center for Adaptive Rationality, Max Planck Institute for Human Development, Berlin, Germany
- Department of Psychology, Humboldt University of Berlin, Berlin, Germany
| | - Andrew M Guess
- Department of Politics and School of Public and International Affairs, Princeton University, Princeton, NJ, USA
| | - Haifeng Huang
- Department of Political Science, Ohio State University, Columbus, OH, USA
| | - Horacio Larreguy
- Departments of Economics and Political Science, Instituto Tecnológico Autónomo de México, Mexico City, Mexico
| | - Rakoen Maertens
- Department of Experimental Psychology, University of Oxford, Oxford, UK
| | | | - Gordon Pennycook
- Department of Psychology, Cornell University, Ithaca, NY, USA
- Department of Psychology, University of Regina, Regina, Saskatchewan, Canada
| | - David G Rand
- Sloan School of Management, Massachusetts Institute of Technology, Cambridge, MA, USA
| | - Steve Rathje
- Department of Psychology, New York University, New York, NY, USA
| | - Jason Reifler
- Department of Politics, University of Exeter, Exeter, UK
| | - Philipp Schmid
- Institute for Planetary Health Behaviour, University of Erfurt, Erfurt, Germany
- Bernhard Nocht Institute for Tropical Medicine, Hamburg, Germany
- Centre for Language Studies, Radboud University Nijmegen, Nijmegen, the Netherlands
| | - Mark Smith
- Graduate School of Education, Stanford University, Stanford, CA, USA
| | | | - Paula Szewach
- Department of Politics, University of Exeter, Exeter, UK
- Barcelona Supercomputing Center, Barcelona, Spain
| | | | - Sam Wineburg
- Graduate School of Education, Stanford University, Stanford, CA, USA
| |
Collapse
|
2
|
Kemp PL, Sinclair AH, Adcock RA, Wahlheim CN. Memory and belief updating following complete and partial reminders of fake news. Cogn Res Princ Implic 2024; 9:28. [PMID: 38713308 PMCID: PMC11076432 DOI: 10.1186/s41235-024-00546-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/07/2023] [Accepted: 03/20/2024] [Indexed: 05/08/2024] Open
Abstract
Fake news can have enduring effects on memory and beliefs. An ongoing theoretical debate has investigated whether corrections (fact-checks) should include reminders of fake news. The familiarity backfire account proposes that reminders hinder correction (increasing interference), whereas integration-based accounts argue that reminders facilitate correction (promoting memory integration). In three experiments, we examined how different types of corrections influenced memory for and belief in news headlines. In the exposure phase, participants viewed real and fake news headlines. In the correction phase, participants viewed reminders of fake news that either reiterated the false details (complete) or prompted recall of missing false details (partial); reminders were followed by fact-checked headlines correcting the false details. Both reminder types led to proactive interference in memory for corrected details, but complete reminders produced less interference than partial reminders (Experiment 1). However, when participants had fewer initial exposures to fake news and experienced a delay between exposure and correction, this effect was reversed; partial reminders led to proactive facilitation, enhancing correction (Experiment 2). This effect occurred regardless of the delay before correction (Experiment 3), suggesting that the effects of partial reminders depend on the number of prior fake news exposures. In all experiments, memory and perceived accuracy were better when fake news and corrections were recollected, implicating a critical role for integrative encoding. Overall, we show that when memories of fake news are weak or less accessible, partial reminders are more effective for correction; when memories of fake news are stronger or more accessible, complete reminders are preferable.
Collapse
Affiliation(s)
- Paige L Kemp
- Department of Psychology, University of North Carolina at Greensboro, 296 Eberhart Building, P. O. Box 26170, Greensboro, NC, 27402-6170, USA.
| | - Alyssa H Sinclair
- Department of Psychology and Neuroscience, Duke University, Durham, NC, 27708, USA
- Center for Science, Sustainability, and the Media, University of Pennsylvania, Philadelphia, USA
| | - R Alison Adcock
- Department of Psychology and Neuroscience, Duke University, Durham, NC, 27708, USA
- Department of Psychiatry and Behavioral Sciences, Duke University, Durham, USA
| | - Christopher N Wahlheim
- Department of Psychology, University of North Carolina at Greensboro, 296 Eberhart Building, P. O. Box 26170, Greensboro, NC, 27402-6170, USA
| |
Collapse
|
3
|
Prike T, Butler LH, Ecker UKH. Source-credibility information and social norms improve truth discernment and reduce engagement with misinformation online. Sci Rep 2024; 14:6900. [PMID: 38519569 PMCID: PMC10960008 DOI: 10.1038/s41598-024-57560-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2023] [Accepted: 03/19/2024] [Indexed: 03/25/2024] Open
Abstract
Misinformation on social media is a pervasive challenge. In this study (N = 415) a social-media simulation was used to test two potential interventions for countering misinformation: a credibility badge and a social norm. The credibility badge was implemented by associating accounts, including participants', with a credibility score. Participants' credibility score was dynamically updated depending on their engagement with true and false posts. To implement the social-norm intervention, participants were provided with both a descriptive norm (i.e., most people do not share misinformation) and an injunctive norm (i.e., sharing misinformation is the wrong thing to do). Both interventions were effective. The social-norm intervention led to reduced belief in false claims and improved discrimination between true and false claims. It also had some positive impact on social-media engagement, although some effects were not robust to alternative analysis specifications. The presence of credibility badges led to greater belief in true claims, lower belief in false claims, and improved discrimination. The credibility-badge intervention also had robust positive impacts on social-media engagement, leading to increased flagging and decreased liking and sharing of false posts. Cumulatively, the results suggest that both interventions have potential to combat misinformation and improve the social-media information landscape.
Collapse
Affiliation(s)
- Toby Prike
- School of Psychological Science, University of Western Australia, Perth, Australia.
- School of Psychology, University of Adelaide, Adelaide, Australia.
| | - Lucy H Butler
- School of Psychological Science, University of Western Australia, Perth, Australia
| | - Ullrich K H Ecker
- School of Psychological Science, University of Western Australia, Perth, Australia
| |
Collapse
|
4
|
Siebert J, Siebert JU. Enhancing misinformation correction: New variants and a combination of awareness training and counter-speech to mitigate belief perseverance bias. PLoS One 2024; 19:e0299139. [PMID: 38363785 PMCID: PMC10871482 DOI: 10.1371/journal.pone.0299139] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2023] [Accepted: 02/05/2024] [Indexed: 02/18/2024] Open
Abstract
Belief perseverance bias refers to individuals' tendency to persevere in biased opinions even after the misinformation that initially shaped those opinions has been retracted. This study contributes to research on reducing the negative impact of misinformation by mitigating the belief perseverance bias. The study explores the previously proposed awareness-training and counter-speech debiasing techniques, further developing them by introducing new variants and combining them. We investigate their effectiveness in mitigating the belief perseverance bias after the retraction of misinformation related to a real-life issue in an experiment involving N = 876 individuals, of whom 364 exhibit belief perseverance bias. The effectiveness of the debiasing techniques is assessed by measuring the difference between the baseline opinions before exposure to misinformation and the opinions after exposure to a debiasing technique. Our study confirmed the effectiveness of the awareness-training and counter-speech debiasing techniques in mitigating the belief perseverance bias, finding no discernible differences in the effectiveness between the previously proposed and the new variants. Moreover, we observed that the combination of awareness training and counter-speech is more effective in mitigating the belief perseverance bias than the single debiasing techniques.
Collapse
Affiliation(s)
- Jana Siebert
- Faculty of Arts, Department of Economic and Managerial Studies, Palacky University Olomouc, Olomouc, Czech Republic
| | | |
Collapse
|
5
|
Porter E, Wood TJ. Factual corrections: Concerns and current evidence. Curr Opin Psychol 2024; 55:101715. [PMID: 37988954 DOI: 10.1016/j.copsyc.2023.101715] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2023] [Revised: 10/11/2023] [Accepted: 10/18/2023] [Indexed: 11/23/2023]
Abstract
Factual corrections that target misinformation improve belief accuracy. They do so across a wide variety of countries, political beliefs and demographic characteristics. Instances of backfire, wherein exposure to corrections reduce accuracy, are exceedingly rare and may be an artifact of research design. The evidence regarding other common concerns is mixed. While the effects on corrections on belief are not permanent, they are not entirely ephemeral, either. With some exceptions, corrections mostly only affect belief accuracy, with minor to nonexistent influence on downstream attitudes and behaviors. While corrections are not unpopular among the public, limited available evidence suggests that those who see misinformation are exceedingly unlikely to see relevant corrections.
Collapse
|
6
|
Buczel KA, Siwiak A, Szpitalak M, Polczyk R. How do forewarnings and post-warnings affect misinformation reliance? The impact of warnings on the continued influence effect and belief regression. Mem Cognit 2024:10.3758/s13421-024-01520-z. [PMID: 38261249 DOI: 10.3758/s13421-024-01520-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/06/2024] [Indexed: 01/24/2024]
Abstract
People often continue to rely on certain information in their reasoning, even if this information has been retracted; this is called the continued influence effect (CIE) of misinformation. One technique for reducing this effect involves explicitly warning people that there is a possibility that they might have been misled. The present study aimed to investigate these warnings' effectiveness, depending on when they were given (either before or after misinformation). In two experiments (N = 337), we found that while a forewarning did reduce reliance on misinformation, retrospectively warned participants (when the warning was placed either between the misinformation and the retraction or just before testing) relied on the misinformation to a similar degree as unwarned participants. However, the protective effect of the forewarning was not durable, as shown by the fact that reliance on the misinformation increased for over 7 days following the first testing, despite continued memory of the retraction.
Collapse
Affiliation(s)
- Klara Austeja Buczel
- Institute of Psychology, Jagiellonian University, Kraków, Poland.
- Doctoral School in the Social Sciences, Jagiellonian University, Kraków, Poland.
| | - Adam Siwiak
- Institute of Psychology, Jagiellonian University, Kraków, Poland
- Doctoral School in the Social Sciences, Jagiellonian University, Kraków, Poland
| | | | - Romuald Polczyk
- Institute of Psychology, Jagiellonian University, Kraków, Poland
| |
Collapse
|
7
|
Jeffrey S, Ashton L, Ferfolja T, Armour M. Transgender and gender diverse people with endometriosis: A perspective on affirming gynaecological care. WOMEN'S HEALTH (LONDON, ENGLAND) 2024; 20:17455057241251974. [PMID: 38742674 PMCID: PMC11095187 DOI: 10.1177/17455057241251974] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/30/2023] [Revised: 03/29/2024] [Accepted: 04/15/2024] [Indexed: 05/16/2024]
Abstract
Transgender and gender diverse people presumed female at birth experience gynaecological conditions, such as chronic pelvic pain at elevated rates, estimated to impact between 51% and 72% of this population, compared to rates of up to 26.6% in cisgender women. The negative impact of these conditions is likely amplified due to limited access to safe and affirming healthcare. Despite this high prevalence rate, there is limited research investigating the prevalence, presentation or management options for trans and gender diverse people with endometriosis. Cisgender women with endometriosis report barriers to accessing care, with lengthy times to diagnosis and limited treatment options available. However, barriers for trans and gender diverse individuals are enhanced by physician bias and lack of education in gender-affirming care. This is reflected in stories of discrimination and denial of basic healthcare. A healthcare environment built on the presumption that gynaecological patients are women, others trans and gender diverse patients, which can result in avoidance of needed medical care. A lack of knowledge of gender-affirming care alongside healthcare provider bias highlights a need for gender-affirming care and bias reduction training in undergraduate healthcare provider curricula. Research to date assessing current curriculum in Australia and Aotearoa (New Zealand) shows limited inclusion of lesbian, gay, bisexual, trans, queer, intersex, asexual and other related identities content as a whole with gender-affirming care being among the least-frequently addressed topics. This review will detail barriers to accessing gender-affirming healthcare specific to gynaecology, interweaving the experiences of a non-binary individual seeking access to gender-affirming endometriosis care.
Collapse
Affiliation(s)
- Sam Jeffrey
- NICM Health Research Institute, Western Sydney University, Penrith, NSW, Australia
| | | | - Tania Ferfolja
- School of Education, Western Sydney University, Penrith, NSW, Australia
| | - Mike Armour
- NICM Health Research Institute, Western Sydney University, Penrith, NSW, Australia
- Translational Health Research Institute, Western Sydney University, Penrith, NSW, Australia
| |
Collapse
|
8
|
Smith R, Chen K, Winner D, Friedhoff S, Wardle C. A Systematic Review Of COVID-19 Misinformation Interventions: Lessons Learned. Health Aff (Millwood) 2023; 42:1738-1746. [PMID: 37967291 DOI: 10.1377/hlthaff.2023.00717] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2023]
Abstract
Governments, public health authorities, and social media platforms have employed various measures to counter misinformation that emerged during the COVID-19 pandemic. The effectiveness of those misinformation interventions is poorly understood. We analyzed fifty papers published between January 1, 2020, and February 24, 2023, to understand which interventions, if any, were helpful in mitigating COVID-19 misinformation. We found evidence supporting accuracy prompts, debunks, media literacy tips, warning labels, and overlays in mitigating either the spread of or belief in COVID-19 misinformation. However, by mapping the different characteristics of each study, we found levels of variation that weaken the current evidence base. For example, only 18 percent of studies included public health-related measures, such as intent to vaccinate, and the misinformation that interventions were tested against ranged considerably from conspiracy theories (vaccines include microchips) to unproven claims (gargling with saltwater prevents COVID-19). To more clearly discern the impact of various interventions and make evidence actionable for public health, the field urgently needs to include more public health experts in intervention design and to develop a health misinformation typology; agreed-upon outcome measures; and more global, more longitudinal, more video-based, and more platform-diverse studies.
Collapse
Affiliation(s)
- Rory Smith
- Rory Smith , Brown University, Providence, Rhode Island
| | | | | | | | | |
Collapse
|
9
|
Koban D, Abroms LC, Napolitano M, Simmens S, Broniatowski DA. Trust in public health institutions moderates the effectiveness of COVID-19 vaccine discussion groups on Facebook. JOURNAL OF COMMUNICATION IN HEALTHCARE 2023; 16:375-384. [PMID: 38095610 DOI: 10.1080/17538068.2023.2283308] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/18/2023]
Abstract
BACKGROUND Distrust and partisan identity are theorized to undermine health communications. We examined the role of these factors on the efficacy of discussion groups intended to promote vaccine uptake. METHOD We analyzed survey data from unvaccinated Facebook users (N = 371) living in the US between January and April 2022. Participants were randomly assigned to Facebook discussion groups (intervention) or referred to Facebook's COVID-19 Information Center (control). We used Analysis of Covariance to test if the intervention was more effective at changing vaccination intentions and beliefs compared to the control in subgroups based on participants' partisan identity, political views, and information trust views. RESULTS We found a significant interaction between the intervention and trust in public health institutions (PHIs) for improving intentions to vaccinate (P = .04), intentions to encourage others to vaccinate (P = .03), and vaccine confidence beliefs (P = .01). Among participants who trusted PHIs, those in the intervention had higher posttest intentions to vaccinate (P = .008) and intentions to encourage others to vaccinate (P = .002) compared to the control. Among non-conservatives, participants in the intervention had higher posttest intentions to vaccinate (P = .048). The intervention was more effective at improving intentions to encourage others to vaccinate within the subgroups of Republicans (P = .03), conservatives (P = .02), and participants who distrusted government (P = .02). CONCLUSIONS Facebook discussion groups were more effective for people who trusted PHIs and non-conservatives. Health communicators may need to segment health messaging and develop strategies around trust views.
Collapse
Affiliation(s)
- Donald Koban
- School of Engineering & Applied Science, George Washington University, Washington, DC, USA
| | - Lorien C Abroms
- Milken Institute School of Public Health, George Washington University, Washington, DC, USA
- Institute for Data Democracy and Politics, George Washington University, Washington, DC, USA
| | - Melissa Napolitano
- Milken Institute School of Public Health, George Washington University, Washington, DC, USA
| | - Samuel Simmens
- Milken Institute School of Public Health, George Washington University, Washington, DC, USA
| | - David A Broniatowski
- School of Engineering & Applied Science, George Washington University, Washington, DC, USA
- Institute for Data Democracy and Politics, George Washington University, Washington, DC, USA
| |
Collapse
|
10
|
Prike T, Ecker UKH. Effective correction of misinformation. Curr Opin Psychol 2023; 54:101712. [PMID: 37944323 DOI: 10.1016/j.copsyc.2023.101712] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2023] [Revised: 10/14/2023] [Accepted: 10/16/2023] [Indexed: 11/12/2023]
Abstract
This paper reviews correction effectiveness, highlighting which factors matter, which do not, and where further research is needed. To boost effectiveness, we recommend using detailed corrections and providing an alternative explanation wherever possible. We also recommend providing a reminder of the initial misinformation and repeating the correction. Presenting corrections pre-emptively (i.e., prebunking) or after misinformation exposure is unlikely to greatly impact correction effectiveness. There is also limited risk of repeating misinformation within a correction or that a correction will inadvertently spread misinformation to new audiences. Further research is needed into which correction formats are most effective, whether boosting correction memorability can enhance effectiveness, the effectiveness of discrediting a misinformation source, and whether distrusted correction sources can contribute to corrections backfiring.
Collapse
Affiliation(s)
- Toby Prike
- School of Psychological Science, University of Western Australia, Perth, Australia.
| | - Ullrich K H Ecker
- School of Psychological Science, University of Western Australia, Perth, Australia
| |
Collapse
|
11
|
Chan MPS, Albarracín D. A meta-analysis of correction effects in science-relevant misinformation. Nat Hum Behav 2023; 7:1514-1525. [PMID: 37322236 DOI: 10.1038/s41562-023-01623-8] [Citation(s) in RCA: 10] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2022] [Accepted: 05/09/2023] [Indexed: 06/17/2023]
Abstract
Scientifically relevant misinformation, defined as false claims concerning a scientific measurement procedure or scientific evidence, regardless of the author's intent, is illustrated by the fiction that the coronavirus disease 2019 vaccine contained microchips to track citizens. Updating science-relevant misinformation after a correction can be challenging, and little is known about what theoretical factors can influence the correction. Here this meta-analysis examined 205 effect sizes (that is, k, obtained from 74 reports; N = 60,861), which showed that attempts to debunk science-relevant misinformation were, on average, not successful (d = 0.19, P = 0.131, 95% confidence interval -0.06 to 0.43). However, corrections were more successful when the initial science-relevant belief concerned negative topics and domains other than health. Corrections fared better when they were detailed, when recipients were likely familiar with both sides of the issue ahead of the study and when the issue was not politically polarized.
Collapse
Affiliation(s)
- Man-Pui Sally Chan
- Annenberg School for Communication, University of Pennsylvania, Philadelphia, PA, USA.
| | - Dolores Albarracín
- Annenberg School for Communication, Annenberg Public Policy Center, School of Arts and Sciences, School of Nursing, Wharton School, University of Pennsylvania, Philadelphia, PA, USA
| |
Collapse
|
12
|
Myneni S, Cuccaro P, Montgomery S, Pakanati V, Tang J, Singh T, Dominguez O, Cohen T, Reininger B, Savas LS, Fernandez ME. Lessons Learned From Interdisciplinary Efforts to Combat COVID-19 Misinformation: Development of Agile Integrative Methods From Behavioral Science, Data Science, and Implementation Science. JMIR INFODEMIOLOGY 2023; 3:e40156. [PMID: 37113378 PMCID: PMC9987191 DOI: 10.2196/40156] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/08/2022] [Revised: 10/01/2022] [Accepted: 12/07/2022] [Indexed: 04/29/2023]
Abstract
Background Despite increasing awareness about and advances in addressing social media misinformation, the free flow of false COVID-19 information has continued, affecting individuals' preventive behaviors, including masking, testing, and vaccine uptake. Objective In this paper, we describe our multidisciplinary efforts with a specific focus on methods to (1) gather community needs, (2) develop interventions, and (3) conduct large-scale agile and rapid community assessments to examine and combat COVID-19 misinformation. Methods We used the Intervention Mapping framework to perform community needs assessment and develop theory-informed interventions. To supplement these rapid and responsive efforts through large-scale online social listening, we developed a novel methodological framework, comprising qualitative inquiry, computational methods, and quantitative network models to analyze publicly available social media data sets to model content-specific misinformation dynamics and guide content tailoring efforts. As part of community needs assessment, we conducted 11 semistructured interviews, 4 listening sessions, and 3 focus groups with community scientists. Further, we used our data repository with 416,927 COVID-19 social media posts to gather information diffusion patterns through digital channels. Results Our results from community needs assessment revealed the complex intertwining of personal, cultural, and social influences of misinformation on individual behaviors and engagement. Our social media interventions resulted in limited community engagement and indicated the need for consumer advocacy and influencer recruitment. The linking of theoretical constructs underlying health behaviors to COVID-19-related social media interactions through semantic and syntactic features using our computational models has revealed frequent interaction typologies in factual and misleading COVID-19 posts and indicated significant differences in network metrics such as degree. The performance of our deep learning classifiers was reasonable, with an F-measure of 0.80 for speech acts and 0.81 for behavior constructs. Conclusions Our study highlights the strengths of community-based field studies and emphasizes the utility of large-scale social media data sets in enabling rapid intervention tailoring to adapt grassroots community interventions to thwart misinformation seeding and spread among minority communities. Implications for consumer advocacy, data governance, and industry incentives are discussed for the sustainable role of social media solutions in public health.
Collapse
Affiliation(s)
- Sahiti Myneni
- School of Biomedical Informatics The University of Texas Health Science Center Houston, TX United States
| | - Paula Cuccaro
- Department of Health Promotion & Behavioral Sciences School of Public Health The University of Texas Health Science Center Houston, TX United States
- Center for Health Promotion and Prevention Research School of Public Health The University of Texas Health Science Center Houston, TX United States
| | - Sarah Montgomery
- Department of Health Promotion & Behavioral Sciences School of Public Health The University of Texas Health Science Center Houston, TX United States
- Center for Health Promotion and Prevention Research School of Public Health The University of Texas Health Science Center Houston, TX United States
| | - Vivek Pakanati
- The University of Texas Health Science Center Tyler, TX United States
| | - Jinni Tang
- Department of Health Promotion & Behavioral Sciences School of Public Health The University of Texas Health Science Center Houston, TX United States
- Center for Health Promotion and Prevention Research School of Public Health The University of Texas Health Science Center Houston, TX United States
| | - Tavleen Singh
- School of Biomedical Informatics The University of Texas Health Science Center Houston, TX United States
| | - Olivia Dominguez
- Department of Health Promotion & Behavioral Sciences School of Public Health The University of Texas Health Science Center Houston, TX United States
- Center for Health Promotion and Prevention Research School of Public Health The University of Texas Health Science Center Houston, TX United States
| | - Trevor Cohen
- Department of Biomedical Informatics and Medical Education The University of Washington Seattle, WA United States
| | - Belinda Reininger
- School of Public Health Brownsville Regional Campus The University of Texas Health Science Center Brownsville, TX United States
| | - Lara S Savas
- Department of Health Promotion & Behavioral Sciences School of Public Health The University of Texas Health Science Center Houston, TX United States
- Center for Health Promotion and Prevention Research School of Public Health The University of Texas Health Science Center Houston, TX United States
| | - Maria E Fernandez
- Department of Health Promotion & Behavioral Sciences School of Public Health The University of Texas Health Science Center Houston, TX United States
- Center for Health Promotion and Prevention Research School of Public Health The University of Texas Health Science Center Houston, TX United States
| |
Collapse
|
13
|
Greene CM, de Saint Laurent C, Murphy G, Prike T, Hegarty K, Ecker UKH. Best Practices for Ethical Conduct of Misinformation Research. EUROPEAN PSYCHOLOGIST 2022. [DOI: 10.1027/1016-9040/a000491] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/15/2022]
Abstract
Abstract. Misinformation can have noxious impacts on cognition, fostering the formation of false beliefs, retroactively distorting memory for events, and influencing reasoning and decision-making even after it has been credibly corrected. Researchers investigating the impacts of real-world misinformation are therefore faced with an ethical issue: they must consider the immediate and long-term consequences of exposing participants to false claims. In this paper, we first present an overview of the ethical risks associated with real-world misinformation. We then report results from a scoping review of ethical practices in misinformation research. We investigated (1) the extent to which researchers report the details of their ethical practices, including issues of informed consent and debriefing, and (2) the specific steps that researchers report taking to protect participants from the consequences of misinformation exposure. We found that fewer than 30% of misinformation papers report any debriefing, and almost no authors assessed the effectiveness of their debriefing procedure. Building on the findings from this review, we evaluate the balance of risk versus reward currently operating in this field and propose a set of guidelines for best practices. Our ultimate goal is to allow researchers the freedom to investigate questions of considerable scientific and societal impact while meeting their ethical obligations to participants.
Collapse
Affiliation(s)
| | | | - Gillian Murphy
- School of Applied Psychology, University College Cork, Ireland
| | - Toby Prike
- School of Psychological Science, University of Western Australia, Perth, Australia
| | - Karen Hegarty
- School of Psychology, University College Dublin, Ireland
| | - Ullrich K. H. Ecker
- School of Psychological Science, University of Western Australia, Perth, Australia
| |
Collapse
|
14
|
Schmid P, Betsch C. Benefits and Pitfalls of Debunking Interventions to Counter mRNA Vaccination Misinformation During the COVID-19 Pandemic. SCIENCE COMMUNICATION 2022; 44:531-558. [PMID: 38603361 PMCID: PMC9574536 DOI: 10.1177/10755470221129608] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/28/2023]
Abstract
Misinformation about mRNA vaccination is a barrier in the global fight against the COVID-19 pandemic. Thus, authorities often rely on text-based refutations as a countermeasure. In two experiments (N = 2,444), text-based refutations effectively reduced the belief in misinformation and immunized participants against the impact of a misleading social media post. However, a follow-up (N = 817) questions the longevity of these debunking and prebunking effects. Moreover, the studies reveal potential pitfalls by showing a row of unintended effects of the refutations (lacking effect on intentions, backfire-effects among religious groups, and biased judgments when omitting information about vaccine side effects).
Collapse
Affiliation(s)
- Philipp Schmid
- University of Erfurt, Germany
- Bernhard-Nocht-Institute for Tropical
Medicine, Hamburg, Germany
| | - Cornelia Betsch
- University of Erfurt, Germany
- Bernhard-Nocht-Institute for Tropical
Medicine, Hamburg, Germany
| |
Collapse
|