1
|
Bliuc AM, Betts JM, Vergani M, Bouguettaya A, Cristea M. A theoretical framework for polarization as the gradual fragmentation of a divided society. COMMUNICATIONS PSYCHOLOGY 2024; 2:75. [PMID: 39242900 PMCID: PMC11327288 DOI: 10.1038/s44271-024-00125-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/16/2023] [Accepted: 08/01/2024] [Indexed: 09/09/2024]
Abstract
We propose a framework integrating insights from computational social science, political, and social psychology to explain how extreme polarization can occur in deeply divided societies. Extreme polarization in a society emerges through a dynamic and complex process where societal, group, and individual factors interact. Dissent at different levels of analysis represents the driver of this process, where societal-level ideological dissent divides society into opposing camps, each with contrasting collective narratives. Within these opposing camps, further dissent leads to the formation of splinter factions and radical cells-sub-groups with increasingly extreme views. At the group level, collective narratives underpinning group identity become more extreme as society fragments. At the individual level, this process involves the internalization of an extreme group narrative and norms sanctioning radical behavior. The intense bonding within these groups and the convergence of personal and group identities through identity fusion increase the likelihood of radical group behavior.
Collapse
Affiliation(s)
| | - John M Betts
- Department of Data Science & AI, Monash University, Melbourne, VIC, Australia
| | - Matteo Vergani
- School of Humanities & Social Science, Deakin University, Melbourne, VIC, Australia
| | | | - Mioara Cristea
- Department of Psychology, Heriot-Watt University, Edinburgh, UK
| |
Collapse
|
2
|
Smith LGE, Thomas EF, Bliuc AM, McGarty C. Polarization is the psychological foundation of collective engagement. COMMUNICATIONS PSYCHOLOGY 2024; 2:41. [PMID: 39242857 PMCID: PMC11332107 DOI: 10.1038/s44271-024-00089-2] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/14/2023] [Accepted: 04/17/2024] [Indexed: 09/09/2024]
Abstract
The term polarization is used to describe both the division of a society into opposing groups (political polarization), and a social psychological phenomenon (group polarization) whereby people adopt more extreme positions after discussion. We explain how group polarization underpins the political polarization phenomenon: Social interaction, for example through social media, enables groups to form in such a way that their beliefs about what should be done to change the world-and how this differs from the stance of other groups-become integrated as aspects of a new, shared social identity. This provides a basis for mobilization to collective action.
Collapse
|
3
|
Hong CS. Fake news virality: Relational niches and the diffusion of COVID-19 vaccine misinformation. SOCIAL SCIENCE RESEARCH 2024; 120:103004. [PMID: 38763539 DOI: 10.1016/j.ssresearch.2024.103004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/23/2023] [Revised: 02/23/2024] [Accepted: 03/07/2024] [Indexed: 05/21/2024]
Abstract
This study explores why some fake news publishers are able to propagate misinformation while others receive little attention on social media. Using COVID-19 vaccine tweets as a case study, this study combined the relational niche framework with pooled and multilevel models that address the unobserved heterogeneity. The results showed that, as expected, ties to accounts with more followers were associated with more fake news tweets, retweets, and likes. However, more surprisingly, embedding with fake news publishers had an inverted U-shaped association with diffusion, whereas social proximity to mainstream media was positively associated. Although the effect of influential users is in line with opinion leader theory, the newly-identified effects of social proximity to reliable sources and embeddedness suggest that the key to fake news virality is to earn greater organizational status and modest, not overly, echo chambers. This study highlights the potential of dynamic media networks to shape the misinformation market.
Collapse
Affiliation(s)
- Chen-Shuo Hong
- Department of Sociology University of Massachusetts, 200 Hicks Way, 738 Thompson Hall, Amherst, MA, 01003, USA.
| |
Collapse
|
4
|
Tschofenig F, Reisinger D, Jäger G, Kogler ML, Adam R, Füllsack M. Stochastic modeling of cascade dynamics: A unified approach for simple and complex contagions across homogeneous and heterogeneous threshold distributions on networks. Phys Rev E 2024; 109:044307. [PMID: 38755926 DOI: 10.1103/physreve.109.044307] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2023] [Accepted: 03/20/2024] [Indexed: 05/18/2024]
Abstract
The COVID-19 pandemic has underscored the importance of understanding, forecasting, and avoiding infectious processes, as well as the necessity for understanding the diffusion and acceptance of preventative measures. Simple contagions, like virus transmission, can spread with a single encounter, while complex contagions, such as preventive social measures (e.g., wearing masks, social distancing), may require multiple interactions to propagate. This disparity in transmission mechanisms results in differing contagion rates and contagion patterns between viruses and preventive measures. Furthermore, the dynamics of complex contagions are significantly less understood than those of simple contagions. Stochastic models, integrating inherent variability and randomness, offer a way to elucidate complex contagion dynamics. This paper introduces a stochastic model for both simple and complex contagions and assesses its efficacy against ensemble simulations for homogeneous and heterogeneous threshold configurations. The model provides a unified framework for analyzing both types of contagions, demonstrating promising outcomes across various threshold setups on Erds-Rényi graphs.
Collapse
Affiliation(s)
- Fabian Tschofenig
- Department of Environmental Systems Sciences, University of Graz, Graz, Styria, Austria
| | - Daniel Reisinger
- Department of Environmental Systems Sciences, University of Graz, Graz, Styria, Austria
| | - Georg Jäger
- Department of Environmental Systems Sciences, University of Graz, Graz, Styria, Austria
| | - Marie Lisa Kogler
- Department of Environmental Systems Sciences, University of Graz, Graz, Styria, Austria
| | - Raven Adam
- Department of Environmental Systems Sciences, University of Graz, Graz, Styria, Austria
| | - Manfred Füllsack
- Department of Environmental Systems Sciences, University of Graz, Graz, Styria, Austria
| |
Collapse
|
5
|
Lin MR, Guo X, Azizi A, Fewell JH, Milner F. Mechanistic modeling of alarm signaling in seed-harvester ants. MATHEMATICAL BIOSCIENCES AND ENGINEERING : MBE 2024; 21:5536-5555. [PMID: 38872547 DOI: 10.3934/mbe.2024244] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2024]
Abstract
Ant colonies demonstrate a finely tuned alarm response to potential threats, offering a uniquely manageable empirical setting for exploring adaptive information diffusion within groups. To effectively address potential dangers, a social group must swiftly communicate the threat throughout the collective while conserving energy in the event that the threat is unfounded. Through a combination of modeling, simulation, and empirical observations of alarm spread and damping patterns, we identified the behavioral rules governing this adaptive response. Experimental trials involving alarmed ant workers (Pogonomyrmex californicus) released into a tranquil group of nestmates revealed a consistent pattern of rapid alarm propagation followed by a comparatively extended decay period [1]. The experiments in [1] showed that individual ants exhibiting alarm behavior increased their movement speed, with variations in response to alarm stimuli, particularly during the peak of the reaction. We used the data in [1] to investigate whether these observed characteristics alone could account for the swift mobility increase and gradual decay of alarm excitement. Our self-propelled particle model incorporated a switch-like mechanism for ants' response to alarm signals and individual variations in the intensity of speed increased after encountering these signals. This study aligned with the established hypothesis that individual ants possess cognitive abilities to process and disseminate information, contributing to collective cognition within the colony (see [2] and the references therein). The elements examined in this research support this hypothesis by reproducing statistical features of the empirical speed distribution across various parameter values.
Collapse
Affiliation(s)
- Michael R Lin
- Simon A. Levin Mathematical, Computational and Modeling Sciences Center, Arizona State University, Tempe 85281, USA
| | - Xiaohui Guo
- Department of Physics of Complex Systems, Weizmann Institute of Science, Rehovot 7632706, Israel
| | - Asma Azizi
- Department of Mathematics, Kennesaw State University, Marietta 30062, USA
| | | | - Fabio Milner
- Simon A. Levin Mathematical, Computational and Modeling Sciences Center, Arizona State University, Tempe 85281, USA
- School of Mathematical and Statistical Sciences, Arizona State University, Tempe 85287, USA
| |
Collapse
|
6
|
Goebel JT, Susmann MW, Parthasarathy S, El Gamal H, Garrett RK, Wegener DT. Belief-consistent information is most shared despite being the least surprising. Sci Rep 2024; 14:6109. [PMID: 38480773 PMCID: PMC10937659 DOI: 10.1038/s41598-024-56086-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2023] [Accepted: 03/01/2024] [Indexed: 03/17/2024] Open
Abstract
In the classical information theoretic framework, information "value" is proportional to how novel/surprising the information is. Recent work building on such notions claimed that false news spreads faster than truth online because false news is more novel and therefore surprising. However, another determinant of surprise, semantic meaning (e.g., information's consistency or inconsistency with prior beliefs), should also influence value and sharing. Examining sharing behavior on Twitter, we observed separate relations of novelty and belief consistency with sharing. Though surprise could not be assessed in those studies, belief consistency should relate to less surprise, suggesting the relevance of semantic meaning beyond novelty. In two controlled experiments, belief-consistent (vs. belief-inconsistent) information was shared more despite consistent information being the least surprising. Manipulated novelty did not predict sharing or surprise. Thus, classical information theoretic predictions regarding perceived value and sharing would benefit from considering semantic meaning in contexts where people hold pre-existing beliefs.
Collapse
Affiliation(s)
- Jacob T Goebel
- Department of Psychology, Ohio State University, Columbus, OH, USA.
| | - Mark W Susmann
- Department of Psychology, Ohio State University, Columbus, OH, USA
- Department of Psychology, Vanderbilt University, Nashville, TN, USA
- Department of Computer Science and Engineering, Ohio State University, Columbus, OH, USA
| | | | - Hesham El Gamal
- Faculty of Engineering, University of Sydney, Sydney, NSW, Australia
| | - R Kelly Garrett
- School of Communication, Ohio State University, Columbus, OH, USA
| | - Duane T Wegener
- Department of Psychology, Ohio State University, Columbus, OH, USA
| |
Collapse
|
7
|
Hahn U, Merdes C, von Sydow M. Knowledge through social networks: Accuracy, error, and polarisation. PLoS One 2024; 19:e0294815. [PMID: 38170696 PMCID: PMC10763946 DOI: 10.1371/journal.pone.0294815] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2023] [Accepted: 11/09/2023] [Indexed: 01/05/2024] Open
Abstract
This paper examines the fundamental problem of testimony. Much of what we believe to know we know in good part, or even entirely, through the testimony of others. The problem with testimony is that we often have very little on which to base estimates of the accuracy of our sources. Simulations with otherwise optimal agents examine the impact of this for the accuracy of our beliefs about the world. It is demonstrated both where social networks of information dissemination help and where they hinder. Most importantly, it is shown that both social networks and a common strategy for gauging the accuracy of our sources give rise to polarisation even for entirely accuracy motivated agents. Crucially these two factors interact, amplifying one another's negative consequences, and this side effect of communication in a social network increases with network size. This suggests a new causal mechanism by which social media may have fostered the increase in polarisation currently observed in many parts of the world.
Collapse
Affiliation(s)
- Ulrike Hahn
- Department of Psychological Sciences, Birkbeck, University of London, London, United Kingdom
- MCMP, Ludwig-Maximilians-Universitaet, Munich, Germany
| | - Christoph Merdes
- MCMP, Ludwig-Maximilians-Universitaet, Munich, Germany
- Interdisciplinary Centre for Ethics, Jagiellonian University Cracow, Cracow, Poland
| | | |
Collapse
|
8
|
Debnath R, Ebanks D, Mohaddes K, Roulet T, Alvarez RM. Do fossil fuel firms reframe online climate and sustainability communication? A data-driven analysis. NPJ CLIMATE ACTION 2023; 2:47. [PMID: 38694952 PMCID: PMC11062293 DOI: 10.1038/s44168-023-00086-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/30/2023] [Accepted: 11/16/2023] [Indexed: 05/04/2024]
Abstract
Identifying drivers of climate misinformation on social media is crucial to climate action. Misinformation comes in various forms; however, subtler strategies, such as emphasizing favorable interpretations of events or data or reframing conversations to fit preferred narratives, have received little attention. This data-driven paper examines online climate and sustainability communication behavior over 7 years (2014-2021) across three influential stakeholder groups consisting of eight fossil fuel firms (industry), 14 non-governmental organizations (NGOs), and eight inter-governmental organizations (IGOs). We examine historical Twitter interaction data (n = 668,826) using machine learning-driven joint-sentiment topic modeling and vector autoregression to measure online interactions and influences amongst these groups. We report three key findings. First, we find that the stakeholders in our sample are responsive to one another online, especially over topics in their respective areas of domain expertise. Second, the industry is more likely to respond to IGOs' and NGOs' online messaging changes, especially regarding environmental justice and climate action topics. The fossil fuel industry is more likely to discuss public relations, advertising, and corporate sustainability topics. Third, we find that climate change-driven extreme weather events and stock market performance do not significantly affect the patterns of communication among these firms and organizations. In conclusion, we provide a data-driven foundation for understanding the influence of powerful stakeholder groups on shaping the online climate and sustainability information ecosystem around climate change.
Collapse
Affiliation(s)
- Ramit Debnath
- University of Cambridge, Cambridge, CB2 1TN UK
- California Institute of Technology, Pasadena, CA 91125 USA
| | - Danny Ebanks
- California Institute of Technology, Pasadena, CA 91125 USA
- Harvard University, Cambridge, MA 02138 USA
| | | | | | | |
Collapse
|
9
|
Yantseva V, Vega D, Magnani M. Immigrant-critical alternative media in online conversations. PLoS One 2023; 18:e0294636. [PMID: 38033035 PMCID: PMC10688883 DOI: 10.1371/journal.pone.0294636] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2023] [Accepted: 11/05/2023] [Indexed: 12/02/2023] Open
Abstract
In this work, we explore the role of immigrant-critical alternative media in shaping collective emotions and users' evaluations of the immigration issue, using a conversational approach and an empirical case of Flashback, a prominent Swedish online platform where many immigration-related discussions take place. Our text and network-based analysis of more than 9,000 conversations during the last election period reveals that the platform users consume and distribute diverging types of media content across a wide ideological spectrum which, however, has a limited influence on the evolution of conversations and users' stances in the immigration debate. Nevertheless, we find that the conversation networks with alternative media content tend to include more negative evaluations of the immigration issue, attracting fewer participants and lasting less than other conversations. We contextualise our findings using Collins' Interaction Ritual Chains (IRC) theory and discuss the conditions under which such online conversations can produce high user involvement and, potentially, participants' radicalisation.
Collapse
Affiliation(s)
- Victoria Yantseva
- InfoLab, Department of Information Technology, Uppsala University, Uppsala, Sweden
| | - Davide Vega
- InfoLab, Department of Information Technology, Uppsala University, Uppsala, Sweden
| | - Matteo Magnani
- InfoLab, Department of Information Technology, Uppsala University, Uppsala, Sweden
| |
Collapse
|
10
|
Adams Z, Osman M, Bechlivanidis C, Meder B. (Why) Is Misinformation a Problem? PERSPECTIVES ON PSYCHOLOGICAL SCIENCE 2023; 18:1436-1463. [PMID: 36795592 PMCID: PMC10623619 DOI: 10.1177/17456916221141344] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/17/2023]
Abstract
In the last decade there has been a proliferation of research on misinformation. One important aspect of this work that receives less attention than it should is exactly why misinformation is a problem. To adequately address this question, we must first look to its speculated causes and effects. We examined different disciplines (computer science, economics, history, information science, journalism, law, media, politics, philosophy, psychology, sociology) that investigate misinformation. The consensus view points to advancements in information technology (e.g., the Internet, social media) as a main cause of the proliferation and increasing impact of misinformation, with a variety of illustrations of the effects. We critically analyzed both issues. As to the effects, misbehaviors are not yet reliably demonstrated empirically to be the outcome of misinformation; correlation as causation may have a hand in that perception. As to the cause, advancements in information technologies enable, as well as reveal, multitudes of interactions that represent significant deviations from ground truths through people's new way of knowing (intersubjectivity). This, we argue, is illusionary when understood in light of historical epistemology. Both doubts we raise are used to consider the cost to established norms of liberal democracy that come from efforts to target the problem of misinformation.
Collapse
Affiliation(s)
- Zoë Adams
- Department of Linguistics, School of Languages, Linguistics and Film, Queen Mary University London
| | - Magda Osman
- Centre for Science and Policy, University of Cambridge
- Judge Business School, University of Cambridge
- Leeds Business School, University of Leeds
| | | | - Björn Meder
- Department of Psychology, Health and Medical University, Potsdam, Germany
- Max Planck Research Group iSearch, Max Planck Institute for Human Development, Berlin, Germany
| |
Collapse
|
11
|
Lockmiller C. Decoding the Misinformation-Legislation Pipeline: an analysis of Florida Medicaid and the current state of transgender healthcare. J Med Libr Assoc 2023; 111:750-761. [PMID: 37928129 PMCID: PMC10621716 DOI: 10.5195/jmla.2023.1724] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2023] Open
Abstract
Background The state of evidence-based transgender healthcare in the United States has been put at risk by the spread of misinformation harmful to transgender people. Health science librarians can alleviate the spread of misinformation by identifying and analyzing its flow through systems that affect access to healthcare. Discussion The author developed the theory of the Misinformation - Legislation Pipeline by studying the flow of anti-transgender misinformation from online echo chambers through a peer-reviewed article and into policy enacted to ban medical treatments for transgender people in the state of Florida. The analysis is precluded with a literature review of currently accepted best practices in transgender healthcare, after which, the author analyzes the key report leveraged by Florida's Department of Health in its ban. A critical analysis of the report is followed by a secondary analysis of the key peer-reviewed article upon which the Florida Medicaid authors relied to make the decision. The paper culminates with a summation of the trajectory of anti-transgender misinformation. Conclusion Misinformation plays a key role in producing legislation harmful to transgender people. Health science librarians have a role to play in identifying misinformation as it flows through the Misinformation - Legislation Pipeline and enacting key practices to identify, analyze, and oppose the spread of harmful misinformation.
Collapse
Affiliation(s)
- Catherine Lockmiller
- , Health Science Librarian, Cline Library, Northern Arizona University, Flagstaff, AZ
| |
Collapse
|
12
|
Bizzotto N, Schulz PJ, de Bruijn GJ. The "Loci" of Misinformation and Its Correction in Peer- and Expert-Led Online Communities for Mental Health: Content Analysis. J Med Internet Res 2023; 25:e44656. [PMID: 37721800 PMCID: PMC10546261 DOI: 10.2196/44656] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2022] [Revised: 03/22/2023] [Accepted: 08/04/2023] [Indexed: 09/19/2023] Open
Abstract
BACKGROUND Mental health problems are recognized as a pressing public health issue, and an increasing number of individuals are turning to online communities for mental health to search for information and support. Although these virtual platforms have the potential to provide emotional support and access to anecdotal experiences, they can also present users with large amounts of potentially inaccurate information. Despite the importance of this issue, limited research has been conducted, especially on the differences that might emerge due to the type of content moderation of online communities: peer-led or expert-led. OBJECTIVE We aim to fill this gap by examining the prevalence, the communicative context, and the persistence of mental health misinformation on Facebook online communities for mental health, with a focus on understanding the mechanisms that enable effective correction of inaccurate information and differences between expert-led and peer-led groups. METHODS We conducted a content analysis of 1534 statements (from 144 threads) in 2 Italian-speaking Facebook groups. RESULTS The study found that an alarming number of comments (26.1%) contained medically inaccurate information. Furthermore, nearly 60% of the threads presented at least one misinformation statement without any correction attempt. Moderators were more likely to correct misinformation than members; however, they were not immune to posting content containing misinformation, which was an unexpected finding. Discussions about aspects of treatment (including side effects or treatment interruption) significantly increased the probability of encountering misinformation. Additionally, the study found that misinformation produced in the comments of a thread, rather than as the first post, had a lower probability of being corrected, particularly in peer-led communities. CONCLUSIONS The high prevalence of misinformation in online communities, particularly when left uncorrected, underscores the importance of conducting additional research to identify effective mechanisms to prevent its spread. This is especially important given the study's finding that misinformation tends to be more prevalent around specific "loci" of discussion that, once identified, can serve as a starting point to develop strategies for preventing and correcting misinformation within them.
Collapse
Affiliation(s)
- Nicole Bizzotto
- Faculty of Communication, Culture and Society, Università della Svizzera italiana, Lugano, Switzerland
| | - Peter Johannes Schulz
- Faculty of Communication, Culture and Society, Università della Svizzera italiana, Lugano, Switzerland
- Department of Communication and Media, Ewha Womans University, Seoul, Republic of Korea
- Wee Kim Wee School of Communication & Information & LKC School of Medicine, Nanyang Technological University, Singapore
| | - Gert-Jan de Bruijn
- Department of Communication Studies, University of Antwerp, Antwerp, Belgium
| |
Collapse
|
13
|
Pal R, Kumar A, Santhanam MS. Depolarization of opinions on social networks through random nudges. Phys Rev E 2023; 108:034307. [PMID: 37849173 DOI: 10.1103/physreve.108.034307] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2023] [Accepted: 08/17/2023] [Indexed: 10/19/2023]
Abstract
Polarization of opinions has been empirically noted in many online social network platforms. Traditional models of opinion dynamics, based on statistical physics principles, do not account for the emergence of polarization and echo chambers in online network platforms. A recently introduced opinion dynamics model that incorporates the homophily factor-the tendency of agents to connect with those holding similar opinions as their own-captures polarization and echo chamber effects. In this work, we provide a nonintrusive framework for mildly nudging agents in an online community to form random connections. This is shown to lead to significant depolarization of opinions and decrease the echo chamber effects. Though a mild nudge effectively avoids polarization, overdoing this leads to another undesirable effect, namely, radicalization. Further, we obtain the optimal nudge probability to avoid the extremes of polarization and radicalization outcomes.
Collapse
Affiliation(s)
- Ritam Pal
- Department of Physics, Indian Institute of Science Education and Research, Dr. Homi Bhabha Road, Pune 411008, India
| | - Aanjaneya Kumar
- Department of Physics, Indian Institute of Science Education and Research, Dr. Homi Bhabha Road, Pune 411008, India
| | - M S Santhanam
- Department of Physics, Indian Institute of Science Education and Research, Dr. Homi Bhabha Road, Pune 411008, India
| |
Collapse
|
14
|
Sundelson AE, Jamison AM, Huhn N, Pasquino SL, Sell TK. Fighting the infodemic: the 4 i Framework for Advancing Communication and Trust. BMC Public Health 2023; 23:1662. [PMID: 37644563 PMCID: PMC10466697 DOI: 10.1186/s12889-023-16612-9] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2023] [Accepted: 08/24/2023] [Indexed: 08/31/2023] Open
Abstract
BACKGROUND The proliferation of false and misleading health claims poses a major threat to public health. This ongoing "infodemic" has prompted numerous organizations to develop tools and approaches to manage the spread of falsehoods and communicate more effectively in an environment of mistrust and misleading information. However, these tools and approaches have not been systematically characterized, limiting their utility. This analysis provides a characterization of the current ecosystem of infodemic management strategies, allowing public health practitioners, communicators, researchers, and policy makers to gain an understanding of the tools at their disposal. METHODS A multi-pronged search strategy was used to identify tools and approaches for combatting health-related misinformation and disinformation. The search strategy included a scoping review of academic literature; a review of gray literature from organizations involved in public health communications and misinformation/disinformation management; and a review of policies and infodemic management approaches from all U.S. state health departments and select local health departments. A team of annotators labelled the main feature(s) of each tool or approach using an iteratively developed list of tags. RESULTS We identified over 350 infodemic management tools and approaches. We introduce the 4 i Framework for Advancing Communication and Trust (4 i FACT), a modified social-ecological model, to characterize different levels of infodemic intervention: informational, individual, interpersonal, and institutional. Information-level strategies included those designed to amplify factual information, fill information voids, debunk false information, track circulating information, and verify, detect, or rate the credibility of information. Individual-level strategies included those designed to enhance information literacy and prebunking/inoculation tools. Strategies at the interpersonal/community level included resources for public health communicators and community engagement approaches. Institutional and structural approaches included resources for journalists and fact checkers, tools for managing academic/scientific literature, resources for infodemic researchers/research, resources for infodemic managers, social media regulation, and policy/legislation. CONCLUSIONS The 4 i FACT provides a useful way to characterize the current ecosystem of infodemic management strategies. Recognizing the complex and multifaceted nature of the ongoing infodemic, efforts should be taken to utilize and integrate strategies across all four levels of the modified social-ecological model.
Collapse
Affiliation(s)
- Anne E Sundelson
- Johns Hopkins Center for Health Security, 700 E. Pratt Street, Suite 900, Baltimore, MD, 21202, USA.
- Department of Environmental Health and Engineering, Johns Hopkins Bloomberg School of Public Health, 615 N. Wolfe Street, Room E7527, Baltimore, MD, 21205, USA.
| | - Amelia M Jamison
- Department of Health, Behavior and Society, Johns Hopkins Bloomberg School of Public Health, 615 N. Wolfe Street, Baltimore, MD, 21205, USA
| | - Noelle Huhn
- Johns Hopkins Center for Health Security, 700 E. Pratt Street, Suite 900, Baltimore, MD, 21202, USA
- Department of Environmental Health and Engineering, Johns Hopkins Bloomberg School of Public Health, 615 N. Wolfe Street, Room E7527, Baltimore, MD, 21205, USA
| | - Sarah-Louise Pasquino
- Department of International Health, Johns Hopkins Bloomberg School of Public Health, 615 N. Wolfe Street, Baltimore, MD, 21205, USA
| | - Tara Kirk Sell
- Johns Hopkins Center for Health Security, 700 E. Pratt Street, Suite 900, Baltimore, MD, 21202, USA
- Department of Environmental Health and Engineering, Johns Hopkins Bloomberg School of Public Health, 615 N. Wolfe Street, Room E7527, Baltimore, MD, 21205, USA
| |
Collapse
|
15
|
Nus BM, Wu K, Sledge T, Torres G, Kamma S, Janumpally S, Gilani S, Lick S. The Quality of Coronary Artery Bypass Grafting Videos on YouTube. Cureus 2023; 15:e44281. [PMID: 37645663 PMCID: PMC10462417 DOI: 10.7759/cureus.44281] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 08/27/2023] [Indexed: 08/31/2023] Open
Abstract
Objective YouTube (YouTube LLC, San Bruno, California, United States), one of the most accessed sites on the internet, has become a widespread source of healthcare information for patients. Videos about coronary artery bypass grafts (CABG) have accrued tens of millions of views on the platform, yet their educational quality is unknown. This study investigates the educational landscape of videos regarding CABG procedures on YouTube. Methods YouTube was queried for "Coronary Artery Bypass Graft Surgery" and "Coronary Artery Bypass Graft Procedure". After applying exclusion criteria, 73 videos were assessed. Two independent reviewers rated the material with the Global Quality Scale (GQS) (5 = high quality, 0 = low quality) to judge educational value. A ratio of view count to days since upload was applied to assess video popularity. Source, modality, and date of upload were recorded for each video as well. Results An average GQS score of 2.94 was found, indicating poor educational quality of the 73 YouTube videos on CABG procedures. Videos uploaded by physicians (56/73; 76.7%) had a significantly higher average GQS score than those uploaded by non-physicians (p<0.001). When content was grouped by delivery method, physician-led presentations (24/73 or 32.9%) produced the highest average GQS score of 3.35; conversely, patient-friendly delivery methods (18/73 or 24.7%) yielded the lowest average GQS score of 2.36 (p<0.001). Neither the view ratio nor the days since upload significantly correlated with the educational quality of the video. Conclusion Although CABG videos are readily available on YouTube, they often contain considerable biases and misleading information. With online sources for healthcare education now commonplace, physicians must be aware of the vast quantities of low-quality videos patients often encounter when weighing different treatment options. Further analysis of CABG videos on YouTube may allow physicians to ameliorate this gap by producing videos that are not only high quality but highly viewed on the platform.
Collapse
Affiliation(s)
- Bradley M Nus
- Cardiology, University of Texas Medical Branch at Galveston, Galveston, USA
| | - Kylie Wu
- Cardiology, Texas College of Osteopathic Medicine, Fort Worth, USA
| | - Trey Sledge
- Cardiology, University of Texas Medical Branch at Galveston, Galveston, USA
| | - Grant Torres
- Cardiology, University of Texas Medical Branch at Galveston, Galveston, USA
| | - Sai Kamma
- Cardiology, University of Texas Medical Branch at Galveston, Galveston, USA
| | | | - Syed Gilani
- Cardiology, University of Texas Medical Branch at Galveston, Galveston, USA
| | - Scott Lick
- Cardiothoracic Surgery, University of Texas Medical Branch at Galveston, Galveston, USA
| |
Collapse
|
16
|
Gao Y, Liu F, Gao L. Echo chamber effects on short video platforms. Sci Rep 2023; 13:6282. [PMID: 37072484 PMCID: PMC10111082 DOI: 10.1038/s41598-023-33370-1] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2022] [Accepted: 04/12/2023] [Indexed: 05/03/2023] Open
Abstract
In recent years, short videos have become an increasingly vital source of information. To compete for users' attention, short video platforms have been overusing algorithmic technology, making the group polarization intensify, which is likely to push users into the homogeneous "echo chamber". However, echo chambers can contribute to the spread of misleading information, false news, or rumors, which have negative social impacts. Therefore, it is necessary to explore echo chamber effects in short video platforms. Moreover, the communication paradigms between users and feed algorithms greatly vary across short video platforms. This paper investigated echo chamber effects of three popular short video platforms (Douyin, TikTok, and Bilibili) using social network analysis and explored how user features influenced the generation of echo chambers. We quantified echo chamber effects through two primary ingredients: selective exposure and homophily, in both platform and topic dimensions. Our analyses indicate that the gathering of users into homogeneous groups dominates online interactions on Douyin and Bilibili. We performed performance comparison of echo chamber effects and found that echo chamber members tend to display themselves to attract the attention of their peers and that cultural differences can prevent the development of echo chambers. Our findings are of great value in designing targeted management strategies to prevent the spread of misleading information, false news, or rumors.
Collapse
Affiliation(s)
- Yichang Gao
- Business School, Shandong Normal University, Ji'nan, 250014, China
- Commonwealth Scientific and Industrial Research Organisation (CSIRO), Waite Campus, Urrbrae, SA, 5064, Australia
| | - Fengming Liu
- Business School, Shandong Normal University, Ji'nan, 250014, China.
| | - Lei Gao
- Commonwealth Scientific and Industrial Research Organisation (CSIRO), Waite Campus, Urrbrae, SA, 5064, Australia.
| |
Collapse
|
17
|
Fake news believability: The effects of political beliefs and espoused cultural values. INFORMATION & MANAGEMENT 2023; 60. [PMCID: PMC9771845 DOI: 10.1016/j.im.2022.103745] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 12/24/2022]
Abstract
Fake news has led to a polarized society as evidenced by diametrically opposed perceptions of and reactions to global events such as the Coronavirus Disease 2019 (COVID-19) pandemic and presidential campaigns. Popular press has linked individuals’ political beliefs and cultural values to the extent to which they believe in false content shared on social networking sites (SNS). However, sweeping generalizations run the risk of helping exacerbate divisiveness in already polarized societies. This study examines the effects of individuals’ political beliefs and espoused cultural values on fake news believability using a repeated-measures design (that exposes individuals to a variety of fake news scenarios). Results from online questionnaire-based survey data collected from participants in the US and India help confirm that conservative individuals tend to exhibit increasing fake news believability and show that collectivists tend to do the same. This study advances knowledge on characteristics that make individuals more susceptible to lending credence to fake news. In addition, this study explores the influence exerted by control variables (i.e., age, sex, and Internet usage). Findings are used to provide implications for theory as well as actionable insights.
Collapse
|
18
|
Suzuki T, Yamamoto H, Ogawa Y, Umetani R. Effects of media on preventive behaviour during the COVID-19 pandemic. HUMANITIES & SOCIAL SCIENCES COMMUNICATIONS 2023; 10:58. [PMID: 36818040 PMCID: PMC9926457 DOI: 10.1057/s41599-023-01554-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/08/2022] [Accepted: 02/03/2023] [Indexed: 06/18/2023]
Abstract
The novel coronavirus 2019 (COVID-19) pandemic required implementation of a variety of measures. In addition to pharmaceutical measures, such as vaccines, changing individuals' nonpharmaceutical preventive behaviour is essential to prevent the spread of infection. In uncertain situations, such as a pandemic, media sources are important for guiding individuals' decision-making behaviour. In this study, we examined the effects of media use on preventive behaviour during COVID-19. Earlier studies have shown that social networking service (SNS) browsing promotes preventive behaviour. However, those studies only assessed a single point during the early stages of the pandemic; therefore, the effects on ongoing preventive behaviour are unclear. Thus, a two-wave panel survey was conducted in 2020 and 2021 for an exploratory analysis of changes in the effects of media on individuals' preventive behaviour over time. The results show that the effect of SNS browsing on preventing going out was confirmed only during the early stage of the pandemic and was not observed 1 year later. It is also shown that those who shifted from self-restraint to going out within 1 year were not affected by the type of media use, but by cognitive factors. As the situation changes during a pandemic, analyses that consider time-series changes are essential for gaining insights about the effects of media on the promotion and maintenance of continuous prevention behaviours.
Collapse
|
19
|
Network segregation and the propagation of misinformation. Sci Rep 2023; 13:917. [PMID: 36650189 PMCID: PMC9845210 DOI: 10.1038/s41598-022-26913-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2022] [Accepted: 12/21/2022] [Indexed: 01/18/2023] Open
Abstract
How does the ideological segregation of online networks impact the spread of misinformation? Past studies have found that homophily generally increases diffusion, suggesting that partisan news, whether true or false, will spread farther in ideologically segregated networks. We argue that network segregation disproportionately aids messages that are otherwise too implausible to diffuse, thus favoring false over true news. To test this argument, we seeded true and false informational messages in experimental networks in which subjects were either ideologically integrated or segregated, yielding 512 controlled propagation histories in 16 independent information systems. Experimental results reveal that the fraction of false information circulating was systematically greater in ideologically segregated networks. Agent-based models show robustness of this finding across different network topologies and sizes. We conclude that partisan sorting undermines the veracity of information circulating on the Internet by increasing exposure to content that would otherwise not manage to diffuse.
Collapse
|
20
|
Abdalla Mikhaeil C, Baskerville RL. Explaining online conspiracy theory radicalization: A second‐order affordance for identity‐driven escalation. INFORMATION SYSTEMS JOURNAL 2023. [DOI: 10.1111/isj.12427] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/17/2023]
|
21
|
Devakumar D, Selvarajah S, Abubakar I, Kim SS, McKee M, Sabharwal NS, Saini A, Shannon G, White AIR, Achiume ET. Racism, xenophobia, discrimination, and the determination of health. Lancet 2022; 400:2097-2108. [PMID: 36502848 DOI: 10.1016/s0140-6736(22)01972-9] [Citation(s) in RCA: 39] [Impact Index Per Article: 19.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/22/2020] [Revised: 09/29/2022] [Accepted: 10/03/2022] [Indexed: 12/13/2022]
Abstract
This Series shows how racism, xenophobia, discrimination, and the structures that support them are detrimental to health. In this first Series paper, we describe the conceptual model used throughout the Series and the underlying principles and definitions. We explore concepts of epistemic injustice, biological experimentation, and misconceptions about race using a historical lens. We focus on the core structural factors of separation and hierarchical power that permeate society and result in the negative health consequences we see. We are at a crucial moment in history, as populist leaders pushing the politics of hate have become more powerful in several countries. These leaders exploit racism, xenophobia, and other forms of discrimination to divide and control populations, with immediate and long-term consequences for both individual and population health. The COVID-19 pandemic and transnational racial justice movements have brought renewed attention to persisting structural racial injustice.
Collapse
Affiliation(s)
- Delan Devakumar
- Institute for Global Health, University College London, London, UK.
| | | | - Ibrahim Abubakar
- Institute for Global Health, University College London, London, UK
| | - Seung-Sup Kim
- Department of Environmental Health Sciences, Seoul National University, Seoul, South Korea
| | - Martin McKee
- London School of Hygiene & Tropical Medicine, London, UK
| | - Nidhi S Sabharwal
- Centre for Policy Research in Higher Education, National Institute of Educational Planning and Administration, New Delhi, India
| | | | - Geordan Shannon
- Institute for Global Health, University College London, London, UK
| | - Alexandre I R White
- Johns Hopkins University and Johns Hopkins School of Medicine, Baltimore, MD, USA
| | | |
Collapse
|
22
|
Yeung A, Ng E, Abi-Jaoude E. TikTok and Attention-Deficit/Hyperactivity Disorder: A Cross-Sectional Study of Social Media Content Quality. CANADIAN JOURNAL OF PSYCHIATRY. REVUE CANADIENNE DE PSYCHIATRIE 2022; 67:899-906. [PMID: 35196157 PMCID: PMC9659797 DOI: 10.1177/07067437221082854] [Citation(s) in RCA: 54] [Impact Index Per Article: 27.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 12/16/2022]
Abstract
OBJECTIVES Social media platforms are increasingly being used to disseminate mental health information online. User-generated content about attention-deficit/hyperactivity disorder (ADHD) is one of the most popular health topics on the video-sharing social media platform TikTok. We sought to investigate the quality of TikTok videos about ADHD. METHOD The top 100 most popular videos about ADHD uploaded by TikTok video creators were classified as misleading, useful, or personal experience. Descriptive and quantitative characteristics of the videos were obtained. The Patient Education Materials Assessment Tool for Audiovisual Materials (PEMAT-A/V) and Journal of American Medical Association (JAMA) benchmark criteria were used to assess the overall quality, understandability, and actionability of the videos. RESULTS Of the 100 videos meeting inclusion criteria, 52% (n = 52) were classified as misleading, 27% (n = 27) as personal experience, and 21% (n = 21) as useful. Classification agreement between clinician ratings was 86% (kappa statistic of 0.7766). Videos on the platform were highly understandable by viewers but had low actionability. Non-healthcare providers uploaded the majority of misleading videos. Healthcare providers uploaded higher quality and more useful videos, compared to non-healthcare providers. CONCLUSIONS Approximately half of the analyzed TikTok videos about ADHD were misleading. Clinicians should be aware of the widespread dissemination of health misinformation on social media platforms and its potential impact on clinical care.
Collapse
Affiliation(s)
- Anthony Yeung
- Department of Psychiatry, 12358University of British Columbia, Vancouver, British Columbia, Canada.,Centre for Addiction and Mental Health (CAMH), Toronto, Ontario, Canada
| | - Enoch Ng
- Department of Psychiatry, 12366University of Toronto, Toronto, Ontario, Canada
| | - Elia Abi-Jaoude
- Department of Psychiatry, 12366University of Toronto, Toronto, Ontario, Canada.,Department of Psychiatry, The Hospital for Sick Children, 12366University of Toronto, Toronto, Ontario, Canada
| |
Collapse
|
23
|
Sun M, Ma X, Huo Y. Does Social Media Users' Interaction Influence the Formation of Echo Chambers? Social Network Analysis Based on Vaccine Video Comments on YouTube. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH 2022; 19:15869. [PMID: 36497977 PMCID: PMC9739846 DOI: 10.3390/ijerph192315869] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/24/2022] [Revised: 11/21/2022] [Accepted: 11/27/2022] [Indexed: 06/17/2023]
Abstract
The characteristics and influence of the echo chamber effect (TECE) of health misinformation diffusion on social media have been investigated by researchers, but the formation mechanism of TECE needs to be explored specifically and deeply. This research focuses on the influence of users' imitation, intergroup interaction, and reciprocity behavior on TECE based on the social contagion mechanism. A user comment-reply social network was constructed using the comments of a COVID-19 vaccine video on YouTube. The semantic similarity and Exponential Random Graph Model (ERGM) were used to calculate TECE and the effect of three interaction mechanisms on the echo chamber. The results show that there is a weak echo chamber effect (ECE) in the spread of misinformation about the COVID-19 vaccine. The imitation and intergroup interaction behavior are positively related to TECE. Reciprocity has no significant influence on TECE.
Collapse
Affiliation(s)
| | - Xiaoyue Ma
- School of Journalism and New Media, Xi’an Jiaotong University, Xi’an 710049, China
| | | |
Collapse
|
24
|
Yoon HY, You KH, Kwon JH, Kim JS, Rha SY, Chang YJ, Lee SC. Understanding the Social Mechanism of Cancer Misinformation Spread on YouTube and Lessons Learned: Infodemiological Study. J Med Internet Res 2022; 24:e39571. [PMID: 36374534 PMCID: PMC9699593 DOI: 10.2196/39571] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/01/2022] [Revised: 08/30/2022] [Accepted: 10/20/2022] [Indexed: 11/16/2022] Open
Abstract
Background A knowledge gap exists between the list of required actions and the action plan for countering cancer misinformation on social media. Little attention has been paid to a social media strategy for disseminating factual information while also disrupting misinformation on social media networks. Objective The aim of this study was to, first, identify the spread structure of cancer misinformation on YouTube. We asked the question, “How do YouTube videos play an important role in spreading information about the self-administration of anthelmintics for dogs as a cancer medicine for humans?” Second, the study aimed to suggest an action strategy for disrupting misinformation diffusion on YouTube by exploiting the network logic of YouTube information flow and the recommendation system. We asked the question, “What would be a feasible and effective strategy to block cancer misinformation diffusion on YouTube?” Methods The study used the YouTube case of the self-administration of anthelmintics for dogs as an alternative cancer medicine in South Korea. We gathered Korean YouTube videos about the self-administration of fenbendazole. Using the YouTube application programming interface for the query “fenbendazole,” 702 videos from 227 channels were compiled. Then, videos with at least 50,000 views, uploaded between September 2019 and September 2020, were selected from the collection, resulting in 90 videos. Finally, 10 recommended videos for each of the 90 videos were compiled, totaling 573 videos. Social network visualization for the recommended videos was used to identify three intervention strategies for disrupting the YouTube misinformation network. Results The study found evidence of complex contagion by human and machine recommendation systems. By exposing stakeholders to multiple information sources on fenbendazole self-administration and by linking them through a recommendation algorithm, YouTube has become the perfect infrastructure for reinforcing the belief that fenbendazole can cure cancer, despite government warnings about the risks and dangers of self-administration. Conclusions Health authorities should upload pertinent information through multiple channels and should exploit the existing YouTube recommendation algorithm to disrupt the misinformation network. Considering the viewing habits of patients and caregivers, the direct use of YouTube hospital channels is more effective than the indirect use of YouTube news media channels or government channels that report public announcements and statements. Reinforcing through multiple channels is the key.
Collapse
Affiliation(s)
- Ho Young Yoon
- Division of Communication and Media, Ewha Womans University, Seoul, Republic of Korea
| | - Kyung Han You
- Department of Media and Communication Studies, Jeonbuk National University, Jeonju, Republic of Korea
| | - Jung Hye Kwon
- Division of Hemato-Oncology, Department of Internal Medicine, Chungnam National University Sejong Hospital, Sejong-Si, Republic of Korea
- Division of Hematology and Oncology, Department of Internal Medicine, College of Medicine, Chungnam National University, Daejeon, Republic of Korea
- Daejeon Regional Cancer Center, Daejeon, Republic of Korea
| | - Jung Sun Kim
- Division of Hemato-Oncology, Department of Internal Medicine, Chungnam National University Sejong Hospital, Sejong-Si, Republic of Korea
| | - Sun Young Rha
- Division of Medical Oncology, Yonsei University College of Medicine, Seoul, Republic of Korea
- Yonsei Cancer Center, Yonsei University Health System, Seoul, Republic of Korea
| | - Yoon Jung Chang
- Division of Cancer Control and Policy, National Cancer Center Korea, Department of Cancer Control and Population Health, National Cancer Center Graduate School of Cancer Science and Policy, Goyang, Republic of Korea
| | - Sang-Cheol Lee
- Division of Hematology and Oncology, Department of Internal Medicine, Soonchunhyang University Hospital Cheonan, Cheonan, Republic of Korea
| |
Collapse
|
25
|
Kar AK, Tripathi SN, Malik N, Gupta S, Sivarajah U. How Does Misinformation and Capricious Opinions Impact the Supply Chain - A Study on the Impacts During the Pandemic. ANNALS OF OPERATIONS RESEARCH 2022; 327:1-22. [PMID: 36407940 PMCID: PMC9640789 DOI: 10.1007/s10479-022-04997-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/12/2021] [Revised: 07/18/2022] [Accepted: 09/15/2022] [Indexed: 06/16/2023]
Abstract
Misinformation or fake news has had multifaceted ramifications with the onset of the Covid-19 pandemic, creating widespread panic amongst people. This study investigates the impact of misinformation/ fake news (on internet platforms) on consumer buying behavior, impact of fear (created by fake news) on hoarding of essential products and consumer spending and finally impact of misinformation-induced panic buying on supply chain disruptions. It draws upon the consumer decision theory and the cognitive load theory for explaining the psychological and behavioral responses of consumers. The study follows an inductive approach towards theory building using a multi-method approach. Initially, a qualitative research method based on interviews followed by text-mining has been used followed by analysis using python for topic modelling using Latent Dirichlet Allocation (LDA). The findings revealed several prominent themes like consumer shift to online buying, two contrasting spending intentions namely financial security and compensatory consumptions, irrational panic buying, uncertainty/ambiguity of government protocol and norms, social media fraudulent practices and misinformation dissemination, personalized buying experience, reduced trust on news and marketers, logistics and transportation bottlenecks, labor shortage due to migration and plant closures, and bullwhip effect in supply chains.
Collapse
Affiliation(s)
- Arpan Kumar Kar
- Yardi School of Artificial Intelligence and Department of Management Studies, Indian Institute of Technology Delhi, Hauz Khas, New Delhi, 110016 India
| | - Shalini Nath Tripathi
- Jaipuria Institute of Management, Hahnemann Road, Vineet Khand, Gomti Nagar, 226010 Lucknow, Uttar Pradesh India
| | - Nishtha Malik
- Jaipuria Institute of Management, Hahnemann Road, Vineet Khand, Gomti Nagar, 226010 Lucknow, Uttar Pradesh India
| | - Shivam Gupta
- Department of Information Systems, Supply Chain Management & Decision Support, NEOMA Business School, 59 Rue Pierre Taittinger, 51100 Reims, France
| | | |
Collapse
|
26
|
Xiao X. Not doomed: Examining the path from misinformation exposure to verification and correction in the context of COVID-19 pandemic. TELEMATICS AND INFORMATICS 2022; 74:101890. [PMID: 36213556 PMCID: PMC9527493 DOI: 10.1016/j.tele.2022.101890] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/22/2022] [Revised: 07/25/2022] [Accepted: 09/23/2022] [Indexed: 11/20/2022]
Abstract
Misinformation exposure has attracted growing scholarly attention. While much research highlights misinformation exposure's negative impacts, this study argues that its positive effects should also be noted. By using a more precise measurement of misinformation exposure and a path model, this study outlines a path from misinformation exposure to anti-misinformation behaviors, partially mediated by misperceptions in the context of COVID-19. Findings indicate that exposure to popular but widely-denounced COVID-19 misinformation via social media had positive effects on verification intention. Frequent exposure to misinformation on social media is associated with lower misperceptions, suggesting a healthy dose of skepticism toward the platform and low internalization of misinformation. Special attention, however, needs to be paid to online news websites and personal contacts as misinformation sources. More tailored interventions and communication strategies to reduce misperceptions and increase media-literate behaviors are needed for younger, conservative, and ethnic minority individuals. Theoretical and practical implications are further discussed.
Collapse
Affiliation(s)
- Xizhu Xiao
- School of Literature, Journalism and Communication, Qingdao University, Qingdao, Shandong 266071, China
| |
Collapse
|
27
|
How digital media drive affective polarization through partisan sorting. Proc Natl Acad Sci U S A 2022; 119:e2207159119. [PMID: 36215484 PMCID: PMC9586282 DOI: 10.1073/pnas.2207159119] [Citation(s) in RCA: 11] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Politics has in recent decades entered an era of intense polarization. Explanations have implicated digital media, with the so-called echo chamber remaining a dominant causal hypothesis despite growing challenge by empirical evidence. This paper suggests that this mounting evidence provides not only reason to reject the echo chamber hypothesis but also the foundation for an alternative causal mechanism. To propose such a mechanism, the paper draws on the literatures on affective polarization, digital media, and opinion dynamics. From the affective polarization literature, we follow the move from seeing polarization as diverging issue positions to rooted in sorting: an alignment of differences which is effectively dividing the electorate into two increasingly homogeneous megaparties. To explain the rise in sorting, the paper draws on opinion dynamics and digital media research to present a model which essentially turns the echo chamber on its head: it is not isolation from opposing views that drives polarization but precisely the fact that digital media bring us to interact outside our local bubble. When individuals interact locally, the outcome is a stable plural patchwork of cross-cutting conflicts. By encouraging nonlocal interaction, digital media drive an alignment of conflicts along partisan lines, thus effacing the counterbalancing effects of local heterogeneity. The result is polarization, even if individual interaction leads to convergence. The model thus suggests that digital media polarize through partisan sorting, creating a maelstrom in which more and more identities, beliefs, and cultural preferences become drawn into an all-encompassing societal division.
Collapse
|
28
|
Hodson J, O’Meara V, Thompson C, Houlden S, Gosse C, Veletsianos G. "My People Already Know That": The Imagined Audience and COVID-19 Health Information Sharing Practices on Social Media. SOCIAL MEDIA + SOCIETY 2022; 8:20563051221122463. [PMID: 36160699 PMCID: PMC9490384 DOI: 10.1177/20563051221122463] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Indexed: 06/16/2023]
Abstract
This article examines how imagined audiences and impression management strategies shape COVID-19 health information sharing practices on social media and considers the implications of this for combatting the spread of misinformation online. In an interview study with 27 Canadian adults, participants were shown two infographics about masks and vaccines produced by the World Health Organization (WHO) and asked whether or not they would share these on social media. We find that interviewees' willingness to share the WHO infographics is negotiated against their mental perception of the online audience, which is conceptualized in three distinct ways. First, interviewees who would not share the infographics frequently describe a self-similar audience of peers that are "in the know" about COVID-19; second, those who might share the infographics conjure a specific and contextual audience who "needs" the information; and finally, those who said they would share the infographics most frequently conjure an abstract audience of "the public" or "my community" to explain that decision. Implications of these sharing behaviors for combatting the spread of misinformation are discussed.
Collapse
|
29
|
Diaz-Diaz F, San Miguel M, Meloni S. Echo chambers and information transmission biases in homophilic and heterophilic networks. Sci Rep 2022; 12:9350. [PMID: 35672432 PMCID: PMC9174247 DOI: 10.1038/s41598-022-13343-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/17/2022] [Accepted: 05/23/2022] [Indexed: 12/04/2022] Open
Abstract
We study how information transmission biases arise by the interplay between the structural properties of the network and the dynamics of the information in synthetic scale-free homophilic/heterophilic networks. We provide simple mathematical tools to quantify these biases. Both Simple and Complex Contagion models are insufficient to predict significant biases. In contrast, a Hybrid Contagion model-in which both Simple and Complex Contagion occur-gives rise to three different homophily-dependent biases: emissivity and receptivity biases, and echo chambers. Simulations in an empirical network with high homophily confirm our findings. Our results shed light on the mechanisms that cause inequalities in the visibility of information sources, reduced access to information, and lack of communication among distinct groups.
Collapse
Affiliation(s)
- Fernando Diaz-Diaz
- IFISC (UIB-CSIC), Institute for Cross-Disciplinary Physics and Complex Systems, Campus Universitat de les Illes Balears, 07122, Palma de Mallorca, Spain
| | - Maxi San Miguel
- IFISC (UIB-CSIC), Institute for Cross-Disciplinary Physics and Complex Systems, Campus Universitat de les Illes Balears, 07122, Palma de Mallorca, Spain
| | - Sandro Meloni
- IFISC (UIB-CSIC), Institute for Cross-Disciplinary Physics and Complex Systems, Campus Universitat de les Illes Balears, 07122, Palma de Mallorca, Spain.
| |
Collapse
|
30
|
Understanding the “infodemic”: social media news use, homogeneous online discussion, self-perceived media literacy and misperceptions about COVID-19. ONLINE INFORMATION REVIEW 2022. [DOI: 10.1108/oir-06-2021-0305] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
PurposeThis study has three main purposes: (1) to investigate the association between social media news use and misperceptions about COVID-19; (2) to explore the mediating role of homogeneous online discussion; (3) and to understand whether the extent to which one perceives themselves as media-literate could moderate the relationship.Design/methodology/approachThe authors conducted an online survey and collected data through Amazon Mechanical Turk. A total of 797 participants aged 18 and above completed the survey. The average age of the respondents is 38.40 years (SD = 12.31), and 41.2% were female. In terms of party identification, 30.8% were reported leaning toward Republicans; 53.7% leaned toward Democrats, and 15.4% were reported neutral.FindingsResults from a moderated mediation model show that social media news use is positively associated with misperceptions about the COVID-19. Moreover, homogeneous online discussion was a significant mediator of the relationship between social media news use and misperceptions about COVID-19. Further, self-perceived media literacy (SPML) significantly moderated the main and indirect effects between social media news use and COVID-19 misperceptions, such that the associations became weaker among those with higher SPML.Originality/valueFindings provide insights into the significance of online information sources, discussion network heterogeneity and media literacy education. Although there have been many studies on misinformation, prior research has not examined these relationships, which may help provide solutions to cope with misinformation.Peer reviewThe peer review history for this article is available at: https://publons.com/publon/10.1108/OIR-06-2021-0305
Collapse
|
31
|
Horsevad N, Mateo D, Kooij RE, Barrat A, Bouffanais R. Transition from simple to complex contagion in collective decision-making. Nat Commun 2022; 13:1442. [PMID: 35301305 PMCID: PMC8931172 DOI: 10.1038/s41467-022-28958-6] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2021] [Accepted: 02/16/2022] [Indexed: 11/20/2022] Open
Abstract
How does the spread of behavior affect consensus-based collective decision-making among animals, humans or swarming robots? In prior research, such propagation of behavior on social networks has been found to exhibit a transition from simple contagion—i.e, based on pairwise interactions—to a complex one—i.e., involving social influence and reinforcement. However, this rich phenomenology appears so far limited to threshold-based decision-making processes with binary options. Here, we show theoretically, and experimentally with a multi-robot system, that such a transition from simple to complex contagion can also be observed in an archetypal model of distributed decision-making devoid of any thresholds or nonlinearities. Specifically, we uncover two key results: the nature of the contagion—simple or complex—is tightly related to the intrinsic pace of the behavior that is spreading, and the network topology strongly influences the effectiveness of the behavioral transmission in ways that are reminiscent of threshold-based models. These results offer new directions for the empirical exploration of behavioral contagions in groups, and have significant ramifications for the design of cooperative and networked robot systems. In consensus-based collective dynamics, the occurrence of simple and complex contagions shapes system behavior. The authors analyze a transition from simple to complex contagions in collective decision-making processes based on consensus, and demonstrate it with a swarm robotic system.
Collapse
Affiliation(s)
| | | | - Robert E Kooij
- Delft University of Technology, Delft, The Netherlands.,The Netherlands Organization for Applied Scientific Research (TNO), The Hague, The Netherlands
| | - Alain Barrat
- Aix Marseille Univ, Université de Toulon, CNRS, CPT, Turing Center for Living Systems, Marseille, France.,Tokyo Tech World Research Hub Initiative (WRHI), Tokyo Institute of Technology, Tokyo, Japan
| | | |
Collapse
|
32
|
Misinformation: susceptibility, spread, and interventions to immunize the public. Nat Med 2022; 28:460-467. [PMID: 35273402 DOI: 10.1038/s41591-022-01713-6] [Citation(s) in RCA: 111] [Impact Index Per Article: 55.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/21/2021] [Accepted: 01/24/2022] [Indexed: 11/09/2022]
Abstract
The spread of misinformation poses a considerable threat to public health and the successful management of a global pandemic. For example, studies find that exposure to misinformation can undermine vaccination uptake and compliance with public-health guidelines. As research on the science of misinformation is rapidly emerging, this conceptual Review summarizes what we know along three key dimensions of the infodemic: susceptibility, spread, and immunization. Extant research is evaluated on the questions of why (some) people are (more) susceptible to misinformation, how misinformation spreads in online social networks, and which interventions can help to boost psychological immunity to misinformation. Implications for managing the infodemic are discussed.
Collapse
|
33
|
Spread of gambling abstinence through peers and comments in online self-help chat forums to quit gambling. Sci Rep 2022; 12:3675. [PMID: 35256679 PMCID: PMC8901770 DOI: 10.1038/s41598-022-07714-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/08/2021] [Accepted: 02/22/2022] [Indexed: 11/08/2022] Open
Abstract
Habit formation occurs in relation to peer habits and comments. This general principle was applied to gambling abstinence in the context of online self-help forums to quit gambling. Participants in this study, conducted between September 2008 and March 2020, were 161 abstinent and 928 non-abstinent gamblers who participated in online self-help chat forums to quit gambling. They received 269,317 comments during their first 3 years of forum participation. Gamblers had an increased likelihood of 3-year continuous gambling abstinence if they had many peers in the forums. However, they had a decreased likelihood of gambling abstinence if they received rejective comments from the forums. Based on these results, online social network-based interventions may be a new treatment option for gamblers.
Collapse
|
34
|
Trotta A, Marinaro M, Cavalli A, Cordisco M, Piperis A, Buonavoglia C, Corrente M. African Swine Fever-How to Unravel Fake News in Veterinary Medicine. Animals (Basel) 2022; 12:ani12050656. [PMID: 35268224 PMCID: PMC8909113 DOI: 10.3390/ani12050656] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2022] [Revised: 02/28/2022] [Accepted: 03/03/2022] [Indexed: 12/24/2022] Open
Abstract
In recent years, fake scientific news has spread much faster through the Internet and social media within the so-called "infodemic". African Swine Fever (ASF) is a perfect case study to prove how fake news can undermine the public health response, even in the veterinary field. ASF is a highly contagious infective disease affecting exclusively domestic and wild pigs such as wild boars. ASF can cause social damage and economic losses both directly (due to the high mortality rate) and indirectly (due to international sanctions). Although ASF is not a threat to human health, since 2018 newspapers have often reported false or misleading news, ranging from misinterpreted findings/data to fake or alarmistic news. In some cases, fake news was spread, such as the use of snipers at the border of nations to kill wild boars, or those reports concerning possible risks to human health. In order to provide real and fact-based news on epidemics, some organizations have created easy-to-read infographic and iconographic materials, available on their websites, to help the readers identifying the fake news. Indeed, it is crucial that governments and scientific organizations work against fear and anxiety, using simple and clear communication.
Collapse
Affiliation(s)
- Adriana Trotta
- Department of Veterinary Medicine, University of Bari “Aldo Moro”, Str. Prov. per Casamassima Km 3, 70010 Valenzano, Italy; (A.C.); (M.C.); (A.P.); (C.B.); (M.C.)
- Correspondence: or
| | - Mariarosaria Marinaro
- Department of Infectious Diseases, Istituto Superiore di Sanità, Viale Regina Elena 299, 00161 Rome, Italy;
| | - Alessandra Cavalli
- Department of Veterinary Medicine, University of Bari “Aldo Moro”, Str. Prov. per Casamassima Km 3, 70010 Valenzano, Italy; (A.C.); (M.C.); (A.P.); (C.B.); (M.C.)
| | - Marco Cordisco
- Department of Veterinary Medicine, University of Bari “Aldo Moro”, Str. Prov. per Casamassima Km 3, 70010 Valenzano, Italy; (A.C.); (M.C.); (A.P.); (C.B.); (M.C.)
| | - Angela Piperis
- Department of Veterinary Medicine, University of Bari “Aldo Moro”, Str. Prov. per Casamassima Km 3, 70010 Valenzano, Italy; (A.C.); (M.C.); (A.P.); (C.B.); (M.C.)
| | - Canio Buonavoglia
- Department of Veterinary Medicine, University of Bari “Aldo Moro”, Str. Prov. per Casamassima Km 3, 70010 Valenzano, Italy; (A.C.); (M.C.); (A.P.); (C.B.); (M.C.)
| | - Marialaura Corrente
- Department of Veterinary Medicine, University of Bari “Aldo Moro”, Str. Prov. per Casamassima Km 3, 70010 Valenzano, Italy; (A.C.); (M.C.); (A.P.); (C.B.); (M.C.)
| |
Collapse
|
35
|
Rivera YM, Moran MB, Thrul J, Joshu C, Smith KC. Contextualizing Engagement With Health Information on Facebook: Using the Social Media Content and Context Elicitation Method. J Med Internet Res 2022; 24:e25243. [PMID: 35254266 PMCID: PMC8933799 DOI: 10.2196/25243] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/25/2020] [Revised: 08/07/2021] [Accepted: 12/27/2021] [Indexed: 11/13/2022] Open
Abstract
Background
Most of what is known regarding health information engagement on social media stems from quantitative methodologies. Public health literature often quantifies engagement by measuring likes, comments, and/or shares of posts within health organizations’ Facebook pages. However, this content may not represent the health information (and misinformation) generally available to and consumed by platform users. Furthermore, some individuals may prefer to engage with information without leaving quantifiable digital traces. Mixed methods approaches may provide a way of surpassing the constraints of assessing engagement with health information by using only currently available social media metrics.
Objective
This study aims to discuss the limitations of current approaches in assessing health information engagement on Facebook and presents the social media content and context elicitation method, a qualitatively driven, mixed methods approach to understanding engagement with health information and how engagement may lead to subsequent actions.
Methods
Data collection, management, and analysis using the social media content and context elicitation method are presented. This method was developed for a broader study exploring how and why US Latinos and Latinas engage with cancer prevention and screening information on Facebook. The study included 20 participants aged between 40 and 75 years without cancer who participated in semistructured, in-depth interviews to discuss their Facebook use and engagement with cancer information on the platform. Participants accessed their Facebook account alongside the researcher, typed cancer in the search bar, and discussed cancer-related posts they engaged with during the previous 12 months. Engagement was defined as liking, commenting, and/or sharing a post; clicking on a post link; reading an article in a post; and/or watching a video within a post. Content engagement prompted questions regarding the reasons for engagement and whether engagement triggered further action. Data were managed using MAXQDA (VERBI GmbH) and analyzed using thematic and content analyses.
Results
Data emerging from the social media content and context elicitation method demonstrated that participants mainly engaged with cancer prevention and screening information by viewing and/or reading content (48/66, 73%) without liking, commenting, or sharing it. This method provided rich content regarding how US Latinos and Latinas engage with and act upon cancer prevention and screening information on Facebook. We present 2 emblematic cases from the main study to exemplify the additional information and context elicited from this methodology, which is currently lacking from quantitative approaches.
Conclusions
The social media content and context elicitation method allows a better representation and deeper contextualization of how people engage with and act upon health information and misinformation encountered on social media. This method may be applied to future studies regarding how to best communicate health information on social media, including how these affect assessments of message credibility and accuracy, which can influence health outcomes.
Collapse
Affiliation(s)
- Yonaira M Rivera
- Department of Communication, School of Communication & Information, Rutgers University, New Brunswick, NJ, United States
- Cancer Prevention and Control Program, Rutgers Cancer Institute of New Jersey, New Brusnwick, NJ, United States
| | - Meghan B Moran
- Department of Health, Behavior & Society, Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, United States
| | - Johannes Thrul
- Department of Mental Health, Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, United States
| | - Corinne Joshu
- Department of Epidemiology, Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, United States
| | - Katherine C Smith
- Department of Health, Behavior & Society, Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, United States
| |
Collapse
|
36
|
Sánchez‐Rodríguez Á, Moreno‐Bella E. Are you interested in economic inequality? Depends on where you live. ASIAN JOURNAL OF SOCIAL PSYCHOLOGY 2022. [DOI: 10.1111/ajsp.12458] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
Affiliation(s)
| | - Eva Moreno‐Bella
- Department of Social Psychology University of Granada Granada Spain
| |
Collapse
|
37
|
Törnberg P, Olbrich E, Uitermark J. Editorial: The Computational Analysis of Cultural Conflicts. Front Big Data 2022; 5:840584. [PMID: 35299883 PMCID: PMC8923055 DOI: 10.3389/fdata.2022.840584] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/21/2021] [Accepted: 01/17/2022] [Indexed: 12/05/2022] Open
Affiliation(s)
- Petter Törnberg
- Amsterdam Institute for Social Science Research, University of Amsterdam, Amsterdam, Netherlands
- *Correspondence: Petter Törnberg
| | - Eckehard Olbrich
- Max Planck Institute for Mathematics in the Sciences, Leipzig, Germany
| | - Justus Uitermark
- Amsterdam Institute for Social Science Research, University of Amsterdam, Amsterdam, Netherlands
| |
Collapse
|
38
|
Gajewski ŁG, Sienkiewicz J, Hołyst JA. Transitions between polarization and radicalization in a temporal bilayer echo-chamber model. Phys Rev E 2022; 105:024125. [PMID: 35291103 DOI: 10.1103/physreve.105.024125] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/11/2021] [Accepted: 02/03/2022] [Indexed: 06/14/2023]
Abstract
Echo chambers and polarization dynamics are, as of late, a very prominent topic in scientific communities around the world. As these phenomena directly affect our lives, seemingly more and more as our societies and communication channels evolve, it becomes ever so important for us to understand the intricacies of opinion dynamics in the modern era. Here we extend an existing echo-chamber model with activity-driven agents to a bilayer topology and study the dynamics of the polarized state as a function of interlayer couplings. Different cases of such couplings are presented: unidirectional coupling that can be reduced to a monolayer facing an external bias and symmetric and nonsymmetric couplings. We have assumed that initial conditions impose system polarization and agent opinions are different for both layers. Such a preconditioned polarized state can persist without explicit homophilic interactions provided the coupling strength between agents belonging to different layers is weak enough. For a strong unidirectional or attractive coupling between two layers a discontinuous transition to a radicalized state takes place when mean opinions in both layers are the same. When coupling constants between the layers are of different signs, the system exhibits sustained or decaying oscillations. Transitions between these states are analyzed using a mean field approximation and classified in the framework of bifurcation theory.
Collapse
Affiliation(s)
- Łukasz G Gajewski
- Faculty of Physics, Warsaw University of Technology, Koszykowa 75, 00-662 Warszawa, Poland
| | - Julian Sienkiewicz
- Faculty of Physics, Warsaw University of Technology, Koszykowa 75, 00-662 Warszawa, Poland
| | - Janusz A Hołyst
- Faculty of Physics, Warsaw University of Technology, Koszykowa 75, 00-662 Warszawa, Poland and ITMO University, Kronverkskiy Prospekt 49, St. Petersburg, 197101 Russia
| |
Collapse
|
39
|
Horner CG, Galletta D, Crawford J, Shirsat A. Emotions: The Unexplored Fuel of Fake News on Social Media. J MANAGE INFORM SYST 2022. [DOI: 10.1080/07421222.2021.1990610] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Affiliation(s)
- Christy Galletta Horner
- Bowling Green State University, School of Education and Human Development, Bowling Green, Ohio
| | - Dennis Galletta
- University of Pittsburgh, Katz Graduate School of Business, Pittsburgh, PA
| | - Jennifer Crawford
- Bowling Green State University, School of Education and Human Development, Bowling Green, Ohio
| | - Abhijeet Shirsat
- College of Continuing Education, California State University, Sacramento, California
| |
Collapse
|
40
|
Batailler C, Brannon SM, Teas PE, Gawronski B. A Signal Detection Approach to Understanding the Identification of Fake News. PERSPECTIVES ON PSYCHOLOGICAL SCIENCE 2022; 17:78-98. [PMID: 34264150 DOI: 10.1177/1745691620986135] [Citation(s) in RCA: 12] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Researchers across many disciplines seek to understand how misinformation spreads with a view toward limiting its impact. One important question in this research is how people determine whether a given piece of news is real or fake. In the current article, we discuss the value of signal detection theory (SDT) in disentangling two distinct aspects in the identification of fake news: (a) ability to accurately distinguish between real news and fake news and (b) response biases to judge news as real or fake regardless of news veracity. The value of SDT for understanding the determinants of fake-news beliefs is illustrated with reanalyses of existing data sets, providing more nuanced insights into how partisan bias, cognitive reflection, and prior exposure influence the identification of fake news. Implications of SDT for the use of source-related information in the identification of fake news, interventions to improve people's skills in detecting fake news, and the debunking of misinformation are discussed.
Collapse
Affiliation(s)
| | | | - Paul E Teas
- Department of Psychology, University of Illinois at Chicago
| | | |
Collapse
|
41
|
Siem B, Kretzmeyer, B, Stürmer S. The role of self-evaluation in predicting attitudes toward supporters of COVID-19-related conspiracy theories: A direct and a conceptual replication of Cichocka et al. (2016). JOURNAL OF PACIFIC RIM PSYCHOLOGY 2021. [DOI: 10.1177/18344909211052587] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
Abstract
We examined the role of people’s self-evaluation in predicting their attitudes toward supporters of COVID-19-related conspiracy theories by replicating and extending the findings of a study by Cichocka et al. (2016, Study 3) in two preregistered studies (total N = 1179). Study 1, a direct replication, confirmed that narcissism and self-esteem—two different sources of people’s self-evaluation—differentially predicted their beliefs in a series of well-known conspiracy theories (not related to COVID-19), and served as mutual suppressor variables. Specifically, narcissism was positively related and self-esteem was negatively related to conspiracy beliefs, especially when the respective other predictor was controlled for. Study 2 extended Cichocka’s and our Study 1’s findings by testing the differential role of self-esteem and narcissism in predicting a COVID-19-specific criterion. Specifically, we focused on people’s rejection of supporters of COVID-19 conspiracy theories, a criterion we deem particularly important in curtailing the spread of these theories. Results were generally in line with previous findings, but effects were substantially weaker. As suggested by exploratory analyses, this might be due to the fact that the overall rejection of supporters measure comprises not only items capturing rejection of supporters but also items capturing low beliefs in conspiracy theories. These two distinct components differentially related to self-esteem and narcissism: the differential role of self-esteem and narcissism could only be replicated for the “low belief” subcomponent (thus replicating findings from the original study and from Study 1) but not for the “rejection of supporters” subcomponent. The present work thus contributes to recent research suggesting that low belief in conspiracy theories and the rejection of their supporters might be qualitatively different responses with unique antecedents.
Collapse
|
42
|
Törnberg P, Andersson C, Lindgren K, Banisch S. Modeling the emergence of affective polarization in the social media society. PLoS One 2021; 16:e0258259. [PMID: 34634056 PMCID: PMC8504759 DOI: 10.1371/journal.pone.0258259] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2021] [Accepted: 09/22/2021] [Indexed: 12/03/2022] Open
Abstract
Rising political polarization in recent decades has hampered and gridlocked policymaking, as well as weakened trust in democratic institutions. These developments have been linked to the idea that new media technology fosters extreme views and political conflict by facilitating self-segregation into "echo chambers" where opinions are isolated and reinforced. This opinion-centered picture has recently been challenged by an emerging political science literature on "affective polarization", which suggests that current polarization is better understood as driven by partisanship emerging as a strong social identity. Through this lens, politics has become a question of competing social groups rather than differences in policy position. Contrary to the opinion-centered view, this identity-centered perspective has not been subject to dynamical formal modeling, which generally permits hypotheses about micro-level explanations for macro-level phenomena to be systematically tested and explored. We here propose a formal model that links new information technology to affective polarization via social psychological mechanisms of social identity. Our results suggest that new information technology catalyzes affective polarization by lowering search and interaction costs, which shifts the balance between centrifugal and centripetal forces of social identity. We find that the macro-dynamics of social identity is characterized by two stable regimes on the societal level: one fluid regime, in which identities are weak and social connections heterogeneous, and one solid regime in which identities are strong and groups homogeneous. We also find evidence of hysteresis, meaning that a transition into a fragmented state is not readily reversed by again increasing those costs. This suggests that, due to systemic feedback effects, if polarization passes certain tipping points, we may experience run-away political polarization that is highly difficult to reverse.
Collapse
Affiliation(s)
- Petter Törnberg
- Amsterdam Institute for Social Science Research, University of Amsterdam, Amsterdam, The Netherlands
| | - Claes Andersson
- Complex Systems Group, Physical Resource Theory, Chalmers University of Technology, Gothenburg, Sweden
- European Centre for Living Technology, University of Venice Ca’ Foscari, Venice, Italy
| | - Kristian Lindgren
- Complex Systems Group, Physical Resource Theory, Chalmers University of Technology, Gothenburg, Sweden
| | - Sven Banisch
- Max Planck Institute for Mathematics in the Sciences, Max Planck Gesellschaft, Leipzig, Germany
| |
Collapse
|
43
|
Roitero K, Soprano M, Portelli B, De Luise M, Spina D, Mea VD, Serra G, Mizzaro S, Demartini G. Can the crowd judge truthfulness? A longitudinal study on recent misinformation about COVID-19. PERSONAL AND UBIQUITOUS COMPUTING 2021; 27:59-89. [PMID: 34545278 PMCID: PMC8444165 DOI: 10.1007/s00779-021-01604-6] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/01/2021] [Accepted: 07/12/2021] [Indexed: 06/13/2023]
Abstract
Recently, the misinformation problem has been addressed with a crowdsourcing-based approach: to assess the truthfulness of a statement, instead of relying on a few experts, a crowd of non-expert is exploited. We study whether crowdsourcing is an effective and reliable method to assess truthfulness during a pandemic, targeting statements related to COVID-19, thus addressing (mis)information that is both related to a sensitive and personal issue and very recent as compared to when the judgment is done. In our experiments, crowd workers are asked to assess the truthfulness of statements, and to provide evidence for the assessments. Besides showing that the crowd is able to accurately judge the truthfulness of the statements, we report results on workers' behavior, agreement among workers, effect of aggregation functions, of scales transformations, and of workers background and bias. We perform a longitudinal study by re-launching the task multiple times with both novice and experienced workers, deriving important insights on how the behavior and quality change over time. Our results show that workers are able to detect and objectively categorize online (mis)information related to COVID-19; both crowdsourced and expert judgments can be transformed and aggregated to improve quality; worker background and other signals (e.g., source of information, behavior) impact the quality of the data. The longitudinal study demonstrates that the time-span has a major effect on the quality of the judgments, for both novice and experienced workers. Finally, we provide an extensive failure analysis of the statements misjudged by the crowd-workers.
Collapse
|
44
|
Kauk J, Kreysa H, Schweinberger SR. Understanding and countering the spread of conspiracy theories in social networks: Evidence from epidemiological models of Twitter data. PLoS One 2021; 16:e0256179. [PMID: 34383860 PMCID: PMC8360523 DOI: 10.1371/journal.pone.0256179] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/09/2021] [Accepted: 08/02/2021] [Indexed: 11/24/2022] Open
Abstract
Conspiracy theories in social networks are considered to have adverse effects on individuals' compliance with public health measures in the context of a pandemic situation. A deeper understanding of how conspiracy theories propagate through social networks is critical for the development of countermeasures. The present work focuses on a novel approach to characterize the propagation of conspiracy theories through social networks by applying epidemiological models to Twitter data. A Twitter dataset was searched for tweets containing hashtags indicating belief in the "5GCoronavirus" conspiracy theory, which states that the COVID-19 pandemic is a result of, or enhanced by, the enrollment of the 5G mobile network. Despite the absence of any scientific evidence, the "5GCoronavirus" conspiracy theory propagated rapidly through Twitter, beginning at the end of January, followed by a peak at the beginning of April, and ceasing/disappearing approximately at the end of June 2020. An epidemic SIR (Susceptible-Infected-Removed) model was fitted to this time series with acceptable model fit, indicating parallels between the propagation of conspiracy theories in social networks and infectious diseases. Extended SIR models were used to simulate the effects that two specific countermeasures, fact-checking and tweet-deletion, could have had on the propagation of the conspiracy theory. Our simulations indicate that fact-checking is an effective mechanism in an early stage of conspiracy theory diffusion, while tweet-deletion shows only moderate efficacy but is less time-sensitive. More generally, an early response is critical to gain control over the spread of conspiracy theories through social networks. We conclude that an early response combined with strong fact-checking and a moderate level of deletion of problematic posts is a promising strategy to fight conspiracy theories in social networks. Results are discussed with respect to their theoretical validity and generalizability.
Collapse
Affiliation(s)
- Julian Kauk
- Department of General Psychology and Cognitive Neuroscience, Friedrich Schiller University Jena, Jena, Germany
| | - Helene Kreysa
- Department of General Psychology and Cognitive Neuroscience, Friedrich Schiller University Jena, Jena, Germany
| | - Stefan R. Schweinberger
- Department of General Psychology and Cognitive Neuroscience, Friedrich Schiller University Jena, Jena, Germany
- DFG Research Unit Person Perception, Friedrich Schiller University Jena, Jena, Germany
| |
Collapse
|
45
|
Forati AM, Ghose R. Geospatial analysis of misinformation in COVID-19 related tweets. APPLIED GEOGRAPHY (SEVENOAKS, ENGLAND) 2021; 133:102473. [PMID: 34103772 PMCID: PMC8176902 DOI: 10.1016/j.apgeog.2021.102473] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/25/2020] [Revised: 04/26/2021] [Accepted: 05/03/2021] [Indexed: 05/10/2023]
Abstract
COVID-19 has emerged as a global pandemic caused by its highly transmissible nature during the incubation period. In the absence of vaccination, containment is seen as the best strategy to stop virus diffusion. However, public awareness has been adversely affected by discourses in social media that have downplayed the severity of the virus and disseminated false information. This article investigates COVID-19 related Twitter activity in May and June 2020 to examine the origin and nature of misinformation and its relationship with the COVID-19 incidence rate at the state and county level. A geodatabase of all geotagged COVID-19 related tweets was compiled. Multiscale Geographically Weighted Regression was employed to examine the association between social media activity and the spatial variability of disease incidence. Findings suggest that MGWR could explain 80% of the COVID-19 incidence rate variations indicating a strong spatial relationship between social media activity and spread of the Covid-19 virus. Discourse analysis was conducted on tweets to index tweets downplaying the pandemic or disseminating misinformation. Findings indicate that sites of Twitter misinformation showed more resistance to pandemic management measures in May and June 2020 later experienced a rise in the number of cases in July.
Collapse
Affiliation(s)
- Amir Masoud Forati
- Department of Geography, University of Wisconsin- Milwaukee, Milwaukee, WI 5321, USA
| | - Rina Ghose
- Department of Geography, University of Wisconsin- Milwaukee, Milwaukee, WI 5321, USA
| |
Collapse
|
46
|
It doesn't take a village to fall for misinformation: Social media use, discussion heterogeneity preference, worry of the virus, faith in scientists, and COVID-19-related misinformation beliefs. TELEMATICS AND INFORMATICS 2021; 58:101547. [PMID: 36570475 PMCID: PMC9758539 DOI: 10.1016/j.tele.2020.101547] [Citation(s) in RCA: 51] [Impact Index Per Article: 17.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/23/2020] [Revised: 11/30/2020] [Accepted: 11/30/2020] [Indexed: 12/27/2022]
Abstract
With the circulation of misinformation about the COVID-19 pandemic, the World Health Organization has raised concerns about an "infodemic," which exacerbates people's misperceptions and deters preventive measures. Against this backdrop, this study examined the conditional indirect effect of social media use and discussion heterogeneity preference on COVID-19-related misinformation beliefs in the United States, using a national survey. Findings suggested that social media use was positively associated with misinformation beliefs, while discussion heterogeneity preference was negatively associated with misinformation beliefs. Furthermore, worry of COVID-19 was found to be a significant mediator as both associations became more significant when mediated through worry. In addition, faith in scientists served as a moderator that mitigated the indirect effect of discussion heterogeneity preference on misinformation beliefs. That is, among those who had stronger faiths in scientists, the indirect effect of discussion heterogeneity preference on misinformation belief became more negative. The findings revealed communication and psychological factors associated with COVID-19-related misinformation beliefs and provided insights into coping strategies during the pandemic.
Collapse
|
47
|
Murayama T, Wakamiya S, Aramaki E, Kobayashi R. Modeling the spread of fake news on Twitter. PLoS One 2021; 16:e0250419. [PMID: 33886665 PMCID: PMC8062041 DOI: 10.1371/journal.pone.0250419] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2020] [Accepted: 04/06/2021] [Indexed: 11/18/2022] Open
Abstract
Fake news can have a significant negative impact on society because of the growing use of mobile devices and the worldwide increase in Internet access. It is therefore essential to develop a simple mathematical model to understand the online dissemination of fake news. In this study, we propose a point process model of the spread of fake news on Twitter. The proposed model describes the spread of a fake news item as a two-stage process: initially, fake news spreads as a piece of ordinary news; then, when most users start recognizing the falsity of the news item, that itself spreads as another news story. We validate this model using two datasets of fake news items spread on Twitter. We show that the proposed model is superior to the current state-of-the-art methods in accurately predicting the evolution of the spread of a fake news item. Moreover, a text analysis suggests that our model appropriately infers the correction time, i.e., the moment when Twitter users start realizing the falsity of the news item. The proposed model contributes to understanding the dynamics of the spread of fake news on social media. Its ability to extract a compact representation of the spreading pattern could be useful in the detection and mitigation of fake news.
Collapse
Affiliation(s)
- Taichi Murayama
- Nara Institute of Science and Technology (NAIST), Ikoma, Japan
| | - Shoko Wakamiya
- Nara Institute of Science and Technology (NAIST), Ikoma, Japan
| | - Eiji Aramaki
- Nara Institute of Science and Technology (NAIST), Ikoma, Japan
| | - Ryota Kobayashi
- The University of Tokyo, Tokyo, Japan
- JST PRESTO, Kawaguchi, Japan
- * E-mail:
| |
Collapse
|
48
|
De Coninck D, Frissen T, Matthijs K, d’Haenens L, Lits G, Champagne-Poirier O, Carignan ME, David MD, Pignard-Cheynel N, Salerno S, Généreux M. Beliefs in Conspiracy Theories and Misinformation About COVID-19: Comparative Perspectives on the Role of Anxiety, Depression and Exposure to and Trust in Information Sources. Front Psychol 2021; 12:646394. [PMID: 33935904 PMCID: PMC8085263 DOI: 10.3389/fpsyg.2021.646394] [Citation(s) in RCA: 113] [Impact Index Per Article: 37.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/26/2020] [Accepted: 03/26/2021] [Indexed: 11/13/2022] Open
Abstract
While COVID-19 spreads aggressively and rapidly across the globe, many societies have also witnessed the spread of other viral phenomena like misinformation, conspiracy theories, and general mass suspicions about what is really going on. This study investigates how exposure to and trust in information sources, and anxiety and depression, are associated with conspiracy and misinformation beliefs in eight countries/regions (Belgium, Canada, England, Philippines, Hong Kong, New Zealand, United States, Switzerland) during the COVID-19 pandemic. Data were collected in an online survey fielded from May 29, 2020 to June 12, 2020, resulting in a multinational representative sample of 8,806 adult respondents. Results indicate that greater exposure to traditional media (television, radio, newspapers) is associated with lower conspiracy and misinformation beliefs, while exposure to politicians and digital media and personal contacts are associated with greater conspiracy and misinformation beliefs. Exposure to health experts is associated with lower conspiracy beliefs only. Higher feelings of depression are also associated with greater conspiracy and misinformation beliefs. We also found relevant group- and country differences. We discuss the implications of these results.
Collapse
Affiliation(s)
| | - Thomas Frissen
- Department of Technology and Society Studies, Faculty of Arts and Social Sciences, Maastricht University, Maastricht, Netherlands
| | - Koen Matthijs
- Centre for Sociological Research, KU Leuven, Leuven, Belgium
| | | | - Grégoire Lits
- Institut Langage et Communication, Université catholique de Louvain, Louvain-la-Neuve, Belgium
| | - Olivier Champagne-Poirier
- Département de Communication, Faculté des Lettres et Sciences Humaines, Université de Sherbrooke, Sherbrooke, QC, Canada
| | - Marie-Eve Carignan
- Département de Communication, Faculté des Lettres et Sciences Humaines, Université de Sherbrooke, Sherbrooke, QC, Canada
| | - Marc D. David
- Département de Communication, Faculté des Lettres et Sciences Humaines, Université de Sherbrooke, Sherbrooke, QC, Canada
| | | | | | - Melissa Généreux
- Department of Community Health Sciences, Faculty of Medicine and Health Sciences, Université de Sherbrooke, Sherbrooke, QC, Canada
| |
Collapse
|
49
|
Asatani K, Yamano H, Sakaki T, Sakata I. Dense and influential core promotion of daily viral information spread in political echo chambers. Sci Rep 2021; 11:7491. [PMID: 33820918 PMCID: PMC8021571 DOI: 10.1038/s41598-021-86750-w] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/09/2020] [Accepted: 03/05/2021] [Indexed: 02/01/2023] Open
Abstract
Despite the intensive study of the viral spread of fake news in political echo chambers (ECs) on social networking services (SNSs), little is known regarding the underlying structure of the daily information spread in these ECs. Moreover, the effect of SNSs on opinion polarisation is still unclear in terms of pluralistic information access or selective exposure to opinions in an SNS. In this study, we confirmed the steady, highly independent nature of left- and right-leaning ECs, both of which are composed of approximately 250,000 users, from a year-long reply/retweet network of 42 million Japanese Twitter users. We found that both communities have similarly efficient information spreading networks with densely connected and core-periphery structures. Core nodes resonate in the early stages of information cascades, and unilaterally transmit information to peripheral nodes. Each EC has resonant core users who amplify and steadily spread information to a quarter of a million users. In addition, we confirmed the existence of extremely aggressive users of ECs who co-reply/retweet each other. The connection between these users and top influencers suggests that the extreme opinions of the former group affect the entire community through the top influencers.
Collapse
Affiliation(s)
- Kimitaka Asatani
- Graduate school of Engineering, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo, 113-0033, Japan.
| | - Hiroko Yamano
- Institute for Future Initiatives, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo, 113-0033, Japan
| | - Takeshi Sakaki
- Graduate school of Engineering, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo, 113-0033, Japan
| | - Ichiro Sakata
- Graduate school of Engineering, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo, 113-0033, Japan
| |
Collapse
|
50
|
Alshaabi T, Dewhurst DR, Minot JR, Arnold MV, Adams JL, Danforth CM, Dodds PS. The growing amplification of social media: measuring temporal and social contagion dynamics for over 150 languages on Twitter for 2009-2020. EPJ DATA SCIENCE 2021; 10:15. [PMID: 33816048 PMCID: PMC8010293 DOI: 10.1140/epjds/s13688-021-00271-0] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/10/2020] [Accepted: 03/17/2021] [Indexed: 06/12/2023]
Abstract
Working from a dataset of 118 billion messages running from the start of 2009 to the end of 2019, we identify and explore the relative daily use of over 150 languages on Twitter. We find that eight languages comprise 80% of all tweets, with English, Japanese, Spanish, Arabic, and Portuguese being the most dominant. To quantify social spreading in each language over time, we compute the 'contagion ratio': The balance of retweets to organic messages. We find that for the most common languages on Twitter there is a growing tendency, though not universal, to retweet rather than share new content. By the end of 2019, the contagion ratios for half of the top 30 languages, including English and Spanish, had reached above 1-the naive contagion threshold. In 2019, the top 5 languages with the highest average daily ratios were, in order, Thai (7.3), Hindi, Tamil, Urdu, and Catalan, while the bottom 5 were Russian, Swedish, Esperanto, Cebuano, and Finnish (0.26). Further, we show that over time, the contagion ratios for most common languages are growing more strongly than those of rare languages.
Collapse
Affiliation(s)
- Thayer Alshaabi
- Vermont Complex Systems Center, University of Vermont, Burlington, VT 05405 USA
- Computational Story Lab, University of Vermont, Burlington, VT 05405 USA
- Department of Computer Science, University of Vermont, Burlington, VT 05405 USA
| | - David Rushing Dewhurst
- Vermont Complex Systems Center, University of Vermont, Burlington, VT 05405 USA
- Computational Story Lab, University of Vermont, Burlington, VT 05405 USA
- Charles River Analytics, Cambridge, MA 02138 USA
| | - Joshua R. Minot
- Vermont Complex Systems Center, University of Vermont, Burlington, VT 05405 USA
- Computational Story Lab, University of Vermont, Burlington, VT 05405 USA
| | - Michael V. Arnold
- Vermont Complex Systems Center, University of Vermont, Burlington, VT 05405 USA
- Computational Story Lab, University of Vermont, Burlington, VT 05405 USA
| | - Jane L. Adams
- Vermont Complex Systems Center, University of Vermont, Burlington, VT 05405 USA
- Computational Story Lab, University of Vermont, Burlington, VT 05405 USA
| | - Christopher M. Danforth
- Vermont Complex Systems Center, University of Vermont, Burlington, VT 05405 USA
- Computational Story Lab, University of Vermont, Burlington, VT 05405 USA
- Department of Mathematics & Statistics, University of Vermont, Burlington, VT 05405 USA
| | - Peter Sheridan Dodds
- Vermont Complex Systems Center, University of Vermont, Burlington, VT 05405 USA
- Computational Story Lab, University of Vermont, Burlington, VT 05405 USA
- Department of Computer Science, University of Vermont, Burlington, VT 05405 USA
| |
Collapse
|