1
|
Michal AL, Shah P. A Practical Significance Bias in Laypeople's Evaluation of Scientific Findings. Psychol Sci 2024; 35:315-327. [PMID: 38437295 DOI: 10.1177/09567976241231506] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/06/2024] Open
Abstract
People often rely on scientific findings to help them make decisions-however, failing to report effect magnitudes might lead to a potential bias in assuming findings are practically significant. Across two online studies (Prolific; N = 800), we measured U.S. adults' endorsements of expensive interventions described in media reports that led to effects that were small, large, or of unreported magnitude between groups. Participants who viewed interventions with unreported effect magnitudes were more likely to endorse interventions compared with those who viewed interventions with small effects and were just as likely to endorse interventions as those who viewed interventions with large effects, suggesting a practical significance bias. When effect magnitudes were reported, participants on average adjusted their evaluations accordingly. However, some individuals, such as those with low numeracy skills, were more likely than others to act on small effects, even when explicitly prompted to first consider the meaningfulness of the effect.
Collapse
Affiliation(s)
| | - Priti Shah
- Department of Psychology, University of Michigan
| |
Collapse
|
2
|
Mantri P, Subramonyam H, Michal AL, Xiong C. How Do Viewers Synthesize Conflicting Information from Data Visualizations? IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2023; 29:1005-1015. [PMID: 36166526 DOI: 10.1109/tvcg.2022.3209467] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/04/2023]
Abstract
Scientific knowledge develops through cumulative discoveries that build on, contradict, contextualize, or correct prior findings. Scientists and journalists often communicate these incremental findings to lay people through visualizations and text (e.g., the positive and negative effects of caffeine intake). Consequently, readers need to integrate diverse and contrasting evidence from multiple sources to form opinions or make decisions. However, the underlying mechanism for synthesizing information from multiple visualizations remains under-explored. To address this knowledge gap, we conducted a series of four experiments ( N=1166) in which participants synthesized empirical evidence from a pair of line charts presented sequentially. In Experiment 1, we administered a baseline condition with charts depicting no specific context where participants held no strong belief. To test for the generalizability, we introduced real-world scenarios to our visualizations in Experiment 2 and added accompanying text descriptions similar to online news articles or blog posts in Experiment 3. In all three experiments, we varied the relative direction and magnitude of line slopes within the chart pairs. We found that participants tended to weigh the positive slope more when the two charts depicted relationships in the opposite direction (e.g., one positive slope and one negative slope). Participants tended to weigh the less steep slope more when the two charts depicted relationships in the same direction (e.g., both positive). Through these experiments, we characterize participants' synthesis behaviors depending on the relationship between the information they viewed, contribute to theories describing underlying cognitive mechanisms in information synthesis, and describe design implications for data storytelling.
Collapse
|
3
|
Hodson G. Reconsidering reconsent: Threats to internal and external validity when participants reconsent after debriefing. Br J Psychol 2022; 113:853-871. [PMID: 35274307 DOI: 10.1111/bjop.12561] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2021] [Accepted: 02/28/2022] [Indexed: 11/27/2022]
Abstract
We overwhelmingly utilize (partially) informed consent for, and debriefing of, human research participants. Also common is the practice of reconsent, particularly where changes in study protocols (or in participants themselves) occur midstream - participants consent again to remaining in the project or to having their data included. Worryingly under-discussed is post-debriefing reconsent, wherein participants can withdraw their data after learning more fully of the study's goals and methods. Yet, major ethics bodies in Canada, the United States and the United Kingdom promote such practice, with vague and potentially problematic guidelines. Here, the author provides examples involving such reconsent practice, highlighting potentially serious problems that are scientific (e.g. threats to internal and external validity) and ethical (i.e. to the participant, their peers, the researcher and society) in nature. Particularly, problematic is the introduction, by design, of unknowable bias in our research findings. For example, highly prejudiced participants could withdraw data from a discrimination study after learning of the study's hypotheses and goals. The practice may arguably contradict an Open Science goal of increasing research transparency. This call for discussion about the direction of psychological science methods aims to engage a broader discussion in the research community.
Collapse
Affiliation(s)
- Gordon Hodson
- Department of Psychology, Brock University, St. Catharines, Ontario, Canada
| |
Collapse
|
4
|
Abstract
Data reasoning is an essential component of scientific reasoning, as a component of evidence evaluation. In this paper, we outline a model of scientific data reasoning that describes how data sensemaking underlies data reasoning. Data sensemaking, a relatively automatic process rooted in perceptual mechanisms that summarize large quantities of information in the environment, begins early in development, and is refined with experience, knowledge, and improved strategy use. Summarizing data highlights set properties such as central tendency and variability, and these properties are used to draw inferences from data. However, both data sensemaking and data reasoning are subject to cognitive biases or heuristics that can lead to flawed conclusions. The tools of scientific reasoning, including external representations, scientific hypothesis testing, and drawing probabilistic conclusions, can help reduce the likelihood of such flaws and help improve data reasoning. Although data sensemaking and data reasoning are not supplanted by scientific data reasoning, scientific reasoning skills can be leveraged to improve learning about science and reasoning with data.
Collapse
|
5
|
Seifert CM, Harrington M, Michal AL, Shah P. Causal theory error in college students' understanding of science studies. Cogn Res Princ Implic 2022; 7:4. [PMID: 35022946 PMCID: PMC8755867 DOI: 10.1186/s41235-021-00347-5] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2021] [Accepted: 11/27/2021] [Indexed: 11/21/2022] Open
Abstract
When reasoning about science studies, people often make causal theory errors by inferring or accepting a causal claim based on correlational evidence. While humans naturally think in terms of causal relationships, reasoning about science findings requires understanding how evidence supports—or fails to support—a causal claim. This study investigated college students’ thinking about causal claims presented in brief media reports describing behavioral science findings. How do science students reason about causal claims from correlational evidence? And can their reasoning be improved through instruction clarifying the nature of causal theory error? We examined these questions through a series of written reasoning exercises given to advanced college students over three weeks within a psychology methods course. In a pretest session, students critiqued study quality and support for a causal claim from a brief media report suggesting an association between two variables. Then, they created diagrams depicting possible alternative causal theories. At the beginning of the second session, an instructional intervention introduced students to an extended example of a causal theory error through guided questions about possible alternative causes. Then, they completed the same two tasks with new science reports immediately and again 1 week later. The results show students’ reasoning included fewer causal theory errors after the intervention, and this improvement was maintained a week later. Our findings suggest that interventions aimed at addressing reasoning about causal claims in correlational studies are needed even for advanced science students, and that training on considering alternative causal theories may be successful in reducing casual theory error.
Collapse
Affiliation(s)
- Colleen M Seifert
- Department of Psychology, University of Michigan, 530 Church St, Ann Arbor, MI, 48109, USA.
| | - Michael Harrington
- Department of Psychology, University of Michigan, 530 Church St, Ann Arbor, MI, 48109, USA
| | - Audrey L Michal
- Department of Psychology, University of Michigan, 530 Church St, Ann Arbor, MI, 48109, USA
| | - Priti Shah
- Department of Psychology, University of Michigan, 530 Church St, Ann Arbor, MI, 48109, USA
| |
Collapse
|
6
|
Reasoning on Controversial Science Issues in Science Education and Science Communication. EDUCATION SCIENCES 2021. [DOI: 10.3390/educsci11090522] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
The ability to make evidence-based decisions, and hence to reason on questions concerning scientific and societal aspects, is a crucial goal in science education and science communication. However, science denial poses a constant challenge for society and education. Controversial science issues (CSI) encompass scientific knowledge rejected by the public as well as socioscientific issues, i.e., societal issues grounded in science that are frequently applied to science education. Generating evidence-based justifications for claims is central in scientific and informal reasoning. This study aims to describe attitudes and their justifications within the argumentations of a random online sample (N = 398) when reasoning informally on selected CSI. Following a deductive-inductive approach and qualitative content analysis of written open-ended answers, we identified five types of justifications based on a fine-grained category system. The results suggest a topic-specificity of justifications referring to specific scientific data, while justifications appealing to authorities tend to be common across topics. Subjective, and therefore normative, justifications were slightly related to conspiracy ideation and a general rejection of the scientific consensus. The category system could be applied to other CSI topics to help clarify the relation between scientific and informal reasoning in science education and communication.
Collapse
|
7
|
Hendriks F, Mayweg-Paus E, Felton M, Iordanou K, Jucks R, Zimmermann M. Constraints and Affordances of Online Engagement With Scientific Information-A Literature Review. Front Psychol 2020; 11:572744. [PMID: 33362638 PMCID: PMC7759725 DOI: 10.3389/fpsyg.2020.572744] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2020] [Accepted: 11/16/2020] [Indexed: 12/05/2022] Open
Abstract
Many urgent problems that societies currently face—from climate change to a global pandemic—require citizens to engage with scientific information as members of democratic societies as well as to solve problems in their personal lives. Most often, to solve their epistemic aims (aims directed at achieving knowledge and understanding) regarding such socio-scientific issues, individuals search for information online, where there exists a multitude of possibly relevant and highly interconnected sources of different perspectives, sometimes providing conflicting information. The paper provides a review of the literature aimed at identifying (a) constraints and affordances that scientific knowledge and the online information environment entail and (b) individuals' cognitive and motivational processes that have been found to hinder, or conversely, support practices of engagement (such as critical information evaluation or two-sided dialogue). Doing this, a conceptual framework for understanding and fostering what we call online engagement with scientific information is introduced, which is conceived as consisting of individual engagement (engaging on one's own in the search, selection, evaluation, and integration of information) and dialogic engagement (engaging in discourse with others to interpret, articulate and critically examine scientific information). In turn, this paper identifies individual and contextual conditions for individuals' goal-directed and effortful online engagement with scientific information.
Collapse
Affiliation(s)
- Friederike Hendriks
- Institute for Psychology in Education and Instruction, Department of Psychology and Sport Studies, University of Münster, Münster, Germany
| | - Elisabeth Mayweg-Paus
- Institute of Educational Studies, Faculty of Humanities and Social Sciences, Humboldt University of Berlin, Einstein Center Digital Future, Berlin, Germany
| | - Mark Felton
- Department of Teacher Education, Lurie College of Education, San Jose State University, San Jose, CA, United States
| | - Kalypso Iordanou
- School of Sciences, University of Central Lancashire, Larnaka, Cyprus
| | - Regina Jucks
- Institute for Psychology in Education and Instruction, Department of Psychology and Sport Studies, University of Münster, Münster, Germany
| | - Maria Zimmermann
- Institute of Educational Studies, Faculty of Humanities and Social Sciences, Humboldt University of Berlin, Einstein Center Digital Future, Berlin, Germany
| |
Collapse
|
8
|
Čavojová V, Šrol J, Ballová Mikušková E. How scientific reasoning correlates with health-related beliefs and behaviors during the COVID-19 pandemic? J Health Psychol 2020; 27:534-547. [PMID: 33016131 DOI: 10.1177/1359105320962266] [Citation(s) in RCA: 45] [Impact Index Per Article: 11.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
Abstract
We examined whether scientific reasoning is associated with health-related beliefs and behaviors over and above general analytic thinking ability in the general public (N = 783, aged 18-84). Health-related beliefs included: anti-vaccination attitudes, COVID-19 conspiracy beliefs, and generic health-related epistemically suspect beliefs. Scientific reasoning correlated with generic pseudoscientific and health-related conspiracy beliefs and COVID-19 conspiracy beliefs. Crucially, scientific reasoning was a stronger independent predictor of unfounded beliefs (including anti-vaccination attitudes) than general analytic thinking was; however, it had a more modest role in health-related behaviors.
Collapse
Affiliation(s)
- Vladimíra Čavojová
- Institute of Experimental Psychology, Centre of Social and Psychological Sciences, Slovak Academy of Sciences, Bratislava, Slovakia
| | - Jakub Šrol
- Institute of Experimental Psychology, Centre of Social and Psychological Sciences, Slovak Academy of Sciences, Bratislava, Slovakia
| | - Eva Ballová Mikušková
- Institute of Experimental Psychology, Centre of Social and Psychological Sciences, Slovak Academy of Sciences, Bratislava, Slovakia
| |
Collapse
|
9
|
Navarrete JA, Sandoval-Díaz JS. Does cognitive reflection mediate the math gender gap at university admission in Chile? SOCIAL PSYCHOLOGY OF EDUCATION 2020. [DOI: 10.1007/s11218-020-09545-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
10
|
Xiong C, Shapiro J, Hullman J, Franconeri S. Illusion of Causality in Visualized Data. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2020; 26:853-862. [PMID: 31425111 DOI: 10.1109/tvcg.2019.2934399] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Students who eat breakfast more frequently tend to have a higher grade point average. From this data, many people might confidently state that a before-school breakfast program would lead to higher grades. This is a reasoning error, because correlation does not necessarily indicate causation - X and Y can be correlated without one directly causing the other. While this error is pervasive, its prevalence might be amplified or mitigated by the way that the data is presented to a viewer. Across three crowdsourced experiments, we examined whether how simple data relations are presented would mitigate this reasoning error. The first experiment tested examples similar to the breakfast-GPA relation, varying in the plausibility of the causal link. We asked participants to rate their level of agreement that the relation was correlated, which they rated appropriately as high. However, participants also expressed high agreement with a causal interpretation of the data. Levels of support for the causal interpretation were not equally strong across visualization types: causality ratings were highest for text descriptions and bar graphs, but weaker for scatter plots. But is this effect driven by bar graphs aggregating data into two groups or by the visual encoding type? We isolated data aggregation versus visual encoding type and examined their individual effect on perceived causality. Overall, different visualization designs afford different cognitive reasoning affordances across the same data. High levels of data aggregation by graphs tend to be associated with higher perceived causality in data. Participants perceived line and dot visual encodings as more causal than bar encodings. Our results demonstrate how some visualization designs trigger stronger causal links while choosing others can help mitigate unwarranted perceptions of causality.
Collapse
|
11
|
Improving the Welfare of Companion Dogs-Is Owner Education the Solution? Animals (Basel) 2019; 9:ani9090662. [PMID: 31500203 PMCID: PMC6770859 DOI: 10.3390/ani9090662] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2019] [Revised: 08/29/2019] [Accepted: 09/03/2019] [Indexed: 11/22/2022] Open
Abstract
Simple Summary The welfare of most dogs living in homes is largely unknown. However, national surveys carried out by animal welfare charities and findings by animal welfare researchers have shown significant deterioration in some key aspects of dog welfare. For example, more dogs presenting to vets with behavioural problems, obesity, and ill-health due to poor breeding practices. This means that some dogs are suffering due to their owners’ behaviours or ownership practices. Educating dog owners as to how best to look after their dogs is, and has been seen by many, as key to improving the welfare of dogs living in homes. However, the concept of education, the context in which it occurs, and the lack of systematic evaluation of the effectiveness of education interventions means that nobody really knows if this approach works. This paper explores these concepts and draws together a wide range of sources of information to highlight some of the complexities of improving dog welfare by educating owners. Abstract Vets, animal welfare charities, and researchers have frequently cited educating owners as a necessity for improving the welfare of companion dogs. The assumption that improving an owner’s knowledge through an education intervention subsequently results in improvements in the welfare of the dog appears reasonable. However, the complexity of dog welfare and dog ownership and the context in which these relationships occur is rapidly changing. Psychology has demonstrated that humans are complex, with values, attitudes, and beliefs influencing our behaviours as much as knowledge and understanding. Equally, the context in which we individuals and our dogs live is rapidly changing and responding to evolving societal and cultural norms. Therefore, we seek to understand education’s effectiveness as an approach to improving welfare through exploring and understanding these complexities, in conjunction with the relevant research from the disciplines of science education and communication. We argue that well designed and rigorously evaluated education interventions can play a part in the challenge of improving welfare, but that these may have limited scope, and welfare scientists could further consider extending cross-disciplinary, cross-boundary working, and research in order to improve the welfare of companion dogs.
Collapse
|
12
|
Nenciovici L, Allaire-Duquette G, Masson S. Brain activations associated with scientific reasoning: a literature review. Cogn Process 2018; 20:139-161. [DOI: 10.1007/s10339-018-0896-z] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2017] [Accepted: 12/04/2018] [Indexed: 12/15/2022]
|
13
|
Beaujean AA, Benson NF, McGill RJ, Dombrowski SC. A Misuse of IQ Scores: Using the Dual Discrepancy/Consistency Model for Identifying Specific Learning Disabilities. J Intell 2018; 6:E36. [PMID: 31162463 PMCID: PMC6480769 DOI: 10.3390/jintelligence6030036] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/06/2018] [Revised: 07/20/2018] [Accepted: 07/23/2018] [Indexed: 11/30/2022] Open
Abstract
The purpose of this article is to describe the origins of patterns of strengths and weaknesses (PSW) methods for identifying specific learning disabilities (SLD) and to provide a comprehensive review of the assumptions and evidence supporting the most commonly-used PSW method in the United States: Dual Discrepancy/Consistency (DD/C). Given their use in determining whether students have access to special education and related services, it is important that any method used to identify SLD have supporting evidence. A review of the DD/C evidence indicates it cannot currently be classified as an evidence-based method for identifying individuals with a SLD. We show that the DD/C method is unsound for three major reasons: (a) it requires test scores have properties that they fundamentally lack, (b) lack of experimental utility evidence supporting its use, and (c) evidence supporting the inability of the method to identify SLD accurately.
Collapse
Affiliation(s)
| | - Nicholas F Benson
- Educational Psychology Department, Baylor University, Waco, TX 76798, USA.
| | - Ryan J McGill
- School of Education, College of William & Mary, Williamsburg, VA 23187, USA.
| | - Stefan C Dombrowski
- Department of Graduate Education, Leadership and Counseling, Rider University, Lawrenceville, NJ 08648, USA.
| |
Collapse
|
14
|
Hullman J, Kay M, Kim YS, Shrestha S. Imagining Replications: Graphical Prediction & Discrete Visualizations Improve Recall & Estimation of Effect Uncertainty. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2018; 24:446-456. [PMID: 28866501 DOI: 10.1109/tvcg.2017.2743898] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/07/2023]
Abstract
People often have erroneous intuitions about the results of uncertain processes, such as scientific experiments. Many uncertainty visualizations assume considerable statistical knowledge, but have been shown to prompt erroneous conclusions even when users possess this knowledge. Active learning approaches been shown to improve statistical reasoning, but are rarely applied in visualizing uncertainty in scientific reports. We present a controlled study to evaluate the impact of an interactive, graphical uncertainty prediction technique for communicating uncertainty in experiment results. Using our technique, users sketch their prediction of the uncertainty in experimental effects prior to viewing the true sampling distribution from an experiment. We find that having a user graphically predict the possible effects from experiment replications is an effective way to improve one's ability to make predictions about replications of new experiments. Additionally, visualizing uncertainty as a set of discrete outcomes, as opposed to a continuous probability distribution, can improve recall of a sampling distribution from a single experiment. Our work has implications for various applications where it is important to elicit peoples' estimates of probability distributions and to communicate uncertainty effectively.
Collapse
|