1
|
Tiokhin L, Panchanathan K, Smaldino PE, Lakens D. Shifting the Level of Selection in Science. PERSPECTIVES ON PSYCHOLOGICAL SCIENCE 2024; 19:908-920. [PMID: 37526118 PMCID: PMC11539478 DOI: 10.1177/17456916231182568] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 08/02/2023]
Abstract
Criteria for recognizing and rewarding scientists primarily focus on individual contributions. This creates a conflict between what is best for scientists' careers and what is best for science. In this article, we show how the theory of multilevel selection provides conceptual tools for modifying incentives to better align individual and collective interests. A core principle is the need to account for indirect effects by shifting the level at which selection operates from individuals to the groups in which individuals are embedded. This principle is used in several fields to improve collective outcomes, including animal husbandry, team sports, and professional organizations. Shifting the level of selection has the potential to ameliorate several problems in contemporary science, including accounting for scientists' diverse contributions to knowledge generation, reducing individual-level competition, and promoting specialization and team science. We discuss the difficulties associated with shifting the level of selection and outline directions for future development in this domain.
Collapse
Affiliation(s)
- Leo Tiokhin
- Human Technology Interaction Group, Eindhoven University of Technology, The Netherlands
- Data & Analytics Group, IG&H, The Netherlands
| | | | - Paul E. Smaldino
- Department of Cognitive & Information Sciences, University of California, Merced, USA
- Santa Fe Institute, New Mexico, USA
| | - Daniël Lakens
- Human Technology Interaction Group, Eindhoven University of Technology, The Netherlands
| |
Collapse
|
2
|
Fitzpatrick BG, Gorman DM, Trombatore C. Impact of redefining statistical significance on P-hacking and false positive rates: An agent-based model. PLoS One 2024; 19:e0303262. [PMID: 38753677 PMCID: PMC11098386 DOI: 10.1371/journal.pone.0303262] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2023] [Accepted: 04/23/2024] [Indexed: 05/18/2024] Open
Abstract
In recent years, concern has grown about the inappropriate application and interpretation of P values, especially the use of P<0.05 to denote "statistical significance" and the practice of P-hacking to produce results below this threshold and selectively reporting these in publications. Such behavior is said to be a major contributor to the large number of false and non-reproducible discoveries found in academic journals. In response, it has been proposed that the threshold for statistical significance be changed from 0.05 to 0.005. The aim of the current study was to use an evolutionary agent-based model comprised of researchers who test hypotheses and strive to increase their publication rates in order to explore the impact of a 0.005 P value threshold on P-hacking and published false positive rates. Three scenarios were examined, one in which researchers tested a single hypothesis, one in which they tested multiple hypotheses using a P<0.05 threshold, and one in which they tested multiple hypotheses using a P<0.005 threshold. Effects sizes were varied across models and output assessed in terms of researcher effort, number of hypotheses tested and number of publications, and the published false positive rate. The results supported the view that a more stringent P value threshold can serve to reduce the rate of published false positive results. Researchers still engaged in P-hacking with the new threshold, but the effort they expended increased substantially and their overall productivity was reduced, resulting in a decline in the published false positive rate. Compared to other proposed interventions to improve the academic publishing system, changing the P value threshold has the advantage of being relatively easy to implement and could be monitored and enforced with minimal effort by journal editors and peer reviewers.
Collapse
Affiliation(s)
- Ben G. Fitzpatrick
- Department of Mathematics, Loyola Marymount University, Los Angeles, California, United States of America
- Tempest Technologies, Los Angeles, California, United States of America
| | - Dennis M. Gorman
- Department of Epidemiology & Biostatistics, School of Public Health, Texas A&M University, College Station, Texas, United States of America
| | - Caitlin Trombatore
- Department of Mathematics, Loyola Marymount University, Los Angeles, California, United States of America
| |
Collapse
|
3
|
Silverstein P, Elman C, Montoya A, McGillivray B, Pennington CR, Harrison CH, Steltenpohl CN, Röer JP, Corker KS, Charron LM, Elsherif M, Malicki M, Hayes-Harb R, Grinschgl S, Neal T, Evans TR, Karhulahti VM, Krenzer WLD, Belaus A, Moreau D, Burin DI, Chin E, Plomp E, Mayo-Wilson E, Lyle J, Adler JM, Bottesini JG, Lawson KM, Schmidt K, Reneau K, Vilhuber L, Waltman L, Gernsbacher MA, Plonski PE, Ghai S, Grant S, Christian TM, Ngiam W, Syed M. A guide for social science journal editors on easing into open science. Res Integr Peer Rev 2024; 9:2. [PMID: 38360805 PMCID: PMC10870631 DOI: 10.1186/s41073-023-00141-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2023] [Accepted: 12/28/2023] [Indexed: 02/17/2024] Open
Abstract
Journal editors have a large amount of power to advance open science in their respective fields by incentivising and mandating open policies and practices at their journals. The Data PASS Journal Editors Discussion Interface (JEDI, an online community for social science journal editors: www.dpjedi.org ) has collated several resources on embedding open science in journal editing ( www.dpjedi.org/resources ). However, it can be overwhelming as an editor new to open science practices to know where to start. For this reason, we created a guide for journal editors on how to get started with open science. The guide outlines steps that editors can take to implement open policies and practices within their journal, and goes through the what, why, how, and worries of each policy and practice. This manuscript introduces and summarizes the guide (full guide: https://doi.org/10.31219/osf.io/hstcx ).
Collapse
Affiliation(s)
- Priya Silverstein
- Department of Psychology, Ashland University, Ashland, USA.
- Institute for Globally Distributed Open Research and Education, Preston, UK.
| | - Colin Elman
- Maxwell School of Citizenship and Public Affairs, Syracuse University, Syracuse, USA
| | - Amanda Montoya
- Department of Psychology, University of California, Los Angeles, USA
| | | | - Charlotte R Pennington
- School of Psychology, College of Health & Life Sciences, Aston University, Birmingham, UK
| | | | | | - Jan Philipp Röer
- Department of Psychology and Psychotherapy, Witten/Herdecke University, Witten, Germany
| | | | - Lisa M Charron
- American Family Insurance Data Science Institute, University of Wisconsin-Madison, Madison, USA
- Nelson Institute for Environmental Studies, University of Wisconsin-Madison, Madison, USA
| | - Mahmoud Elsherif
- Department of Psychology, University of Birmingham, Birmingham, UK
| | - Mario Malicki
- Meta-Research Innovation Center at Stanford, Stanford University, Stanford, USA
- Stanford Program On Research Rigor and Reproducibility, Stanford University, Stanford, USA
- Department of Epidemiology and Population Health, Stanford University School of Medicine, Stanford, USA
| | | | | | - Tess Neal
- Department of Psychology, Iowa State University, Ames, USA
- School of Social & Behavioral Sciences, Arizona State University, Tempe, USA
| | - Thomas Rhys Evans
- School of Human Sciences and Institute for Lifecourse Development, University of Greenwich, London, UK
| | - Veli-Matti Karhulahti
- Department of Music, Art and Culture Studies, University of Jyväskylä, Jyväskylä, Finland
| | | | - Anabel Belaus
- National Agency for Scientific and Technological Promotion, Córdoba, Argentina
| | - David Moreau
- School of Psychology and Centre for Brain Research, University of Auckland, Auckland, New Zealand
| | - Debora I Burin
- Facultad de Psicología, Universidad de Buenos Aires, Buenos Aires, Argentina
- CONICET, Buenos Aires, Argentina
| | | | - Esther Plomp
- Faculty of Applied Sciences, Delft University of Technology, Delft, Netherlands
- The, The Alan Turing Institute, Turing Way, London, UK
| | - Evan Mayo-Wilson
- Department of Epidemiology, UNC Gillings School of Global Public Health, Chapel Hill, USA
| | - Jared Lyle
- Inter-University Consortium for Political and Social Research (ICPSR), University of Michigan, Ann Arbor, USA
| | | | - Julia G Bottesini
- Maxwell School of Citizenship and Public Affairs, Syracuse University, Syracuse, USA
| | | | | | - Kyrani Reneau
- Inter-University Consortium for Political and Social Research (ICPSR), University of Michigan, Ann Arbor, USA
| | - Lars Vilhuber
- Economics Department, Cornell University, Ithaca, USA
| | - Ludo Waltman
- Centre for Science and Technology Studies, Leiden University, Leiden, Netherlands
| | | | - Paul E Plonski
- Department of Psychology, Tufts University, Medford, USA
| | - Sakshi Ghai
- Department of Psychology, University of Cambridge, Cambridge, USA
| | - Sean Grant
- HEDCO Institute for Evidence-Based Practice, College of Education, University of Oregon, Eugene, USA
| | - Thu-Mai Christian
- Odum Institute for Research in Social Science, University of North Carolina at Chapel Hill, Chapel Hill, USA
| | - William Ngiam
- Institute of Mind and Biology, University of Chicago, Chicago, USA
- Department of Psychology, University of Chicago, Chicago, USA
| | - Moin Syed
- Department of Psychology, University of Minnesota, Minneapolis, USA
| |
Collapse
|
4
|
Spivey MJ. Cognitive Science Progresses Toward Interactive Frameworks. Top Cogn Sci 2023; 15:219-254. [PMID: 36949655 PMCID: PMC10123086 DOI: 10.1111/tops.12645] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2022] [Revised: 02/27/2023] [Accepted: 02/27/2023] [Indexed: 03/24/2023]
Abstract
Despite its many twists and turns, the arc of cognitive science generally bends toward progress, thanks to its interdisciplinary nature. By glancing at the last few decades of experimental and computational advances, it can be argued that-far from failing to converge on a shared set of conceptual assumptions-the field is indeed making steady consensual progress toward what can broadly be referred to as interactive frameworks. This inclination is apparent in the subfields of psycholinguistics, visual perception, embodied cognition, extended cognition, neural networks, dynamical systems theory, and more. This pictorial essay briefly documents this steady progress both from a bird's eye view and from the trenches. The conclusion is one of optimism that cognitive science is getting there, albeit slowly and arduously, like any good science should.
Collapse
Affiliation(s)
- Michael J Spivey
- Department of Cognitive and Information Sciences, University of California, Merced
| |
Collapse
|
5
|
Kohrt F, Smaldino PE, McElreath R, Schönbrodt F. Replication of the natural selection of bad science. ROYAL SOCIETY OPEN SCIENCE 2023; 10:221306. [PMID: 36844805 PMCID: PMC9943874 DOI: 10.1098/rsos.221306] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/12/2022] [Accepted: 01/20/2023] [Indexed: 06/18/2023]
Abstract
This study reports an independent replication of the findings presented by Smaldino and McElreath (Smaldino, McElreath 2016 R. Soc. Open Sci. 3, 160384 (doi:10.1098/rsos.160384)). The replication was successful with one exception. We find that selection acting on scientist's propensity for replication frequency caused a brief period of exuberant replication not observed in the original paper due to a coding error. This difference does not, however, change the authors' original conclusions. We call for more replication studies for simulations as unique contributions to scientific quality assurance.
Collapse
Affiliation(s)
- Florian Kohrt
- Department of Psychology, Ludwig-Maximilians-Universität München, Munich, Germany
| | - Paul E. Smaldino
- Department of Cognitive and Information Sciences, University of California, Merced, CA 95343, USA
- Santa Fe Institute, Santa Fe, NM 87501, USA
| | - Richard McElreath
- Department of Human Behavior, Ecology, and Culture, Max Planck Institute for Evolutionary Anthropology, Leipzig, Germany
| | - Felix Schönbrodt
- Department of Psychology, Ludwig-Maximilians-Universität München, Munich, Germany
| |
Collapse
|
6
|
Abstract
Concerns about a crisis of mass irreplicability across scientific fields ("the replication crisis") have stimulated a movement for open science, encouraging or even requiring researchers to publish their raw data and analysis code. Recently, a rule at the US Environmental Protection Agency (US EPA) would have imposed a strong open data requirement. The rule prompted significant public discussion about whether open science practices are appropriate for fields of environmental public health. The aims of this paper are to assess (1) whether the replication crisis extends to fields of environmental public health; and (2) in general whether open science requirements can address the replication crisis. There is little empirical evidence for or against mass irreplicability in environmental public health specifically. Without such evidence, strong claims about whether the replication crisis extends to environmental public health - or not - seem premature. By distinguishing three concepts - reproducibility, replicability, and robustness - it is clear that open data initiatives can promote reproducibility and robustness but do little to promote replicability. I conclude by reviewing some of the other benefits of open science, and offer some suggestions for funding streams to mitigate the costs of adoption of open science practices in environmental public health.
Collapse
|
7
|
Quis judicabit ipsos judices? A case study on the dynamics of competitive funding panel evaluations. RESEARCH EVALUATION 2022. [DOI: 10.1093/reseval/rvac021] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022]
Abstract
Abstract
Securing research funding is essential for all researchers. The standard evaluation method for competitive grants is through evaluation by a panel of experts. However, the literature notes that peer review has inherent flaws and is subject to biases, which can arise from differing interpretations of the criteria, the impossibility for a group of reviewers to be experts in all possible topics within their field, and the role of affect. As such, understanding the dynamics at play during panel evaluations is crucial to allow researchers a better chance at securing funding, and also for the reviewers themselves to be aware of the cognitive mechanisms underlying their decision-making. In this study, we conduct a case study based on application and evaluation data for two social sciences panels in a competitive state-funded call in Portugal. Using a mixed-methods approach, we find that qualitative evaluations largely resonate with the evaluation criteria, and the candidate’s scientific output is partially aligned with the qualitative evaluations, but scientometric indicators alone do not significantly influence the candidate’s evaluation. However, the polarity of the qualitative evaluation has a positive influence on the candidate’s evaluation. This paradox is discussed as possibly resulting from the occurrence of a halo effect in the panel’s judgment of the candidates. By providing a multi-methods approach, this study aims to provide insights that can be useful for all stakeholders involved in competitive funding evaluations.
Collapse
|
8
|
Braganza O. Proxyeconomics, a theory and model of proxy-based competition and cultural evolution. ROYAL SOCIETY OPEN SCIENCE 2022; 9:211030. [PMID: 35223051 PMCID: PMC8864350 DOI: 10.1098/rsos.211030] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/14/2021] [Accepted: 01/24/2022] [Indexed: 06/14/2023]
Abstract
Competitive societal systems by necessity rely on imperfect proxy measures. For instance, profit is used to measure value to consumers, patient volumes to measure hospital performance, or the journal impact factor to measure scientific value. While there are numerous reasons why proxies will deviate from the underlying societal goals, they will nevertheless determine the selection of cultural practices and guide individual decisions. These considerations suggest that the study of proxy-based competition requires the integration of cultural evolution theory and economics or decision theory. Here, we attempt such an integration in two ways. First, we describe an agent-based simulation model, combining methods and insights from these disciplines. The model suggests that an individual intrinsic incentive can constrain a cultural evolutionary pressure, which would otherwise enforce fully proxy-oriented practices. The emergent outcome is distinct from that with either the isolated economic or evolutionary mechanism. It reflects what we term lock-in, where competitive pressure can undermine the ability of agents to pursue the shared social goal. Second, we elaborate the broader context, outlining the system-theoretic foundations as well as some philosophical and practical implications, towards a broader theory. Overall, we suggest such a theory may offer an explanatory and predictive framework for diverse subjects, ranging from scientific replicability to climate inaction, and outlining strategies for diagnosis and mitigation.
Collapse
Affiliation(s)
- Oliver Braganza
- Institute for Experimental Epileptology and Cognition Research, University of Bonn, Bonn, Germany
- Center for Science and Thought, University of Bonn, Bonn, Germany
| |
Collapse
|
9
|
Cashin AG, Bagg MK, Richards GC, Toomey E, McAuley JH, Lee H. Limited engagement with transparent and open science standards in the policies of pain journals: a cross-sectional evaluation. BMJ Evid Based Med 2021; 26:313-319. [PMID: 31980469 DOI: 10.1136/bmjebm-2019-111296] [Citation(s) in RCA: 23] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 01/06/2020] [Indexed: 11/04/2022]
Abstract
Scientific progress requires transparency and openness. The ability to critique, replicate and implement scientific findings depends on the transparency of the study design and methods, and the open availability of study materials, data and code. Journals are key stakeholders in supporting transparency and openness. This study aimed to evaluate 10 highest ranked pain journals' authorship policies with respect to their support for transparent and open research practices. Two independent authors evaluated the journal policies (as at 27 May 2019) using three tools: the self-developed Transparency and Openness Evaluation Tool, the Centre for Open Science (COS) Transparency Factor and the International Committee of Medical Journal Editors (ICMJE) requirements for disclosure of conflicts of interest. We found that the journal policies had an overall low level of engagement with research transparency and openness standards. The median COS Transparency Factor score was 3.5 (IQR 2.8) of 29 possible points, and only 7 of 10 journals' stated requirements for disclosure of conflicts of interest aligned fully with the ICMJE recommendations. Improved transparency and openness of pain research has the potential to benefit all that are involved in generating and using research findings. Journal policies that endorse and facilitate transparent and open research practices will ultimately improve the evidence base that informs the care provided for people with pain.
Collapse
Affiliation(s)
- Aidan G Cashin
- Prince of Wales Clinical School, University of New South Wales Faculty of Medicine, Randwick, New South Wales, Australia
- Centre for Pain IMPACT, Neuroscience Research Australia, Randwick, New South Wales, Australia
| | - Matthew K Bagg
- Prince of Wales Clinical School, University of New South Wales Faculty of Medicine, Randwick, New South Wales, Australia
- Centre for Pain IMPACT, Neuroscience Research Australia, Randwick, New South Wales, Australia
| | - Georgia C Richards
- Centre for Evidence-Based Medicine, Nuffield Department of Primary Care Health Sciences, University of Oxford, Oxford, Oxfordshire, UK
| | - Elaine Toomey
- Health Behaviour Change Research Group, School of Psychology, National University of Ireland Galway, Galway, Ireland
| | - James H McAuley
- Centre for Pain IMPACT, Neuroscience Research Australia, Randwick, New South Wales, Australia
- School of Medical Sciences, Faculty of Medicine, University of New South Wales, Randwick, New South Wales, Australia
| | - Hopin Lee
- Centre for Statistics in Medicine & Rehabilitation Research in Oxford, Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences (NDORMS), University of Oxford, Oxford, Oxfordshire, UK
- School of Medicine and Public Health, The University of Newcastle, Callaghan, New South Wales, Australia
| |
Collapse
|
10
|
Abstract
Scientists in some fields are concerned that many published results are false. Recent models predict selection for false positives as the inevitable result of pressure to publish, even when scientists are penalized for publications that fail to replicate. We model the cultural evolution of research practices when laboratories are allowed to expend effort on theory, enabling them, at a cost, to identify hypotheses that are more likely to be true, before empirical testing. Theory can restore high effort in research practice and suppress false positives to a technical minimum, even without replication. The mere ability to choose between two sets of hypotheses, one with greater prior chance of being correct, promotes better science than can be achieved with effortless access to the set of stronger hypotheses. Combining theory and replication can have synergistic effects. On the basis of our analysis, we propose four simple recommendations to promote good science.
Collapse
|
11
|
Tiokhin L, Yan M, Morgan TJH. Competition for priority harms the reliability of science, but reforms can help. Nat Hum Behav 2021; 5:857-867. [PMID: 33510392 DOI: 10.1038/s41562-020-01040-1] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/22/2020] [Accepted: 12/18/2020] [Indexed: 01/30/2023]
Abstract
Incentives for priority of discovery are hypothesized to harm scientific reliability. Here, we evaluate this hypothesis by developing an evolutionary agent-based model of a competitive scientific process. We find that rewarding priority of discovery causes populations to culturally evolve towards conducting research with smaller samples. This reduces research reliability and the information value of the average study. Increased start-up costs for setting up single studies and increased payoffs for secondary results (also known as scoop protection) attenuate the negative effects of competition. Furthermore, large rewards for negative results promote the evolution of smaller sample sizes. Our results confirm the logical coherence of scoop protection reforms at several journals. Our results also imply that reforms to increase scientific efficiency, such as rapid journal turnaround times, may produce collateral damage by incentivizing lower-quality research; in contrast, reforms that increase start-up costs, such as pre-registration and registered reports, may generate incentives for higher-quality research.
Collapse
Affiliation(s)
- Leonid Tiokhin
- Human-Technology Interaction Group, Eindhoven University of Technology, Eindhoven, the Netherlands.
| | - Minhua Yan
- School of Human Evolution and Social Change, Arizona State University, Tempe, AZ, USA.,Institute of Human Origins, Arizona State University, Tempe, AZ, USA
| | - Thomas J H Morgan
- School of Human Evolution and Social Change, Arizona State University, Tempe, AZ, USA.,Institute of Human Origins, Arizona State University, Tempe, AZ, USA
| |
Collapse
|
12
|
Abstract
In the face of unreplicable results, statistical anomalies, and outright fraud, introspection and changes in the psychological sciences have taken root. Vibrant reform and metascience movements have emerged. These are exciting developments and may point toward practical improvements in the future. Yet there is nothing so practical as good theory. This article outlines aspects of reform and metascience in psychology that are ripe for an injection of theory, including a lot of excellent and overlooked theoretical work from different disciplines. I review established frameworks that model the process of scientific discovery, the types of scientific networks that we ought to aspire to, and the processes by which problematic norms and institutions might evolve, focusing especially on modeling from the philosophy of science and cultural evolution. We have unwittingly evolved a toxic scientific ecosystem; existing interdisciplinary theory may help us intelligently design a better one.
Collapse
Affiliation(s)
- Will M. Gervais
- Centre for Culture and Evolution, Department of Psychology, Brunel University London
| |
Collapse
|
13
|
Besançon L, Peiffer-Smadja N, Segalas C, Jiang H, Masuzzo P, Smout C, Billy E, Deforet M, Leyrat C. Open science saves lives: lessons from the COVID-19 pandemic. BMC Med Res Methodol 2021; 21:117. [PMID: 34090351 PMCID: PMC8179078 DOI: 10.1186/s12874-021-01304-y] [Citation(s) in RCA: 84] [Impact Index Per Article: 28.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/22/2021] [Accepted: 05/04/2021] [Indexed: 11/10/2022] Open
Abstract
In the last decade Open Science principles have been successfully advocated for and are being slowly adopted in different research communities. In response to the COVID-19 pandemic many publishers and researchers have sped up their adoption of Open Science practices, sometimes embracing them fully and sometimes partially or in a sub-optimal manner. In this article, we express concerns about the violation of some of the Open Science principles and its potential impact on the quality of research output. We provide evidence of the misuses of these principles at different stages of the scientific process. We call for a wider adoption of Open Science practices in the hope that this work will encourage a broader endorsement of Open Science principles and serve as a reminder that science should always be a rigorous process, reliable and transparent, especially in the context of a pandemic where research findings are being translated into practice even more rapidly. We provide all data and scripts at https://osf.io/renxy/ .
Collapse
Affiliation(s)
- Lonni Besançon
- Faculty of Information Technology, Monash University, Melbourne, Australia
- Media and Information Technology, Linköping University, Norrköping, Sweden
| | - Nathan Peiffer-Smadja
- Université de Paris, IAME, INSERM, Paris, F-75018 France
- National Institute for Health Research Health Protection Research Unit in Healthcare Associated Infections and Antimicrobial Resistance, Imperial College London, London, United Kingdom
| | - Corentin Segalas
- Department of Medical Statistics, London School of Hygiene and Tropical Medicine, London, United Kingdom
| | - Haiting Jiang
- School of Health Policy and Management, Nanjing Medical University, Nanjing, China
| | - Paola Masuzzo
- IGDORE, Institute for Globally Distributed Open Research and Education, Box 1074, Kristinehöjdsgatan 9A, Gothenburg, 412 82 Sweden
| | - Cooper Smout
- IGDORE, Institute for Globally Distributed Open Research and Education, Box 1074, Kristinehöjdsgatan 9A, Gothenburg, 412 82 Sweden
| | | | - Maxime Deforet
- Sorbonne Université, CNRS, Institut de Biologie Paris-Seine (IBPS), Laboratoire Jean Perrin (LJP), Paris, France
| | - Clémence Leyrat
- Department of Medical Statistics, London School of Hygiene and Tropical Medicine, London, United Kingdom
- Inequalities in Cancer Outcomes Network, Department of Non-Communicable Disease Epidemiology, London School of Hygiene and Tropical Medicine, London, United Kingdom
| |
Collapse
|
14
|
Krammer G, Svecnik E. [The contribution of Open Science to the quality of educational research]. ZEITSCHRIFT FUR BILDUNGSFORSCHUNG 2021; 10:263-278. [PMID: 38624603 PMCID: PMC7793618 DOI: 10.1007/s35834-020-00286-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/13/2020] [Revised: 12/14/2020] [Accepted: 12/20/2020] [Indexed: 11/10/2022]
Abstract
The starting point of this paper are the discussions of the robustness of empirical findings in related disciplines, namely social psychology, which culminated in the so-called "replication crisis". These discussions about replication and "questionable research practices" have only started to reach the educational sciences. At the same time, parts of the educational sciences are prone to the same problems as related disciplines. Therefore, it may only be a matter of time before these controversies also arise in the educational sciences. Against this backdrop, we argue how Open Science can contribute to increasing the robustness of educational sciences' findings. In particular, we suggest three Open Science practices: Pre-registration, Open Materials and Open Data. We present these practices and examine how researchers can implement these Open Science practices in the educational sciences. We discuss the specific conditions of the educational sciences in comparison to related disciplines and address the limitations and particularities of the educational sciences. We conclude with a plea for transparency.
Collapse
Affiliation(s)
- Georg Krammer
- Pädagogische Hochschule Steiermark, Hasnerplatz 12, 8010 Graz, Österreich
| | - Erich Svecnik
- IQS – Institut des Bundes für Qualitätssicherung im österreichischen Schulwesen, Hans-Sachs-Gasse 3, 8010 Graz, Österreich
| |
Collapse
|