1
|
Kyonka EGE, Subramaniam S. Tactics of just, equitable, diverse, and inclusive scientific research. J Exp Anal Behav 2024. [PMID: 39155678 DOI: 10.1002/jeab.4201] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2023] [Accepted: 07/15/2024] [Indexed: 08/20/2024]
Abstract
The principles of social justice, equity, diversity, inclusion (JEDI) have received increasing attention in behavior analysis circles, but the conversation has largely centered on implications for applied behavior analysis practice and research. It may be less clear to researchers who conduct basic and translational research how JEDI principles can inform and inspire their work. This article synthesizes publications from behavior analysis and other scientific fields about tactics of JEDI-informed research. We organized this scholarship across five stages of research from developing the research question to sharing findings and curated sources for an audience of behavioral science researchers. We discuss reflexive practice, representation, belongingness, participatory research, quantitative critical theory, and open science, among other topics. Some researchers may have already adopted some of the practices outlined, some may begin new practices, and some may choose to conduct experimental analyses of JEDI problems. Our hope is that those actions will be reinforced by the behavior analysis scientific community. We conclude by encouraging the leadership of this journal to continue to work toward the structural changes necessary to make the experimental analysis of behavior just, equitable, diverse, and inclusive.
Collapse
Affiliation(s)
- Elizabeth G E Kyonka
- Department of Psychology, California State University, East Bay, Hayward, CA, USA
| | - Shrinidhi Subramaniam
- Department of Psychology and Child Development, California State University, Stanislaus, Turlock, CA, USA
| |
Collapse
|
2
|
Hardwicke TE, Vazire S. Transparency Is Now the Default at Psychological Science. Psychol Sci 2024; 35:708-711. [PMID: 38150599 DOI: 10.1177/09567976231221573] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/29/2023] Open
|
3
|
Silverstein P, Elman C, Montoya A, McGillivray B, Pennington CR, Harrison CH, Steltenpohl CN, Röer JP, Corker KS, Charron LM, Elsherif M, Malicki M, Hayes-Harb R, Grinschgl S, Neal T, Evans TR, Karhulahti VM, Krenzer WLD, Belaus A, Moreau D, Burin DI, Chin E, Plomp E, Mayo-Wilson E, Lyle J, Adler JM, Bottesini JG, Lawson KM, Schmidt K, Reneau K, Vilhuber L, Waltman L, Gernsbacher MA, Plonski PE, Ghai S, Grant S, Christian TM, Ngiam W, Syed M. A guide for social science journal editors on easing into open science. Res Integr Peer Rev 2024; 9:2. [PMID: 38360805 PMCID: PMC10870631 DOI: 10.1186/s41073-023-00141-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2023] [Accepted: 12/28/2023] [Indexed: 02/17/2024] Open
Abstract
Journal editors have a large amount of power to advance open science in their respective fields by incentivising and mandating open policies and practices at their journals. The Data PASS Journal Editors Discussion Interface (JEDI, an online community for social science journal editors: www.dpjedi.org ) has collated several resources on embedding open science in journal editing ( www.dpjedi.org/resources ). However, it can be overwhelming as an editor new to open science practices to know where to start. For this reason, we created a guide for journal editors on how to get started with open science. The guide outlines steps that editors can take to implement open policies and practices within their journal, and goes through the what, why, how, and worries of each policy and practice. This manuscript introduces and summarizes the guide (full guide: https://doi.org/10.31219/osf.io/hstcx ).
Collapse
Affiliation(s)
- Priya Silverstein
- Department of Psychology, Ashland University, Ashland, USA.
- Institute for Globally Distributed Open Research and Education, Preston, UK.
| | - Colin Elman
- Maxwell School of Citizenship and Public Affairs, Syracuse University, Syracuse, USA
| | - Amanda Montoya
- Department of Psychology, University of California, Los Angeles, USA
| | | | - Charlotte R Pennington
- School of Psychology, College of Health & Life Sciences, Aston University, Birmingham, UK
| | | | | | - Jan Philipp Röer
- Department of Psychology and Psychotherapy, Witten/Herdecke University, Witten, Germany
| | | | - Lisa M Charron
- American Family Insurance Data Science Institute, University of Wisconsin-Madison, Madison, USA
- Nelson Institute for Environmental Studies, University of Wisconsin-Madison, Madison, USA
| | - Mahmoud Elsherif
- Department of Psychology, University of Birmingham, Birmingham, UK
| | - Mario Malicki
- Meta-Research Innovation Center at Stanford, Stanford University, Stanford, USA
- Stanford Program On Research Rigor and Reproducibility, Stanford University, Stanford, USA
- Department of Epidemiology and Population Health, Stanford University School of Medicine, Stanford, USA
| | | | | | - Tess Neal
- Department of Psychology, Iowa State University, Ames, USA
- School of Social & Behavioral Sciences, Arizona State University, Tempe, USA
| | - Thomas Rhys Evans
- School of Human Sciences and Institute for Lifecourse Development, University of Greenwich, London, UK
| | - Veli-Matti Karhulahti
- Department of Music, Art and Culture Studies, University of Jyväskylä, Jyväskylä, Finland
| | | | - Anabel Belaus
- National Agency for Scientific and Technological Promotion, Córdoba, Argentina
| | - David Moreau
- School of Psychology and Centre for Brain Research, University of Auckland, Auckland, New Zealand
| | - Debora I Burin
- Facultad de Psicología, Universidad de Buenos Aires, Buenos Aires, Argentina
- CONICET, Buenos Aires, Argentina
| | | | - Esther Plomp
- Faculty of Applied Sciences, Delft University of Technology, Delft, Netherlands
- The, The Alan Turing Institute, Turing Way, London, UK
| | - Evan Mayo-Wilson
- Department of Epidemiology, UNC Gillings School of Global Public Health, Chapel Hill, USA
| | - Jared Lyle
- Inter-University Consortium for Political and Social Research (ICPSR), University of Michigan, Ann Arbor, USA
| | | | - Julia G Bottesini
- Maxwell School of Citizenship and Public Affairs, Syracuse University, Syracuse, USA
| | | | | | - Kyrani Reneau
- Inter-University Consortium for Political and Social Research (ICPSR), University of Michigan, Ann Arbor, USA
| | - Lars Vilhuber
- Economics Department, Cornell University, Ithaca, USA
| | - Ludo Waltman
- Centre for Science and Technology Studies, Leiden University, Leiden, Netherlands
| | | | - Paul E Plonski
- Department of Psychology, Tufts University, Medford, USA
| | - Sakshi Ghai
- Department of Psychology, University of Cambridge, Cambridge, USA
| | - Sean Grant
- HEDCO Institute for Evidence-Based Practice, College of Education, University of Oregon, Eugene, USA
| | - Thu-Mai Christian
- Odum Institute for Research in Social Science, University of North Carolina at Chapel Hill, Chapel Hill, USA
| | - William Ngiam
- Institute of Mind and Biology, University of Chicago, Chicago, USA
- Department of Psychology, University of Chicago, Chicago, USA
| | - Moin Syed
- Department of Psychology, University of Minnesota, Minneapolis, USA
| |
Collapse
|
4
|
Rethlefsen ML, Brigham TJ, Price C, Moher D, Bouter LM, Kirkham JJ, Schroter S, Zeegers MP. Systematic review search strategies are poorly reported and not reproducible: a cross-sectional metaresearch study. J Clin Epidemiol 2024; 166:111229. [PMID: 38052277 DOI: 10.1016/j.jclinepi.2023.111229] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2023] [Revised: 11/21/2023] [Accepted: 11/27/2023] [Indexed: 12/07/2023]
Abstract
OBJECTIVES To determine the reproducibility of biomedical systematic review search strategies. STUDY DESIGN AND SETTING A cross-sectional reproducibility study was conducted on a random sample of 100 systematic reviews indexed in MEDLINE in November 2021. The primary outcome measure is the percentage of systematic reviews for which all database searches can be reproduced, operationalized as fulfilling six key Preferred Reporting Items for Systematic reviews and Meta-Analyses literature search extension (PRISMA-S) reporting guideline items and having all database searches reproduced within 10% of the number of original results. Key reporting guideline items included database name, multi-database searching, full search strategies, limits and restrictions, date(s) of searches, and total records. RESULTS The 100 systematic review articles contained 453 database searches. Only 22 (4.9%) database searches reported all six PRISMA-S items. Forty-seven (10.4%) database searches could be reproduced within 10% of the number of results from the original search; six searches differed by more than 1,000% between the originally reported number of results and the reproduction. Only one systematic review article provided the necessary search details to be fully reproducible. CONCLUSION Systematic review search reporting is poor. To correct this will require a multifaceted response from authors, peer reviewers, journal editors, and database providers.
Collapse
Affiliation(s)
- Melissa L Rethlefsen
- Health Sciences Library & Informatics Center, University of New Mexico, MSC 09 5100, 1 University of New Mexico, Albuquerque, NM 87131-0001, USA; Department of Epidemiology, Maastricht University, Maastricht, The Netherlands.
| | - Tara J Brigham
- Library Services-Florida, Mayo Clinic Libraries, Mayo Clinic, 4500 San Pablo Road, Jacksonville, FL 32224, USA
| | - Carrie Price
- Albert S. Cook Library, Towson University, 8000 York Road, Towson, MD 21252, USA
| | - David Moher
- Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute, The Ottawa Hospital, General Campus, Centre for Practice Changing Research Building, 501 Smyth Road, PO BOX 201B, Ottawa, Ontario K1H 8L6, Canada
| | - Lex M Bouter
- Department of Epidemiology and Data Science, Amsterdam UMC, Vrije Universiteit Amsterdam, De Boelelaan 1089a, 1081 HV Amsterdam, The Netherlands; Department of Philosophy, Faculty of Humanities, Vrije Universiteit Amsterdam, De Boelelaan 1105, 1081 HV Amsterdam, The Netherlands
| | - Jamie J Kirkham
- Centre for Biostatistics, The University of Manchester, Manchester Academic Health Science Centre, Manchester, UK
| | - Sara Schroter
- BMJ, BMA House, Tavistock Square, London WC1H 9JR, UK; Faculty of Public Health & Policy, London School of Hygiene & Tropical Medicine, Keppel Street, London WC1E 7HT, UK
| | - Maurice P Zeegers
- Department of Epidemiology, Maastricht University, Maastricht, The Netherlands; MBP Holding, Heerlen, The Netherlands
| |
Collapse
|
5
|
Samuel S, Mietchen D. Computational reproducibility of Jupyter notebooks from biomedical publications. Gigascience 2024; 13:giad113. [PMID: 38206590 PMCID: PMC10783158 DOI: 10.1093/gigascience/giad113] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2022] [Revised: 08/09/2023] [Accepted: 12/08/2023] [Indexed: 01/12/2024] Open
Abstract
BACKGROUND Jupyter notebooks facilitate the bundling of executable code with its documentation and output in one interactive environment, and they represent a popular mechanism to document and share computational workflows, including for research publications. The reproducibility of computational aspects of research is a key component of scientific reproducibility but has not yet been assessed at scale for Jupyter notebooks associated with biomedical publications. APPROACH We address computational reproducibility at 2 levels: (i) using fully automated workflows, we analyzed the computational reproducibility of Jupyter notebooks associated with publications indexed in the biomedical literature repository PubMed Central. We identified such notebooks by mining the article's full text, trying to locate them on GitHub, and attempting to rerun them in an environment as close to the original as possible. We documented reproduction success and exceptions and explored relationships between notebook reproducibility and variables related to the notebooks or publications. (ii) This study represents a reproducibility attempt in and of itself, using essentially the same methodology twice on PubMed Central over the course of 2 years, during which the corpus of Jupyter notebooks from articles indexed in PubMed Central has grown in a highly dynamic fashion. RESULTS Out of 27,271 Jupyter notebooks from 2,660 GitHub repositories associated with 3,467 publications, 22,578 notebooks were written in Python, including 15,817 that had their dependencies declared in standard requirement files and that we attempted to rerun automatically. For 10,388 of these, all declared dependencies could be installed successfully, and we reran them to assess reproducibility. Of these, 1,203 notebooks ran through without any errors, including 879 that produced results identical to those reported in the original notebook and 324 for which our results differed from the originally reported ones. Running the other notebooks resulted in exceptions. CONCLUSIONS We zoom in on common problems and practices, highlight trends, and discuss potential improvements to Jupyter-related workflows associated with biomedical publications.
Collapse
Affiliation(s)
- Sheeba Samuel
- Heinz-Nixdorf Chair for Distributed Information Systems, Friedrich Schiller University Jena, Jena 07743, Germany
- Michael Stifel Center Jena, Jena 07743, Germany
| | - Daniel Mietchen
- Ronin Institute, Montclair 07043-2314, NJ, United States
- Institute for Globally Distributed Open Research and Education (IGDORE)
- FIZ Karlsruhe—Leibniz Institute for Information Infrastructure, Berlin 76344, Germany
| |
Collapse
|
6
|
Thibault RT, Amaral OB, Argolo F, Bandrowski AE, Davidson AR, Drude NI. Open Science 2.0: Towards a truly collaborative research ecosystem. PLoS Biol 2023; 21:e3002362. [PMID: 37856538 PMCID: PMC10617723 DOI: 10.1371/journal.pbio.3002362] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Revised: 10/31/2023] [Indexed: 10/21/2023] Open
Abstract
Conversations about open science have reached the mainstream, yet many open science practices such as data sharing remain uncommon. Our efforts towards openness therefore need to increase in scale and aim for a more ambitious target. We need an ecosystem not only where research outputs are openly shared but also in which transparency permeates the research process from the start and lends itself to more rigorous and collaborative research. To support this vision, this Essay provides an overview of a selection of open science initiatives from the past 2 decades, focusing on methods transparency, scholarly communication, team science, and research culture, and speculates about what the future of open science could look like. It then draws on these examples to provide recommendations for how funders, institutions, journals, regulators, and other stakeholders can create an environment that is ripe for improvement.
Collapse
Affiliation(s)
- Robert T. Thibault
- 1 Meta-Research Innovation Center at Stanford (METRICS), Stanford University, Stanford, California, Unites States of America
| | - Olavo B. Amaral
- Institute of Medical Biochemistry Leopoldo de Meis, Universidade Federal do Rio de Janeiro, Rio de Janeiro, Brazil
| | | | - Anita E. Bandrowski
- FAIR Data Informatics Lab, Department of Neuroscience, UCSD, San Diego, California, United States of America
- SciCrunch Inc., San Diego, California, United States of America
| | - Alexandra R, Davidson
- Institute for Evidence-Based Health Care, Bond University, Robina, Australia
- Faculty of Health Science and Medicine, Bond University, Robina, Australia
| | - Natascha I. Drude
- Berlin Institute of Health (BIH) at Charité, BIH QUEST Center for Responsible Research, Berlin, Germany
| |
Collapse
|
7
|
van Ravenzwaaij D, Bakker M, Heesen R, Romero F, van Dongen N, Crüwell S, Field SM, Held L, Munafò MR, Pittelkow MM, Tiokhin L, Traag VA, van den Akker OR, van ‘t Veer AE, Wagenmakers EJ. Perspectives on scientific error. ROYAL SOCIETY OPEN SCIENCE 2023; 10:230448. [PMID: 37476516 PMCID: PMC10354464 DOI: 10.1098/rsos.230448] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/07/2023] [Accepted: 06/27/2023] [Indexed: 07/22/2023]
Abstract
Theoretical arguments and empirical investigations indicate that a high proportion of published findings do not replicate and are likely false. The current position paper provides a broad perspective on scientific error, which may lead to replication failures. This broad perspective focuses on reform history and on opportunities for future reform. We organize our perspective along four main themes: institutional reform, methodological reform, statistical reform and publishing reform. For each theme, we illustrate potential errors by narrating the story of a fictional researcher during the research cycle. We discuss future opportunities for reform. The resulting agenda provides a resource to usher in an era that is marked by a research culture that is less error-prone and a scientific publication landscape with fewer spurious findings.
Collapse
Affiliation(s)
- D. van Ravenzwaaij
- Department of Psychology, University of Groningen, Grote Kruisstraat 2/1, Heymans Building, room 239, 9712 TS Groningen, The Netherlands
| | - M. Bakker
- Tilburg University, 5037 AB Tilburg, The Netherlands
| | - R. Heesen
- University of Western Australia, Perth, Western Australia 6009, Australia
- London School of Economics and Political Science, London WC2A 2AE, UK
| | - F. Romero
- Department of Psychology, University of Groningen, Grote Kruisstraat 2/1, Heymans Building, room 239, 9712 TS Groningen, The Netherlands
| | - N. van Dongen
- University of Amsterdam, 1012 WP Amsterdam, The Netherlands
| | - S. Crüwell
- Department of History and Philosophy of Science, University of Cambridge, Cambridge CB2 1TN, UK
| | - S. M. Field
- Centre for Science and Technology Studies (CWTS), Leiden University, 2311 EZ Leiden, The Netherlands
| | - L. Held
- University of Zurich, 8006 Zürich, Switzerland
| | - M. R. Munafò
- School of Psychological Science, University of Bristol, Bristol BS8 1QU, UK
| | - M. M. Pittelkow
- Department of Psychology, University of Groningen, Grote Kruisstraat 2/1, Heymans Building, room 239, 9712 TS Groningen, The Netherlands
- QUEST Center for Transforming Biomedical Research, Berlin Institute of Health, Charité—Universitätsmedizin, 10178 Berlin, Germany
| | - L. Tiokhin
- IG&H Consulting, 3528 AC Utrecht, The Netherlands
| | - V. A. Traag
- Centre for Science and Technology Studies (CWTS), Leiden University, 2311 EZ Leiden, The Netherlands
| | - O. R. van den Akker
- Tilburg University, 5037 AB Tilburg, The Netherlands
- QUEST Center for Transforming Biomedical Research, Berlin Institute of Health, Charité—Universitätsmedizin, 10178 Berlin, Germany
| | - A. E. van ‘t Veer
- Methodology and Statistics Unit, Institute of Psychology, Leiden University, 2333 AK Leiden, The Netherlands
| | | |
Collapse
|