1
|
Wendelborn C, Anger M, Schickhardt C. Promoting Data Sharing: The Moral Obligations of Public Funding Agencies. SCIENCE AND ENGINEERING ETHICS 2024; 30:35. [PMID: 39105890 PMCID: PMC11303567 DOI: 10.1007/s11948-024-00491-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/21/2022] [Accepted: 06/08/2024] [Indexed: 08/07/2024]
Abstract
Sharing research data has great potential to benefit science and society. However, data sharing is still not common practice. Since public research funding agencies have a particular impact on research and researchers, the question arises: Are public funding agencies morally obligated to promote data sharing? We argue from a research ethics perspective that public funding agencies have several pro tanto obligations requiring them to promote data sharing. However, there are also pro tanto obligations that speak against promoting data sharing in general as well as with regard to particular instruments of such promotion. We examine and weigh these obligations and conclude that all things considered funders ought to promote the sharing of data. Even the instrument of mandatory data sharing policies can be justified under certain conditions.
Collapse
Affiliation(s)
- Christian Wendelborn
- Section for Translational Medical Ethics, German Cancer Research Center (DKFZ), National Center for Tumor Diseases (NCT) Heidelberg, Heidelberg, Germany.
- University of Konstanz, Konstanz, Germany.
| | - Michael Anger
- Section for Translational Medical Ethics, German Cancer Research Center (DKFZ), National Center for Tumor Diseases (NCT) Heidelberg, Heidelberg, Germany
| | - Christoph Schickhardt
- Section for Translational Medical Ethics, German Cancer Research Center (DKFZ), National Center for Tumor Diseases (NCT) Heidelberg, Heidelberg, Germany
| |
Collapse
|
2
|
Hardwicke TE, Vazire S. Transparency Is Now the Default at Psychological Science. Psychol Sci 2024; 35:708-711. [PMID: 38150599 DOI: 10.1177/09567976231221573] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/29/2023] Open
|
3
|
Dulitzki C, Crane SM, Hardwicke TE, Ioannidis JPA. Expanding the data Ark: an attempt to make the data from highly cited social science papers publicly available. ROYAL SOCIETY OPEN SCIENCE 2024; 11:240016. [PMID: 39076822 PMCID: PMC11285638 DOI: 10.1098/rsos.240016] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/04/2024] [Accepted: 04/19/2024] [Indexed: 07/31/2024]
Abstract
Access to scientific data can enable independent reuse and verification; however, most data are not available and become increasingly irrecoverable over time. This study aimed to retrieve and preserve important datasets from 160 of the most highly-cited social science articles published between 2008-2013 and 2015-2018. We asked authors if they would share data in a public repository-the Data Ark-or provide reasons if data could not be shared. Of the 160 articles, data for 117 (73%, 95% CI [67%-80%]) were not available and data for 7 (4%, 95% CI [0%-12%]) were available with restrictions. Data for 36 (22%, 95% CI [16%-30%]) articles were available in unrestricted form: 29 of these datasets were already available and 7 datasets were made available in the Data Ark. Most authors did not respond to our data requests and a minority shared reasons for not sharing, such as legal or ethical constraints. These findings highlight an unresolved need to preserve important scientific datasets and increase their accessibility to the scientific community.
Collapse
Affiliation(s)
- Coby Dulitzki
- Department of Biology, Stanford University, Stanford, CA, USA
| | - Steven Michael Crane
- Stanford Prevention Research Center, Stanford School of Medicine, Stanford, CA, USA
| | - Tom E. Hardwicke
- School of Psychological Sciences, University of Melbourne, Melbourne, Australia
| | - John P. A. Ioannidis
- Meta-Research Innovation Center at Stanford (METRICS), Stanford University, Stanford, CA, USA
- Departments of Medicine, of Epidemiology and Population Health, of Biomedical Data Science, and of Statistics, Stanford University, Stanford, CA, USA
- Stanford Prevention Research Center, Stanford School of Medicine, Stanford, CA, USA
| |
Collapse
|
4
|
Gong T, Gao X, Jiang T. FAB: A "Dummy's" program for self-paced forward and backward reading. Behav Res Methods 2023; 55:4419-4436. [PMID: 36947356 DOI: 10.3758/s13428-022-02025-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/11/2022] [Indexed: 03/23/2023]
Abstract
The self-paced reading paradigm has been popular and widely used in psycholinguistic research for several decades. The tool described in this paper, FAB (Forward and Backward reading), is a tool created to hopefully and maximally reduce the coding demands and simplify the operation costs for experimental researchers and clinical researchers who are doing experimental work, in their designing, coding, implementing, and analyzing self-paced reading tasks. Its basis in web languages (HTML, JavaScript) also promotes experimental implementation and material sharing in our era of open science. In addition, FAB has a unique forward-and-backward mode that can track regressive-like behaviors that are usually only recordable using eye-tracking or mouse-tracking equipment. In this paper, the specific application and usage of FAB is demonstrated in one laboratory and two online validation experiments. We hope this free and open-sourced tool can benefit research in a diverse range of contexts where self-paced reading is desirable.
Collapse
Affiliation(s)
- Tianwei Gong
- Faculty of Psychology, Beijing Normal University, Beijing, 100875, China
| | - Xuefei Gao
- School of Foreign Languages, Fuzhou University of International Studies and Trade, Fuzhou, 350202, China.
- CAS Key Laboratory of Mental Health, Institute of Psychology, Chinese Academy of Sciences, Beijing, 100101, China.
- Department of Psychology, University of Chinese Academy of Sciences, Beijing, 100049, China.
| | - Ting Jiang
- Faculty of Psychology, Beijing Normal University, Beijing, 100875, China.
| |
Collapse
|
5
|
Dumanis SB, Ratan K, McIntosh S, Shah HV, Lewis M, Vines TH, Schekman R, Riley EA. From policy to practice: Lessons learned from an open science funding initiative. PLoS Comput Biol 2023; 19:e1011626. [PMID: 38060981 PMCID: PMC10703508 DOI: 10.1371/journal.pcbi.1011626] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/18/2023] Open
Affiliation(s)
- Sonya B. Dumanis
- Coalition for Aligning Science, Chevy Chase, Maryland, United States of America
- Aligning Science Across Parkinson’s (ASAP), Chevy Chase, Maryland, United States of America
| | - Kristen Ratan
- Strategies for Open Science (Stratos) and Incentivizing Collaborative Open Research (ICOR) Santa Cruz, California, United States of America
| | - Souad McIntosh
- DataSeer Research Data Services, Vancouver, British Columbia, Canada
| | - Hetal V. Shah
- Coalition for Aligning Science, Chevy Chase, Maryland, United States of America
- Aligning Science Across Parkinson’s (ASAP), Chevy Chase, Maryland, United States of America
| | - Matt Lewis
- Coalition for Aligning Science, Chevy Chase, Maryland, United States of America
- Aligning Science Across Parkinson’s (ASAP), Chevy Chase, Maryland, United States of America
| | - Timothy H. Vines
- DataSeer Research Data Services, Vancouver, British Columbia, Canada
| | - Randy Schekman
- Aligning Science Across Parkinson’s (ASAP), Chevy Chase, Maryland, United States of America
- Howard Hughes Medical Institute, University of California, Berkeley, Berkeley, United States of America
| | - Ekemini A. Riley
- Coalition for Aligning Science, Chevy Chase, Maryland, United States of America
- Aligning Science Across Parkinson’s (ASAP), Chevy Chase, Maryland, United States of America
| |
Collapse
|
6
|
Hamilton DG, Hong K, Fraser H, Rowhani-Farid A, Fidler F, Page MJ. Prevalence and predictors of data and code sharing in the medical and health sciences: systematic review with meta-analysis of individual participant data. BMJ 2023; 382:e075767. [PMID: 37433624 DOI: 10.1136/bmj-2023-075767] [Citation(s) in RCA: 10] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 07/13/2023]
Abstract
OBJECTIVES To synthesise research investigating data and code sharing in medicine and health to establish an accurate representation of the prevalence of sharing, how this frequency has changed over time, and what factors influence availability. DESIGN Systematic review with meta-analysis of individual participant data. DATA SOURCES Ovid Medline, Ovid Embase, and the preprint servers medRxiv, bioRxiv, and MetaArXiv were searched from inception to 1 July 2021. Forward citation searches were also performed on 30 August 2022. REVIEW METHODS Meta-research studies that investigated data or code sharing across a sample of scientific articles presenting original medical and health research were identified. Two authors screened records, assessed the risk of bias, and extracted summary data from study reports when individual participant data could not be retrieved. Key outcomes of interest were the prevalence of statements that declared that data or code were publicly or privately available (declared availability) and the success rates of retrieving these products (actual availability). The associations between data and code availability and several factors (eg, journal policy, type of data, trial design, and human participants) were also examined. A two stage approach to meta-analysis of individual participant data was performed, with proportions and risk ratios pooled with the Hartung-Knapp-Sidik-Jonkman method for random effects meta-analysis. RESULTS The review included 105 meta-research studies examining 2 121 580 articles across 31 specialties. Eligible studies examined a median of 195 primary articles (interquartile range 113-475), with a median publication year of 2015 (interquartile range 2012-2018). Only eight studies (8%) were classified as having a low risk of bias. Meta-analyses showed a prevalence of declared and actual public data availability of 8% (95% confidence interval 5% to 11%) and 2% (1% to 3%), respectively, between 2016 and 2021. For public code sharing, both the prevalence of declared and actual availability were estimated to be <0.5% since 2016. Meta-regressions indicated that only declared public data sharing prevalence estimates have increased over time. Compliance with mandatory data sharing policies ranged from 0% to 100% across journals and varied by type of data. In contrast, success in privately obtaining data and code from authors historically ranged between 0% and 37% and 0% and 23%, respectively. CONCLUSIONS The review found that public code sharing was persistently low across medical research. Declarations of data sharing were also low, increasing over time, but did not always correspond to actual sharing of data. The effectiveness of mandatory data sharing policies varied substantially by journal and type of data, a finding that might be informative for policy makers when designing policies and allocating resources to audit compliance. SYSTEMATIC REVIEW REGISTRATION Open Science Framework doi:10.17605/OSF.IO/7SX8U.
Collapse
Affiliation(s)
- Daniel G Hamilton
- MetaMelb Research Group, School of BioSciences, University of Melbourne, Melbourne, VIC, Australia
- Melbourne Medical School, Faculty of Medicine, Dentistry, and Health Sciences, University of Melbourne, Melbourne, VIC, Australia
| | - Kyungwan Hong
- Department of Practice, Sciences, and Health Outcomes Research, University of Maryland School of Pharmacy, Baltimore, MD, USA
| | - Hannah Fraser
- MetaMelb Research Group, School of BioSciences, University of Melbourne, Melbourne, VIC, Australia
| | - Anisa Rowhani-Farid
- Department of Practice, Sciences, and Health Outcomes Research, University of Maryland School of Pharmacy, Baltimore, MD, USA
| | - Fiona Fidler
- MetaMelb Research Group, School of BioSciences, University of Melbourne, Melbourne, VIC, Australia
- School of Historical and Philosophical Studies, University of Melbourne, Melbourne, VIC, Australia
| | - Matthew J Page
- Methods in Evidence Synthesis Unit, School of Public Health and Preventive Medicine, Monash University, Melbourne, VIC, Australia
| |
Collapse
|
7
|
Kennedy E, Dennis EL, Lindsey HM, deRoon-Cassini T, Du Plessis S, Fani N, Kaufman ML, Koen N, Larson CL, Laskowitz S, Lebois LAM, Morey RA, Newsome MR, Palermo C, Pastorek NJ, Powers A, Scheibel R, Seedat S, Seligowski A, Stein DJ, Stevens J, Sun D, Thompson P, Troyanskaya M, van Rooij SJH, Watts AA, Tomas CW, Williams W, Hillary FG, Pugh MJ, Wilde EA, Tate DF. Harmonizing PTSD severity scales across instruments and sites. Neuropsychology 2023; 37:398-408. [PMID: 35797175 PMCID: PMC9948684 DOI: 10.1037/neu0000823] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022] Open
Abstract
OBJECTIVE The variety of instruments used to assess posttraumatic stress disorder (PTSD) allows for flexibility, but also creates challenges for data synthesis. The objective of this work was to use a multisite mega analysis to derive quantitative recommendations for equating scores across measures of PTSD severity. METHOD Empirical Bayes harmonization and linear models were used to describe and mitigate site and covariate effects. Quadratic models for converting scores across PTSD assessments were constructed using bootstrapping and tested on hold out data. RESULTS We aggregated 17 data sources and compiled an n = 5,634 sample of individuals who were assessed for PTSD symptoms. We confirmed our hypothesis that harmonization and covariate adjustments would significantly improve inference of scores across instruments. Harmonization significantly reduced cross-dataset variance (28%, p < .001), and models for converting scores across instruments were well fit (median R² = 0.985) with an average root mean squared error of 1.46 on sum scores. CONCLUSIONS These methods allow PTSD symptom severity to be placed on multiple scales and offers interesting empirical perspectives on the role of harmonization in the behavioral sciences. (PsycInfo Database Record (c) 2023 APA, all rights reserved).
Collapse
Affiliation(s)
- Eamonn Kennedy
- Department of Neurology, University of Utah School of Medicine
| | - Emily L Dennis
- Department of Neurology, University of Utah School of Medicine
| | | | - Terri deRoon-Cassini
- Department of Surgery, Division of Trauma and Acute Care Surgery, Medical College of Wisconsin
| | | | - Negar Fani
- Department of Psychiatry and Behavioral Sciences, Emory University School of Medicine
| | | | - Nastassja Koen
- Department of Psychiatry and Mental Health, University of Cape Town
| | | | | | | | | | - Mary R Newsome
- H. Ben Taub Department of Physical Medicine and Rehabilitation, Baylor College of Medicine
| | - Cori Palermo
- Department of Psychiatry, Harvard Medical School
| | - Nicholas J Pastorek
- H. Ben Taub Department of Physical Medicine and Rehabilitation, Baylor College of Medicine
| | - Abigail Powers
- Department of Psychiatry and Behavioral Sciences, Emory University School of Medicine
| | - Randall Scheibel
- H. Ben Taub Department of Physical Medicine and Rehabilitation, Baylor College of Medicine
| | - Soraya Seedat
- SU/UCT MRC Unit on Risk and Resilience in Mental Disorders, Department of Psychiatry, Stellenbosch University
| | | | - Dan J Stein
- Department of Psychiatry and Mental Health, University of Cape Town
| | - Jennifer Stevens
- Department of Psychiatry and Behavioral Sciences, Emory University School of Medicine
| | - Delin Sun
- Brain Imaging and Analysis Center, Duke University
| | - Paul Thompson
- Imaging Genetics Center, Stevens Neuroimaging and Informatics Institute, Keck School of Medicine of USC
| | - Maya Troyanskaya
- H. Ben Taub Department of Physical Medicine and Rehabilitation, Baylor College of Medicine
| | - Sanne J H van Rooij
- Department of Psychiatry and Behavioral Sciences, Emory University School of Medicine
| | | | | | | | | | - Mary Jo Pugh
- Department of Neurology, University of Utah School of Medicine
| | | | - David F Tate
- Department of Neurology, University of Utah School of Medicine
| |
Collapse
|
8
|
Sadeh Y, Denejkina A, Karyotaki E, Lenferink LIM, Kassam-Adams N. Opportunities for improving data sharing and FAIR data practices to advance global mental health. Glob Ment Health (Camb) 2023; 10:e14. [PMID: 37860102 PMCID: PMC10581864 DOI: 10.1017/gmh.2023.7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/10/2022] [Revised: 01/24/2023] [Accepted: 02/23/2023] [Indexed: 03/06/2023] Open
Abstract
It is crucial to optimize global mental health research to address the high burden of mental health challenges and mental illness for individuals and societies. Data sharing and reuse have demonstrated value for advancing science and accelerating knowledge development. The FAIR (Findable, Accessible, Interoperable, and Reusable) Guiding Principles for scientific data provide a framework to improve the transparency, efficiency, and impact of research. In this review, we describe ethical and equity considerations in data sharing and reuse, delineate the FAIR principles as they apply to mental health research, and consider the current state of FAIR data practices in global mental health research, identifying challenges and opportunities. We describe noteworthy examples of collaborative efforts, often across disciplinary and national boundaries, to improve Findability and Accessibility of global mental health data, as well as efforts to create integrated data resources and tools that improve Interoperability and Reusability. Based on this review, we suggest a vision for the future of FAIR global mental health research and suggest practical steps for researchers with regard to study planning, data preservation and indexing, machine-actionable metadata, data reuse to advance science and improve equity, metrics and recognition.
Collapse
Affiliation(s)
- Yaara Sadeh
- Center for Injury Research and Prevention, Children’s Hospital of Philadelphia, Philadelphia, PA, USA
- Trauma Data Institute, Lovingston, VA, USA
| | - Anna Denejkina
- Graduate Research School, Western Sydney University, Penrith, NSW, Australia
- Translational Health Research Institute, Sydney, Australia
- Young and Resilient Research Centre, Sydney, Australia
| | - Eirini Karyotaki
- Department of Clinical, Neuro- and Developmental Psychology, Vrije Universiteit Amsterdam, Amsterdam, Netherlands
- Amsterdam Public Health Institute, Amsterdam, Netherlands
| | - Lonneke I. M. Lenferink
- Department of Psychology, Health & Technology, University of Twente, Enschede, Netherlands
- Department of Clinical Psychology, Utrecht University, Utrecht, Netherlands
- Department of Clinical Psychology and Experimental Psychopathology, University of Groningen, Groningen, Netherlands
| | - Nancy Kassam-Adams
- Center for Injury Research and Prevention, Children’s Hospital of Philadelphia, Philadelphia, PA, USA
- Trauma Data Institute, Lovingston, VA, USA
- Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA
| |
Collapse
|
9
|
de Ridder J. How to trust a scientist. STUDIES IN HISTORY AND PHILOSOPHY OF SCIENCE 2022; 93:11-20. [PMID: 35247820 DOI: 10.1016/j.shpsa.2022.02.003] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/06/2021] [Revised: 02/07/2022] [Accepted: 02/10/2022] [Indexed: 06/14/2023]
Abstract
Epistemic trust among scientists is inevitable. There are two questions about this: (1) What is the content of this trust, what do scientists trust each other for? (2) Is such trust epistemically justified? I argue that if we assume a traditional answer to (1), namely that scientists trust each other to be reliable informants, then the answer to question (2) is negative, certainly for the biomedical and social sciences. This motivates a different construal of trust among scientists and therefore a different answer to (1): scientists trust each other to only testify to claims that are backed by evidence gathered in accordance with prevailing methodological standards. On this answer, trust among scientists is epistemically justified.
Collapse
Affiliation(s)
- Jeroen de Ridder
- Department of Philosophy, Vrije Universiteit Amsterdam, the Netherlands.
| |
Collapse
|
10
|
Open data and data sharing in articles about COVID-19 published in preprint servers medRxiv and bioRxiv. Scientometrics 2022; 127:2791-2802. [PMID: 35370324 PMCID: PMC8956135 DOI: 10.1007/s11192-022-04346-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2021] [Accepted: 03/09/2022] [Indexed: 11/16/2022]
Abstract
This study aimed to analyze the content of data availability statements (DAS) and the actual sharing of raw data in preprint articles about COVID-19. The study combined a bibliometric analysis and a cross-sectional survey. We analyzed preprint articles on COVID-19 published on medRxiv and bioRxiv from January 1, 2020 to March 30, 2020. We extracted data sharing statements, tried to locate raw data when authors indicated they were available, and surveyed authors. The authors were surveyed in 2020–2021. We surveyed authors whose articles did not include DAS, who indicated that data are available on request, or their manuscript reported that raw data are available in the manuscript, but raw data were not found. Raw data collected in this study are published on Open Science Framework (https://osf.io/6ztec/). We analyzed 897 preprint articles. There were 699 (78%) articles with Data/Code field present on the website of a preprint server. In 234 (26%) preprints, data/code sharing statement was reported within the manuscript. For 283 preprints that reported that data were accessible, we found raw data/code for 133 (47%) of those 283 preprints (15% of all analyzed preprint articles). Most commonly, authors indicated that data were available on GitHub or another clearly specified web location, on (reasonable) request, in the manuscript or its supplementary files. In conclusion, preprint servers should require authors to provide data sharing statements that will be included both on the website and in the manuscript. Education of researchers about the meaning of data sharing is needed.
Collapse
|
11
|
|
12
|
Measurement practices exacerbate the generalizability crisis: Novel digital measures can help. Behav Brain Sci 2022; 45:e10. [PMID: 35139971 DOI: 10.1017/s0140525x21000534] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
Psychology's tendency to focus on confirmatory analyses before ensuring constructs are clearly defined and accurately measured is exacerbating the generalizability crisis. Our growing use of digital behaviors as predictors has revealed the fragility of subjective measures and the latent constructs they scaffold. However, new technologies can provide opportunities to improve conceptualizations, theories, and measurement practices.
Collapse
|
13
|
Towse AS, Ellis DA, Towse JN. Making data meaningful: guidelines for good quality open data. THE JOURNAL OF SOCIAL PSYCHOLOGY 2021; 161:395-402. [PMID: 34292132 DOI: 10.1080/00224545.2021.1938811] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|