1
|
Khatri S, Shah A, Yumeen S, Saliba E. Analysis of Dermatology Journal Policy Toward Artificial Intelligence. J Cutan Med Surg 2024:12034754241238709. [PMID: 38468122 DOI: 10.1177/12034754241238709] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/13/2024]
Affiliation(s)
- Surya Khatri
- Warren Alpert Medical School of Brown University, Providence, RI, USA
| | - Asghar Shah
- Warren Alpert Medical School of Brown University, Providence, RI, USA
| | - Sara Yumeen
- Department of Dermatology, Warren Alpert Medical School of Brown University, Providence, RI, USA
| | - Elie Saliba
- Department of Dermatology, Warren Alpert Medical School of Brown University, Providence, RI, USA
| |
Collapse
|
2
|
Humphreys K, Calder R, Marsden J, Day E. How Addiction handles disagreements over potentially harmful terminology. Addiction 2023; 118:1833-1834. [PMID: 37489005 DOI: 10.1111/add.16302] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/03/2023] [Accepted: 07/03/2023] [Indexed: 07/26/2023]
Affiliation(s)
- Keith Humphreys
- VA Palo Alto Health Care System, Palo Alto, CA, USA
- Stanford University, Stanford, CA, USA
- National Addiction Centre, Institute of Psychiatry, Psychology and Neuroscience, King's College London, London, UK
| | - Rob Calder
- The Society for the Study of Addiction, Northampton, UK
| | - John Marsden
- National Addiction Centre, Institute of Psychiatry, Psychology and Neuroscience, King's College London, London, UK
| | - Ed Day
- Addiction Psychiatry, Institute for Mental Health, School of Psychology, University of Birmingham, Birmingham, UK
| |
Collapse
|
3
|
Crüwell S, Apthorp D, Baker BJ, Colling L, Elson M, Geiger SJ, Lobentanzer S, Monéger J, Patterson A, Schwarzkopf DS, Zaneva M, Brown NJL. What's in a Badge? A Computational Reproducibility Investigation of the Open Data Badge Policy in One Issue of Psychological Science. Psychol Sci 2023; 34:512-522. [PMID: 36730433 DOI: 10.1177/09567976221140828] [Citation(s) in RCA: 6] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/04/2023] Open
Abstract
In April 2019, Psychological Science published its first issue in which all Research Articles received the Open Data badge. We used that issue to investigate the effectiveness of this badge, focusing on the adherence to its aim at Psychological Science: sharing both data and code to ensure reproducibility of results. Twelve researchers of varying experience levels attempted to reproduce the results of the empirical articles in the target issue (at least three researchers per article). We found that all 14 articles provided at least some data and six provided analysis code, but only one article was rated to be exactly reproducible, and three were rated as essentially reproducible with minor deviations. We suggest that researchers should be encouraged to adhere to the higher standard in force at Psychological Science. Moreover, a check of reproducibility during peer review may be preferable to the disclosure method of awarding badges.
Collapse
Affiliation(s)
- Sophia Crüwell
- Meta-Research Innovation Center Berlin (METRIC-B), QUEST Center for Transforming Biomedical Research, Berlin Institute of Health, Charité - Universitätsmedizin Berlin
- Department of History and Philosophy of Science, University of Cambridge
| | - Deborah Apthorp
- School of Psychology, University of New England
- School of Computing, Australian National University
| | - Bradley J Baker
- Department of Sport and Recreation Management, Temple University
| | | | - Malte Elson
- Faculty of Psychology, Ruhr University Bochum
- Horst Görtz Institute for IT Security, Ruhr University Bochum
| | - Sandra J Geiger
- Environmental Psychology, Department of Cognition, Emotion, and Methods, Faculty of Psychology, University of Vienna
| | | | - Jean Monéger
- Department of Psychology, University of Poitiers
- Research Center on Cognition and Learning, Centre National de la Recherche Scientifique (CNRS) 7295
| | - Alex Patterson
- Sheffield Methods Institute, The University of Sheffield
| | - D Samuel Schwarzkopf
- School of Optometry and Vision Science, University of Auckland
- Experimental Psychology, University College London
| | - Mirela Zaneva
- Department of Experimental Psychology, University of Oxford
| | | |
Collapse
|
4
|
Hardwicke TE, Thibault RT, Kosie JE, Tzavella L, Bendixen T, Handcock SA, Köneke VE, Ioannidis JPA. Post-publication critique at top-ranked journals across scientific disciplines: a cross-sectional assessment of policies and practice. R Soc Open Sci 2022; 9:220139. [PMID: 36039285 PMCID: PMC9399707 DOI: 10.1098/rsos.220139] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/03/2022] [Accepted: 07/19/2022] [Indexed: 06/15/2023]
Abstract
Journals exert considerable control over letters, commentaries and online comments that criticize prior research (post-publication critique). We assessed policies (Study One) and practice (Study Two) related to post-publication critique at 15 top-ranked journals in each of 22 scientific disciplines (N = 330 journals). Two-hundred and seven (63%) journals accepted post-publication critique and often imposed limits on length (median 1000, interquartile range (IQR) 500-1200 words) and time-to-submit (median 12, IQR 4-26 weeks). The most restrictive limits were 175 words and two weeks; some policies imposed no limits. Of 2066 randomly sampled research articles published in 2018 by journals accepting post-publication critique, 39 (1.9%, 95% confidence interval [1.4, 2.6]) were linked to at least one post-publication critique (there were 58 post-publication critiques in total). Of the 58 post-publication critiques, 44 received an author reply, of which 41 asserted that original conclusions were unchanged. Clinical Medicine had the most active culture of post-publication critique: all journals accepted post-publication critique and published the most post-publication critique overall, but also imposed the strictest limits on length (median 400, IQR 400-550 words) and time-to-submit (median 4, IQR 4-6 weeks). Our findings suggest that top-ranked academic journals often pose serious barriers to the cultivation, documentation and dissemination of post-publication critique.
Collapse
Affiliation(s)
- Tom E. Hardwicke
- Department of Psychology, University of Amsterdam, Nieuwe Achtergracht 129-B, 1018 WT Amsterdam, The Netherlands
| | - Robert T. Thibault
- Meta-Research Innovation Center at Stanford (METRICS), Stanford University, Stanford, CA, USA
- School of Psychological Science, University of Bristol, Bristol, UK
| | - Jessica E. Kosie
- Department of Psychology, Princeton University, Princeton, NJ, USA
| | | | - Theiss Bendixen
- Department of the Study of Religion, Aarhus University, Aarhus, UK
| | - Sarah A. Handcock
- Florey Department of Neuroscience and Mental Health, University of Melbourne, Melbourne, Australia
| | | | - John P. A. Ioannidis
- Meta-Research Innovation Center at Stanford (METRICS), Stanford University, Stanford, CA, USA
- Departments of Medicine, Epidemiology and Population Health, Biomedical Data Science, and Statistics, Stanford University, Stanford, CA, USA
- Meta-Research Innovation Center Berlin (METRIC-B), QUEST Center for Transforming Biomedical Research, Berlin Institute of Health, Charité – Universitätsmedizin Berlin, Berlin, Germany
| |
Collapse
|
5
|
Hardwicke TE, Bohn M, MacDonald K, Hembacher E, Nuijten MB, Peloquin BN, deMayo BE, Long B, Yoon EJ, Frank MC. Analytic reproducibility in articles receiving open data badges at the journal Psychological Science: an observational study. R Soc Open Sci 2021; 8:201494. [PMID: 33614084 PMCID: PMC7890505 DOI: 10.1098/rsos.201494] [Citation(s) in RCA: 18] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/21/2020] [Accepted: 12/04/2020] [Indexed: 06/12/2023]
Abstract
For any scientific report, repeating the original analyses upon the original data should yield the original outcomes. We evaluated analytic reproducibility in 25 Psychological Science articles awarded open data badges between 2014 and 2015. Initially, 16 (64%, 95% confidence interval [43,81]) articles contained at least one 'major numerical discrepancy' (>10% difference) prompting us to request input from original authors. Ultimately, target values were reproducible without author involvement for 9 (36% [20,59]) articles; reproducible with author involvement for 6 (24% [8,47]) articles; not fully reproducible with no substantive author response for 3 (12% [0,35]) articles; and not fully reproducible despite author involvement for 7 (28% [12,51]) articles. Overall, 37 major numerical discrepancies remained out of 789 checked values (5% [3,6]), but original conclusions did not appear affected. Non-reproducibility was primarily caused by unclear reporting of analytic procedures. These results highlight that open data alone is not sufficient to ensure analytic reproducibility.
Collapse
Affiliation(s)
- Tom E. Hardwicke
- Department of Psychology, University of Amsterdam, Amsterdam, The Netherlands
- Meta-Research Innovation Center Berlin (METRIC-B), QUEST Center for Transforming Biomedical Research, Charité – Universitätsmedizin, Berlin, Germany
| | - Manuel Bohn
- Department of Comparative Cultural Psychology, Max Planck Institute for Evolutionary Anthropology, Leipzig, Germany
| | - Kyle MacDonald
- Department of Communication, University of California, Los Angeles, CA, USA
| | - Emily Hembacher
- Department of Psychology, Stanford University, Stanford, CA, USA
| | - Michèle B. Nuijten
- Department of Methodology and Statistics, Tilburg School of Social and Behavioral Sciences, Tilburg University, Tilburg, The Netherlands
| | | | | | - Bria Long
- Department of Psychology, Stanford University, Stanford, CA, USA
| | - Erica J. Yoon
- Department of Psychology, Stanford University, Stanford, CA, USA
| | - Michael C. Frank
- Department of Psychology, Stanford University, Stanford, CA, USA
| |
Collapse
|
6
|
Hall W, Darke S, Humphreys K, Marsden J, Neale J, West R. Addiction's policy on publishing effectiveness studies of involuntary treatment of addiction and its variants. Addiction 2020; 115:1795-1796. [PMID: 31840309 DOI: 10.1111/add.14933] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/10/2019] [Accepted: 12/10/2019] [Indexed: 11/28/2022]
Affiliation(s)
- Wayne Hall
- University of Queensland, Australia and King's College London, London, UK
| | - Shane Darke
- University of New South Wales, Sydney, Australia
| | | | | | | | | |
Collapse
|
7
|
|
8
|
Hardwicke TE, Mathur MB, MacDonald K, Nilsonne G, Banks GC, Kidwell MC, Hofelich Mohr A, Clayton E, Yoon EJ, Henry Tessler M, Lenne RL, Altman S, Long B, Frank MC. Data availability, reusability, and analytic reproducibility: evaluating the impact of a mandatory open data policy at the journal Cognition. R Soc Open Sci 2018; 5:180448. [PMID: 30225032 PMCID: PMC6124055 DOI: 10.1098/rsos.180448] [Citation(s) in RCA: 110] [Impact Index Per Article: 18.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/19/2018] [Accepted: 06/25/2018] [Indexed: 05/30/2023]
Abstract
Access to data is a critical feature of an efficient, progressive and ultimately self-correcting scientific ecosystem. But the extent to which in-principle benefits of data sharing are realized in practice is unclear. Crucially, it is largely unknown whether published findings can be reproduced by repeating reported analyses upon shared data ('analytic reproducibility'). To investigate this, we conducted an observational evaluation of a mandatory open data policy introduced at the journal Cognition. Interrupted time-series analyses indicated a substantial post-policy increase in data available statements (104/417, 25% pre-policy to 136/174, 78% post-policy), although not all data appeared reusable (23/104, 22% pre-policy to 85/136, 62%, post-policy). For 35 of the articles determined to have reusable data, we attempted to reproduce 1324 target values. Ultimately, 64 values could not be reproduced within a 10% margin of error. For 22 articles all target values were reproduced, but 11 of these required author assistance. For 13 articles at least one value could not be reproduced despite author assistance. Importantly, there were no clear indications that original conclusions were seriously impacted. Mandatory open data policies can increase the frequency and quality of data sharing. However, suboptimal data curation, unclear analysis specification and reporting errors can impede analytic reproducibility, undermining the utility of data sharing and the credibility of scientific findings.
Collapse
Affiliation(s)
- Tom E. Hardwicke
- Meta-Research Innovation Center at Stanford (METRICS), Stanford University, Palo Alto, CA, USA
| | - Maya B. Mathur
- Quantitative Sciences Unit, Stanford University, Palo Alto, CA, USA
- Harvard Biostatistics, Harvard University, Cambridge, MA, USA
| | - Kyle MacDonald
- Department of Psychology, Stanford University, Palo Alto, CA, USA
| | - Gustav Nilsonne
- Department of Psychology, Stanford University, Palo Alto, CA, USA
- Stress Research Institute, Stockholm University, Stockholm, Sweden
- Department of Clinical Neuroscience, Karolinska Institutet, Stockholm, Sweden
| | - George C. Banks
- Belk College of Business, University of North Carolina at Charlotte, Charlotte, NC, USA
| | | | - Alicia Hofelich Mohr
- Liberal Arts Technologies and Innovated Services (LATIS), University of Minnesota, Minneapolis, MN, USA
| | - Elizabeth Clayton
- The Organizational Science Program, University of North Carolina at Charlotte, Charlotte, NC, USA
| | - Erica J. Yoon
- Department of Psychology, Stanford University, Palo Alto, CA, USA
| | | | - Richie L. Lenne
- Department of Psychology, University of Minnesota, Minneapolis, MN, USA
| | - Sara Altman
- Department of Psychology, Stanford University, Palo Alto, CA, USA
| | - Bria Long
- Department of Psychology, Stanford University, Palo Alto, CA, USA
| | - Michael C. Frank
- Department of Psychology, Stanford University, Palo Alto, CA, USA
| |
Collapse
|
9
|
Abstract
BACKGROUND In biomedical research, there have been numerous scandals highlighting conflicts of interest (COIs) leading to significant bias in judgment and questionable practices. Academic institutions, journals, and funding agencies have developed and enforced policies to mitigate issues related to COI, especially surrounding financial interests. After a case of editorial COI in a prominent bioethics journal, there is concern that the same level of oversight regarding COIs in the biomedical sciences may not apply to the field of bioethics. In this study, we examined the availability and comprehensiveness of COI policies for authors, peer reviewers, and editors of bioethics journals. METHODS After developing a codebook, we analyzed the content of online COI policies of 63 bioethics journals, along with policy information provided by journal editors that was not publicly available. RESULTS Just over half of the bioethics journals had COI policies for authors (57%), and only 25% for peer reviewers and 19% for editors. There was significant variation among policies regarding definitions, the types of COIs described, the management mechanisms, and the consequences for noncompliance. Definitions and descriptions centered on financial COIs, followed by personal and professional relationships. Almost all COI policies required disclosure of interests for authors as the primary management mechanism. Very few journals outlined consequences for noncompliance with COI policies or provided additional resources. CONCLUSION Compared to other studies of biomedical journals, a much lower percentage of bioethics journals have COI policies and these vary substantially in content. The bioethics publishing community needs to develop robust policies for authors, peer reviewers, and editors and these should be made publicly available to enhance academic and public trust in bioethics scholarship.
Collapse
Affiliation(s)
- Zubin Master
- Biomedical Ethics Research Program, Mayo Clinic, 200 First Street, SW, Rochester, MN 55905, W: 507-266-1105; Fax: 507-538-0850,
| | - Kelly Werner
- Cohen Children’s Medical Center of New York, Northwell Health, 276-01 76 Ave., New Hyde Park, NY 11040, W: 718-470-3204; Fax: 718-470-3935,
| | - Elise Smith
- National Institute of Environmental Health Sciences, National Institutes of Health, Box 12233, Mail Drop E1 06, Research Triangle Park, NC, USA, 27709,
| | - David B. Resnik
- National Institute of Environmental Health Sciences, National Institutes of Health, Box 12233, Mail Drop E1 06, Research Triangle Park, NC, USA, 27709, W: 919-541-5658; Fax: 919-541-9854,
| | - Bryn Williams-Jones
- Department of Social and Preventive Medicine, School of Public Health, University of Montreal, Montreal, Canada,
| |
Collapse
|
10
|
Johnson JN, Hanson KA, Jones CA, Grandhi R, Guerrero J, Rodriguez JS. Data Sharing in Neurosurgery and Neurology Journals. Cureus 2018; 10:e2680. [PMID: 30050735 PMCID: PMC6059521 DOI: 10.7759/cureus.2680] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2018] [Accepted: 05/08/2018] [Indexed: 12/04/2022] Open
Abstract
In this era of high health care cost and limited research resources, open access to de-identified clinical research study data may promote increased scientific transparency and rigor, allow for the combination and re-analysis of similar data sets, and decrease un-necessary replication of unpublished negative studies. Driven by expanded computing capabilities, advocacy for data sharing to maximize research value is growing in both translational and clinical research communities. The focus of this study is to report on the current status of publicly available research data from studies published in the top 40 neurology and neurosurgery clinical research journals by impact factor. The top journals were carefully reviewed for data sharing policies. Of the journals with data sharing policies, the 10 most current original research papers from December 2015 - February 2016 were reviewed for data sharing statements and data availability. A data sharing policy existed for 48% (19/40) of the 40 journals investigated. Of the 19 journals with an existing data sharing policy, 58% (11/19) of the policies stated that data should be made available to interested parties upon request and 21% (4/19) of these journals encouraged authors to provide a data sharing statement in the article of what data would be available upon request. Of the 190 articles reviewed for data availability, 21% (40/190) of these articles included some source data in the results, figures, or supplementary sections. This evaluation highlights opportunities for neurology and neurosurgery investigators and journals to improve access to study data and even publish the data prospectively for the betterment of clinical outcome analysis and patient care.
Collapse
Affiliation(s)
| | - Keith A Hanson
- School of Medicine, University of Texas Health Science Center San Antonio, San Antonio, USA
| | - Caleb A Jones
- School of Medicine, University of Texas Health Science Center San Antonio, San Antonio, USA
| | - Ramesh Grandhi
- Department of Neurological Surgery, University of Texas Health Science Center San Antonio, San Antonio, USA
| | - Jaime Guerrero
- School of Medicine, University of Texas Health Science Center San Antonio, San Antonio, USA
| | - Jesse S Rodriguez
- Department of Neurological Surgery, University of Texas Health Science Center San Antonio, San Antonio, USA
| |
Collapse
|
11
|
Affiliation(s)
- David B Resnik
- 1 National Institute of Environmental Health Sciences, National Institutes of Health, Durham, North Carolina, USA
| | - Susan A Elmore
- 2 National Institute of Environmental Health Sciences, National Toxicology Program, National Institutes of Health, Research Triangle Park, North Carolina, USA
| |
Collapse
|
12
|
Li G, Kamel M, Jin Y, Xu MK, Mbuagbaw L, Samaan Z, Levine MA, Thabane L. Exploring the characteristics, global distribution and reasons for retraction of published articles involving human research participants: a literature survey. J Multidiscip Healthc 2018; 11:39-47. [PMID: 29403283 PMCID: PMC5779311 DOI: 10.2147/jmdh.s151745] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022] Open
Abstract
Aim Article retraction is a measure taken by journals or authors where there is evidence of research misconduct or error, redundancy, plagiarism or unethical research. Recently, the retraction of scientific publications has been on the rise. In this survey, we aimed to describe the characteristics and distribution of retracted articles and the reasons for retractions. Methods We searched retracted articles on the PubMed database and Retraction Watch website from 1980 to February 2016. The primary outcomes were the characteristics and distribution of retracted articles and the reasons for retractions. The secondary outcomes included how article retractions were handled by journals and how to improve the journal practices toward article retractions. Results We included 1,339 retracted articles. Most retracted articles had six authors or fewer. Article retraction was most common in the USA (26%), Japan (11%) and Germany (10%). The main reasons for article retraction were misconduct (51%, n = 685) and error (14%, n = 193). There were 66% (n = 889) of retracted articles having male senior or corresponding authors. Of the articles retracted after August 2010, 63% (n = 567) retractions were reported on Retraction Watch. Large discrepancies were observed in the ways that different journals handled article retractions. For instance, articles were completely withdrawn from some journals, while in others, articles were still available with no indication of retraction. Likewise, some retraction notices included a detailed account of the events that led to article retraction, while others only consisted of a statement indicating the article retraction. Conclusion The characteristics, geographic distribution and reasons for retraction of published articles involving human research participants were examined in this survey. More efforts are needed to improve the consistency and transparency of journal practices toward article retractions.
Collapse
Affiliation(s)
- Guowei Li
- Department of Health Research Methods, Evidence, and Impact.,St. Joseph's Healthcare Hamilton.,Centre for Evaluation of Medicines, Programs for Assessment of Technology in Health Research Institute
| | - Mariam Kamel
- Department of Health Research Methods, Evidence, and Impact
| | - Yanling Jin
- Department of Health Research Methods, Evidence, and Impact
| | | | - Lawrence Mbuagbaw
- Department of Health Research Methods, Evidence, and Impact.,St. Joseph's Healthcare Hamilton
| | - Zainab Samaan
- Department of Health Research Methods, Evidence, and Impact.,Department of Medicine, McMaster University, Hamilton, ON, Canada
| | - Mitchell Ah Levine
- Department of Health Research Methods, Evidence, and Impact.,St. Joseph's Healthcare Hamilton.,Centre for Evaluation of Medicines, Programs for Assessment of Technology in Health Research Institute.,Department of Medicine, McMaster University, Hamilton, ON, Canada
| | - Lehana Thabane
- Department of Health Research Methods, Evidence, and Impact.,St. Joseph's Healthcare Hamilton
| |
Collapse
|