51
|
Shamsoddin E, Janani L, Ghamari K, Kabiri P, Shamsi Gooshki E, Mesgarpour B. Psychometric properties of Persian version of the research misconduct questionnaire (PRMQ). J Med Ethics Hist Med 2020; 13:18. [PMID: 33552451 PMCID: PMC7838887 DOI: 10.18502/jmehm.v13i18.4826] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/27/2020] [Accepted: 10/01/2020] [Indexed: 12/05/2022] Open
Abstract
Assessment of scientific misconduct is considered to be an increasingly important topic in medical sciences. Providing a definition for scientific research misconduct and proposing practical methods for evaluating and measuring it in various fields of medicine discipline are required. This study aimed at assessing the psychometric properties of Scientific Research Misconduct-Revised (SMQ-R) and Publication Pressure Questionnaires (PPQ). After translation and merging of these two questionnaires, the validity of the translated draft was evaluated by 11-member expert panel using Content Validity Index (CVI) and Content Validity Ratio (CVR). Reliability of the final questionnaire, completed by 100 participants randomly chosen from medical academic members, was assessed by calculating Cronbach’s alpha coefficient. The final version was named Persian Research Misconduct Questionnaire (PRMQ) and consisted of 63 question items. The item-level content validity indices of 61 questions were above 0.79, and reliability assessment showed that 6 out of 7 subscales had alpha values higher than 0.6. Hence, PRMQ can be considered an acceptable, valid and reliable tool to measure research misconduct in biomedical sciences researches in Iran.
Collapse
Affiliation(s)
- Erfan Shamsoddin
- Research Assistant, National Institute for Medical Research Development (NIMAD), Tehran, Iran
| | - Leila Janani
- Associate Professor, Department of Biostatistics, School of Public Health, Iran University of Medical Sciences, Tehran, Iran
| | - Kiandokht Ghamari
- Research Assistant, National Institute for Medical Research Development (NIMAD), Tehran, Iran
| | - Payam Kabiri
- Senior Research Fellow, Department of Biostatistics and Epidemiology, School of Public Health, Tehran University of Medical Sciences, Tehran, Iran
| | - Ehsan Shamsi Gooshki
- Assistant Professor, Medical Ethics and History of Medicine Research Center, Tehran University of Medical Sciences, Tehran, Iran; Department of Medical Ethics, School of Medicine, Tehran University of Medical Sciences, Tehran, Iran
| | - Bita Mesgarpour
- Assistant Professor, National Institute for Medical Research Development (NIMAD), Tehran, Iran
| |
Collapse
|
52
|
Affiliation(s)
- Rachel Hackett
- The Company of Biologists, Bidder Building, Station Road, Cambridge CB24 9LF, UK
| | - Steven Kelly
- Department of Plant Sciences, University of Oxford, South Parks Road, Oxford OX1 3RB, UK
| |
Collapse
|
53
|
Davies C, Hamilton OKL, Hooley M, Ritakari TE, Stevenson AJ, Wheater ENW. Translational neuroscience: the state of the nation (a PhD student perspective). Brain Commun 2020; 2:fcaa038. [PMID: 32671338 PMCID: PMC7331125 DOI: 10.1093/braincomms/fcaa038] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/18/2020] [Revised: 03/23/2020] [Accepted: 03/24/2020] [Indexed: 11/13/2022] Open
Abstract
Many brain disorders are currently untreatable. It has been suggested that taking a 'translational' approach to neuroscientific research might change this. We discuss what 'translational neuroscience' is and argue for the need to expand the traditional translational model if we are to make further advances in treating brain disorders.
Collapse
Affiliation(s)
- Caitlin Davies
- Wellcome Trust 4-Year PhD in Translational Neuroscience, University of Edinburgh, Edinburgh, UK.,Centre for Discovery Brain Sciences, University of Edinburgh, Hugh Robson Building, 15 George Square, Edinburgh, EH8 9XD, UK.,Dementia Research Institute, University of Edinburgh, Chancellor's Building, 49 Little France Crescent, Edinburgh, EH16 4SB, UK
| | - Olivia K L Hamilton
- Wellcome Trust 4-Year PhD in Translational Neuroscience, University of Edinburgh, Edinburgh, UK.,Dementia Research Institute, University of Edinburgh, Chancellor's Building, 49 Little France Crescent, Edinburgh, EH16 4SB, UK.,Centre for Clinical Brain Sciences, University of Edinburgh, Chancellor's Building, 49 Little France Crescent, Edinburgh, EH16 4SB, UK
| | - Monique Hooley
- Wellcome Trust 4-Year PhD in Translational Neuroscience, University of Edinburgh, Edinburgh, UK.,Centre for Discovery Brain Sciences, University of Edinburgh, Hugh Robson Building, 15 George Square, Edinburgh, EH8 9XD, UK.,Dementia Research Institute, University of Edinburgh, Chancellor's Building, 49 Little France Crescent, Edinburgh, EH16 4SB, UK
| | - Tuula E Ritakari
- Wellcome Trust 4-Year PhD in Translational Neuroscience, University of Edinburgh, Edinburgh, UK.,Dementia Research Institute, University of Edinburgh, Chancellor's Building, 49 Little France Crescent, Edinburgh, EH16 4SB, UK.,Centre for Clinical Brain Sciences, University of Edinburgh, Chancellor's Building, 49 Little France Crescent, Edinburgh, EH16 4SB, UK
| | - Anna J Stevenson
- Wellcome Trust 4-Year PhD in Translational Neuroscience, University of Edinburgh, Edinburgh, UK.,Centre for Discovery Brain Sciences, University of Edinburgh, Hugh Robson Building, 15 George Square, Edinburgh, EH8 9XD, UK.,Centre for Genomic and Experimental Medicine, Institute of Genetics and Molecular and Medicine, Crewe Road, Edinburgh, EH4 2XU, UK
| | - Emily N W Wheater
- Wellcome Trust 4-Year PhD in Translational Neuroscience, University of Edinburgh, Edinburgh, UK.,Centre for Clinical Brain Sciences, University of Edinburgh, Chancellor's Building, 49 Little France Crescent, Edinburgh, EH16 4SB, UK.,Centre for Reproductive Health, Queen's Medical Research Institute, Little France Drive, Edinburgh, EH16 4TJ, UK
| |
Collapse
|
54
|
Biagioli M. Times of Fraud. TRENDS IN CHEMISTRY 2020. [DOI: 10.1016/j.trechm.2020.02.008] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
|
55
|
Discussion: Photographic and Video Deepfakes Have Arrived: How Machine Learning May Influence Plastic Surgery. Plast Reconstr Surg 2020; 145:1087-1088. [PMID: 32221239 DOI: 10.1097/prs.0000000000006698] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
56
|
Dal-Ré R. Analysis of biomedical Spanish articles retracted between 1970 and 2018. Med Clin (Barc) 2020; 154:125-130. [PMID: 31239080 DOI: 10.1016/j.medcli.2019.04.018] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/16/2019] [Revised: 04/08/2019] [Accepted: 04/11/2019] [Indexed: 10/26/2022]
Abstract
BACKGROUND AND OBJECTIVE To analyse articles retracted due to irregularities by authors helps to determine the state of scientific integrity of a discipline or country. The Retraction Watch (RW) database is the largest worldwide database on retracted articles. The objective was to determine the reasons for and features of retracted biomedical articles by Spanish authors. MATERIAL AND METHODS A search was conducted in the RW database of 7 types of scientific articles from 9 biomedicine disciplines -biology, genetics, medicine, microbiology, neurosciences, nutrition, dentistry, public health and toxicology-, with at least one author working in a Spanish centre, and published between 1970 and 2018. The features of the articles and the reasons for their retraction were recorded. RESULTS Of the 18,621 retracted articles, 217 (1%) were by Spanish authors; 155 (74%) were on biomedicine and the types of articles of interest. In most cases, there were several reasons for retracting an article. Research misconduct (fabrication, falsification, plagiarism) and duplication were involved in 25% and 35% of the cases, respectively. Twenty-two percent of the articles were retracted due to errors by the authors or the journals. A dentist retracted 18 articles -all from the same journal and in the same year, 2018-, which accounts for 12% of all retracted biomedicine articles. CONCLUSION The number of retracted biomedicine articles by Spanish authors is low. Research misconduct was a frequent reason, with a similar percentage of articles retracted due to honest errors.
Collapse
Affiliation(s)
- Rafael Dal-Ré
- Unidad de Epidemiología, Instituto de Investigación Sanitaria-Fundación Jiménez Díaz, Hospital Universitario, Universidad Autónoma de Madrid, Madrid, España.
| |
Collapse
|
57
|
Siler K. Demarcating spectrums of predatory publishing: Economic and institutional sources of academic legitimacy. J Assoc Inf Sci Technol 2020. [DOI: 10.1002/asi.24339] [Citation(s) in RCA: 21] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Affiliation(s)
- Kyle Siler
- Science Policy Research Unit University of Sussex Brighton UK
| |
Collapse
|
58
|
Miyakawa T. No raw data, no science: another possible source of the reproducibility crisis. Mol Brain 2020; 13:24. [PMID: 32079532 PMCID: PMC7033918 DOI: 10.1186/s13041-020-0552-2] [Citation(s) in RCA: 80] [Impact Index Per Article: 20.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/08/2019] [Accepted: 01/09/2020] [Indexed: 02/08/2023] Open
Abstract
A reproducibility crisis is a situation where many scientific studies cannot be reproduced. Inappropriate practices of science, such as HARKing, p-hacking, and selective reporting of positive results, have been suggested as causes of irreproducibility. In this editorial, I propose that a lack of raw data or data fabrication is another possible cause of irreproducibility.As an Editor-in-Chief of Molecular Brain, I have handled 180 manuscripts since early 2017 and have made 41 editorial decisions categorized as "Revise before review," requesting that the authors provide raw data. Surprisingly, among those 41 manuscripts, 21 were withdrawn without providing raw data, indicating that requiring raw data drove away more than half of the manuscripts. I rejected 19 out of the remaining 20 manuscripts because of insufficient raw data. Thus, more than 97% of the 41 manuscripts did not present the raw data supporting their results when requested by an editor, suggesting a possibility that the raw data did not exist from the beginning, at least in some portions of these cases.Considering that any scientific study should be based on raw data, and that data storage space should no longer be a challenge, journals, in principle, should try to have their authors publicize raw data in a public database or journal site upon the publication of the paper to increase reproducibility of the published results and to increase public trust in science.
Collapse
Affiliation(s)
- Tsuyoshi Miyakawa
- Division of Systems Medical Science, Institute for Comprehensive Medical Science, Fujita Health University, Toyoake, Aichi, 470-1192, Japan.
| |
Collapse
|
59
|
Moritz CP. 40 years Western blotting: A scientific birthday toast. J Proteomics 2020; 212:103575. [DOI: 10.1016/j.jprot.2019.103575] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/12/2019] [Revised: 10/23/2019] [Accepted: 10/28/2019] [Indexed: 12/27/2022]
|
60
|
Dal-Ré R, Bouter LM, Cuijpers P, Gluud C, Holm S. Should research misconduct be criminalized? RESEARCH ETHICS REVIEW 2020. [DOI: 10.1177/1747016119898400] [Citation(s) in RCA: 19] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
For more than 25 years, research misconduct (research fraud) is defined as fabrication, falsification, or plagiarism (FFP)—although other research misbehaviors have been also added in codes of conduct and legislations. A critical issue in deciding whether research misconduct should be subject to criminal law is its definition, because not all behaviors labeled as research misconduct qualifies as serious crime. But assuming that all FFP is fraud and all non-FFP not is far from obvious. In addition, new research misbehaviors have recently been described, such as prolific authorship, and fake peer review, or boosted such as duplication of images. The scientific community has been largely successful in keeping criminal law away from the cases of research misconduct. Alleged cases of research misconduct are usually looked into by committees of scientists usually from the same institution or university of the suspected offender in a process that often lacks transparency. Few countries have or plan to introduce independent bodies to address research misconduct; so for the coming years, most universities and research institutions will continue handling alleged research misconduct cases with their own procedures. A global operationalization of research misconduct with clear boundaries and clear criteria would be helpful. There is room for improvement in reaching global clarity on what research misconduct is, how allegations should be handled, and which sanctions are appropriate.
Collapse
Affiliation(s)
- Rafael Dal-Ré
- Epidemiology Unit, Health Research Institute-Fundación Jiménez Díaz University Hospital, Universidad Autónoma de Madrid, Madrid, Spain
| | - Lex M Bouter
- Department of Epidemiology and Biostatistics, Amsterdam University Medical Centers, location VUmc, and Department of Philosophy, Vrije Universiteit Amsterdam, Amsterdam, The Netherlands
| | - Pim Cuijpers
- Department of Clinical, Neuro and Developmental Psychology, Amsterdam Public Health Research Institute, Vrije Universiteit Amsterdam, Amsterdam, The Netherlands
| | - Christian Gluud
- The Copenhagen Trial Unit, Centre for Clinical Intervention Research, Rigshospitalet, Copenhagen University Hospital, Copenhagen, Denmark
| | - Søren Holm
- Centre for Social Ethics and Policy, School of Law, Williamson Building, University of Manchester, Manchester, UK
| |
Collapse
|
61
|
Biagioli M. Before and After Photoshop: Recursive Fraud in the Age of Digital Reproducibility. Angew Chem Int Ed Engl 2019; 58:16334-16335. [PMID: 31568606 DOI: 10.1002/anie.201908646] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
"Word-processing software, SCIgen, or Photoshop have increased fraudulent output, but … they have also changed the very form of fraud. Photoshop and similar tools are thus enabling something that is categorically new: recursive fraud, that is, the serial digital reproduction and dissemination of more fraud." Read more in the Guest Editorial by M. Biagioli.
Collapse
Affiliation(s)
- Mario Biagioli
- University of California Los Angeles, Dept of Communication and School of Law, UCLA School of Law, 385 Charles E. Young Drive East, 1242 Law Building, Los Angeles, CA, 90095, USA
| |
Collapse
|
62
|
Biagioli M. Vor und nach Photoshop: rekursiver Betrug im Zeitalter digitaler Reproduzierbarkeit. Angew Chem Int Ed Engl 2019. [DOI: 10.1002/ange.201908646] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Affiliation(s)
- Mario Biagioli
- University of California Los Angeles Dept of Communication and School of Law UCLA School of Law 385 Charles E. Young Drive East, 1242 Law Building Los Angeles CA 90095 USA
| |
Collapse
|
63
|
|
64
|
Fanelli D, Costas R, Fang FC, Casadevall A, Bik EM. Testing Hypotheses on Risk Factors for Scientific Misconduct via Matched-Control Analysis of Papers Containing Problematic Image Duplications. SCIENCE AND ENGINEERING ETHICS 2019; 25:771-789. [PMID: 29460082 PMCID: PMC6591179 DOI: 10.1007/s11948-018-0023-7] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/19/2017] [Accepted: 01/17/2018] [Indexed: 05/26/2023]
Abstract
It is commonly hypothesized that scientists are more likely to engage in data falsification and fabrication when they are subject to pressures to publish, when they are not restrained by forms of social control, when they work in countries lacking policies to tackle scientific misconduct, and when they are male. Evidence to test these hypotheses, however, is inconclusive due to the difficulties of obtaining unbiased data. Here we report a pre-registered test of these four hypotheses, conducted on papers that were identified in a previous study as containing problematic image duplications through a systematic screening of the journal PLoS ONE. Image duplications were classified into three categories based on their complexity, with category 1 being most likely to reflect unintentional error and category 3 being most likely to reflect intentional fabrication. We tested multiple parameters connected to the hypotheses above with a matched-control paradigm, by collecting two controls for each paper containing duplications. Category 1 duplications were mostly not associated with any of the parameters tested, as was predicted based on the assumption that these duplications were mostly not due to misconduct. Categories 2 and 3, however, exhibited numerous statistically significant associations. Results of univariable and multivariable analyses support the hypotheses that academic culture, peer control, cash-based publication incentives and national misconduct policies might affect scientific integrity. No clear support was found for the "pressures to publish" hypothesis. Female authors were found to be equally likely to publish duplicated images compared to males. Country-level parameters generally exhibited stronger effects than individual-level parameters, because developing countries were significantly more likely to produce problematic image duplications. This suggests that promoting good research practices in all countries should be a priority for the international research integrity agenda.
Collapse
Affiliation(s)
- Daniele Fanelli
- Department of Methodology, London School of Economics and Political Science, Columbia House, London, WC2A 2AE, UK.
| | - Rodrigo Costas
- Centre for Science and Technology Studies (CWTS), Leiden University, P.O. Box 905, 2300 AX, Leiden, The Netherlands
| | - Ferric C Fang
- Departments of Laboratory Medicine and Microbiology, University of Washington School of Medicine, Seattle, WA, 98195, USA
| | - Arturo Casadevall
- Department of Molecular Microbiology and Immunology, Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, 21205, USA
| | | |
Collapse
|
65
|
Hamilton DG. Continued Citation of Retracted Radiation Oncology Literature—Do We Have a Problem? Int J Radiat Oncol Biol Phys 2019; 103:1036-1042. [DOI: 10.1016/j.ijrobp.2018.11.014] [Citation(s) in RCA: 22] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2018] [Revised: 10/22/2018] [Accepted: 11/10/2018] [Indexed: 11/17/2022]
|
66
|
Williams CL, Casadevall A, Jackson S. Figure errors, sloppy science, and fraud: keeping eyes on your data. J Clin Invest 2019; 129:1805-1807. [PMID: 30907748 DOI: 10.1172/jci128380] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
Abstract
Recent reports suggest that there has been an increase in the number of retractions and corrections of published articles due to post-publication detection of problematic data. Moreover, fraudulent data and sloppy science have long-term effects on the scientific literature and subsequent projects based on false and unreproducible claims. At the JCI, we have introduced several data screening checks for manuscripts prior to acceptance in an attempt to reduce the number of post-publication corrections and retractions, with the ultimate goal of increasing confidence in the papers we publish.
Collapse
|
67
|
Abstract
Although a case can be made for rewarding scientists for risky, novel science rather than for incremental, reliable science, novelty without reliability ceases to be science. The currently available evidence suggests that the most prestigious journals are no better at detecting unreliable science than other journals. In fact, some of the most convincing studies show a negative correlation, with the most prestigious journals publishing the least reliable science. With the credibility of science increasingly under siege, how much longer can we afford to reward novelty at the expense of reliability? Here, I argue for replacing the legacy journals with a modern information infrastructure that is governed by scholars. This infrastructure would allow renewed focus on scientific reliability, with improved sort, filter, and discovery functionalities, at massive cost savings. If these savings were invested in additional infrastructure for research data and scientific code and/or software, scientific reliability would receive additional support, and funding woes-for, e.g., biological databases-would be a concern of the past.
Collapse
Affiliation(s)
- Björn Brembs
- Universität Regensburg, Institut für Zoologie, Neurogenetik, Regensburg, Germany
| |
Collapse
|
68
|
Bik EM, Fang FC, Kullas AL, Davis RJ, Casadevall A. Analysis and Correction of Inappropriate Image Duplication: the Molecular and Cellular Biology Experience. Mol Cell Biol 2018; 38:e00309-18. [PMID: 30037982 PMCID: PMC6168979 DOI: 10.1128/mcb.00309-18] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2018] [Revised: 07/11/2018] [Accepted: 07/17/2018] [Indexed: 01/25/2023] Open
Abstract
We analyzed 960 papers published in Molecular and Cellular Biology (MCB) from 2009 to 2016 and found 59 (6.1%) to contain inappropriately duplicated images. The 59 instances of inappropriate image duplication led to 41 corrections, 5 retractions, and 13 instances in which no action was taken. Our experience suggests that the majority of inappropriate image duplications result from errors during figure preparation that can be remedied by correction. Nevertheless, ∼10% of papers with inappropriate image duplications in MCB were retracted (∼0.5% of total). If this proportion is representative, then as many as 35,000 papers in the literature are candidates for retraction due to inappropriate image duplication. The resolution of inappropriate image duplication concerns after publication required an average of 6 h of journal staff time per published paper. MCB instituted a pilot program to screen images of accepted papers prior to publication that identified 12 manuscripts (14.5% out of 83) with image concerns in 2 months. The screening and correction of papers before publication required an average of 30 min of staff time per problematic paper. Image screening can identify papers with problematic images prior to publication, reduces postpublication problems, and requires less staff time than the correction of problems after publication.
Collapse
Affiliation(s)
| | - Ferric C Fang
- Department of Laboratory Medicine, University of Washington, Seattle, Washington, USA
- Department of Microbiology, University of Washington, Seattle, Washington, USA
| | - Amy L Kullas
- Journals Department, American Society for Microbiology (ASM), Washington, DC, USA
| | - Roger J Davis
- Howard Hughes Medical Institute and Program in Molecular Medicine, University of Massachusetts Medical School, Worcester, Massachusetts, USA
| | - Arturo Casadevall
- Department of Molecular Microbiology and Immunology, Johns Hopkins School of Public Health, Baltimore, Maryland, USA
| |
Collapse
|
69
|
Casadevall A, Fang FC. Making the scientific literature fail-safe. J Clin Invest 2018; 128:4243-4244. [PMID: 30179223 PMCID: PMC6159988 DOI: 10.1172/jci123884] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/25/2023] Open
Affiliation(s)
- Arturo Casadevall
- Department of Molecular Microbiology and Immunology, Johns Hopkins School of Public Health, Baltimore, Maryland, USA
| | - Ferric C. Fang
- Departments of Laboratory Medicine and Microbiology, University of Washington, Seattle, Washington, USA
| |
Collapse
|
70
|
Dal-Ré R. How to improve the integrity of clinical trial articles. REVISTA DE PSIQUIATRIA Y SALUD MENTAL 2018; 11:189-191. [PMID: 29625891 DOI: 10.1016/j.rpsm.2018.02.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/05/2018] [Accepted: 02/08/2018] [Indexed: 06/08/2023]
Affiliation(s)
- Rafael Dal-Ré
- Unidad de Epidemiología, Instituto de Investigación Sanitaria-Hospital Universitario Fundación Jiménez Díaz, Universidad Autónoma de Madrid, Madrid, España.
| |
Collapse
|
71
|
Publication pressure and scientific misconduct: why we need more open governance. Spinal Cord 2018; 56:821-822. [PMID: 30194444 DOI: 10.1038/s41393-018-0193-9] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/20/2018] [Accepted: 08/20/2018] [Indexed: 11/08/2022]
|
72
|
Christopher J. Systematic fabrication of scientific images revealed. FEBS Lett 2018; 592:3027-3029. [DOI: 10.1002/1873-3468.13201] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2018] [Accepted: 07/23/2018] [Indexed: 11/05/2022]
Affiliation(s)
- Jana Christopher
- FEBS Letters Editorial Office Heidelberg University Biochemistry Center Germany
| |
Collapse
|
73
|
Barnett AG, Zardo P, Graves N. Randomly auditing research labs could be an affordable way to improve research quality: A simulation study. PLoS One 2018; 13:e0195613. [PMID: 29649314 PMCID: PMC5896971 DOI: 10.1371/journal.pone.0195613] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2017] [Accepted: 03/26/2018] [Indexed: 01/30/2023] Open
Abstract
The "publish or perish" incentive drives many researchers to increase the quantity of their papers at the cost of quality. Lowering quality increases the number of false positive errors which is a key cause of the reproducibility crisis. We adapted a previously published simulation of the research world where labs that produce many papers are more likely to have "child" labs that inherit their characteristics. This selection creates a competitive spiral that favours quantity over quality. To try to halt the competitive spiral we added random audits that could detect and remove labs with a high proportion of false positives, and also improved the behaviour of "child" and "parent" labs who increased their effort and so lowered their probability of making a false positive error. Without auditing, only 0.2% of simulations did not experience the competitive spiral, defined by a convergence to the highest possible false positive probability. Auditing 1.35% of papers avoided the competitive spiral in 71% of simulations, and auditing 1.94% of papers in 95% of simulations. Audits worked best when they were only applied to established labs with 50 or more papers compared with labs with 25 or more papers. Adding a ±20% random error to the number of false positives to simulate peer reviewer error did not reduce the audits' efficacy. The main benefit of the audits was via the increase in effort in "child" and "parent" labs. Audits improved the literature by reducing the number of false positives from 30.2 per 100 papers to 12.3 per 100 papers. Auditing 1.94% of papers would cost an estimated $15.9 million per year if applied to papers produced by National Institutes of Health funding. Our simulation greatly simplifies the research world and there are many unanswered questions about if and how audits would work that can only be addressed by a trial of an audit.
Collapse
Affiliation(s)
- Adrian G. Barnett
- School of Public Health and Social Work, Queensland University of Technology, Brisbane, Australia
| | - Pauline Zardo
- Data & Policy, Faculty of Law & Digital Media Research Centre, Queensland University of Technology, Gardens Point, QLD, Australia
| | - Nicholas Graves
- School of Public Health and Social Work, Queensland University of Technology, Brisbane, Australia
| |
Collapse
|
74
|
Abstract
Journal clubs are important mechanisms for teaching how to approach the scientific literature critically and for disseminating findings. Papers from high-impact journals often dominate journal club selections, a practice that reinforces the unscientific emphasis of placing high value on publishing venue rather than scientific content and critical analysis of the publications. We suggest improving journal clubs by including preprints rather than focusing completely on published papers. This change in practice might benefit the scientific enterprise in numerous ways, including by providing direct criticisms to preprint authors before publication, deemphasizing publishing venue, teaching students the art of reviewing papers, and making journal clubs more current by discussing unpublished data.
Collapse
Affiliation(s)
- Arturo Casadevall
- Johns Hopkins Bloomberg School of Public Health, Baltimore, Maryland, USA
| | - Neil Gow
- University of Aberdeen, Aberdeen, United Kingdom
| |
Collapse
|
75
|
Tanaka S. [Recent Progress in Promoting Research Integrity]. YAKUGAKU ZASSHI 2018; 138:477-486. [PMID: 29607992 DOI: 10.1248/yakushi.17-00181-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
Abstract
An increasing number of cases of research misconduct and whistle-blowing in the fields of medicine and life sciences has created public concern about research integrity. In Europe and the United States, there has been a large focus on poor reproducibility in life science research, and poor reproducibility is largely associated with research misconduct. Research integrity is equally crucial in the pharmaceutical sciences, which play an important role in medical and life sciences. Individual cases of research misconduct have not been investigated in detail in Japan, because it was generally believed that only researchers with strong or strange personalities would participate in misconduct. However, a better understanding of research misconduct will enable more in-depth discussions about research integrity, which is now known to be closely associated with normal research activities. Here I will introduce information on various contemporary activities being performed to create a sound research environment, drawn from practices in universities, pharmaceutical companies, and government agencies. I will also discuss ways in which individual researchers can promote research integrity.
Collapse
Affiliation(s)
- Satoshi Tanaka
- Graduate School of Medicine, Dentistry, and Pharmaceutical Sciences, Okayama University
| |
Collapse
|
76
|
Abstract
Images in scientific papers are used to support the experimental description and the discussion of the findings since several centuries. In the field of biomedical sciences, in particular, the use of images to depict laboratory results is widely diffused, at such a level that one would not err in saying that there is barely any experimental paper devoid of images to document the attained results. With the advent of software for digital image manipulation, however, even photographic reproductions of experimental results may be easily altered by researchers, leading to an increasingly high rate of scientific papers containing unreliable images. In this paper I introduce a software pipeline to detect some of the most diffuse misbehaviours, running two independent tests on a random set of papers and on the full publishing record of a single journal. The results obtained by these two tests support the feasibility of the software approach and imply an alarming level of image manipulation in the published record.
Collapse
Affiliation(s)
- Enrico M Bucci
- Temple University, Philadelphia, PA, USA. .,Sbarro Health Research Organization, Philadelphia, PA, USA.
| |
Collapse
|
77
|
Fanelli D. Opinion: Is science really facing a reproducibility crisis, and do we need it to? Proc Natl Acad Sci U S A 2018; 115:2628-2631. [PMID: 29531051 PMCID: PMC5856498 DOI: 10.1073/pnas.1708272114] [Citation(s) in RCA: 149] [Impact Index Per Article: 24.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Efforts to improve the reproducibility and integrity of science are typically justified by a narrative of crisis, according to which most published results are unreliable due to growing problems with research and publication practices. This article provides an overview of recent evidence suggesting that this narrative is mistaken, and argues that a narrative of epochal changes and empowerment of scientists would be more accurate, inspiring, and compelling.
Collapse
Affiliation(s)
- Daniele Fanelli
- Department of Methodology, London School of Economics and Political Science, London WC2A 2AE, United Kingdom
| |
Collapse
|
78
|
Boutron I, Ravaud P. Misrepresentation and distortion of research in biomedical literature. Proc Natl Acad Sci U S A 2018; 115:2613-2619. [PMID: 29531025 PMCID: PMC5856510 DOI: 10.1073/pnas.1710755115] [Citation(s) in RCA: 130] [Impact Index Per Article: 21.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022] Open
Abstract
Publication in peer-reviewed journals is an essential step in the scientific process. However, publication is not simply the reporting of facts arising from a straightforward analysis thereof. Authors have broad latitude when writing their reports and may be tempted to consciously or unconsciously "spin" their study findings. Spin has been defined as a specific intentional or unintentional reporting that fails to faithfully reflect the nature and range of findings and that could affect the impression the results produce in readers. This article, based on a literature review, reports the various practices of spin from misreporting by "beautification" of methods to misreporting by misinterpreting the results. It provides data on the prevalence of some forms of spin in specific fields and the possible effects of some types of spin on readers' interpretation and research dissemination. We also discuss why researchers would spin their reports and possible ways to avoid it.
Collapse
Affiliation(s)
- Isabelle Boutron
- Methods of Therapeutic Evaluation Of Chronic Diseases (METHODS) team, INSERM, UMR 1153, Epidemiology and Biostatistics Sorbonne Paris Cité Research Center (CRESS), F-75014 Paris, France;
- Faculté de Médicine, Paris Descartes University, 75006 Paris, France
- Centre d'Épidémiologie Clinique, Hôpital Hôtel Dieu, Assistance Publique des Hôpitaux de Paris, 75004 Paris, France
| | - Philippe Ravaud
- Methods of Therapeutic Evaluation Of Chronic Diseases (METHODS) team, INSERM, UMR 1153, Epidemiology and Biostatistics Sorbonne Paris Cité Research Center (CRESS), F-75014 Paris, France
- Faculté de Médicine, Paris Descartes University, 75006 Paris, France
- Centre d'Épidémiologie Clinique, Hôpital Hôtel Dieu, Assistance Publique des Hôpitaux de Paris, 75004 Paris, France
- Department of Epidemiology, Columbia University Mailman School of Public Health, New York, NY 10032
| |
Collapse
|
79
|
Kroeger CM, Garza C, Lynch CJ, Myers E, Rowe S, Schneeman BO, Sharma AM, Allison DB. Scientific rigor and credibility in the nutrition research landscape. Am J Clin Nutr 2018; 107:484-494. [PMID: 29566196 PMCID: PMC6248649 DOI: 10.1093/ajcn/nqx067] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/30/2017] [Revised: 11/30/2017] [Accepted: 12/04/2017] [Indexed: 12/27/2022] Open
Abstract
Scientific progress depends on the quality and credibility of research methods. As discourse on rigor, transparency, and reproducibility joins the cacophony of nutrition information and misinformation in mass media, buttressing the real and perceived reliability of nutrition science is more important than ever. This broad topic was the focus of a 2016 plenary session, "Scientific Rigor and Competing Interests in the Nutrition Research Landscape." This article summarizes and expands on this session in an effort to increase understanding and dialogue with regard to factors that limit the real and perceived reliability of nutrition science and steps that can be taken to mitigate those factors. The end goal is to both earn and merit greater trust in nutrition science by both the scientific community and the general public. The authors offer suggestions in each of the domains of education and training, communications, research conduct, and procedures and policies to help achieve this goal. The authors emphasize the need for adequate funding to support these efforts toward greater rigor and transparency, which will be resource demanding and may require either increased research funding or the recognition that a greater proportion of research funding may need to be allocated to these tasks.
Collapse
Affiliation(s)
- Cynthia M Kroeger
- Department of Epidemiology and Biostatistics, Indiana University School of
Public Health-Bloomington, Bloomington, IN
| | | | - Christopher J Lynch
- National Institutes of Diabetes and Digestive and Kidney Diseases, NIH,
Bethesda, MD
| | | | | | | | | | - David B Allison
- Department of Epidemiology and Biostatistics, Indiana University School of
Public Health-Bloomington, Bloomington, IN
| |
Collapse
|
80
|
Abstract
In which journal a scientist publishes is considered one of the most crucial factors determining their career. The underlying common assumption is that only the best scientists manage to publish in a highly selective tier of the most prestigious journals. However, data from several lines of evidence suggest that the methodological quality of scientific experiments does not increase with increasing rank of the journal. On the contrary, an accumulating body of evidence suggests the inverse: methodological quality and, consequently, reliability of published research works in several fields may be decreasing with increasing journal rank. The data supporting these conclusions circumvent confounding factors such as increased readership and scrutiny for these journals, focusing instead on quantifiable indicators of methodological soundness in the published literature, relying on, in part, semi-automated data extraction from often thousands of publications at a time. With the accumulating evidence over the last decade grew the realization that the very existence of scholarly journals, due to their inherent hierarchy, constitutes one of the major threats to publicly funded science: hiring, promoting and funding scientists who publish unreliable science eventually erodes public trust in science.
Collapse
Affiliation(s)
- Björn Brembs
- Institute of Zoology-Neurogenetics, Universität Regensburg, Regensburg, Germany
| |
Collapse
|
81
|
Abstract
There is a growing realization that graduate education in the biomedical sciences is successful at teaching students how to conduct research but falls short in preparing them for a diverse job market, communicating with the public, and remaining versatile scientists throughout their careers. Major problems with graduate level education today include overspecialization in a narrow area of science without a proper grounding in essential critical thinking skills. Shortcomings in education may also contribute to some of the problems of the biomedical sciences, such as poor reproducibility, shoddy literature, and the rise in retracted publications. The challenge is to modify graduate programs such that they continue to generate individuals capable of conducting deep research while at the same time producing more broadly trained scientists without lengthening the time to a degree. Here we describe our first experiences at Johns Hopkins and propose a manifesto for reforming graduate science education.
Collapse
Affiliation(s)
- Gundula Bosch
- Department of Molecular Microbiology and Immunology, Johns Hopkins Bloomberg School of Public Health, Baltimore, Maryland, USA
| | - Arturo Casadevall
- Department of Molecular Microbiology and Immunology, Johns Hopkins Bloomberg School of Public Health, Baltimore, Maryland, USA
| |
Collapse
|
82
|
Koppers L, Wormer H, Ickstadt K. Towards a Systematic Screening Tool for Quality Assurance and Semiautomatic Fraud Detection for Images in the Life Sciences. SCIENCE AND ENGINEERING ETHICS 2017; 23:1113-1128. [PMID: 27848190 PMCID: PMC5539263 DOI: 10.1007/s11948-016-9841-7] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 05/06/2016] [Accepted: 10/27/2016] [Indexed: 06/06/2023]
Abstract
The quality and authenticity of images is essential for data presentation, especially in the life sciences. Questionable images may often be a first indicator for questionable results, too. Therefore, a tool that uses mathematical methods to detect suspicious images in large image archives can be a helpful instrument to improve quality assurance in publications. As a first step towards a systematic screening tool, especially for journal editors and other staff members who are responsible for quality assurance, such as laboratory supervisors, we propose a basic classification of image manipulation. Based on this classification, we developed and explored some simple algorithms to detect copied areas in images. Using an artificial image and two examples of previously published modified images, we apply quantitative methods such as pixel-wise comparison, a nearest neighbor and a variance algorithm to detect copied-and-pasted areas or duplicated images. We show that our algorithms are able to detect some simple types of image alteration, such as copying and pasting background areas. The variance algorithm detects not only identical, but also very similar areas that differ only by brightness. Further types could, in principle, be implemented in a standardized scanning routine. We detected the copied areas in a proven case of image manipulation in Germany and showed the similarity of two images in a retracted paper from the Kato labs, which has been widely discussed on sites such as pubpeer and retraction watch.
Collapse
Affiliation(s)
- Lars Koppers
- Department of Statistics, TU Dortmund University, Vogelpothsweg 87, 44227 Dortmund, Germany
| | - Holger Wormer
- Institute for Journalism, TU Dortmund University, Emil-Figge-Straße 50, 44227 Dortmund, Germany
| | - Katja Ickstadt
- Department of Statistics, TU Dortmund University, Vogelpothsweg 87, 44227 Dortmund, Germany
| |
Collapse
|
83
|
Abstract
The field of microbiology has experienced significant growth due to transformative advances in technology and the influx of scientists driven by a curiosity to understand how microbes sustain myriad biochemical processes that maintain Earth. With this explosion in scientific output, a significant bottleneck has been the ability to rapidly disseminate new knowledge to peers and the public. Preprints have emerged as a tool that a growing number of microbiologists are using to overcome this bottleneck. Posting preprints can help to transparently recruit a more diverse pool of reviewers prior to submitting to a journal for formal peer review. Although the use of preprints is still limited in the biological sciences, early indications are that preprints are a robust tool that can complement and enhance peer-reviewed publications. As publishing moves to embrace advances in Internet technology, there are many opportunities for preprints and peer-reviewed journals to coexist in the same ecosystem.
Collapse
Affiliation(s)
- Patrick D Schloss
- Department of Microbiology and Immunology, University of Michigan, Ann Arbor, Michigan, USA
| |
Collapse
|
84
|
47 Years of IAI: a Glance in the Mirror and the Road Ahead. Infect Immun 2017; 85:IAI.00256-17. [PMID: 28438977 DOI: 10.1128/iai.00256-17] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022] Open
|
85
|
Mullane K, Williams M. Enhancing reproducibility: Failures from Reproducibility Initiatives underline core challenges. Biochem Pharmacol 2017; 138:7-18. [PMID: 28396196 DOI: 10.1016/j.bcp.2017.04.008] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/12/2017] [Accepted: 04/05/2017] [Indexed: 12/20/2022]
Abstract
Efforts to address reproducibility concerns in biomedical research include: initiatives to improve journal publication standards and peer review; increased attention to publishing methodological details that enable experiments to be reconstructed; guidelines on standards for study design, implementation, analysis and execution; meta-analyses of multiple studies within a field to synthesize a common conclusion and; the formation of consortia to adopt uniform protocols and internally reproduce data. Another approach to addressing reproducibility are Reproducibility Initiatives (RIs), well-intended, high-profile, systematically peer-vetted initiatives that are intended to replace the traditional process of scientific self-correction. Outcomes from the RIs reported to date have questioned the usefulness of this approach, particularly when the RI outcome differs from other independent self-correction studies that have reproduced the original finding. As a failed RI attempt is a single outcome distinct from the original study, it cannot provide any definitive conclusions necessitating additional studies that the RI approach has neither the ability nor intent of conducting making it a questionable replacement for self-correction. A failed RI attempt also has the potential to damage the reputation of the author of the original finding. Reproduction is frequently confused with replication, an issue that is more than semantic with the former denoting "similarity" and the latter an "exact copy" - an impossible outcome in research because of known and unknown technical, environmental and motivational differences between the original and reproduction studies. To date, the RI framework has negatively impacted efforts to improve reproducibility, confounding attempts to determine whether a research finding is real.
Collapse
Affiliation(s)
- Kevin Mullane
- Gladstone Institutes, San Francisco, CA, United States
| | - Michael Williams
- Department of Pharmacology, Feinberg School of Medicine, Northwestern University, Chicago, IL, United States.
| |
Collapse
|
86
|
Striking similarities between publications from China describing single gene knockdown experiments in human cancer cell lines. Scientometrics 2016. [DOI: 10.1007/s11192-016-2209-6] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/17/2022]
|
87
|
Abstract
Proposals to improve the reproducibility of biomedical research have emphasized scientific rigor. Although the word "rigor" is widely used, there has been little specific discussion as to what it means and how it can be achieved. We suggest that scientific rigor combines elements of mathematics, logic, philosophy, and ethics. We propose a framework for rigor that includes redundant experimental design, sound statistical analysis, recognition of error, avoidance of logical fallacies, and intellectual honesty. These elements lead to five actionable recommendations for research education.
Collapse
|
88
|
Abstract
The American Academy of Microbiology convened a colloquium to discuss problems in the biological sciences, with emphasis on identifying mechanisms to improve the quality of research. Participants from various disciplines made six recommendations: (i) design rigorous and comprehensive evaluation criteria to recognize and reward high-quality scientific research; (ii) require universal training in good scientific practices, appropriate statistical usage, and responsible research practices for scientists at all levels, with training content regularly updated and presented by qualified scientists; (iii) establish open data at the timing of publication as the standard operating procedure throughout the scientific enterprise; (iv) encourage scientific journals to publish negative data that meet methodologic standards of quality; (v) agree upon common criteria among scientific journals for retraction of published papers, to provide consistency and transparency; and (vi) strengthen research integrity oversight and training. These recommendations constitute an actionable framework that, in combination, could improve the quality of biological research.
Collapse
|
89
|
Abstract
Abstract
This article discusses the responsible conduct of research, questionable research practices, and research misconduct. Responsible conduct of research is often defined in terms of a set of abstract, normative principles, professional standards, and ethics in doing research. In order to accommodate the normative principles of scientific research, the professional standards, and a researcher’s moral principles, transparent research practices can serve as a framework for responsible conduct of research. We suggest a “prune-and-add” project structure to enhance transparency and, by extension, responsible conduct of research. Questionable research practices are defined as practices that are detrimental to the research process. The prevalence of questionable research practices remains largely unknown, and reproducibility of findings has been shown to be problematic. Questionable practices are discouraged by transparent practices because practices that arise from them will become more apparent to scientific peers. Most effective might be preregistrations of research design, hypotheses, and analyses, which reduce particularism of results by providing an a priori research scheme. Research misconduct has been defined as fabrication, falsification, and plagiarism (FFP), which is clearly the worst type of research practice. Despite it being clearly wrong, it can be approached from a scientific and legal perspective. The legal perspective sees research misconduct as a form of white-collar crime. The scientific perspective seeks to answer the following question: “Were results invalidated because of the misconduct?” We review how misconduct is typically detected, how its detection can be improved, and how prevalent it might be. Institutions could facilitate detection of data fabrication and falsification by implementing data auditing. Nonetheless, the effect of misconduct is pervasive: many retracted articles are still cited after the retraction has been issued.
Main points
Researchers systematically evaluate their own conduct as more responsible than colleagues, but not as responsible as they would like.
Transparent practices, facilitated by the Open Science Framework, help embody scientific norms that promote responsible conduct.
Questionable research practices harm the research process and work counter to the generally accepted scientific norms, but are hard to detect.
Research misconduct requires active scrutiny of the research community because editors and peer-reviewers do not pay adequate attention to detecting this. Tips are given on how to improve your detection of potential problems.
Collapse
|
90
|
Banerjee A. When perfect is the enemy of the good. MEDICAL JOURNAL OF DR. D.Y. PATIL UNIVERSITY 2016. [DOI: 10.4103/0975-2870.194179] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022] Open
|