1
|
Carneiro CFD, Queiroz VGS, Moulin TC, Carvalho CAM, Haas CB, Rayêe D, Henshall DE, De-Souza EA, Amorim FE, Boos FZ, Guercio GD, Costa IR, Hajdu KL, van Egmond L, Modrák M, Tan PB, Abdill RJ, Burgess SJ, Guerra SFS, Bortoluzzi VT, Amaral OB. Comparing quality of reporting between preprints and peer-reviewed articles in the biomedical literature. Res Integr Peer Rev 2020; 5:16. [PMID: 33292815 PMCID: PMC7706207 DOI: 10.1186/s41073-020-00101-3] [Citation(s) in RCA: 42] [Impact Index Per Article: 10.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/26/2020] [Accepted: 10/22/2020] [Indexed: 12/12/2022] Open
Abstract
BACKGROUND Preprint usage is growing rapidly in the life sciences; however, questions remain on the relative quality of preprints when compared to published articles. An objective dimension of quality that is readily measurable is completeness of reporting, as transparency can improve the reader's ability to independently interpret data and reproduce findings. METHODS In this observational study, we initially compared independent samples of articles published in bioRxiv and in PubMed-indexed journals in 2016 using a quality of reporting questionnaire. After that, we performed paired comparisons between preprints from bioRxiv to their own peer-reviewed versions in journals. RESULTS Peer-reviewed articles had, on average, higher quality of reporting than preprints, although the difference was small, with absolute differences of 5.0% [95% CI 1.4, 8.6] and 4.7% [95% CI 2.4, 7.0] of reported items in the independent samples and paired sample comparison, respectively. There were larger differences favoring peer-reviewed articles in subjective ratings of how clearly titles and abstracts presented the main findings and how easy it was to locate relevant reporting information. Changes in reporting from preprints to peer-reviewed versions did not correlate with the impact factor of the publication venue or with the time lag from bioRxiv to journal publication. CONCLUSIONS Our results suggest that, on average, publication in a peer-reviewed journal is associated with improvement in quality of reporting. They also show that quality of reporting in preprints in the life sciences is within a similar range as that of peer-reviewed articles, albeit slightly lower on average, supporting the idea that preprints should be considered valid scientific contributions.
Collapse
Affiliation(s)
- Clarissa F D Carneiro
- Institute of Medical Biochemistry Leopoldo de Meis, Federal University of Rio de Janeiro, Rio de Janeiro, RJ, 21941-902, Brazil.
| | - Victor G S Queiroz
- Institute of Medical Biochemistry Leopoldo de Meis, Federal University of Rio de Janeiro, Rio de Janeiro, RJ, 21941-902, Brazil
| | - Thiago C Moulin
- Institute of Medical Biochemistry Leopoldo de Meis, Federal University of Rio de Janeiro, Rio de Janeiro, RJ, 21941-902, Brazil
| | - Carlos A M Carvalho
- Seção de Arbovirologia e Febres Hemorrágicas, Instituto Evandro Chagas, Ananindeua, Pará, Brazil
- Departamento de Patologia, Universidade do Estado do Pará, Belém, Pará, Brazil
- Centro Universitário Metropolitano da Amazônia, Instituto Euro-Americano de Educação, Ciência e Tecnologia, Belém, Pará, Brazil
| | - Clarissa B Haas
- Departamento de Bioquímica, Instituto de Ciências Básicas da Saúde, Universidade Federal do Rio Grande do Sul, Porto Alegre, Rio Grande do Sul, Brazil
| | - Danielle Rayêe
- Biomedical Sciences Institute, Federal University of Rio de Janeiro, Rio de Janeiro, Brazil
| | | | - Evandro A De-Souza
- Institute of Medical Biochemistry Leopoldo de Meis, Federal University of Rio de Janeiro, Rio de Janeiro, RJ, 21941-902, Brazil
| | - Felippe E Amorim
- Institute of Medical Biochemistry Leopoldo de Meis, Federal University of Rio de Janeiro, Rio de Janeiro, RJ, 21941-902, Brazil
| | - Flávia Z Boos
- Programa de Pós-Graduação em Psicobiologia, Universidade Federal de São Paulo, São Paulo, Brazil
| | - Gerson D Guercio
- Department of Psychiatry, University of Minnesota, Minneapolis, MN, USA
| | - Igor R Costa
- Institute of Medical Biochemistry Leopoldo de Meis, Federal University of Rio de Janeiro, Rio de Janeiro, RJ, 21941-902, Brazil
| | - Karina L Hajdu
- Biomedical Sciences Institute, Federal University of Rio de Janeiro, Rio de Janeiro, Brazil
| | | | - Martin Modrák
- Institute of Microbiology of the Czech Academy of Sciences, Prague, Czech Republic
| | - Pedro B Tan
- Biomedical Sciences Institute, Federal University of Rio de Janeiro, Rio de Janeiro, Brazil
| | - Richard J Abdill
- Department of Genetics, Cell Biology, and Development, University of Minnesota, Minneapolis, MN, USA
| | - Steven J Burgess
- Carl R Woese Institute for Genomic Biology, University of Illinois at Urbana-Champaign, Champaign, IL, USA
| | - Sylvia F S Guerra
- Centro Universitário Metropolitano da Amazônia, Instituto Euro-Americano de Educação, Ciência e Tecnologia, Belém, Pará, Brazil
- Seção de Virologia, Instituto Evandro Chagas, Ananindeua, Pará, Brazil
- Departamento de Morfologia e Ciências Fisiológicas, Universidade do Estado do Pará, Belém, Pará, Brazil
| | - Vanessa T Bortoluzzi
- Departamento de Bioquímica, Instituto de Ciências Básicas da Saúde, Universidade Federal do Rio Grande do Sul, Porto Alegre, Rio Grande do Sul, Brazil
| | - Olavo B Amaral
- Institute of Medical Biochemistry Leopoldo de Meis, Federal University of Rio de Janeiro, Rio de Janeiro, RJ, 21941-902, Brazil
| |
Collapse
|
2
|
Desai B, Mattingly TJ, van den Broek RWM, Pham N, Frailer M, Yang J, Perfetto EM. Peer Review and Transparency in Evidence-Source Selection in Value and Health Technology Assessment. VALUE IN HEALTH : THE JOURNAL OF THE INTERNATIONAL SOCIETY FOR PHARMACOECONOMICS AND OUTCOMES RESEARCH 2020; 23:689-696. [PMID: 32540225 DOI: 10.1016/j.jval.2020.01.014] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/07/2019] [Revised: 01/02/2020] [Accepted: 01/20/2020] [Indexed: 06/11/2023]
Abstract
OBJECTIVES Value and health technology assessment (V/HTA) is often used in clinical, access, and reimbursement decisions. V/HTA data-source selection may not be transparent, which is a necessary element for stakeholder understanding and trust and for fostering accountability among decision makers. Peer review is considered one mechanism for judging data trustworthiness. Our objective was (1) to use publicly available documentation of V/HTA methods to identify requirements for inclusion of peer-reviewed evidence sources, (2) to compare and contrast US and non-US approaches, and (3) to assess evidence sources used in published V/HTA reports. METHODS Publicly available methods documentation from 11 V/HTA organizations in North America and Europe were manually searched and abstracted for descriptions of requirements and recommendations regarding search strategy and evidence-source selection. The bibliographies of a subset of V/HTA reports published in 2018 were manually abstracted for evidence-source types used in each. RESULTS Heterogeneity in evidence-source retrieval and selection was observed across all V/HTA organizations, with more pronounced differences between US and non-US organizations. Not all documentation of organizations' methods address the evidence-source selection processes (7 of 11), and few explicitly reference peer-reviewed sources (3 of 11). Documentation of the evidence-source selection strategy was inconsistent across reports (6 of 13), and the level of detail provided varied across organizations. Some information on evidence-source selection was often included in confidential documentation and was not publicly available. CONCLUSIONS Disparities exist among V/HTA organizations in requirements and guidance regarding evidence-source selection. Standardization of evidence-source selection strategies and documentation could help improve V/HTA transparency and has implications for decision making based on report findings.
Collapse
Affiliation(s)
- Bansri Desai
- University of Maryland, School of Pharmacy, Baltimore, MD, USA.
| | | | | | - Ngan Pham
- University of Maryland, School of Pharmacy, Baltimore, MD, USA
| | - Megan Frailer
- University of Maryland, School of Pharmacy, Baltimore, MD, USA
| | - Joseph Yang
- University of Maryland, School of Pharmacy, Baltimore, MD, USA
| | - Eleanor M Perfetto
- University of Maryland, School of Pharmacy, Baltimore, MD, USA; National Health Council, Washington, DC, USA
| |
Collapse
|
3
|
Bibliographic databases: Is The Journal of Wildlife Management
being found? J Wildl Manage 2015. [DOI: 10.1002/jwmg.898] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
4
|
Grimes DA. Epidemiologic research with administrative databases: red herrings, false alarms and pseudo-epidemics. Hum Reprod 2015; 30:1749-52. [PMID: 26113658 DOI: 10.1093/humrep/dev151] [Citation(s) in RCA: 27] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2015] [Accepted: 06/01/2015] [Indexed: 12/16/2022] Open
Affiliation(s)
- David A Grimes
- Department of Obstetrics and Gynecology, University of North Carolina School of Medicine, Chapel Hill, NC, USA
| |
Collapse
|
5
|
Abstract
The phenomenon of self-citation can present in many different forms, including direct, co-author, collaborative, and coercive induced self-citation. It can also pertain to the citation of single scientists, groups of scientists, journals, and institutions. This article presents some case studies of extreme self-citation practices. It also discusses the implications of different types of self-citation. Self-citation is not necessarily inappropriate by default. In fact, usually it is fully appropriate but often it is even necessary. Conversely, inappropriate self-citation practices may be highly misleading and may distort the scientific literature. Coercive induced self-citation is the most difficult to discover. Coercive Induced self-citation may happen directly from reviewers of articles, but also indirectly from reviewers of grants, scientific advisors who steer a research agenda, and leaders of funding agencies who may espouse spending disproportionately large funds in research domains that perpetuate their own self-legacy. Inappropriate self-citation can be only a surrogate marker of what might be much greater distortions of the scientific corpus towards conformity to specific opinions and biases. Inappropriate self-citations eventually affect also impact metrics. Different impact metrics vary in the extent to which they can be gamed through self-citation practices. Citation indices that are more gaming-proof are available and should be more widely used. We need more empirical studies to dissect the impact of different types of inappropriate self-citation and to examine the effectiveness of interventions to limit them.
Collapse
|
6
|
Thombs BD, Levis AW, Razykov I, Syamchandra A, Leentjens AFG, Levenson JL, Lumley MA. Potentially coercive self-citation by peer reviewers: a cross-sectional study. J Psychosom Res 2015; 78:1-6. [PMID: 25300537 DOI: 10.1016/j.jpsychores.2014.09.015] [Citation(s) in RCA: 35] [Impact Index Per Article: 3.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/29/2014] [Revised: 08/14/2014] [Accepted: 09/11/2014] [Indexed: 10/24/2022]
Abstract
OBJECTIVE Peer reviewers sometimes request that authors cite their work, either appropriately or via coercive self-citation to highlight the reviewers' work. The objective of this study was to determine in peer reviews submitted to one biomedical journal (1) the extent of peer reviewer self-citation; (2) the proportion of reviews recommending revision or acceptance versus rejection that included reviewer self-citations; and (3) the proportion of reviewer self-citations versus citations to others that included a rationale. METHODS Peer reviews for manuscripts submitted in 2012 to the Journal of Psychosomatic Research were evaluated. Data extraction was performed independently by two investigators. RESULTS There were 616 peer reviews (526 reviewers; 276 manuscripts), of which 444 recommended revision or acceptance and 172 rejection. Of 428 total citations, there were 122 peer reviewer self-citations (29%) and 306 citations to others' work (71%). Self-citations were more common in reviews recommending revision or acceptance (105 of 316 citations; 33%) versus rejection (17/112; 15%; p<0.001). The percentage of self-citations with no rationale (26 of 122; 21%) was higher than for citations to others' work (15 of 306; 5%; p<0.001). CONCLUSIONS Self-citation in peer reviews is common and may reflect a combination of appropriate citation to research that should be cited in published articles and inappropriate citation intended to highlight the work of the peer reviewer. Providing instructions to peer reviewers about self-citation and asking them to indicate when and why they have self-cited may help to limit self-citation to appropriate, constructive recommendations.
Collapse
Affiliation(s)
- Brett D Thombs
- Lady Davis Institute for Medical Research, Jewish General Hospital, Montréal, Québec, Canada; Department of Psychiatry, McGill University, Montreal, Quebec, Canada; Department of Epidemiology, Biostatistics, and Occupational Health, McGill University, Montreal, Quebec, Canada; Department of Medicine, McGill University, Montreal, Quebec, Canada; Department of Educational and Counselling Psychology, McGill University, Montreal, Quebec, Canada; Department of Psychology, McGill University, Montreal, Quebec, Canada; School of Nursing, McGill University, Montreal, Quebec, Canada.
| | - Alexander W Levis
- Lady Davis Institute for Medical Research, Jewish General Hospital, Montréal, Québec, Canada
| | - Ilya Razykov
- Lady Davis Institute for Medical Research, Jewish General Hospital, Montréal, Québec, Canada; Department of Educational and Counselling Psychology, McGill University, Montreal, Quebec, Canada
| | - Achyuth Syamchandra
- Lady Davis Institute for Medical Research, Jewish General Hospital, Montréal, Québec, Canada
| | - Albert F G Leentjens
- Department of Psychiatry, Maastricht University Medical Center, Maastricht, The Netherlands
| | - James L Levenson
- Department of Psychiatry, Virginia Commonwealth University School of Medicine, Richmond, VA, USA
| | - Mark A Lumley
- Department of Psychology, Wayne State University, Detroit, MI, USA
| |
Collapse
|
7
|
Abstract
Scientific observations must survive the scrutiny of experts before they are disseminated to the broader community because their publication in a scientific journal provides a stamp of validity. Although critical review of a manuscript by peers prior to publication in a scientific journal is a central element in this process, virtually no formal guidance is provided to reviewers about the nature of the task. In this article, the essence of peer review is described and critical steps in the process are summarized. The role of the peer reviewer as an intermediary and arbiter in the process of scientific communication between the authors and the readers via the vehicle of the particular journal is discussed and the responsibilities of the reviewer to each of the three parties (the author/s, readers, and the Journal editor) are defined. The two formal products of this activity are separate sets of reviewer comments to the editor and the authors and these are described. Ethical aspects of the process are considered and rewards accruing to the reviewer summarized.
Collapse
|
8
|
Huan LN, Tejani AM, Egan G. Biomedical journals lack a consistent method to detect outcome reporting bias: a cross-sectional analysis. J Clin Pharm Ther 2014; 39:501-6. [PMID: 24828874 DOI: 10.1111/jcpt.12172] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2014] [Accepted: 04/14/2014] [Indexed: 11/27/2022]
Abstract
WHAT IS KNOWN AND OBJECTIVE An increasing amount of recently published literature has implicated outcome reporting bias (ORB) as a major contributor to skewing data in both randomized controlled trials and systematic reviews; however, little is known about the current methods in place to detect ORB. This study aims to gain insight into the detection and management of ORB by biomedical journals. METHODS This was a cross-sectional analysis involving standardized questions via email or telephone with the top 30 biomedical journals (2012) ranked by impact factor. The Cochrane Database of Systematic Reviews was excluded leaving 29 journals in the sample. RESULTS Of 29 journals, 24 (83%) responded to our initial inquiry of which 14 (58%) answered our questions and 10 (42%) declined participation. Five (36%) of the responding journals indicated they had a specific method to detect ORB, whereas 9 (64%) did not have a specific method in place. The prevalence of ORB in the review process seemed to differ with 4 (29%) journals indicating ORB was found commonly, whereas 7 (50%) indicated ORB was uncommon or never detected by their journal previously. The majority (n = 10/14, 72%) of journals were unwilling to report or make discrepancies found in manuscripts available to the public. Although the minority, there were some journals (n = 4/14, 29%) which described thorough methods to detect ORB. WHAT IS NEW AND CONCLUSION Many journals seemed to lack a method with which to detect ORB and its estimated prevalence was much lower than that reported in literature suggesting inadequate detection. There exists a potential for overestimation of treatment effects of interventions and unclear risks. Fortunately, there are journals within this sample which appear to utilize comprehensive methods for detection of ORB, but overall, the data suggest improvements at the biomedical journal level for detecting and minimizing the effect of this bias are needed.
Collapse
Affiliation(s)
- L N Huan
- Lower Mainland Pharmacy Services, Richmond General Hospital, Pharmacy, Richmond, BC, Canada
| | | | | |
Collapse
|
9
|
Ciezki JP. High-Risk Prostate Cancer in the Modern Era: Does a Single Standard of Care Exist? Int J Radiat Oncol Biol Phys 2013; 87:440-2. [DOI: 10.1016/j.ijrobp.2013.06.006] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2013] [Revised: 06/03/2013] [Accepted: 06/07/2013] [Indexed: 10/26/2022]
|
10
|
|
11
|
Affiliation(s)
- Armen Yuri Gasparyan
- Departments of Rheumatology and Research and Development, Dudley Group NHS Foundation Trust (A Teaching Trust of the University of Birmingham, UK), Russells Hall Hospital, West Midlands, UK
| |
Collapse
|
12
|
Onitilo AA, Engel JM, Salzman-Scott SA, Stankowski RV, Doi SAR. Reliability of reviewer ratings in the manuscript peer review process: an opportunity for improvement. Account Res 2013; 20:270-84. [PMID: 23805832 DOI: 10.1080/08989621.2013.804345] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
Abstract
Accountability to authors and readers cannot exist without proper peer review practices. Thus, the information a journal seeks from its peer reviewers and how it makes use of this information is paramount. Disagreement amongst peer reviewers can be considerable, resulting in very diverse comments to authors. Incorporating a clear scoring system for key concrete items and requiring referees to provide justification for scores may ensure that reviewers contribute in a consistently fair and effective manner. This article evaluates information collected from reviewers and proposes an example of a system that aims to improve accountability, while having the potential to make it easier for reviewers to perform a more objective review.
Collapse
Affiliation(s)
- Adedayo A Onitilo
- Department of Hematology/Oncology, Marshfield Clinic Weston Center, Weston, WI, USA
| | | | | | | | | |
Collapse
|
13
|
Gasparyan AY, Kitas GD. Best peer reviewers and the quality of peer review in biomedical journals. Croat Med J 2013; 53:386-9. [PMID: 22911533 PMCID: PMC3428827 DOI: 10.3325/cmj.2012.53.386] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/04/2022] Open
Abstract
Current scholarly publications heavily rely on high quality peer review. Peer review, albeit imperfect, is aimed at improving science writing and editing. Evidence supporting peer review as a guarantor of the quality of biomedical publications is currently lacking. Its outcomes are largely dependent on the credentials of the reviewers. Several lines of evidence suggest that predictors of the best contributors to the process include affiliation to a good University and proper research training. Though the options to further improve peer review are currently limited, experts are in favor of formal education and courses on peer review for all contributors to this process. Long-term studies are warranted to assess the strengths and weaknesses of this approach.
Collapse
Affiliation(s)
- Armen Yuri Gasparyan
- Department of Rheumatology, Dudley Group NHS Foundation Trust, Clinical Research Unit, Russell's Hall Hospital, Dudley, United Kingdom.
| | | |
Collapse
|
14
|
Yarkoni T. Designing next-generation platforms for evaluating scientific output: what scientists can learn from the social web. Front Comput Neurosci 2012; 6:72. [PMID: 23060783 PMCID: PMC3461500 DOI: 10.3389/fncom.2012.00072] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/04/2011] [Accepted: 09/03/2012] [Indexed: 11/17/2022] Open
Abstract
Traditional pre-publication peer review of scientific output is a slow, inefficient, and unreliable process. Efforts to replace or supplement traditional evaluation models with open evaluation platforms that leverage advances in information technology are slowly gaining traction, but remain in the early stages of design and implementation. Here I discuss a number of considerations relevant to the development of such platforms. I focus particular attention on three core elements that next-generation evaluation platforms should strive to emphasize, including (1) open and transparent access to accumulated evaluation data, (2) personalized and highly customizable performance metrics, and (3) appropriate short-term incentivization of the userbase. Because all of these elements have already been successfully implemented on a large scale in hundreds of existing social web applications, I argue that development of new scientific evaluation platforms should proceed largely by adapting existing techniques rather than engineering entirely new evaluation mechanisms. Successful implementation of open evaluation platforms has the potential to substantially advance both the pace and the quality of scientific publication and evaluation, and the scientific community has a vested interest in shifting toward such models as soon as possible.
Collapse
Affiliation(s)
- Tal Yarkoni
- Institute of Cognitive Science, University of Colorado Boulder Boulder, CO, USA
| |
Collapse
|
15
|
Moher D, Stewart L, Shekelle P. Establishing a new journal for systematic review products. Syst Rev 2012; 1:1. [PMID: 22587946 PMCID: PMC3348672 DOI: 10.1186/2046-4053-1-1] [Citation(s) in RCA: 84] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/27/2012] [Accepted: 02/09/2012] [Indexed: 02/08/2023] Open
Abstract
Welcome to a new age in publishing systematic reviews. We hope the launch of Systematic Reviews will resonate with a broad spectrum of readers interested in using them in a variety of ways, such as providing comprehensive and up to date evidence for patient management, informing health policy, and developing rigorous practice guidelines. Systematic reviews are increasingly popular. Our journal is committed to publishing a wide variety of well conducted and transparently reported systematic reviews and associated research. We are open access and electronic and not confined by space and so offer scope for publishing reviews in detail and providing a modern and innovative approach to publishing. We look forward to participating in the voyage with all of our readers.
Collapse
Affiliation(s)
- David Moher
- Clincal Epidemiology Program, Ottawa Hospital Research Institute, The Ottawa Hospital - General Campus, 501 Smyth Road, Box 201B, Ottawa, ON K1H 8L6, Canada
- Department of Epidemiology & Community Medicine, Faculty of Medicine, University of Ottawa
| | - Lesley Stewart
- Centre for Reviews and Dissemination (CRD), University of York, UK
| | - Paul Shekelle
- West Los Angeles VA Medical Center, Los Angeles, CA 90066 USA
| |
Collapse
|
16
|
Editorial: Methodology in judgment and decision making research. JUDGMENT AND DECISION MAKING 2011. [DOI: 10.1017/s1930297500004137] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/04/2023]
Abstract
AbstractIn this introduction to the special issue on methodology, we provide background on its original motivation and a systematic overview of the contributions. The latter are discussed with correspondence to the phase of the scientific process they (most strongly) refer to: Theory construction, design, data analysis, and cumulative development of scientific knowledge. Several contributions propose novel measurement techniques and paradigms that will allow for new insights and can thus avail researchers in JDM and beyond. Another set of contributions centers around how models can best be tested and/or compared. Especially when viewed in combination, the papers on this topic spell out vital necessities for model comparisons and provide approaches that solve noteworthy problems prior work has been faced with.
Collapse
|
17
|
Hughes MA, Brennan PM. The Internet for neurosurgeons: current resources and future challenges. Br J Neurosurg 2011; 25:347-51. [DOI: 10.3109/02688697.2011.554582] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022]
|
18
|
Simera I, Moher D, Hirst A, Hoey J, Schulz KF, Altman DG. Transparent and accurate reporting increases reliability, utility, and impact of your research: reporting guidelines and the EQUATOR Network. BMC Med 2010; 8:24. [PMID: 20420659 PMCID: PMC2874506 DOI: 10.1186/1741-7015-8-24] [Citation(s) in RCA: 335] [Impact Index Per Article: 23.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/09/2010] [Accepted: 04/26/2010] [Indexed: 11/10/2022] Open
Abstract
Although current electronic methods of scientific publishing offer increased opportunities for publishing all research studies and describing them in sufficient detail, health research literature still suffers from many shortcomings. These shortcomings seriously undermine the value and utility of the literature and waste scarce resources invested in the research. In recent years there have been several positive steps aimed at improving this situation, such as a strengthening of journals' policies on research publication and the wide requirement to register clinical trials.The EQUATOR (Enhancing the QUAlity and Transparency Of health Research) Network is an international initiative set up to advance high quality reporting of health research studies; it promotes good reporting practices including the wider implementation of reporting guidelines. EQUATOR provides free online resources http://www.equator-network.org supported by education and training activities and assists in the development of robust reporting guidelines. This paper outlines EQUATOR's goals and activities and offers suggestions for organizations and individuals involved in health research on how to strengthen research reporting.
Collapse
Affiliation(s)
- Iveta Simera
- Centre for Statistics in Medicine, University of Oxford, Oxford, UK.
| | | | | | | | | | | |
Collapse
|