1
|
Paul E, Elfar A, Peters C, Smith C, Nees D, Hughes G, Vassar M. Assessment of Rehabilitation Journal Requirements for the Use of Reporting Guidelines and Clinical Trial Registration. Arch Phys Med Rehabil 2024; 105:1330-1337. [PMID: 38561144 DOI: 10.1016/j.apmr.2024.03.011] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/25/2023] [Revised: 03/21/2024] [Accepted: 03/23/2024] [Indexed: 04/04/2024]
Abstract
OBJECTIVE To assess reporting guideline and clinical trial registration requirements in rehabilitation journals. DESIGN We examined rehabilitation journals with 5-year impact factors exceeding 1.00 from the 2021 Scopus CiteScore tool, alongside the 28 journals included in the 2014 rehabilitation and disability quality improvement initiative. Journals outside the traditional rehabilitation scope were excluded. SETTING A publicly-funded academic health center in the United States. PARTICIPANTS AND INTERVENTIONS N/A. MAIN OUTCOME MEASURE(S) The proportion of journals requiring/recommending reporting guideline use and clinical trial registration. RESULTS Over 90% (57/63) of journals required/recommended clinical trial reporting guidelines, while 68% (39/57) specified guideline requirements for systematic review/meta-analysis protocols. The 2014 collaborative initiative journals demonstrated higher rates of requiring/recommending reporting guidelines for clinical trials (24/26; 92.3%), systematic reviews/meta-analyses (23/26; 88.5%), observational studies in epidemiology (22/25; 88%), and diagnostic accuracy studies (20/24; 83.3%). Conversely, the 2021 Scopus CiteScore journals displayed higher rates for the remaining study designs. Overall, 52/63 (82.5%) journals required/recommended trial registration. Trial registration policies were comparable, with a slight advantage favoring the 2021 Scopus CiteScore journals. CONCLUSION Rehabilitation journals variably promoted reporting guideline use and clinical trial registration. Common study designs like clinical trials, observational studies in epidemiology, and diagnostic accuracy studies demonstrated robust requirement/recommendation rates, while less common designs like economic evaluations and animal research had suboptimal rates. Journals can enhance reporting guideline use and trial registration by directing authors to the EQUATOR Network, requiring adherence to registration and reporting standards, and clarifying language in author instructions.
Collapse
Affiliation(s)
- Eli Paul
- Office of Medical Student Research, Oklahoma State University Center for Health Sciences, Tulsa, Oklahoma, USA.
| | - Annes Elfar
- Office of Medical Student Research, Oklahoma State University Center for Health Sciences, Tulsa, Oklahoma, USA
| | - Caleb Peters
- Office of Medical Student Research, Oklahoma State University Center for Health Sciences, Tulsa, Oklahoma, USA
| | - Caleb Smith
- Office of Medical Student Research, Oklahoma State University Center for Health Sciences, Tulsa, Oklahoma, USA
| | - Danya Nees
- Office of Medical Student Research, Oklahoma State University Center for Health Sciences, Tulsa, Oklahoma, USA
| | - Griffin Hughes
- Office of Medical Student Research, Oklahoma State University Center for Health Sciences, Tulsa, Oklahoma, USA
| | - Matt Vassar
- Office of Medical Student Research, Oklahoma State University Center for Health Sciences, Tulsa, Oklahoma, USA; Department of Psychiatry and Behavioral Sciences, Oklahoma State University Center for Health Sciences, Tulsa, Oklahoma, USA
| |
Collapse
|
2
|
Jelinek T, Shumard A, Modi J, Smith C, Nees D, Hughes G, Vassar M. Endorsement of reporting guidelines and clinical trial registration across Scopus-indexed rheumatology journals: a cross-sectional analysis. Rheumatol Int 2024; 44:909-917. [PMID: 37861727 DOI: 10.1007/s00296-023-05474-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2023] [Accepted: 09/16/2023] [Indexed: 10/21/2023]
Abstract
The purpose of this study was to investigate the instructions for authors of rheumatology journals and analyze their endorsement of reporting guidelines and clinical trial registration. Sixty rheumatology journals were selected by a research librarian and an investigator through the 2021 Scopus CiteScore tool. The instructions for authors' subsection of each journal was assessed to determine endorsement of study design-specific reporting guidelines or clinical trial registration. Descriptive statistics were calculated using R (version 4.2.1) and RStudio. Of the 58 journals analyzed, 34 (34/58; 59%) mentioned the EQUATOR Network: an online compendium of best practice reporting guidelines. The most commonly mentioned reporting guidelines were CONSORT with 44 journals (44/58; 75%), and PRISMA with 35 journals (35/58; 60%). The least mentioned guidelines were QUOROM with 56 journals not mentioning the guideline (56/58; 97%), and SRQR with 53 journals not mentioning the guideline (53/57, 93%). Clinical trial registration was required by 38 journals (38/58; 66%) and recommended by 8 journals (8/58; 14%). Our study found that endorsement of reporting guidelines and clinical trial registration within rheumatology journals was suboptimal with great room for improvement. Endorsement of reporting guidelines have shown to not only mitigate bias, but also improve research methodologies. Therefore, we recommend rheumatology journals broadly expand their endorsement of reporting guidelines and clinical trial registration to improve the quality of evidence they publish.
Collapse
Affiliation(s)
- Trevon Jelinek
- Office of Medical Student Research, Oklahoma State University Center for Health Sciences, 1111 W 17Th St., Tulsa, OK, 74107, USA.
| | - Alexandra Shumard
- Office of Medical Student Research, Oklahoma State University Center for Health Sciences, 1111 W 17Th St., Tulsa, OK, 74107, USA
| | - Jay Modi
- Office of Medical Student Research, Oklahoma State University Center for Health Sciences, 1111 W 17Th St., Tulsa, OK, 74107, USA
| | - Caleb Smith
- Office of Medical Student Research, Oklahoma State University Center for Health Sciences, 1111 W 17Th St., Tulsa, OK, 74107, USA
| | - Danya Nees
- Office of Medical Student Research, Oklahoma State University Center for Health Sciences, 1111 W 17Th St., Tulsa, OK, 74107, USA
| | | | - Matt Vassar
- Office of Medical Student Research, Oklahoma State University Center for Health Sciences, 1111 W 17Th St., Tulsa, OK, 74107, USA
- Department of Psychiatry and Behavioral Sciences, Oklahoma State University Center for Health Sciences, Tulsa, OK, USA
| |
Collapse
|
3
|
Baasan O, Freihat O, Nagy DU, Lohner S. Change over Five Years in Important Measures of Methodological Quality and Reporting in Randomized Cardiovascular Clinical Trials. J Cardiovasc Dev Dis 2023; 11:2. [PMID: 38276655 PMCID: PMC10816801 DOI: 10.3390/jcdd11010002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/29/2023] [Revised: 12/15/2023] [Accepted: 12/18/2023] [Indexed: 01/27/2024] Open
Abstract
OBJECTIVES The aim of our current study was to analyze whether the use of important measures of methodological quality and reporting of randomized clinical trials published in the field of cardiovascular disease research haschanged over time. A furtheraim was to investigate whether there was an improvement over time in the ability of these trials to provide a good estimate of the true intervention effect. METHODS We conducted two searches in the Cochrane Central Register of Controlled Trials (CENTAL) database to identify randomized cardiovascular clinical trials published in either 2012 or 2017. Randomized clinical trials (RCTs) trials in cardiovascular disease research with adult participants were eligible to be included. We randomly selected 250 RCTs for publication years 2012 and 2017. Trial characteristics, data on measures of methodological quality, and reporting were extracted and the risk of bias for each trial was assessed. RESULTS As compared to 2012, in 2017 there were significant improvements in the reporting of the presence of a data monitoring committee (42.0% in 2017 compared to 34.4% in 2012; p < 0.001), and a positive change in registering randomized cardiovascular disease research in clinical trial registries (78.4% in 2017 compared to 68.9% in 2012; p = 0.03). We also observed that significantly more RCTs reported sample size calculation (60.4% in 2017 compared to 49.6% in 2012; p < 0.01) in 2017 as compared to 2012. RCTs in 2017 were more likely to have a low overall risk of bias (RoB) than in 2012 (29.2% in 2017 compared to 21.2% in 2012; p < 0.01). However, fewer 2017 RCTs were rated low (50.8% compared to 65.6%; p < 0.001) risk for blinding of participants and personnel, for blinding of outcome assessors (82.4% compared to 90.8%; p < 0.001), and selective outcome reporting (62.8% compared to 80.0%; <0.001). CONCLUSIONS As compared to 2012, in 2017 there were significant improvements in some, but not all, the important measures of methodological quality. Although more trials in the field of cardiovascular disease research had a lower overall RoB in 2017, the improvement over time was not consistently perceived in all RoB domains.
Collapse
Affiliation(s)
- Odgerel Baasan
- Doctoral School of Health Sciences, University of Pécs, 7621 Pécs, Hungary
- Cochrane Hungary, Clinical Centre of the University of Pécs, Medical School, University of Pécs, 7624 Pécs, Hungary
| | - Omar Freihat
- Doctoral School of Health Sciences, University of Pécs, 7621 Pécs, Hungary
| | - Dávid U. Nagy
- Cochrane Hungary, Clinical Centre of the University of Pécs, Medical School, University of Pécs, 7624 Pécs, Hungary
- Institute of Geobotany/Plant Ecology, Martin-Luther-University, 06108 Halle, Germany
| | - Szimonetta Lohner
- Doctoral School of Health Sciences, University of Pécs, 7621 Pécs, Hungary
- Cochrane Hungary, Clinical Centre of the University of Pécs, Medical School, University of Pécs, 7624 Pécs, Hungary
- Department of Public Health Medicine, Medical School, University of Pécs, 7624 Pécs, Hungary
| |
Collapse
|
4
|
Zhong J, Xing Y, Lu J, Zhang G, Mao S, Chen H, Yin Q, Cen Q, Jiang R, Hu Y, Ding D, Ge X, Zhang H, Yao W. The endorsement of general and artificial intelligence reporting guidelines in radiological journals: a meta-research study. BMC Med Res Methodol 2023; 23:292. [PMID: 38093215 PMCID: PMC10717715 DOI: 10.1186/s12874-023-02117-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/28/2023] [Accepted: 12/01/2023] [Indexed: 12/17/2023] Open
Abstract
BACKGROUND Complete reporting is essential for clinical research. However, the endorsement of reporting guidelines in radiological journals is still unclear. Further, as a field extensively utilizing artificial intelligence (AI), the adoption of both general and AI reporting guidelines would be necessary for enhancing quality and transparency of radiological research. This study aims to investigate the endorsement of general reporting guidelines and those for AI applications in medical imaging in radiological journals, and explore associated journal characteristic variables. METHODS This meta-research study screened journals from the Radiology, Nuclear Medicine & Medical Imaging category, Science Citation Index Expanded of the 2022 Journal Citation Reports, and excluded journals not publishing original research, in non-English languages, and instructions for authors unavailable. The endorsement of fifteen general reporting guidelines and ten AI reporting guidelines was rated using a five-level tool: "active strong", "active weak", "passive moderate", "passive weak", and "none". The association between endorsement and journal characteristic variables was evaluated by logistic regression analysis. RESULTS We included 117 journals. The top-five endorsed reporting guidelines were CONSORT (Consolidated Standards of Reporting Trials, 58.1%, 68/117), PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses, 54.7%, 64/117), STROBE (STrengthening the Reporting of Observational Studies in Epidemiology, 51.3%, 60/117), STARD (Standards for Reporting of Diagnostic Accuracy, 50.4%, 59/117), and ARRIVE (Animal Research Reporting of In Vivo Experiments, 35.9%, 42/117). The most implemented AI reporting guideline was CLAIM (Checklist for Artificial Intelligence in Medical Imaging, 1.7%, 2/117), while other nine AI reporting guidelines were not mentioned. The Journal Impact Factor quartile and publisher were associated with endorsement of reporting guidelines in radiological journals. CONCLUSIONS The general reporting guideline endorsement was suboptimal in radiological journals. The implementation of reporting guidelines for AI applications in medical imaging was extremely low. Their adoption should be strengthened to facilitate quality and transparency of radiological study reporting.
Collapse
Affiliation(s)
- Jingyu Zhong
- Department of Imaging, Tongren Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, 200336, China
| | - Yue Xing
- Department of Imaging, Tongren Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, 200336, China
| | - Junjie Lu
- Department of Epidemiology and Population Health, Stanford University School of Medicine, Stanford, CA, 94305, USA
| | - Guangcheng Zhang
- Department of Orthopedics, Shanghai Sixth People's Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, 200233, China
| | - Shiqi Mao
- Department of Medical Oncology, Shanghai Pulmonary Hospital, Tongji University School of Medicine, Shanghai, 200433, China
| | - Haoda Chen
- Department of General Surgery, Pancreatic Disease Center, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, 200025, China
| | - Qian Yin
- Department of Pathology, Shanghai Sixth People's Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, 200233, China
| | - Qingqing Cen
- Department of Dermatology, Shanghai Ninth People's Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, 200011, China
| | - Run Jiang
- Department of Pharmacovigilance, Shanghai Hansoh BioMedical Co., Ltd., Shanghai, 201203, China
| | - Yangfan Hu
- Department of Imaging, Tongren Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, 200336, China
| | - Defang Ding
- Department of Imaging, Tongren Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, 200336, China
| | - Xiang Ge
- Department of Imaging, Tongren Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, 200336, China
| | - Huan Zhang
- Department of Radiology, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, 200025, China.
| | - Weiwu Yao
- Department of Imaging, Tongren Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai, 200336, China.
| |
Collapse
|
5
|
Barajas-Ochoa A, Ramirez-Trejo M, Dash A, Raybould JE, Bearman G. Are reporting guidelines used in infectious diseases publications? An analysis of more than 1,000 articles. ANTIMICROBIAL STEWARDSHIP & HEALTHCARE EPIDEMIOLOGY : ASHE 2023; 3:e213. [PMID: 38156238 PMCID: PMC10753476 DOI: 10.1017/ash.2023.492] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/30/2023] [Revised: 07/12/2023] [Accepted: 10/24/2023] [Indexed: 12/30/2023]
Abstract
Objective To assess whether 16 reporting guidelines of Enhancing the QUAlity and Transparency Of Health Research (EQUATOR) were used in infectious diseases research publications. Design This cross-sectional, audit-type study assessed articles published in five infectious diseases journals in 2019. Methods All articles were manually reviewed to assess if a reporting guideline was advisable and searched for the names and acronyms of 16 reporting guidelines. An "advisable use rate" was calculated. Results We reviewed 1,251 manuscripts across five infectious diseases journals. Guideline use was advisable for 973 (75%) articles. Reporting guidelines were used in 85 articles, 6.1% of total articles, and 8% (95% CI 6%-9%) of articles for which guidelines were advised. The advisable use rate ranged from 0.06 to 0.17 for any guideline, 0-0.08 for CONSORT, 0.53-1 for Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA), and 0-0.66 for Transparent reporting of a multivariable prediction model for individual prognosis or diagnosis (TRIPOD) : The TRIPOD statement. No trends were observed across the five journals. Conclusions The use of EQUATOR-related reporting guidelines is infrequent, despite journals and publishers promoting their usage. Whether this finding is attributable to knowledge, acceptance, or perceived usefulness of the guidelines still needs to be clarified.
Collapse
Affiliation(s)
- Aldo Barajas-Ochoa
- Division of Infectious Diseases, Department of Medicine, Virginia Commonwealth University, Richmond, VA, USA
| | - Manuel Ramirez-Trejo
- Centro Universitario de Ciencias de la Salud, Universidad de Guadalajara, Guadalajara, Mexico
| | - Aditee Dash
- Division of Infectious Diseases, Department of Medicine, Virginia Commonwealth University, Richmond, VA, USA
| | - Jillian E. Raybould
- Division of Infectious Diseases, Department of Medicine, Virginia Commonwealth University, Richmond, VA, USA
| | - Gonzalo Bearman
- Division of Infectious Diseases, Department of Medicine, Virginia Commonwealth University, Richmond, VA, USA
| |
Collapse
|
6
|
Rauh S, Johnson BS, Bowers A, Tritz D, Vassar BM. A review of reproducible and transparent research practices in urology publications from 2014 to2018. BMC Urol 2022; 22:102. [PMID: 35820886 PMCID: PMC9277815 DOI: 10.1186/s12894-022-01059-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2020] [Accepted: 07/07/2022] [Indexed: 11/10/2022] Open
Abstract
Background Reproducibility is essential for the integrity of scientific research. Reproducibility is measured by the ability of different investigators to replicate the outcomes of an original publication using the same materials and procedures. Unfortunately, reproducibility is not currently a standard being met by most scientific research. Methods For this review, we sampled 300 publications in the field of urology to assess for 14 indicators of reproducibility including material availability, raw data availability, analysis script availability, pre-registration information, links to protocols, and if the publication was available free to the public. Publications were also assessed for statements about conflicts of interest and funding sources. Results Of the 300 sample publications, 171 contained empirical data available for analysis of reproducibility. Of the 171 articles with empirical data to analyze, 0.58% provided links to protocols, 4.09% provided access to raw data, 3.09% provided access to materials, and 4.68% were pre-registered. None of the studies provided analysis scripts. Our review is cross-sectional in nature, including only PubMed indexed journals-published in English-and within a finite time period. Thus, our results should be interpreted in light of these considerations. Conclusion Current urology research does not consistently provide the components needed to reproduce original studies. Collaborative efforts from investigators and journal editors are needed to improve research quality while minimizing waste and patient risk. Supplementary Information The online version contains supplementary material available at 10.1186/s12894-022-01059-8.
Collapse
Affiliation(s)
- Shelby Rauh
- Oklahoma State University Center for Health Sciences, 1111 W 17th St., Tulsa, OK, 74107, USA.
| | - Bradley S Johnson
- Oklahoma State University Center for Health Sciences, 1111 W 17th St., Tulsa, OK, 74107, USA
| | - Aaron Bowers
- Oklahoma State University Center for Health Sciences, 1111 W 17th St., Tulsa, OK, 74107, USA
| | - Daniel Tritz
- Oklahoma State University Center for Health Sciences, 1111 W 17th St., Tulsa, OK, 74107, USA
| | - Benjamin Matthew Vassar
- Oklahoma State University Center for Health Sciences, 1111 W 17th St., Tulsa, OK, 74107, USA
| |
Collapse
|
7
|
Nicholls SG, McDonald S, McKenzie JE, Carroll K, Taljaard M. A review identified challenges distinguishing primary reports of randomised trials for meta-research: a proposal for improved reporting. J Clin Epidemiol 2022; 145:121-125. [DOI: 10.1016/j.jclinepi.2022.01.013] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/05/2021] [Revised: 01/04/2022] [Accepted: 01/18/2022] [Indexed: 11/15/2022]
|
8
|
Tonsillar-related pathologies: An analysis of the evidence underpinning management recommendations. Int J Pediatr Otorhinolaryngol 2022; 152:110992. [PMID: 34883327 DOI: 10.1016/j.ijporl.2021.110992] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/07/2021] [Revised: 11/11/2021] [Accepted: 11/22/2021] [Indexed: 11/20/2022]
Abstract
OBJECTIVE Evidence-based decision making is crucial in reducing the health and economic burdens imposed by tonsillar-related pathologies. Clinical practice guidelines are used to guide these decisions; however, uptake of recommendations in these guidelines is low. Systematic reviews are the highest level of evidence used to influence guideline recommendations; therefore, improving the reporting and methodological quality of systematic reviews related to tonsillar-related pathologies may improve guideline uptake and patient care. METHODS We used PubMed to search for all clinical practice guidelines related to tonsillar-related pathologies from 2010 to 2020. Included guidelines were then searched for all systematic reviews and meta-analyses. Study characteristics were extracted from each cited systematic review/meta-analysis before being evaluated using the PRISMA (Preferred Reporting Instrument for Systematic Reviews and Meta-Analyses) and AMSTAR-2 (A Measurement Tool to Assess Systematic Reviews 2) instruments. We then compared systematic reviews conducted by a Cochrane systematic review group with non-Cochrane systematic reviews. RESULTS Seven clinical practice guidelines were included in our study and within these guidelines 98 SRs/MAs were cited, 80 of which were unique and included. Systematic reviews composed 9.1% (98/1082) of all guideline citations. Guideline PRISMA scores ranged from 0.47 to 0.83 with a mean score of 0.71 (n = 80) and guideline AMSTAR-2 scores ranged from 0.52 to 0.83 with a mean of 0.56 (7.29/13) and 0.75 (11.94/16) (n = 80). Cochrane systematic reviews displayed greater PRISMA (0.88 vs. 0.64: p < 0.001) and AMSTAR-2 (0.90 vs. 0.57; p < 0.001) scores compared to the non-Cochrane studies. We found PRISMA and AMSTAR-2 scores were positively correlated across guidelines (r = 0.93). CONCLUSION Wide variation exists in adherence to PRISMA and AMSTAR-2 guidelines among systematic reviews cited in clinical practice guidelines for tonsillar-related pathologies. Prior registration and adequate risk of bias assessment are two areas where improvements may be needed. Given the importance of guideline uptake, careful considerations to improve the methodological and reporting quality of evidence supporting tonsillar-related pathology recommendations are necessary.
Collapse
|
9
|
Rauh S, Bowers A, Rorah D, Tritz D, Pate H, Frye L, Vassar M. Evaluating the reproducibility of research in obstetrics and gynecology. Eur J Obstet Gynecol Reprod Biol 2021; 269:24-29. [PMID: 34954422 DOI: 10.1016/j.ejogrb.2021.12.021] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2021] [Revised: 11/19/2021] [Accepted: 12/11/2021] [Indexed: 01/21/2023]
Abstract
OBJECTIVE Reproducibility is a core tenet of scientific research. A reproducible study is one where the results can be recreated by using the same methodology and materials as the original researchers. Unfortunately, reproducibility is not a standard to which the majority of research is currently adherent. METHODS Our cross-sectional survey evaluated 300 trials in the field of Obstetrics and Gynecology. Our primary objective was to identify nine indicators of reproducibility and transparency. These indicators include availability of data, analysis scripts, pre-registration information, study protocols, funding source, conflict of interest statements and whether or not the study was available via Open Access. RESULTS Of the 300 trials in our sample, 208 contained empirical data that could be assessed for reproducibility. None of the trials in our sample provided a link to their protocols or provided a statement on availability of materials. None were replication studies. Just 10.58% provided a statement regarding their data availability, while only 5.82% provided a statement on preregistration. 25.85% failed to report the presence or absence of conflicts of interest and 54.08% did not state the origin of their funding. CONCLUSION In the studies we examined, research in the field of Obstetrics and Gynecology is not consistently reproducible and frequently lacks conflict of interest disclosure. Consequences of this could be far-reaching and include increased research waste, widespread acceptance of misleading results and erroneous conclusions guiding clinical decision-making.
Collapse
Affiliation(s)
- Shelby Rauh
- Oklahoma State University Center for Health Sciences, Tulsa, OK, United States.
| | - Aaron Bowers
- Oklahoma State University Center for Health Sciences, Tulsa, OK, United States
| | - Drayton Rorah
- Kansas City University of Medicine and Biosciences, Joplin, MO, United States
| | - Daniel Tritz
- Oklahoma State University Center for Health Sciences, Tulsa, OK, United States
| | - Heather Pate
- Department of Obstetrics and Gynecology, Oklahoma State University Medical Center, Tulsa, OK, United States
| | - Lance Frye
- Department of Obstetrics and Gynecology, Oklahoma State University Medical Center, Tulsa, OK, United States
| | - Matt Vassar
- Oklahoma State University Center for Health Sciences, Tulsa, OK, United States
| |
Collapse
|
10
|
Abstract
Previous surveys of the literature have shown that reports of statistical analyses often lack important information, causing lack of transparency and failure of reproducibility. Editors and authors agree that guidelines for reporting should be encouraged. This Review presents a set of Bayesian analysis reporting guidelines (BARG). The BARG encompass the features of previous guidelines, while including many additional details for contemporary Bayesian analyses, with explanations. An extensive example of applying the BARG is presented. The BARG should be useful to researchers, authors, reviewers, editors, educators and students. Utilization, endorsement and promotion of the BARG may improve the quality, transparency and reproducibility of Bayesian analyses.
Collapse
Affiliation(s)
- John K Kruschke
- Department of Psychological and Brain Sciences, Indiana University, Bloomington, Bloomington, IN, USA.
| |
Collapse
|
11
|
Abstract
Micro- or minimally invasive glaucoma surgeries (MIGS) have been the latest addition to the glaucoma surgical treatment paradigm. This term refers not to a single surgery, but rather to a group of distinct procedures and devices that aim to decrease intraocular pressure. Broadly, MIGS can be categorized into surgeries that increase the trabecular outflow [Trabectome, iStent (first and second generations), Hydrus microstent, Kahook Dual Blade and gonioscopy-assisted transluminal trabeculotomy], surgeries that increase suprachoroidal outflow (Cypass microstent and iStent Supra), and conjunctival bleb-forming procedures (Xen gel stent and InnFocus microshunt). Compared to traditional glaucoma surgeries, such as trabeculectomy and glaucoma drainage device implantation (Ahmed, Baerveldt, and Molteno valves), MIGS are touted to have less severe complications and shorter surgical time. MIGS represent an evolving field, and the efficacy and complications of each procedure should be considered independently, giving more importance to high-quality and longer-term studies.
Collapse
Affiliation(s)
- David J Mathew
- Department of Ophthalmology and Vision Sciences, University of Toronto, Toronto, Ontario M5T 2S8, Canada;
| | - Yvonne M Buys
- Department of Ophthalmology and Vision Sciences, University of Toronto, Toronto, Ontario M5T 2S8, Canada;
| |
Collapse
|
12
|
Palmer W, Okonya O, Jellison S, Horn J, Harter Z, Wilkett M, Vassar M. Intervention reporting of clinical trials published in high-impact cardiology journals: effect of the TIDieR checklist and guide. BMJ Evid Based Med 2021; 26:91-97. [PMID: 32139513 DOI: 10.1136/bmjebm-2019-111309] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 02/03/2020] [Indexed: 11/03/2022]
Abstract
BACKGROUND Randomised controlled trials (RCTs) provide the highest-level of evidence among primary research in cardiovascular medicine. Yet, even the best trial may be less useful if it fails to provide an accurate means of reproducibility. Unfortunately, discrepancies in the standards of trial reporting have been persistent in previous trials. The Template for Intervention Description and Replication (TIDieR) checklist aims to improve research efficacy by setting standards for quality intervention reporting and reproducibility. The goal of this study was to assess adherence to the TIDieR checklist among RCTs published in cardiovascular health journals. We also compared the quality of intervention reporting before and after the publication of TIDieR. METHODS This cross-sectional, methodological study analysed 101 trials published within high-impact cardiology journals. Our primary objective was to assess overall adherence to the TIDieR checklist. Our secondary objective was to use an interrupted time-series analysis to determine if intervention reporting increased following the publication of TIDieR in March 2014. Additionally, we used generalised estimating equations to identify trial characteristics associated with intervention reporting. RESULTS Trials in our sample reported 8.6/12 TIDieR checklist items, on average. The most under-reported items were those for describing the expertise of the interventionists and for describing the location of the intervention. CONCLUSION Improved outcome reporting and intervention reproducibility among RCTs are greatly needed in cardiovascular medicine. Clinicians and researchers should advocate for the ethical publication of complete, translatable and replicable clinical research results.
Collapse
Affiliation(s)
- William Palmer
- Department of Psychiatry and Behavioral Science, Oklahoma State University Center for Health Sciences, Tulsa, Oklahoma, USA
| | - Ochije Okonya
- Department of Psychiatry and Behavioral Science, Oklahoma State University Center for Health Sciences, Tulsa, Oklahoma, USA
| | - Samuel Jellison
- Department of Psychiatry and Behavioral Science, Oklahoma State University Center for Health Sciences, Tulsa, Oklahoma, USA
| | - Jarryd Horn
- Department of Psychiatry and Behavioral Science, Oklahoma State University Center for Health Sciences, Tulsa, Oklahoma, USA
| | - Zachery Harter
- Department of Internal Medicine, Kansas City University of Medicine and Biosciences, Kansas City, Missouri, USA
| | - Matt Wilkett
- Department of Internal Medicine, Oklahoma State University Medical Center, Tulsa, Oklahoma, USA
| | - Matt Vassar
- Department of Psychiatry and Behavioral Science, Oklahoma State University Center for Health Sciences, Tulsa, Oklahoma, USA
| |
Collapse
|
13
|
Lindsley K, Fusco N, Teeuw H, Mooij E, Scholten R, Hooft L. Poor compliance of clinical trial registration among trials included in systematic reviews: a cohort study. J Clin Epidemiol 2020; 132:79-87. [PMID: 33333165 DOI: 10.1016/j.jclinepi.2020.12.016] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2020] [Revised: 11/30/2020] [Accepted: 12/09/2020] [Indexed: 01/09/2023]
Abstract
OBJECTIVES The objective of the study was to examine whether clinical trials that have been included in systematic reviews have been registered in clinical trial registers and, when they have, whether results of the trials were included in the clinical trial register. STUDY DESIGN AND SETTING This study used a sample of 100 systematic reviews published by the Cochrane Musculoskeletal, Oral, Skin and Sensory Network between 2014 and 2019. RESULTS We identified 2,000 trials (369,778 participants) from a sample of 100 systematic reviews. The median year of trial publication was 2007. Of 1,177 trials published in 2005 or later, a clinical trial registration record was identified for 368 (31%). Of these registered trials, 135 (37%) were registered prospectively and results were posted for 114 (31%); most registered trials evaluated pharmaceutical interventions (62%). Of trials published in the last 10 years, the proportion of registered trials increased to 38% (261 of 682). CONCLUSION Although some improvement in clinical trial registration has been observed in recent years, the proportion of registered clinical trials included in recently published systematic reviews remains less than desirable. Prospective clinical trial registration provides an essential role in assessing the risk of bias and judging the quality of evidence in systematic reviews of intervention safety and effectiveness.
Collapse
Affiliation(s)
- Kristina Lindsley
- Julius Center for Health Sciences and Primary Care, University Medical Center Utrecht, Utrecht University, Utrecht, the Netherlands; Cochrane Netherlands, University Medical Center Utrecht, Utrecht University, Utrecht, the Netherlands; Life Sciences, IBM Watson Health, Baltimore, MD, USA.
| | - Nicole Fusco
- Life Sciences, IBM Watson Health, Baltimore, MD, USA
| | - Hannah Teeuw
- Julius Center for Health Sciences and Primary Care, University Medical Center Utrecht, Utrecht University, Utrecht, the Netherlands
| | - Eva Mooij
- Julius Center for Health Sciences and Primary Care, University Medical Center Utrecht, Utrecht University, Utrecht, the Netherlands
| | - Rob Scholten
- Julius Center for Health Sciences and Primary Care, University Medical Center Utrecht, Utrecht University, Utrecht, the Netherlands; Cochrane Netherlands, University Medical Center Utrecht, Utrecht University, Utrecht, the Netherlands
| | - Lotty Hooft
- Julius Center for Health Sciences and Primary Care, University Medical Center Utrecht, Utrecht University, Utrecht, the Netherlands; Cochrane Netherlands, University Medical Center Utrecht, Utrecht University, Utrecht, the Netherlands
| |
Collapse
|
14
|
Wang F, Schilsky RL, Page D, Califf RM, Cheung K, Wang X, Pang H. Development and Validation of a Natural Language Processing Tool to Generate the CONSORT Reporting Checklist for Randomized Clinical Trials. JAMA Netw Open 2020; 3:e2014661. [PMID: 33030549 PMCID: PMC7545295 DOI: 10.1001/jamanetworkopen.2020.14661] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 01/25/2023] Open
Abstract
IMPORTANCE Adherence to the Consolidated Standards of Reporting Trials (CONSORT) for randomized clinical trials is associated with improvingquality because inadequate reporting in randomized clinical trials may complicate the interpretation and the application of findings to clinical care. OBJECTIVE To evaluate an automated reporting checklist generation tool that uses natural language processing (NLP), called CONSORT-NLP. DESIGN, SETTING, AND PARTICIPANTS This study used published journal articles as training, testing, and validation sets to develop, refine, and evaluate the CONSORT-NLP tool. Articles reporting randomized clinical trials were selected from 25 high-impact-factor journals under the following categories: (1) general and internal medicine, (2) oncology, and (3) cardiac and cardiovascular systems. MAIN OUTCOMES AND MEASURES For an evaluation of the performance of this tool, an accuracy metric defined as the number of correct assessments divided by all assessments was calculated. RESULTS The CONSORT-NLP tool uses the widely used Portable Document Format as an input file. Of the 37 CONSORT reporting items, 34 (92%) were included in the tool. Of these 34 reporting items, 30 were fully implemented; 28 (93%) of the fully implemented CONSORT reporting items had an accuracy of more than 90% for the validation set. The remaining 2 (7%) had an accuracy between 80% and 90% for the validation set. Two to 5 articles were selected from each of these journals for a total of 158 articles to establish a training set of 111 articles to train CONSORT-NLP for CONSORT reporting items, a testing set of 25 articles to refine CONSORT-NLP, and a validation set of 22 articles to assess the performance of CONSORT-NLP. The CONSORT-NLP tool used the Portable Document Format of the articles as input files. A CONSORT-NLP graphical user interface was built using Java in 2019. The time required to complete the CONSORT checklist manually vs using the CONSORT-NLP tool was compared for 30 articles. Two case studies for randomized clinical trials are provided as an illustration for the CONSORT-NLP tool. For the 30 articles investigated, CONSORT-NLP required a mean (SD) 23.0 (4.1) seconds, whereas the manual reviewer required a mean (SD) 11.9 (2.2), 22.6 (4.6), and 57.6 (7.1) minutes, for 3 reviewers, respectively. CONCLUSIONS AND RELEVANCE The CONSORT-NLP tool is designed to assist in the reporting of randomized clinical trials. Potential users of CONSORT-NLP include clinicians, researchers, and scientists who plan to publish a randomized trial study in a peer-reviewed journal. The use of CONSORT-NLP may help them save substantial time when generating the CONSORT checklist. This tool may also be useful for manuscript reviewers and journal editors who review these articles.
Collapse
Affiliation(s)
- Fan Wang
- School of Public Health, Li Ka Shing Faculty of Medicine, The University of Hong Kong, Hong Kong, China
| | | | - David Page
- Department of Biostatistics and Bioinformatics, Duke University School of Medicine, Durham, North Carolina
| | - Robert M. Califf
- Duke Forge, Duke University School of Medicine, Durham, North Carolina
- Stanford University School of Medicine, Stanford, California
- Verily Life Sciences, South San Francisco, California
| | - Kei Cheung
- Department of Emergency Medicine, Yale Center for Medical Informatics, Yale University School of Medicine, New Haven, Connecticut
| | - Xiaofei Wang
- Department of Biostatistics and Bioinformatics, Duke University School of Medicine, Durham, North Carolina
| | - Herbert Pang
- School of Public Health, Li Ka Shing Faculty of Medicine, The University of Hong Kong, Hong Kong, China
- Department of Biostatistics and Bioinformatics, Duke University School of Medicine, Durham, North Carolina
| |
Collapse
|
15
|
Affiliation(s)
- Harold C Sox
- The Patient-Centered Outcomes Research Institute, Washington, DC, USA
- Geisel School of Medicine at Dartmouth, Hanover, NH, USA
- Both authors contributed equally
| | - Mark A Schuster
- Kaiser Permanente Bernard J Tyson School of Medicine, Pasadena, CA, USA
- Both authors contributed equally
| |
Collapse
|
16
|
Tuazon JR, Jutai JW. Toward guidelines for reporting assistive technology device outcomes. Disabil Rehabil Assist Technol 2019; 16:702-711. [PMID: 31795783 DOI: 10.1080/17483107.2019.1697384] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
Abstract
PURPOSE The aim of this study was to develop and pilot-test reporting guidelines for manuscripts describing studies of assistive technology device outcomes, with the hopes of improving the overall quality of research in this field. METHODS The research is presented in two stages. In Stage 1, a literature review was completed to identify the essential components of a conceptual framework for reporting guidelines and to create a checklist. In Stage 2, two independent reviewers evaluated twenty articles using the checklist to identify any short-comings of the tool and produce an estimate of interrater reliability. Two items of the original checklist were revised after reconciling disagreements between the two raters. RESULTS The Cohen's Kappa value of the checklist was 0.887 (p < .000), reflecting excellent interrater agreement. The overall percent agreement was 94.6%. CONCLUSIONS Reporting guidelines for studies of assistive technology device outcomes appear to be reliable. Although the checklist may require periodic updating, it has potential for advancing outcomes research. Researchers are invited to share comments and criticisms to aid in the efforts of enhancing the quality of reporting in this field.Implications for rehabilitationReporting checklists and guidelines are effective tools for achieving a minimum standard of reporting quality in all areas of rehabilitation research.This study presents a preliminary reporting checklist for the field of assistive technology device outcomes that has potential for advancing outcomes research.Authors and journal editors are encouraged to adopt and adhere to reporting guidelines in order to enhance the clarity and completeness of prospective studies.
Collapse
Affiliation(s)
- Joshua R Tuazon
- Faculty of Health Sciences, University of Ottawa, Ottawa, Canada
| | - Jeffrey W Jutai
- Interdisciplinary School of Health Sciences and LIFE Research Institute, University of Ottawa, Ottawa, Canada
| |
Collapse
|
17
|
Zuñiga-Hernandez JA, Dorsey-Treviño EG, González-González JG, Brito JP, Montori VM, Rodriguez-Gutierrez R. Endorsement of reporting guidelines and study registration by endocrine and internal medicine journals: meta-epidemiological study. BMJ Open 2019; 9:e031259. [PMID: 31558457 PMCID: PMC6773296 DOI: 10.1136/bmjopen-2019-031259] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/29/2019] [Revised: 09/06/2019] [Accepted: 09/09/2019] [Indexed: 02/07/2023] Open
Abstract
OBJECTIVES To improve the trustworthiness of evidence, studies should be prospectively registered and research reports should adhere to existing standards. We aimed to systematically assess the degree to which endocrinology and internal medicine journals endorse study registration and reporting standards for randomised controlled trials (RCTs), systematic reviews (SRs) and observational studies (ObS). Additionally, we evaluated characteristics that predict endorsement of reporting or registration mechanism by these journals. DESIGN Meta-epidemiological study. SETTING Journals included in the 'Endocrinology and Metabolism' and 'General and Internal Medicine' 2017 Journal Citation Reports. PARTICIPANTS Journals with an impact factor of ≥1.0, focused on clinical medicine, and those who publish RCTs, SRs and ObS were included. PRIMARY OUTCOMES Requirement of adherence to reporting guideline and study registration as determined from the journals' author instructions. RESULTS Of the 170 (82 endocrinology and 88 internal medicine) eligible journals, endorsing of reporting standards was the highest for RCTs, with 35 (43%) of endocrine journals and 55 (63%) of internal medicine journals followed by SRs, with 21 (26%) and 48 (55%), respectively, and lastly, by ObS with 41 (50%) of endocrine journals and 21 (24%) of internal medicine journals. In 78 (46%) journals RCTs were required to be registered and published in adherence to the Consolidated Standards of Reporting Trials statement. Only 11 (6%) journals required registration of SRs. Internal medicine journals were more likely to endorse reporting guidelines than endocrine journals except for Strengthening the Reporting of Observational Studies in Epidemiology. No other journal characteristic proved to be an independent predictor of reporting standard endorsement for RCTs besides trial registration. CONCLUSION Our results highlight that study registration requirement and reporting guideline endorsement are suboptimal in internal medicine and endocrine journals. This malpractice may be further enhanced since endorsement does not imply enforcement, impairing the practice of evidence-based medicine.
Collapse
Affiliation(s)
- Jorge Alberto Zuñiga-Hernandez
- Endocrinology Division, Department of Internal Medicine, University Hospital 'Dr. José E. González', Universidad Autonoma de Nuevo Leon, Monterrey, Mexico
- Plataforma INVEST Medicina UANL-KER Unit Mayo Clinic (KER Unit México), Universidad Autonoma de Nuevo Leon, Monterrey, Mexico
| | - Edgar Gerardo Dorsey-Treviño
- Endocrinology Division, Department of Internal Medicine, University Hospital 'Dr. José E. González', Universidad Autonoma de Nuevo Leon, Monterrey, Mexico
- Plataforma INVEST Medicina UANL-KER Unit Mayo Clinic (KER Unit México), Universidad Autonoma de Nuevo Leon, Monterrey, Mexico
| | - Jose Gerardo González-González
- Endocrinology Division, Department of Internal Medicine, University Hospital 'Dr. José E. González', Universidad Autonoma de Nuevo Leon, Monterrey, Mexico
- Plataforma INVEST Medicina UANL-KER Unit Mayo Clinic (KER Unit México), Universidad Autonoma de Nuevo Leon, Monterrey, Mexico
- Research Unit, University Hospital 'Dr. José E. González', Universidad Autonoma de Nuevo Leon, Monterrey, Mexico
| | - Juan P Brito
- Knowledge and Evaluation Research Unit, Mayo Clinic, Rochester, Minnesota, USA
- Division of Endocrinology, Diabetes, Metabolism, and Nutrition, Department of Medicine, Mayo Clinic, Rochester, Minnesota, USA
| | - Victor M Montori
- Knowledge and Evaluation Research Unit, Mayo Clinic, Rochester, Minnesota, USA
- Division of Endocrinology, Diabetes, Metabolism, and Nutrition, Department of Medicine, Mayo Clinic, Rochester, Minnesota, USA
| | - Rene Rodriguez-Gutierrez
- Endocrinology Division, Department of Internal Medicine, University Hospital 'Dr. José E. González', Universidad Autonoma de Nuevo Leon, Monterrey, Mexico
- Plataforma INVEST Medicina UANL-KER Unit Mayo Clinic (KER Unit México), Universidad Autonoma de Nuevo Leon, Monterrey, Mexico
- Knowledge and Evaluation Research Unit, Mayo Clinic, Rochester, Minnesota, USA
- Division of Endocrinology, Diabetes, Metabolism, and Nutrition, Department of Medicine, Mayo Clinic, Rochester, Minnesota, USA
| |
Collapse
|
18
|
Jellison S, Roberts W, Bowers A, Combs T, Beaman J, Wayant C, Vassar M. Evaluation of spin in abstracts of papers in psychiatry and psychology journals. BMJ Evid Based Med 2019; 25:178-181. [PMID: 31383725 DOI: 10.1136/bmjebm-2019-111176] [Citation(s) in RCA: 35] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 01/10/2023]
Abstract
We have identified 'spin' in abstracts of randomised controlled trials (RCTs) with nonsignificant primary endpoints in psychiatry and psychology journals. This is a cross-sectional review of clinical trials with nonsignificant primary endpoints published in psychiatry and psychology journals from January 2012 to December 2017. The main outcome was the frequency and manifestation of spin in the abstracts. We define spin as the 'use of specific reporting strategies, from whatever motive, to highlight that the experimental treatment is beneficial, despite a statistically nonsignificant difference for the primary outcome, or to distract the reader from statistically nonsignificant results'. We have also assessed the relationship between industry funding and spin. Of the 486 RCTs examined, 116 were included in our analysis of spin. Spin was identified in 56% (n=65) of those included. Spin was found in 2 (2%) titles, 24 (21%) abstract results sections and 57 (49.1%) abstract conclusion sections. Evidence of spin was simultaneously identified in both results and conclusions sections in 15% of RCTs (n=17). Twelve articles reported industry funding (10%). Industry funding was not associated with increased odds of spin in the abstract (unadjusted OR: 1.0; 95% CI: 0.3 to 3.2). We found no relationship between industry funding and spin in abstracts. These findings raise concerns about the effects spin may have on clinicians. Further steps could be taken to address spin, including inviting reviewers to comment on the presence of spin and updating Consolidated Standards of Reporting Trials guidelines to contain language discouraging spin.
Collapse
Affiliation(s)
- Samuel Jellison
- College of Osteopathic Medicine, Oklahoma State University Center for Health Sciences, Tulsa, Oklahoma, USA
| | - Will Roberts
- College of Osteopathic Medicine, Oklahoma State University Center for Health Sciences, Tulsa, Oklahoma, USA
| | - Aaron Bowers
- College of Osteopathic Medicine, Oklahoma State University Center for Health Sciences, Tulsa, Oklahoma, USA
| | - Tyler Combs
- College of Osteopathic Medicine, Oklahoma State University Center for Health Sciences, Tulsa, Oklahoma, USA
| | - Jason Beaman
- Department of Psychiatry and Behavioral Sciences, Oklahoma State University Center for Health Sciences, Tulsa, Oklahoma, USA
- Department of Psychiatry, Oklahoma State University Medical Center, Tulsa, Oklahoma, USA
| | - Cole Wayant
- College of Osteopathic Medicine, Oklahoma State University Center for Health Sciences, Tulsa, Oklahoma, USA
| | - Matt Vassar
- College of Osteopathic Medicine, Oklahoma State University Center for Health Sciences, Tulsa, Oklahoma, USA
| |
Collapse
|
19
|
Noble AJ, Marson AG, Blower SL. Psychological interventions for epilepsy: How good are trialists at assessing their implementation fidelity, what are the barriers, and what are journals doing to encourage it? A mixed methods study. Epilepsy Behav 2019; 97:174-181. [PMID: 31252275 DOI: 10.1016/j.yebeh.2019.05.041] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/02/2019] [Revised: 05/28/2019] [Accepted: 05/28/2019] [Indexed: 01/12/2023]
Abstract
INTRODUCTION Psychological interventions hold promise for the epilepsy population and continue to be trialed to determine their efficacy. Such interventions present opportunities for variance in delivery. Therefore, to accurately interpret a trial's estimate of effect, information on implementation fidelity (IF) is required. We present a novel 3-part study. Part 1 systematically rated trials for the extent to which they reported assessing whether the intervention was delivered as intended (adherence) and with what sort of skill (competence). Part 2 identified barriers to reporting and assessing on fidelity perceived by trialists. Part 3 determined what journals publishing epilepsy trials are doing to support IFs reporting. METHODS Articles for 50 randomized controlled trials (RCTs)/quasi-RCTs of psychological interventions identified by Cochrane searches were rated using the Psychotherapy Outcome Study Methodology Rating Form's fidelity items. The 45 corresponding authors for the 50 trials were invited to complete the 'Barriers to Treatment Integrity Implementation Survey'. 'Instructions to Authors' for the 17 journals publishing the trials were reviewed for endorsement of popular reporting guidelines which refer to fidelity (Consolidated Standards of Reporting Trials (CONSORT) statement or Journal Article Reporting Standards [JARS]) and asked how they enforced compliance. RESULTS Part 1: 15 (30%) trials reported assessing for adherence, but only 2 (4.3%) gave the result. Four (8.5%) reported assessing for competence, 1 (2.1%) gave the result. Part 2: 22 trialists - mostly chief investigators - responded. They identified 'lack of theory and specific guidelines on treatment integrity procedures', 'time, cost, and labor demands', and 'lack of editorial requirement' as "strong barriers". Part 3: Most (15, 88.2%) journals endorsed CONSORT or JARS, but only 5 enforced compliance. CONCLUSIONS Most trials of psychological interventions for epilepsy are not reported in a transparent way when it comes to IF. The barriers' trialists identify for this do not appear insurmountable. Addressing them could ultimately help the field to better understand how best to support the population with epilepsy.
Collapse
Affiliation(s)
- Adam J Noble
- Health Services Research, University of Liverpool, Liverpool, UK.
| | - Anthony G Marson
- Department of Molecular and Clinical Pharmacology, University of Liverpool, Liverpool, UK
| | - Sarah L Blower
- Department of Health Sciences, University of York, York, UK
| |
Collapse
|
20
|
Madden K, Phillips M, Solow M, McKinnon V, Bhandari M. A systematic review of quality of reporting in registered intimate partner violence studies: where can we improve? J Inj Violence Res 2019; 11:123-136. [PMID: 31129675 PMCID: PMC6646831 DOI: 10.5249/jivr.v11i2.1140] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2018] [Accepted: 05/06/2019] [Indexed: 12/12/2022] Open
Abstract
BACKGROUND Reporting quality is paramount when presenting clinical findings in published research to ensure that we have the highest quality of evidence. Poorly reported clinical findings can result in a number of potential pitfalls, including confusion of the methodology used or selective reporting of study results. There are guidelines and checklists that aim to standardize the way in which studies are reported in the literature to ensure transparency. The use of these reporting guidelines may aid in the appropriate reporting of research, which is of increased importance in highly complex fields like intimate partner violence (IPV). The primary objective of this systematic review is to assess the reporting quality of published IPV studies using the CONSORT and STROBE checklists. METHODS We performed a systematic review of three large study registries for IPV studies. Of the completed studies, we sought full text publications and used reporting checklists to assess the quality of reporting. RESULTS Of the 42 randomized controlled trials, the mean score on the CONSORT checklist was 63.5% (23.5/37 items, SD 4.7 items). There were also 12 pilot trials in this systematic review, which scored a mean of 49.3% (19.7/40 items; SD 3.3 items) on the CONSORT extension for pilot trials. We included 12 observational studies which scored a mean of 56.1% (18.5/33 items; SD: 4.1 items). CONCLUSIONS We identified an opportunity to improve reporting quality by encouraging adherence to reporting guidelines. There should be a particular focus on ensuring that pilot studies report pilot-specific items. All researchers have a responsibility to ensure commitment to high quality reporting to ensure transparency in IPV studies.
Collapse
Affiliation(s)
- Kim Madden
- Department of Health Research Methods, Evidence, and Impact, McMaster University, Hamilton, Ontario, Canada.
| | | | | | | | | |
Collapse
|
21
|
Sharp MK, Tokalić R, Gómez G, Wager E, Altman DG, Hren D. A cross-sectional bibliometric study showed suboptimal journal endorsement rates of STROBE and its extensions. J Clin Epidemiol 2019; 107:42-50. [PMID: 30423373 DOI: 10.1016/j.jclinepi.2018.11.006] [Citation(s) in RCA: 28] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/07/2018] [Revised: 10/10/2018] [Accepted: 11/06/2018] [Indexed: 02/06/2023]
Abstract
OBJECTIVES The STrengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement provides guidance on reporting observational studies. Many extensions have been created for specialized methods or fields. We determined endorsement prevalence and typology by journals in extension-related fields. STUDY DESIGN AND SETTING A published protocol defined search strategies to identify journals publishing observational studies (2007-2017) across seven fields relating to STROBE extensions. We extracted text regarding STROBE, seven STROBE extensions, reporting guidelines Consolidated Standards of Reporting Trials and Preferred Reporting Items for Systematic Reviews and Meta-Analyses, and transparent reporting documents/groups: International Committee of Medical Journal Editors, Committee on Publication Ethics (COPE), and the Enhancing the QUAlity and Transparency Of health Research (EQUATOR) networks. Relationships between endorsing STROBE, endorsing other guidelines, and journal impact factor were tested using chi square and Mann-Whitney tests. RESULTS Of 257 unique journals, 12 (5%) required STROBE on submission, 22 (9%) suggested use, 12 (5%) recommended a "relevant guideline," 72 (28%) mentioned it indirectly (via editorial policies or International Committee of Medical Journal Editors recommendations), and 139 (54%) did not mention STROBE. The relevant extension was required by 2 (<1%) journals; 4 (1%) suggested use. STROBE endorsement was not associated with journal impact indices but was with Consolidated Standards of Reporting Trials and Preferred Reporting Items for Systematic Reviews and Meta-Analyses endorsements. CONCLUSION Reporting guideline endorsement rates are low; information is vague and scattered. Unambiguous language is needed to improve adherence to reporting guidelines and increase the quality of reporting.
Collapse
Affiliation(s)
- Melissa K Sharp
- Department of Psychology, University of Split, Faculty of Humanities and Social Sciences, Split, Croatia; INSERM, U1153 Epidemiology and Biostatistics Sorbonne Paris Cité Research Center (CRESS), Methods of Therapeutic Evaluation of Chronic Diseases Team (METHODS), Paris, F-75014 France; Paris Descartes University, Sorbonne Paris Cité, France.
| | | | - Guadalupe Gómez
- Universitat Politècnica de Catalunya-BarcelonaTech, Departament d'Estadística i Investigació Operativa, Barcelona, Spain
| | - Elizabeth Wager
- Sideview, Buckinghamshire, UK; University of Split, School of Medicine, Split, Croatia
| | - Douglas G Altman
- Centre for Statistics in Medicine, University of Oxford, Oxford, UK
| | - Darko Hren
- Department of Psychology, University of Split, Faculty of Humanities and Social Sciences, Split, Croatia
| |
Collapse
|
22
|
Otto CM. Heartbeat: Reporting guidelines for high quality clinical cardiology research. Heart 2018; 104:707-709. [PMID: 29650796 DOI: 10.1136/heartjnl-2018-313339] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/04/2022] Open
|