1
|
Puljak L, Bala MM, Zając J, Meštrović T, Buttigieg S, Yanakoulia M, Briel M, Lunny C, Lesniak W, Poklepović Peričić T, Alonso-Coello P, Clarke M, Djulbegovic B, Gartlehner G, Giannakou K, Glenny AM, Glenton C, Guyatt G, Hemkens LG, Ioannidis JPA, Jaeschke R, Juhl Jørgensen K, Martins-Pfeifer CC, Marušić A, Mbuagbaw L, Meneses Echavez JF, Moher D, Nussbaumer-Streit B, Page MJ, Pérez-Gaxiola G, Robinson KA, Salanti G, Saldanha IJ, Savović J, Thomas J, Tricco AC, Tugwell P, van Hoof J, Pieper D. Methods proposed for monitoring the implementation of evidence-based research: a cross-sectional study. J Clin Epidemiol 2024; 168:111247. [PMID: 38185190 DOI: 10.1016/j.jclinepi.2024.111247] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2023] [Revised: 12/22/2023] [Accepted: 01/03/2024] [Indexed: 01/09/2024]
Abstract
OBJECTIVES Evidence-based research (EBR) is the systematic and transparent use of prior research to inform a new study so that it answers questions that matter in a valid, efficient, and accessible manner. This study surveyed experts about existing (e.g., citation analysis) and new methods for monitoring EBR and collected ideas about implementing these methods. STUDY DESIGN AND SETTING We conducted a cross-sectional study via an online survey between November 2022 and March 2023. Participants were experts from the fields of evidence synthesis and research methodology in health research. Open-ended questions were coded by recurring themes; descriptive statistics were used for quantitative questions. RESULTS Twenty-eight expert participants suggested that citation analysis should be supplemented with content evaluation (not just what is cited but also in which context), content expert involvement, and assessment of the quality of cited systematic reviews. They also suggested that citation analysis could be facilitated with automation tools. They emphasized that EBR monitoring should be conducted by ethics committees and funding bodies before the research starts. Challenges identified for EBR implementation monitoring were resource constraints and clarity on responsibility for EBR monitoring. CONCLUSION Ideas proposed in this study for monitoring the implementation of EBR can be used to refine methods and define responsibility but should be further explored in terms of feasibility and acceptability. Different methods may be needed to determine if the use of EBR is improving over time.
Collapse
Affiliation(s)
- Livia Puljak
- Center for Evidence-Based Medicine and Healthcare, Catholic University of Croatia, Zagreb, Croatia.
| | - Małgorzata M Bala
- Systematic Reviews Unit, Department of Hygiene and Dietetics, Chair of Epidemiology and Preventive Medicine, Jagiellonian University Medical College, Kraków, Poland
| | - Joanna Zając
- Department of Hygiene and Dietetics, Chair of Epidemiology and Preventive Medicine, Jagiellonian University Medical College, Kraków, Poland
| | | | - Sandra Buttigieg
- Department of Health Systems Management and Leadership, Faculty of Health Sciences, University of Malta, Msida, Malta
| | - Mary Yanakoulia
- Department of Nutrition and Dietetics, Harokopio University, Athens, Greece
| | - Matthias Briel
- Division of Clinical Epidemiology, Department of Clinical Research, University Hospital Basel, Basel, Switzerland; University of Basel, Basel, Switzerland
| | - Carole Lunny
- Knowledge Translation Program, Li Ka Shing Knowledge Institute, St. Michael's Hospital, Unity Health Toronto, Toronto, Ontario, Canada; Cochrane Hypertension Review Group, The Therapeutics Initiative, University of British Columbia, Vancouver, British Columbia, Canada
| | | | - Tina Poklepović Peričić
- Department of Prosthodontics, Study of Dental Medicine, University of Split School of Medicine, Split, Croatia
| | - Pablo Alonso-Coello
- Institut de Recerca Sant Pau (IR SANT PAU), Iberoamerican Cochrane Center, Barcelona, Spain; Centro de Investigación Biomédica en Red de Epidemiología y Salud Pública (CIBERESP), Madrid, Spain
| | - Mike Clarke
- Northern Ireland Clinical Trials Unit (NICTU), Belfast, UK; Centre for Public Health, Queen's University Belfast, Belfast, UK
| | - Benjamin Djulbegovic
- Division of Hematology/Oncology, Department of Medicine, Medical University of South Carolina, Charleston, South Carolina, USA
| | - Gerald Gartlehner
- RTI International, The RTI International-University of North Carolina Evidence-Based Practice Center, Durham, NC, USA; Department for Evidence-Based Medicine and Evaluation, University for Continuing Education Krems, Krems, Austria
| | - Konstantinos Giannakou
- Department of Health Sciences, School of Sciences, European University Cyprus, Nicosia, Cyprus
| | - Anne-Marie Glenny
- Division of Dentistry, Cochrane Oral Health, School of Medical Sciences, Faculty of Biology, Medicine and Health, University of Manchester, Manchester, UK
| | - Claire Glenton
- Western Norway University of Applied Sciences, Bergen, Norway
| | - Gordon Guyatt
- Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, Ontario, Canada
| | - Lars G Hemkens
- Research Center for Clinical Neuroimmunology and Neuroscience Basel (RC2NB), University Hospital Basel, University of Basel, Basel, Switzerland; Department of Clinical Research, University Hospital Basel, University of Basel, Basel, Switzerland; Meta-Research Innovation Center at Stanford (METRICS), Stanford University, Stanford, CA, USA; Meta-Research Innovation Center Berlin (METRIC-B), Berlin Institute of Health, Berlin, Germany
| | - John P A Ioannidis
- Meta-Research Innovation Center at Stanford (METRICS), Stanford University, Stanford, CA, USA; Department of Medicine, Stanford University, Stanford, CA, USA; Department of Epidemiology and Population Health, Stanford University, Stanford, CA, USA; Department of Biomedical Data Science, Stanford University, Stanford, CA, USA; Department of Statistics, Stanford University, Stanford, CA, USA
| | - Roman Jaeschke
- Department of Health Research Methods, Evidence, and Impact, McMaster University, Hamilton, Ontario, Canada; Department of Medicine, McMaster University, Hamilton, Ontario, Canada
| | - Karsten Juhl Jørgensen
- Department of Clinical Research, Cochrane Denmark and Centre for Evidence-Based Medicine Odense, University of Southern Denmark, Odense, Denmark
| | | | - Ana Marušić
- Department of Research in Biomedicine and Health, Center for Evidence-based Medicine, University of Split School of Medicine, Split, Croatia
| | - Lawrence Mbuagbaw
- Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, Ontario, Canada; Biostatistics Unit, Father Sean O'Sullivan Research Centre, St Joseph's Healthcare Hamilton, Hamilton, Ontario, Canada
| | - Jose Francisco Meneses Echavez
- Facultad de Cultura Física, Deporte y Recreación, Universidad Santo Tomás, Bogotá, Colombia; Division of Health Services at Norwegian Institute of Public Health, Bergen, Norway
| | - David Moher
- Ottawa Hospital Research Institute and University of Ottawa, Ottawa, Canada
| | - Barbara Nussbaumer-Streit
- Department for Evidence-Based Medicine and Evaluation, University for Continuing Education Krems, Krems, Austria
| | - Matthew J Page
- Methods in Evidence Synthesis Unit, School of Public Health and Preventive Medicine, Monash University, Melbourne, Australia
| | | | - Karen A Robinson
- Western Norway University of Applied Sciences, Bergen, Norway; Division of General Internal Medicine, Department of Medicine, School of Medicine, Johns Hopkins University, Baltimore, MD, USA
| | - Georgia Salanti
- Institute of Social and Preventive Medicine University of Bern, Switzerland
| | - Ian J Saldanha
- Department of Epidemiology, Center for Clinical Trials and Evidence Synthesis, Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, USA
| | - Jelena Savović
- Population Health Sciences, Bristol Medical School, University of Bristol, Bristol, UK; NIHR Applied Research Collaboration West (ARC West) at University Hospitals Bristol and Weston NHS Foundation Trust, Bristol, UK
| | - James Thomas
- EPPI-Centre, UCL Social Research Institute, University College London, London, UK
| | - Andrea C Tricco
- Li Ka Shing Knowledge Institute of St. Michael's Hospital, Unity Health Toronto, Toronto, Canada; Epidemiology Division and the Institute of Health Management, Policy, and Evaluation, Dalla Lana School of Public Health University of Toronto, Toronto, Canada; Queen's Collaboration for Health Care Quality Joanna Briggs Institute Centre of Excellence, Queen's University, Kingston, Canada
| | - Peter Tugwell
- Department of Medicine, Faculty of Medicine, University of Ottawa, Ottawa, Ontario, Canada; Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, Ontario, Canada
| | - Joost van Hoof
- The Hague University of Applied Sciences, Faculty Social Work & Education, The Hague, The Netherlands; Institute of Spatial Management, Faculty of Spatial Management and Landscape Architecture, Wrocław University of Environmental and Life Sciences, Wrocław, Poland
| | - Dawid Pieper
- Faculty of Health Sciences Brandenburg, Brandenburg Medical School Theodor Fontane, Institute for Health Services and Health System Research, Rüdersdorf, Germany; Center for Health Services Research, Brandenburg Medical School Theodor Fontane, Rüdersdorf, Germany
| |
Collapse
|
2
|
Nørgaard B, Briel M, Chrysostomou S, Ristic Medic D, Buttigieg SC, Kiisk E, Puljak L, Bala M, Pericic TP, Lesniak W, Zając J, Lund H, Pieper D. A systematic review of meta-research studies finds substantial methodological heterogeneity in citation analyses to monitor evidence-based research. J Clin Epidemiol 2022; 150:126-141. [DOI: 10.1016/j.jclinepi.2022.06.021] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/05/2022] [Revised: 06/21/2022] [Accepted: 06/29/2022] [Indexed: 10/17/2022]
|
3
|
Nørgaard B, Draborg E, Andreasen J, Juhl CB, Yost J, Brunnhuber K, Robinson KA, Lund H. Systematic Reviews are Rarely Used to Inform Study Design - a Systematic Review and Meta-analysis. J Clin Epidemiol 2022; 145:1-13. [PMID: 35045317 DOI: 10.1016/j.jclinepi.2022.01.007] [Citation(s) in RCA: 12] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2021] [Revised: 12/28/2021] [Accepted: 01/13/2022] [Indexed: 12/30/2022]
Abstract
OBJECTIVE Our aim was to identify and synthesize the results from meta-research studies to determine whether and how authors of original studies in clinical health research use systematic reviews when designing new studies. STUDY DESIGN AND SETTING For this systematic review, we searched MEDLINE (OVID), Embase (OVID) and the Cochrane Methodology Register. We included meta-research studies and primary outcome was the percentage of original studies using systematic reviews to design their study. Risk of bias was assessed using an ad hoc created list of ten items. The results are presented both as a narrative synthesis and a meta-analysis. RESULTS Sixteen studies were included. The use of a systematic review to inform the design of new clinical studies varied between 0% and 73%, with a mean percentage of 17%. The number of components of the design in which information from previous systematic reviews was used varied from three to eleven. CONCLUSION Clinical health research is characterized by variability regarding the extent to which systematic reviews are used to guide the design. An evidence-based research (EBR) approach towards research design when new clinical health studies are designed is necessary to decrease potential research redundancy and increase end-user value.
Collapse
Affiliation(s)
- Birgitte Nørgaard
- Department of Public Health, University of Southern Denmark, Odense, Denmark.
| | - Eva Draborg
- Department of Public Health, University of Southern Denmark, Odense, Denmark
| | - Jane Andreasen
- Department of Physiotherapy and Occupational Therapy, Aalborg University Hospital, Denmark and Public Health and Epidemiology Group, Department of Health, Science and Technology, Aalborg University, Aalborg, Denmark
| | - Carsten Bogh Juhl
- Department of Sports Science and Clinical Biomechanics, University of Southern Denmark and Department of Physiotherapy and Occupational Therapy, University of Copenhagen Herlev and Gentofte, Denmark
| | - Jennifer Yost
- M. Louise Fitzpatrick College of Nursing, Villanova University, Philadelphia, Pennsylvania, USA
| | | | | | - Hans Lund
- Department of Evidence-Based Practice, Western Norway University of Applied Sciences, Bergen, Norway
| |
Collapse
|
4
|
Clayton GL, Elliott D, Higgins JPT, Jones HE. Use of external evidence for design and Bayesian analysis of clinical trials: a qualitative study of trialists' views. Trials 2021; 22:789. [PMID: 34749778 PMCID: PMC8577005 DOI: 10.1186/s13063-021-05759-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2021] [Accepted: 10/25/2021] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Evidence from previous studies is often used relatively informally in the design of clinical trials: for example, a systematic review to indicate whether a gap in the current evidence base justifies a new trial. External evidence can be used more formally in both trial design and analysis, by explicitly incorporating a synthesis of it in a Bayesian framework. However, it is unclear how common this is in practice or the extent to which it is considered controversial. In this qualitative study, we explored attitudes towards, and experiences of, trialists in incorporating synthesised external evidence through the Bayesian design or analysis of a trial. METHODS Semi-structured interviews were conducted with 16 trialists: 13 statisticians and three clinicians. Participants were recruited across several universities and trials units in the United Kingdom using snowball and purposeful sampling. Data were analysed using thematic analysis and techniques of constant comparison. RESULTS Trialists used existing evidence in many ways in trial design, for example, to justify a gap in the evidence base and inform parameters in sample size calculations. However, no one in our sample reported using such evidence in a Bayesian framework. Participants tended to equate Bayesian analysis with the incorporation of prior information on the intervention effect and were less aware of the potential to incorporate data on other parameters. When introduced to the concepts, many trialists felt they could be making more use of existing data to inform the design and analysis of a trial in particular scenarios. For example, some felt existing data could be used more formally to inform background adverse event rates, rather than relying on clinical opinion as to whether there are potential safety concerns. However, several barriers to implementing these methods in practice were identified, including concerns about the relevance of external data, acceptability of Bayesian methods, lack of confidence in Bayesian methods and software, and practical issues, such as difficulties accessing relevant data. CONCLUSIONS Despite trialists recognising that more formal use of external evidence could be advantageous over current approaches in some areas and useful as sensitivity analyses, there are still barriers to such use in practice.
Collapse
Affiliation(s)
- Gemma L Clayton
- Department of Population Health Sciences, Bristol Medical School, University of Bristol, Bristol, UK.
| | - Daisy Elliott
- Department of Population Health Sciences, Bristol Medical School, University of Bristol, Bristol, UK
- Bristol Centre for Surgical Research, Population Health Sciences, Bristol Medical School, University of Bristol, Bristol, UK
| | - Julian P T Higgins
- Department of Population Health Sciences, Bristol Medical School, University of Bristol, Bristol, UK
- NIHR Applied Research Collaboration West (ARC West) at University Hospitals Bristol and Weston NHS Foundation Trust, Bristol, UK
| | - Hayley E Jones
- Department of Population Health Sciences, Bristol Medical School, University of Bristol, Bristol, UK
| |
Collapse
|
5
|
McLennan S, Nussbaumer-Streit B, Hemkens LG, Briel M. Barriers and Facilitating Factors for Conducting Systematic Evidence Assessments in Academic Clinical Trials. JAMA Netw Open 2021; 4:e2136577. [PMID: 34846522 PMCID: PMC8634056 DOI: 10.1001/jamanetworkopen.2021.36577] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/19/2022] Open
Abstract
IMPORTANCE A systematic assessment of existing research should justify the conduct and inform the design of new clinical research but is often lacking. There is little research on the barriers to and factors facilitating systematic evidence assessments. OBJECTIVE To examine the practices and attitudes of Swiss stakeholders and international funders regarding conducting systematic evidence assessments in academic clinical trials. DESIGN, SETTING, AND PARTICIPANTS In this qualitative study, individual semistructured qualitative interviews were conducted between February and August 2020 with 48 Swiss stakeholder groups (27 primary investigators, 9 funders and sponsors, 6 clinical trial support organizations, and 6 ethics committee members) and between January and March 2021 with 9 international funders of clinical trials from North America and Europe with a reputation for requiring systematic evidence synthesis in applications for academic clinical trials. MAIN OUTCOMES AND MEASURES The main outcomes were practices and attitudes of Swiss stakeholders and international funders regarding conducting systematic evidence assessments in academic clinical trials. Interviews were analyzed using conventional content analysis. RESULTS Of the 57 participants, 40 (70.2%) were male. Participants universally acknowledged that a comprehensive understanding of the previous evidence is important but reported wide variation regarding how this should be achieved. Participants reported that the conduct of formal systematic reviews was currently not expected before most clinical trials, but most international funders reported expecting a systematic search for the existing evidence. Whereas time and resources were reported by all participants as barriers to conducting systematic reviews, the Swiss research ecosystem was reported not to be as supportive of a systematic approach compared with international settings. CONCLUSIONS AND RELEVANCE In this qualitative study, Swiss stakeholders and international funders generally agreed that new clinical trials should be justified by a systematic evidence assessment but that barriers on individual, organizational, and political levels kept them from implementing it. More explicit requirements from funders appear to be needed to clarify the required level of comprehensiveness in summarizing existing evidence for different types of clinical trials.
Collapse
Affiliation(s)
- Stuart McLennan
- Department of Clinical Research, Basel Institute for Clinical Epidemiology and Biostatistics, University of Basel and University Hospital Basel, Basel, Switzerland
- Institute of History and Ethics in Medicine, TUM School of Medicine, Technical University of Munich, Munich, Germany
| | - Barbara Nussbaumer-Streit
- Cochrane Austria, Department for Evidence-based Medicine and Evaluation, Danube University Krems, Krems, Austria
| | - Lars G. Hemkens
- Department of Clinical Research, Basel Institute for Clinical Epidemiology and Biostatistics, University of Basel and University Hospital Basel, Basel, Switzerland
- Meta-Research Innovation Center at Stanford, Stanford University, Stanford, California
- Meta-Research Innovation Center Berlin, Berlin Institute of Health, Berlin, Germany
| | - Matthias Briel
- Department of Clinical Research, Basel Institute for Clinical Epidemiology and Biostatistics, University of Basel and University Hospital Basel, Basel, Switzerland
- Department of Health Research Methods, Evidence, and Impact, McMaster University, Hamilton, Ontario, Canada
| |
Collapse
|
6
|
Low dissemination rates, non-transparency of trial premature cessation and late registration in child mental health: observational study of registered interventional trials. Eur Child Adolesc Psychiatry 2020; 29:813-825. [PMID: 31486894 DOI: 10.1007/s00787-019-01392-8] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/01/2019] [Accepted: 08/16/2019] [Indexed: 10/26/2022]
Abstract
The aim of this observational study was to explore trial premature cessation, non-publication and trial registration time in child mental health. Data were extracted for "closed" trials in Clinicaltrials.gov registry and European Union Clinical Trial Register (EUCTR) and corresponding publications of completed trials indexed in three data bases (PubMed, Scopus and Google Scholar). We restricted the extraction to the 'Behaviours and Mental Disorders' category and participants' age of 0-17 years. Outcome measures were trial completion, results reporting within a year after the trial completion, publishing an article in a peer-reviewed journal within an average time to publish (729 days), and registration time. The number of EUCTR trials was relatively small (n = 35) and with many inconsistencies. Out of 827 "closed" trials extracted from ClinicalTrials.gov, 69% were completed, 24.2% of prematurely ceased trials did not report reasons for early termination, 12.2% of the completed trials had results reported within a year, and 29.3% had an article published within 24 months after completion. Middle-sized (100-499 participants) and behavioural trials had higher chances of being successfully completed. Middle-sized and industry-funded trials were associated with results reporting. Chances for publishing an article were lower for industry-funded trials. Industry funding and drug interventions were related to timely registration. Large sample and non-industry funding were related to retrospective registration, which was recorded more often in recent years than before (we observed trials registered from 2002 until 2017). This study found low dissemination rates in the field of child mental health, with worrying under-reporting of premature termination causes. These findings indicate that more children are being subjected to unnecessary risk that comes with trial participation.
Collapse
|
7
|
Nikolakopoulou A, Trelle S, Sutton AJ, Egger M, Salanti G. Synthesizing existing evidence to design future trials: survey of methodologists from European institutions. Trials 2019; 20:334. [PMID: 31174597 PMCID: PMC6555919 DOI: 10.1186/s13063-019-3449-6] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2018] [Accepted: 05/13/2019] [Indexed: 12/26/2022] Open
Abstract
Background ‘Conditional trial design’ is a framework for efficiently planning new clinical trials based on a network of relevant existing trials. The framework considers whether new trials are required and how the existing evidence can be used to answer the research question and plan future research. The potential of this approach has not been fully realized. Methods We conducted an online survey among trial statisticians, methodologists, and users of evidence synthesis research using referral sampling to capture opinions about the conditional trial design framework and current practices among clinical researchers. The questions included in the survey were related to the decision of whether a meta-analysis answers the research question, the optimal way to synthesize available evidence, which relates to the acceptability of network meta-analysis, and the use of evidence synthesis in the planning of new studies. Results In total, 76 researchers completed the survey. Two out of three survey participants (65%) were willing to possibly or definitely consider using evidence synthesis to design a future clinical trial and around half of the participants would give priority to such a trial design. The median rating of the frequency of using such a trial design was 0.41 on a scale from 0 (never) to 1 (always). Major barriers to adopting conditional trial design include the current regulatory paradigm and the policies of funding agencies and sponsors. Conclusions Participants reported moderate interest in using evidence synthesis methods in the design of future trials. They indicated that a major paradigm shift is required before the use of network meta-analysis is regularly employed in the design of trials. Electronic supplementary material The online version of this article (10.1186/s13063-019-3449-6) contains supplementary material, which is available to authorized users.
Collapse
Affiliation(s)
- Adriani Nikolakopoulou
- Institute of Social and Preventive Medicine (ISPM), University of Bern, Bern, Switzerland.
| | - Sven Trelle
- CTU Bern, University of Bern, Bern, Switzerland
| | - Alex J Sutton
- Department of Health Sciences, College of Medicine, Biological Sciences and Psychology, University of Leicester, Leicester, UK
| | - Matthias Egger
- Institute of Social and Preventive Medicine (ISPM), University of Bern, Bern, Switzerland
| | - Georgia Salanti
- Institute of Social and Preventive Medicine (ISPM), University of Bern, Bern, Switzerland
| |
Collapse
|
8
|
Jones HE, Ades AE, Sutton AJ, Welton NJ. Use of a random effects meta-analysis in the design and analysis of a new clinical trial. Stat Med 2018; 37:4665-4679. [PMID: 30187505 PMCID: PMC6484819 DOI: 10.1002/sim.7948] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/20/2016] [Revised: 06/29/2018] [Accepted: 07/28/2018] [Indexed: 01/08/2023]
Abstract
In designing a randomized controlled trial, it has been argued that trialists should consider existing evidence about the likely intervention effect. One approach is to form a prior distribution for the intervention effect based on a meta‐analysis of previous studies and then power the trial on its ability to affect the posterior distribution in a Bayesian analysis. Alternatively, methods have been proposed to calculate the power of the trial to influence the “pooled” estimate in an updated meta‐analysis. These two approaches can give very different results if the existing evidence is heterogeneous, summarised using a random effects meta‐analysis. We argue that the random effects mean will rarely represent the trialist's target parameter, and so, it will rarely be appropriate to power a trial based on its impact upon the random effects mean. Furthermore, the random effects mean will not generally provide an appropriate prior distribution. More appropriate alternatives include the predictive distribution and shrinkage estimate for the most similar study. Consideration of the impact of the trial on the entire random effects distribution might sometimes be appropriate. We describe how beliefs about likely sources of heterogeneity have implications for how the previous evidence should be used and can have a profound impact on the expected power of the new trial. We conclude that the likely causes of heterogeneity among existing studies need careful consideration. In the absence of explanations for heterogeneity, we suggest using the predictive distribution from the meta‐analysis as the basis for a prior distribution for the intervention effect.
Collapse
Affiliation(s)
- Hayley E Jones
- Population Health Sciences, Bristol Medical School, University of Bristol, Bristol, UK
| | - A E Ades
- Population Health Sciences, Bristol Medical School, University of Bristol, Bristol, UK
| | - Alex J Sutton
- Department of Health Sciences, University of Leicester, Leicester, UK
| | - Nicky J Welton
- Population Health Sciences, Bristol Medical School, University of Bristol, Bristol, UK
| |
Collapse
|
9
|
Sharma T, Choudhury M, Rejón-Parrilla JC, Jonsson P, Garner S. Using HTA and guideline development as a tool for research priority setting the NICE way: reducing research waste by identifying the right research to fund. BMJ Open 2018; 8:e019777. [PMID: 29523564 PMCID: PMC5855177 DOI: 10.1136/bmjopen-2017-019777] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/02/2017] [Revised: 12/22/2017] [Accepted: 01/24/2018] [Indexed: 11/28/2022] Open
Abstract
BACKGROUND The National Institute for Health and Care Excellence (NICE) was established in 1999 and provides national guidance and advice to improve health and social care. Several steps in the research cycle have been identified that can support the reduction of waste that occurs in biomedical research. The first step in the process is ensuring appropriate research priority setting occurs so only the questions that are needed to fill existing gaps in the evidence are funded. This paper summarises the research priority setting processes at NICE. METHODS NICE uses its guidance production processes to identify and prioritise research questions through systematic reviews, economic analyses and stakeholder consultations and then highlights those priorities by engagement with the research community. NICE also highlights its methodological areas for research to ensure the appropriate development and growth of the evidence landscape. RESULTS NICE has prioritised research questions through its guidance production and methodological work and has successfully had several research products funded through the National Institute for Health Research and Medical Research Council. This paper summarises those activities and results. CONCLUSIONS This activity of NICE therefore reduces research waste by ensuring that the research it recommends has been systematically prioritised through evidence reviews and stakeholder input.
Collapse
Affiliation(s)
- Tarang Sharma
- Faculty of Health and Medical Sciences, University of Copenhagen, Copenahgen, Denmark
| | - Moni Choudhury
- Science Policy and Research, National Institute for Health and Care Excellence, London, UK
| | | | - Pall Jonsson
- Science Policy and Research, National Institute for Health and Care Excellence, Manchester, UK
| | - Sarah Garner
- Science Policy and Research, National Institute for Health and Care Excellence, London, UK
| |
Collapse
|
10
|
Cook A, Streit E, Davage G. Involving clinical experts in prioritising topics for health technology assessment: a randomised controlled trial. BMJ Open 2017; 7:e016104. [PMID: 28827250 PMCID: PMC5629658 DOI: 10.1136/bmjopen-2017-016104] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 01/30/2023] Open
Abstract
OBJECTIVES The objective of this study was to explore whether reducing the material supplied to external experts during peer review and decreasing the burden of response would maintain review quality into prioritising research questions for a major research funder. METHODS AND ANALYSIS Clinical experts who agreed to review documents outlining research for potential commissioning were screened for eligibility and randomised in a factorial design to two types of review materials (long document versus short document) and response modes (structured review form versus free text email response). Previous and current members of the funder's programme groups were excluded. Response quality was assessed by use of a four-point scoring tool and analysed by intention to treat. RESULTS 554 consecutive experts were screened for eligibility and 460 were randomised (232 and 228 to long document or short document, respectively; 230 each to structured response or free text). 356 participants provided reviews, 90 did not respond and 14 were excluded after randomisation as not eligible.The pooled mean quality score was 2.4 (SD=0.95). The short document scored 0.037 (Cohen's d=0.039) extra quality points over the long document arm, and the structured response scored 0.335 (Cohen's d=0.353) over free text. The allocation did not appear to have any effect on the experts' willingness to engage with the task. CONCLUSIONS Neither providing a short or a long document outlining suggested research was shown to be superior. However, providing a structured form to guide the expert response provided more useful information than allowing free text. The funder should continue to use a structured form to gather responses. It would be acceptable to provide shorter documents to reviewers, if there were reasons to do so. TRIAL REGISTRATION NUMBER ANZCTR12614000167662.
Collapse
Affiliation(s)
- Andrew Cook
- Wessex Institute, University of Southampton, Southampton, UK
- University Hospital Southampton NHS Foundation Trust, Southampton, UK
| | - Elke Streit
- NIHR Evaluation, Trials and Studies Coordinating Centre, University of Southampton, Southampton, UK
| | - Gill Davage
- NIHR Evaluation, Trials and Studies Coordinating Centre, University of Southampton, Southampton, UK
| |
Collapse
|
11
|
Williamson PR, Altman DG, Bagley H, Barnes KL, Blazeby JM, Brookes ST, Clarke M, Gargon E, Gorst S, Harman N, Kirkham JJ, McNair A, Prinsen CAC, Schmitt J, Terwee CB, Young B. The COMET Handbook: version 1.0. Trials 2017; 18:280. [PMID: 28681707 PMCID: PMC5499094 DOI: 10.1186/s13063-017-1978-4] [Citation(s) in RCA: 1057] [Impact Index Per Article: 151.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/08/2023] Open
Abstract
The selection of appropriate outcomes is crucial when designing clinical trials in order to compare the effects of different interventions directly. For the findings to influence policy and practice, the outcomes need to be relevant and important to key stakeholders including patients and the public, health care professionals and others making decisions about health care. It is now widely acknowledged that insufficient attention has been paid to the choice of outcomes measured in clinical trials. Researchers are increasingly addressing this issue through the development and use of a core outcome set, an agreed standardised collection of outcomes which should be measured and reported, as a minimum, in all trials for a specific clinical area.Accumulating work in this area has identified the need for guidance on the development, implementation, evaluation and updating of core outcome sets. This Handbook, developed by the COMET Initiative, brings together current thinking and methodological research regarding those issues. We recommend a four-step process to develop a core outcome set. The aim is to update the contents of the Handbook as further research is identified.
Collapse
Affiliation(s)
- Paula R. Williamson
- MRC North West Hub for Trials Methodology Research, Department of Biostatistics, University of Liverpool, Block F Waterhouse Building, 1-5 Brownlow Street, Liverpool, L69 3GL UK
| | - Douglas G. Altman
- Centre for Statistics in Medicine, Nuffield Department of Orthopaedics, Rheumatology and Musculoskeletal Sciences, University of Oxford, Oxford, UK
| | - Heather Bagley
- MRC North West Hub for Trials Methodology Research, Department of Biostatistics, University of Liverpool, Block F Waterhouse Building, 1-5 Brownlow Street, Liverpool, L69 3GL UK
| | - Karen L. Barnes
- MRC North West Hub for Trials Methodology Research, Department of Biostatistics, University of Liverpool, Block F Waterhouse Building, 1-5 Brownlow Street, Liverpool, L69 3GL UK
| | - Jane M. Blazeby
- MRC ConDuCT II Hub for Trials Methodology Research, School of Social and Community Medicine, University of Bristol, Bristol, UK
| | - Sara T. Brookes
- MRC ConDuCT II Hub for Trials Methodology Research, School of Social and Community Medicine, University of Bristol, Bristol, UK
| | - Mike Clarke
- Centre for Public Health, Queen’s University Belfast, Belfast, UK
- National University of Ireland Galway and HRB Trials Methodology Research Network, Galway, Ireland
| | - Elizabeth Gargon
- MRC North West Hub for Trials Methodology Research, Department of Biostatistics, University of Liverpool, Block F Waterhouse Building, 1-5 Brownlow Street, Liverpool, L69 3GL UK
| | - Sarah Gorst
- MRC North West Hub for Trials Methodology Research, Department of Biostatistics, University of Liverpool, Block F Waterhouse Building, 1-5 Brownlow Street, Liverpool, L69 3GL UK
| | - Nicola Harman
- MRC North West Hub for Trials Methodology Research, Department of Biostatistics, University of Liverpool, Block F Waterhouse Building, 1-5 Brownlow Street, Liverpool, L69 3GL UK
| | - Jamie J. Kirkham
- MRC North West Hub for Trials Methodology Research, Department of Biostatistics, University of Liverpool, Block F Waterhouse Building, 1-5 Brownlow Street, Liverpool, L69 3GL UK
| | - Angus McNair
- MRC ConDuCT II Hub for Trials Methodology Research, School of Social and Community Medicine, University of Bristol, Bristol, UK
| | - Cecilia A. C. Prinsen
- Department of Epidemiology and Biostatistics, EMGO+ Institute for Health and Care Research, VU University Medical Center, Amsterdam, The Netherlands
| | - Jochen Schmitt
- Center for Evidence-based Healthcare, Medizinische Fakultät, Technische Univesität Dresden, Dresden, Germany
| | - Caroline B. Terwee
- Department of Epidemiology and Biostatistics, EMGO+ Institute for Health and Care Research, VU University Medical Center, Amsterdam, The Netherlands
| | - Bridget Young
- MRC North West Hub for Trials Methodology Research, Department of Biostatistics, University of Liverpool, Block F Waterhouse Building, 1-5 Brownlow Street, Liverpool, L69 3GL UK
| |
Collapse
|
12
|
Clayton GL, Smith IL, Higgins JPT, Mihaylova B, Thorpe B, Cicero R, Lokuge K, Forman JR, Tierney JF, White IR, Sharples LD, Jones HE. The INVEST project: investigating the use of evidence synthesis in the design and analysis of clinical trials. Trials 2017; 18:219. [PMID: 28506284 PMCID: PMC5433067 DOI: 10.1186/s13063-017-1955-y] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2017] [Accepted: 04/26/2017] [Indexed: 11/23/2022] Open
Abstract
BACKGROUND When designing and analysing clinical trials, using previous relevant information, perhaps in the form of evidence syntheses, can reduce research waste. We conducted the INVEST (INVestigating the use of Evidence Synthesis in the design and analysis of clinical Trials) survey to summarise the current use of evidence synthesis in trial design and analysis, to capture opinions of trialists and methodologists on such use, and to understand any barriers. METHODS Our sampling frame was all delegates attending the International Clinical Trials Methodology Conference in November 2015. Respondents were asked to indicate (1) their views on the use of evidence synthesis in trial design and analysis, (2) their own use during the past 10 years and (3) the three greatest barriers to use in practice. RESULTS Of approximately 638 attendees of the conference, 106 (17%) completed the survey, half of whom were statisticians. Support was generally high for using a description of previous evidence, a systematic review or a meta-analysis in trial design. Generally, respondents did not seem to be using evidence syntheses as often as they felt they should. For example, only 50% (42/84 relevant respondents) had used a meta-analysis to inform whether a trial is needed compared with 74% (62/84) indicating that this is desirable. Only 6% (5/81 relevant respondents) had used a value of information analysis to inform sample size calculations versus 22% (18/81) indicating support for this. Surprisingly large numbers of participants indicated support for, and previous use of, evidence syntheses in trial analysis. For example, 79% (79/100) of respondents indicated that external information about the treatment effect should be used to inform aspects of the analysis. The greatest perceived barrier to using evidence synthesis methods in trial design or analysis was time constraints, followed by a belief that the new trial was the first in the area. CONCLUSIONS Evidence syntheses can be resource-intensive, but their use in informing the design, conduct and analysis of clinical trials is widely considered desirable. We advocate additional research, training and investment in resources dedicated to ways in which evidence syntheses can be undertaken more efficiently, offering the potential for cost savings in the long term.
Collapse
Affiliation(s)
- Gemma L. Clayton
- School of Social and Community Medicine, Faculty of Health Sciences, University of Bristol, Canynge Hall, 39 Whatley Road, Bristol, BS8 2PS UK
| | - Isabelle L. Smith
- Leeds Institute of Clinical Trials Research, University of Leeds, Leeds, UK
| | - Julian P. T. Higgins
- School of Social and Community Medicine, Faculty of Health Sciences, University of Bristol, Canynge Hall, 39 Whatley Road, Bristol, BS8 2PS UK
| | - Borislava Mihaylova
- Health Economics Research Centre, Nuffield Department of Population Health, University of Oxford, Oxford, UK
| | - Benjamin Thorpe
- Leeds Institute of Clinical Trials Research, University of Leeds, Leeds, UK
| | - Robert Cicero
- Leeds Institute of Clinical Trials Research, University of Leeds, Leeds, UK
| | - Kusal Lokuge
- Health Economics Research Centre, Nuffield Department of Population Health, University of Oxford, Oxford, UK
| | - Julia R. Forman
- Cambridge Clinical Trials Unit, University of Cambridge, Cambridge, UK
| | | | - Ian R. White
- MRC Biostatistics Unit, Cambridge Institute of Public Health, Cambridge, UK
| | | | - Hayley E. Jones
- School of Social and Community Medicine, Faculty of Health Sciences, University of Bristol, Canynge Hall, 39 Whatley Road, Bristol, BS8 2PS UK
| |
Collapse
|
13
|
Mahtani KR. All health researchers should begin their training by preparing at least one systematic review. J R Soc Med 2016; 109:264-8. [PMID: 27118697 PMCID: PMC4940997 DOI: 10.1177/0141076816643954] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
Affiliation(s)
- Kamal R Mahtani
- Centre for Evidence Based Medicine, Nuffield Department of Primary Care Health Sciences, University of Oxford, Oxford OX2 6GG, UK
| |
Collapse
|