1
|
Glynn D, Gc VS, Claxton K, Littlewood C, Rothery C. Rapid Assessment of the Need for Evidence: Applying the Principles of Value of Information to Research Prioritisation. PHARMACOECONOMICS 2024; 42:919-928. [PMID: 38900241 DOI: 10.1007/s40273-024-01403-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 05/26/2024] [Indexed: 06/21/2024]
Abstract
We propose a short-cut heuristic approach to rapidly estimate value of information (VOI) using information commonly reported in a research funding application to make a case for the need for further evaluative research. We develop a "Rapid VOI" approach, which focuses on uncertainty in the primary outcome of clinical effectiveness and uses this to explore the health consequences of decision uncertainty. We develop a freely accessible online tool, Rapid Assessment of the Need for Evidence (RANE), to allow for the efficient computation of the value of research. As a case study, the method was applied to a proposal for research on shoulder pain rehabilitation. The analysis was included as part of a successful application for research funding to the UK National Institute for Health and Care Research. Our approach enables research funders and applicants to rapidly estimate the value of proposed research. Rapid VOI relies on information that is readily available and reported in research funding applications. Rapid VOI supports research prioritisation and commissioning decisions where there is insufficient time and resources available to develop and validate complex decision-analytic models. The method provides a practical means for implementing VOI in practice, thus providing a starting point for deliberation and contributing to the transparency and accountability of research prioritisation decisions.
Collapse
Affiliation(s)
- David Glynn
- Centre for Health Economics, University of York, York, UK.
| | - Vijay S Gc
- School of Human and Health Sciences, University of Huddersfield, Huddersfield, UK
| | - Karl Claxton
- Centre for Health Economics, University of York, York, UK
| | - Chris Littlewood
- Allied Health, Social Work & Wellbeing, Edgehill University, Ormskirk, UK
| | - Claire Rothery
- Centre for Health Economics, University of York, York, UK
| |
Collapse
|
2
|
Glynn D, Nikolaidis G, Jankovic D, Welton NJ. Constructing Relative Effect Priors for Research Prioritization and Trial Design: A Meta-epidemiological Analysis. Med Decis Making 2023; 43:553-563. [PMID: 37057388 PMCID: PMC10336712 DOI: 10.1177/0272989x231165985] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/14/2021] [Accepted: 03/01/2023] [Indexed: 04/15/2023]
Abstract
BACKGROUND Bayesian methods have potential for efficient design of randomized clinical trials (RCTs) by incorporating existing evidence. Furthermore, value of information (VOI) methods estimate the value of reducing decision uncertainty, aiding transparent research prioritization. These methods require a prior distribution describing current uncertainty in key parameters, such as relative treatment effect (RTE). However, at the time of designing and commissioning research, there may be no data to base the prior on. The aim of this article is to present methods to construct priors for RTEs based on a collection of previous RCTs. METHODS We developed 2 Bayesian hierarchical models that captured variability in RTE between studies within disease area accounting for study characteristics. We illustrate the methods using a data set of 743 published RCTs across 9 disease areas to obtain predictive distributions for RTEs for a range of disease areas. We illustrate how the priors from such an analysis can be used in a VOI analysis for an RCT in bladder cancer and compare the results with those using an uninformative prior. RESULTS For most disease areas, the predicted RTE favored new interventions over comparators. The predicted effects and uncertainty differed across the 9 disease areas. VOI analysis showed that the expected value of research is much lower with our empirically derived prior compared with an uninformative prior. CONCLUSIONS This study demonstrates a novel approach to generating informative priors that can be used to aid research prioritization and trial design. The methods can also be used to combine RCT evidence with expert opinion. Further work is needed to create a rich database of RCT evidence that can be used to form off-the-shelf priors. HIGHLIGHTS Bayesian methods have potential to aid the efficient design of randomized clinical trials (RCTs) by incorporating existing evidence. Value-of-information (VOI) methods can be used to aid research prioritization by calculating the value of current decision uncertainty.These methods require a distribution describing current uncertainty in key parameters, that is, "prior distributions."This article demonstrates a methodology to estimate prior distributions for relative treatment effects (odds and hazard ratios) estimated from a collection of previous RCTs.These results may be combined with expert elicitation to facilitate 1) value-of-information methods to prioritize research or 2) Bayesian methods for research design.
Collapse
Affiliation(s)
- David Glynn
- Centre for Health Economics, University of York, UK
| | | | | | | |
Collapse
|
3
|
Morton RL, Tuffaha H, Blaya-Novakova V, Spencer J, Hawley CM, Peyton P, Higgins A, Marsh J, Taylor WJ, Huckson S, Sillett A, Schneemann K, Balagurunanthan A, Cumpston M, Scuffham PA, Glasziou P, Simes RJ. Approaches to prioritising research for clinical trial networks: a scoping review. Trials 2022; 23:1000. [PMID: 36510214 PMCID: PMC9743749 DOI: 10.1186/s13063-022-06928-z] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2021] [Accepted: 11/15/2022] [Indexed: 12/14/2022] Open
Abstract
BACKGROUND Prioritisation of clinical trials ensures that the research conducted meets the needs of stakeholders, makes the best use of resources and avoids duplication. The aim of this review was to identify and critically appraise approaches to research prioritisation applicable to clinical trials, to inform best practice guidelines for clinical trial networks and funders. METHODS A scoping review of English-language published literature and research organisation websites (January 2000 to January 2020) was undertaken to identify primary studies, approaches and criteria for research prioritisation. Data were extracted and tabulated, and a narrative synthesis was employed. RESULTS Seventy-eight primary studies and 18 websites were included. The majority of research prioritisation occurred in oncology and neurology disciplines. The main reasons for prioritisation were to address a knowledge gap (51 of 78 studies [65%]) and to define patient-important topics (28 studies, [35%]). In addition, research organisations prioritised in order to support their institution's mission, invest strategically, and identify best return on investment. Fifty-seven of 78 (73%) studies used interpretative prioritisation approaches (including Delphi surveys, James Lind Alliance and consensus workshops); six studies used quantitative approaches (8%) such as prospective payback or value of information (VOI) analyses; and 14 studies used blended approaches (18%) such as nominal group technique and Child Health Nutritional Research Initiative. Main criteria for prioritisation included relevance, appropriateness, significance, feasibility and cost-effectiveness. CONCLUSION Current research prioritisation approaches for groups conducting and funding clinical trials are largely interpretative. There is an opportunity to improve the transparency of prioritisation through the inclusion of quantitative approaches.
Collapse
Affiliation(s)
- Rachael L. Morton
- grid.1013.30000 0004 1936 834XNational Health and Medical Research Council Clinical Trials Centre (NHMRC CTC), University of Sydney, Sydney, Australia
| | - Haitham Tuffaha
- grid.1003.20000 0000 9320 7537Centre for the Business and Economics of Health, University of Queensland, Brisbane, Australia
| | - Vendula Blaya-Novakova
- grid.1013.30000 0004 1936 834XNational Health and Medical Research Council Clinical Trials Centre (NHMRC CTC), University of Sydney, Sydney, Australia
| | - Jenean Spencer
- Australian Clinical Trials Alliance (ACTA), Melbourne, Victoria Australia
| | - Carmel M. Hawley
- grid.1003.20000 0000 9320 7537Australasian Kidney Trials Network (AKTN), Faculty of Medicine, University of Queensland, Brisbane, Australia
| | - Phil Peyton
- grid.418175.e0000 0001 2225 7841Australian and New Zealand College of Anaesthetists (ANZCA), Melbourne, Australia
| | - Alisa Higgins
- grid.1002.30000 0004 1936 7857Australian and New Zealand Intensive Care Research Centre (ANZIC-RC), Monash University, Melbourne, Victoria Australia
| | - Julie Marsh
- grid.414659.b0000 0000 8828 1230Telethon Kids Institute, West Perth, Australia
| | - William J. Taylor
- grid.29980.3a0000 0004 1936 7830University of Otago, Rehabilitation Teaching and Research Unit, Dunedin, New Zealand
| | - Sue Huckson
- grid.489411.10000 0004 5905 1670Australian and New Zealand Intensive Care Society (ANZICS), Camberwell, Victoria Australia
| | - Amy Sillett
- grid.467202.50000 0004 0445 3920AstraZeneca Australia, Macquarie Park, New South Wales Australia
| | - Kieran Schneemann
- Australian Clinical Trials Alliance (ACTA), Melbourne, Victoria Australia ,grid.467202.50000 0004 0445 3920AstraZeneca Australia, Macquarie Park, New South Wales Australia
| | | | - Miranda Cumpston
- Australian Clinical Trials Alliance (ACTA), Melbourne, Victoria Australia ,grid.266842.c0000 0000 8831 109XSchool of Medicine and Public Health, The University of Newcastle, Newcastle, Australia
| | - Paul A. Scuffham
- grid.1003.20000 0000 9320 7537Centre for the Business and Economics of Health, University of Queensland, Brisbane, Australia
| | - Paul Glasziou
- grid.1033.10000 0004 0405 3820Faculty of Health Sciences & Medicine, Bond University, Gold Coast, Australia
| | - Robert J. Simes
- grid.1013.30000 0004 1936 834XNational Health and Medical Research Council Clinical Trials Centre (NHMRC CTC), University of Sydney, Sydney, Australia
| |
Collapse
|
4
|
Grimm SE, Pouwels X, Ramaekers BLT, van Ravesteyn NT, Sankatsing VDV, Grutters J, Joore MA. Implementation Barriers to Value of Information Analysis in Health Technology Decision Making: Results From a Process Evaluation. VALUE IN HEALTH : THE JOURNAL OF THE INTERNATIONAL SOCIETY FOR PHARMACOECONOMICS AND OUTCOMES RESEARCH 2021; 24:1126-1136. [PMID: 34372978 DOI: 10.1016/j.jval.2021.03.013] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/01/2020] [Revised: 02/10/2021] [Accepted: 03/29/2021] [Indexed: 06/13/2023]
Abstract
OBJECTIVES Value of information (VOI) analysis can support health technology assessment decision making, but it is a long way from being standard use. The objective of this study was to understand barriers to the implementation of VOI analysis and propose actions to overcome these. METHODS We performed a process evaluation of VOI analysis use within decision making on tomosynthesis versus digital mammography for use in the Dutch breast cancer population screening. Based on steering committee meeting attendance and regular meetings with analysts, we developed a list of barriers to VOI use, which were analyzed using an established diffusion model. We proposed actions to address these barriers. Barriers and actions were discussed and validated in a workshop with stakeholders representing patients, clinicians, regulators, policy advisors, researchers, and the industry. RESULTS Consensus was reached on groups of barriers, which included characteristics of VOI analysis itself, stakeholder's attitudes, analysts' and policy makers' skills and knowledge, system readiness, and implementation in the organization. Observed barriers did not only pertain to VOI analysis itself but also to formulating the objective of the assessment, economic modeling, and broader aspects of uncertainty assessment. Actions to overcome these barriers related to organizational changes, knowledge transfer, cultural change, and tools. CONCLUSIONS This in-depth analysis of barriers to implementation of VOI analysis and resulting actions and tools may be useful to health technology assessment organizations that wish to implement VOI analysis in technology assessment and research prioritization. Further research should focus on application and evaluation of the proposed actions in real-world assessment processes.
Collapse
Affiliation(s)
- Sabine E Grimm
- Department of Clinical Epidemiology and Medical Technology Assessment, School for Public Health and Primary Care, Maastricht University Medical Centre, Maastricht, The Netherlands.
| | - Xavier Pouwels
- Department of Health Technology and Services Research, Faculty of Behavioural, Management and Social Sciences, University of Twente, Enschede, The Netherlands
| | - Bram L T Ramaekers
- Department of Clinical Epidemiology and Medical Technology Assessment, School for Public Health and Primary Care, Maastricht University Medical Centre, Maastricht, The Netherlands
| | | | - Valérie D V Sankatsing
- Department of Public Health, Erasmus University Medical Center, Rotterdam, The Netherlands
| | - Janneke Grutters
- Department for Health Evidence, Radboud University Medical Center, Nijmegen, The Netherlands
| | - Manuela A Joore
- Department of Clinical Epidemiology and Medical Technology Assessment, School for Public Health and Primary Care, Maastricht University Medical Centre, Maastricht, The Netherlands
| |
Collapse
|
5
|
Kim DD, Guzauskas GF, Bennette CS, Basu A, Veenstra DL, Ramsey SD, Carlson JJ. Influence of Modeling Choices on Value of Information Analysis: An Empirical Analysis from a Real-World Experiment. PHARMACOECONOMICS 2020; 38:171-179. [PMID: 31631254 DOI: 10.1007/s40273-019-00848-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
BACKGROUND Value of information (VOI) analysis often requires modeling to characterize and propagate uncertainty. In collaboration with a cancer clinical trial group, we integrated a VOI approach to assessing trial proposals. OBJECTIVE This paper aims to explore the impact of modeling choices on VOI results and to share lessons learned from the experience. METHODS After selecting two proposals (A: phase III, breast cancer; B: phase II, pancreatic cancer) for in-depth evaluations, we categorized key modeling choices relevant to trial decision makers (characterizing uncertainty of efficacy, evidence thresholds to change clinical practice, and sample size) and modelers (cycle length, survival distribution, simulation runs, and other choices). Using a $150,000 per quality-adjusted life-year (QALY) threshold, we calculated the patient-level expected value of sample information (EVSI) for each proposal and examined whether each modeling choice led to relative change of more than 10% from the averaged base-case estimate. We separately analyzed the impact of the effective time horizon. RESULTS The base-case EVSI was $118,300 for Proposal A and $22,200 for Proposal B per patient. Characterizing uncertainty of efficacy was the most important choice in both proposals (e.g. Proposal A: $118,300 using historical data vs. $348,300 using expert survey), followed by the sample size and the choice of survival distribution. The assumed effective time horizon also had a substantial impact on the population-level EVSI. CONCLUSIONS Modeling choices can have a substantial impact on VOI. Therefore, it is important for groups working to incorporate VOI into research prioritization to adhere to best practices, be clear in their reporting and justification for modeling choices, and to work closely with the relevant decision makers, with particular attention to modeling choices.
Collapse
Affiliation(s)
- David D Kim
- Center for the Evaluation of Value and Risk in Health, Institute for Clinical Research and Health Policy Studies, Tufts Medical Center, 800 Washington St., Box 63, Boston, MA, 02111, USA.
| | | | | | - Anirban Basu
- Department of Pharmacy, University of Washington, Seattle, WA, USA
| | - David L Veenstra
- Department of Pharmacy, University of Washington, Seattle, WA, USA
| | - Scott D Ramsey
- Hutchinson Institute for Cancer Outcomes Research, Fred Hutchinson Cancer Research Center, Seattle, WA, USA
| | - Josh J Carlson
- Department of Pharmacy, University of Washington, Seattle, WA, USA
| |
Collapse
|
6
|
Basu A, Veenstra DL, Carlson JJ, Wang WJ, Branch K, Probstfield J. How can clinical researchers quantify the value of their proposed comparative research? Am Heart J 2019; 209:116-125. [PMID: 30638543 DOI: 10.1016/j.ahj.2018.12.003] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/22/2018] [Accepted: 12/03/2018] [Indexed: 01/24/2023]
Affiliation(s)
- Anirban Basu
- The Comparative Health Outcomes, Policy, and Economics (CHOICE) Institute, School of Pharmacy, University of Washington, Seattle, WA; The Departments of Health Services and Economics, University of Washington, Seattle, WA.
| | - David L Veenstra
- The Comparative Health Outcomes, Policy, and Economics (CHOICE) Institute, School of Pharmacy, University of Washington, Seattle, WA
| | - Josh J Carlson
- The Comparative Health Outcomes, Policy, and Economics (CHOICE) Institute, School of Pharmacy, University of Washington, Seattle, WA
| | - Wei-Jhih Wang
- The Comparative Health Outcomes, Policy, and Economics (CHOICE) Institute, School of Pharmacy, University of Washington, Seattle, WA
| | - Kelley Branch
- Division of Cardiology, Department of Medicine, University of Washington, Seattle, WA
| | - Jeffrey Probstfield
- Division of Cardiology, Department of Medicine, University of Washington, Seattle, WA
| |
Collapse
|
7
|
Carlson JJ, Kim DD, Guzauskas GF, Bennette CS, Veenstra DL, Basu A, Hendrix N, Hershman DL, Baker L, Ramsey SD. Integrating value of research into NCI Clinical Trials Cooperative Group research review and prioritization: A pilot study. Cancer Med 2018; 7:4251-4260. [PMID: 30030904 PMCID: PMC6144145 DOI: 10.1002/cam4.1657] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2018] [Revised: 05/07/2018] [Accepted: 05/25/2018] [Indexed: 01/14/2023] Open
Abstract
Background The Institute of Medicine has called for approaches to help maximize the return on investments (ROI) in cancer clinical trials. Value of Research (VOR) is a health economics technique that estimates ROI and can inform research prioritization. Our objective was to evaluate the impact of using VOR analyses on the clinical trial proposal review process within the SWOG cancer clinical trials consortium. Methods We used a previously developed minimal modeling approach to calculate VOR estimates for 9 phase II/III SWOG proposals between February 2015 and December 2016. Estimates were presented to executive committee (EC) members (N = 12) who determine which studies are sent to the National Cancer Institute for funding consideration. EC members scored proposals from 1 (best) to 5 based on scientific merit and potential impact before and after receiving VOR estimates. EC members were surveyed to assess research priorities, proposal evaluation process satisfaction, and the VOR process. Results Value of Research estimates ranged from −$2.1B to $16.46B per proposal. Following review of VOR results, the EC changed their score for eight of nine proposals. Proposal rankings were different in pre‐ vs postscores (P value: 0.03). Respondents had mixed views of the ultimate utility of VOR for their decisions with most supporting (42%) or neutral (41%) to the idea of adding VOR to the evaluation process. Conclusions The findings from this pilot study indicate use of VOR analyses may be a useful adjunct to inform proposal reviews within NCI Cooperative Clinical Trials groups.
Collapse
|