1
|
Harrison MI, Borsky AE. Funding Learning Health System Research: Challenges and Strategies. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2024; 99:673-682. [PMID: 38363814 DOI: 10.1097/acm.0000000000005661] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/18/2024]
Abstract
PURPOSE A growing number of health systems are establishing learning health system (LHS) programs, where research focuses on rapidly improving the health system's internal operations and performance. The authors examine funding challenges facing such initiatives and identify strategies for managing tensions between reliance on external research funding and directly contributing to improvement and learning within the researchers' own system. METHOD Qualitative case studies of LHS research programs in 5 health systems were performed via 38 semistructured interviews (October 2019-April 2021) with 35 diverse respondents. Inductive and deductive rapid qualitative analysis supported interview, system-level, and cross-system summaries and analysis. RESULTS External funding awards to LHS researchers facilitated some internal improvement and learning, scientific advancements, and the reputation of researchers and their systems, but reliance on external funding also challenged researchers' responsiveness to concerns of system leaders, managers, practitioners, and system needs. Gaps between external funding requirements and internally focused projects arose in objectives, practical applicability, audiences, timetables, routines, skill sets, and researchers' careers. To contribute more directly to system improvement, LHS researchers needed to collaborate with clinicians and other nonresearchers and pivot between long research studies and shorter, dynamic improvement, evaluation, and data analysis projects. With support from system executives, LHS program leaders employed several strategies to enhance researchers' internal contributions. They aligned funded-research topics with long-term system needs, obtained internal funding for implementing and sustaining practice change, and diversified funding sources. CONCLUSIONS To foster LHS research contributions to internal system learning and improvement, LHS program leaders need to manage tensions between concentrating on externally funded research and fulfilling their mission of providing research-based services to their own system. Health system executives can support LHS programs by setting clear goals for them; appropriately staffing, budgeting, and incentivizing LHS researchers; and developing supportive, system-wide teamwork, skill development programs, and data infrastructures.
Collapse
|
2
|
Wood B, Fitzgerald M, Kendall C, Cameron E. Integrating socially accountable health professional education and learning health systems to transform community health and health systems. Learn Health Syst 2021; 5:e10277. [PMID: 34277943 PMCID: PMC8278438 DOI: 10.1002/lrh2.10277] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/14/2020] [Revised: 04/15/2021] [Accepted: 05/10/2021] [Indexed: 12/28/2022] Open
Abstract
A learning health system aims to create value in health systems using data-driven innovations, quality improvement techniques, and collaborations between health system partners. Although the concept is mobilized through cycles of learning, most instantiations of the learning health system overlook the importance of formalized learning in educational settings. Social accountability in health professional education focuses on measurably improving people's health and health care, specifically through education and training activities. In this commentary, we argue that the idea of social accountability clearly articulates a rationale and a broad range of aspirations, whereas the learning health system offers an approach to achieve these goals. With a similar aim to a learning health system, social accountability promotes partnerships between health professional education, the health system, and communities in a way that allows for co-designed and contextualized interventions. On the other hand, learning health systems prioritize data, research, and analytic capacities to facilitate quality improvement. An integrative framework could enhance learning cycles by collectively designing interventions and innovations with people and communities from health, research, and education systems. As well as aspiring to improve population health and health equity, such a framework will consider broader impacts, including the degree of participation amongst a range of partners and the level of responsiveness to partners' priorities.
Collapse
Affiliation(s)
- Brianne Wood
- Medical Education Research Lab in the NorthNorthern Ontario School of MedicineThunder BayOntarioCanada
| | - Michael Fitzgerald
- C.T. Lamont Primary Health Care Research CentreBruyère Research InstituteOttawaOntarioCanada
| | - Claire Kendall
- C.T. Lamont Primary Health Care Research CentreBruyère Research InstituteOttawaOntarioCanada
- Office of Social Accountability, Faculty of MedicineUniversity of OttawaOttawaOntarioCanada
- Institut du Savoir Montfortl'Hôpital MontfortOttawaOntarioCanada
- Clinical Epidemiology Research ProgramOttawa Hospital Research InstituteOttawaOntarioCanada
- Li Ka Shing Knowledge InstituteSt. Michael's HospitalTorontoOntarioCanada
| | - Erin Cameron
- Medical Education Research Lab in the NorthNorthern Ontario School of MedicineThunder BayOntarioCanada
| |
Collapse
|
3
|
Damschroder LJ, Knighton AJ, Griese E, Greene SM, Lozano P, Kilbourne AM, Buist DSM, Crotty K, Elwy AR, Fleisher LA, Gonzales R, Huebschmann AG, Limper HM, Ramalingam NS, Wilemon K, Ho PM, Helfrichfcr CD. Recommendations for strengthening the role of embedded researchers to accelerate implementation in health systems: Findings from a state-of-the-art (SOTA) conference workgroup. HEALTHCARE-THE JOURNAL OF DELIVERY SCIENCE AND INNOVATION 2021; 8 Suppl 1:100455. [PMID: 34175093 DOI: 10.1016/j.hjdsi.2020.100455] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Subscribe] [Scholar Register] [Received: 10/25/2019] [Revised: 05/15/2020] [Accepted: 07/14/2020] [Indexed: 10/21/2022]
Abstract
BACKGROUND Traditional research approaches do not promote timely implementation of evidence-based innovations (EBIs) to benefit patients. Embedding research within health systems can accelerate EBI implementation by blending rigorous methods with practical considerations in real-world settings. A state-of-the-art (SOTA) conference was convened in February 2019 with five workgroups that addressed five facets of embedded research and its potential to impact healthcare. This article reports on results from the workgroup focused on how embedded research programs can be implemented into heath systems for greatest impact. METHODS Based on a pre-conference survey, participants indicating interest in accelerating implementation were invited to participate in the SOTA workgroup. Workgroup participants (N = 26) developed recommendations using consensus-building methods. Ideas were grouped by thematic clusters and voted on to identify top recommendations. A summary was presented to the full SOTA membership. Following the conference, the workgroup facilitators (LJD, CDH, NR) summarized workgroup findings, member-checked with workgroup members, and were used to develop recommendations. RESULTS The workgroup developed 12 recommendations to optimize impact of embedded researchers within health systems. The group highlighted the tension between "ROI vs. R01" goals-where health systems focus on achieving return on their investments (ROI) while embedded researchers focus on obtaining research funding (R01). Recommendations are targeted to three key stakeholder groups: researchers, funders, and health systems. Consensus for an ideal foundation to support optimal embedded research is one that (1) maximizes learning; (2) aligns goals across all 3 stakeholders; and (3) implements EBIs in a consistent and timely fashion. CONCLUSIONS Four cases illustrate a variety of ways that embedded research can be structured and conducted within systems, by demonstrating key embedded research values to enable collaborations with academic affiliates to generate actionable knowledge and meaningfully accelerate implementation of EBIs to benefit patients. IMPLICATIONS Embedded research approaches have potential for transforming health systems and impacting patient health. Accelerating embedded research should be a focused priority for funding agencies to maximize a collective return on investment.
Collapse
Affiliation(s)
- Laura J Damschroder
- VA Center for Clinical Management Research, VA Ann Arbor Healthcare System, 2800 Plymouth Rd. Building 16, Floor 3, (152), Ann Arbor, MI, 48105, USA.
| | - Andrew J Knighton
- Healthcare Delivery Institute, Intermountain Healthcare, 5026 South State Street, 3rd Floor, Murray, UT, 84107, USA.
| | - Emily Griese
- Sanford Research, Sanford Health, 2301 E 60th Street, N Sioux Falls, SD, 57106, USA.
| | - Sarah M Greene
- Health Care Systems Research Network, 1249 NE 89th Street, Seattle, WA, 98115, USA.
| | - Paula Lozano
- Kaiser Permanente Washington Health Research Institute, 1730 Minor Avenue, Suite 1600, Seattle, WA, 98101, USA.
| | - Amy M Kilbourne
- Quality Enhancement Research Initiative (QUERI), U.S. Dept of Veterans Affairs, 810 N Vermont Avenue (10X2), Washington, DC, 20420, USA; Learning Health Science, University of Michigan Medical School, North Campus Research Complex, 2800 Plymouth Road, Bldg 16 Ann Arbor, MI, 48198, USA.
| | - Diana S M Buist
- Kaiser Permanente Washington Health Research Institute, 1730 Minor Avenue, Suite 1600, Seattle, WA, 98101, USA.
| | - Karen Crotty
- RTI International, 3040 E. Cornwallis Road, Hobbs 139 P.O. Box 12194, Durham, NC, 27709, USA.
| | - A Rani Elwy
- VA Center for Healthcare Organization and Implementation Research, Edith Nourse Rogers Memorial Veterans Hospital, 200 Springs Road (152), Bedford, MA, 01730, USA; Department of Psychiatry and Human Behavior, Alpert Medical School, Brown University, Box G-BH, Providence, RI, 02912, USA.
| | - Lee A Fleisher
- Department of Anesthesiology and Critical Care, Leonard Davis Institute of Health Economics, University of Pennsylvania, 3400 Spruce Street, Dulles 680, Philadelphia, PA, 19104, USA.
| | - Ralph Gonzales
- Division of General Internal Medicine, Department of Medicine, UCSF, 350 Parnassus Avenue, Box 0361, San Francisco, CA, 94117-0361, USA.
| | - Amy G Huebschmann
- University of Colorado (CU) School of Medicine, Department of Medicine, Division of General Internal Medicine, 12631 E. 17th Ave., Mailstop, B180, Aurora, CO, 80045, USA.
| | - Heather M Limper
- Vanderbilt University Medical Center, 2525 West End Ave, Nashville, TN, 37203, USA.
| | - NithyaPriya S Ramalingam
- Department of Family Medicine, Oregon Health & Science University, 3181 Sam Jackson Park Rd, Portland, 97239, USA.
| | - Katherine Wilemon
- 680 East Colorado Boulevard, Suite #180, Pasadena, CA 91101-6144, USA.
| | - P Michael Ho
- Cardiology Section, Rocky Mountain Regional VA Medical Center, 1700 N. Wheeling St, Aurora, CO 80045, USA.
| | - Christian D Helfrichfcr
- Seattle-Denver Center of Innovation for Veteran-Centered Value-Driven Care, 1660 South Columbian Way, S-152, Seattle, WA, 98108, USA.
| |
Collapse
|
4
|
Abstract
The third paper in a series on how learning health systems can use routinely collected electronic health data (EHD) to advance knowledge and support continuous learning, this review describes how analytical methods for individual-level electronic health data EHD, including regression approaches, interrupted time series (ITS) analyses, instrumental variables, and propensity score methods, can also be used to address the question of whether the intervention “works.” The two major potential sources of bias in non-experimental studies of health care interventions are that the treatment groups compared do not have the same probability of treatment or exposure and the potential for confounding by unmeasured covariates. Although very different, the approaches presented in this chapter are all based on assumptions about data, causal relationships, and biases. For instance, regression approaches assume that the relationship between the treatment, outcome, and other variables is properly specified, all of the variables are available for analysis (i.e., no unobserved confounders) and measured without error, and that the error term is independent and identically distributed. The instrumental variables approach requires identifying an instrument that is related to the assignment of treatment but otherwise has no direct on the outcome. Propensity score methods approaches, on the other hand, assume that there are no unobserved confounders. The epidemiological designs discussed also make assumptions, for instance that individuals can serve as their own control. To properly address these assumptions, analysts should conduct sensitivity analyses within the assumptions of each method to assess the potential impact of what cannot be observed. Researchers also should analyze the same data with different analytical approaches that make alternative assumptions, and to apply the same methods to different data sets. Finally, different analytical methods, each subject to different biases, should be used in combination and together with different designs, to limit the potential for bias in the final results.
Collapse
|
5
|
Abstract
Learning health systems use routinely collected electronic health data (EHD) to advance knowledge and support continuous learning. Even without randomization, observational studies can play a central role as the nation’s health care system embraces comparative effectiveness research and patient-centered outcomes research. However, neither the breadth, timeliness, volume of the available information, nor sophisticated analytics, allow analysts to confidently infer causal relationships from observational data. However, depending on the research question, careful study design and appropriate analytical methods can improve the utility of EHD. The introduction to a series of four papers, this review begins with a discussion of the kind of research questions that EHD can help address, noting how different evidence and assumptions are needed for each. We argue that when the question involves describing the current (and likely future) state of affairs, causal inference is not relevant, so randomized clinical trials (RCTs) are not necessary. When the question is whether an intervention improves outcomes of interest, causal inference is critical, but appropriately designed and analyzed observational studies can yield valid results that better balance internal and external validity than typical RCTs. When the question is one of translation and spread of innovations, a different set of questions comes into play: How and why does the intervention work? How can a model be amended or adapted to work in new settings? In these “delivery system science” settings, causal inference is not the main issue, so a range of quantitative, qualitative, and mixed research designs are needed. We then describe why RCTs are regarded as the gold standard for assessing cause and effect, how alternative approaches relying on observational data can be used to the same end, and how observational studies of EHD can be effective complements to RCTs. We also describe how RCTs can be a model for designing rigorous observational studies, building an evidence base through iterative studies that build upon each other (i.e., confirmation across multiple investigations).
Collapse
|
6
|
Abstract
The second paper in a series on how learning health systems can use routinely collected electronic health data (EHD) to advance knowledge and support continuous learning, this review summarizes study design approaches, including choosing appropriate data sources, and methods for design and analysis of natural and quasi-experiments. The primary strength of study design approaches described in this section is that they study the impact of a deliberate intervention in real-world settings, which is critical for external validity. These evaluation designs address estimating the counterfactual - what would have happened if the intervention had not been implemented. At the individual level, epidemiologic designs focus on identifying situations in which bias is minimized. Natural and quasi-experiments focus on situations where the change in assignment breaks the usual links that could lead to confounding, reverse causation, and so forth. And because these observational studies typically use data gathered for patient management or administrative purposes, the possibility of observation bias is minimized. The disadvantages are that one cannot necessarily attribute the effect to the intervention (as opposed to other things that might have changed), and the results do not indicate what about the intervention made a difference. Because they cannot rely on randomization to establish causality, program evaluation methods demand a more careful consideration of the "theory" of the intervention and how it is expected to play out. A logic model describing this theory can help to design appropriate comparisons, account for all influential variables in a model, and help to ensure that evaluation studies focus on the critical intermediate and long-term outcomes as well as possible confounders.
Collapse
|