1
|
Landis‐Lewis Z, Janda AM, Chung H, Galante P, Cao Y, Krumm AE. Precision feedback: A conceptual model. Learn Health Syst 2024; 8:e10419. [PMID: 39036537 PMCID: PMC11257058 DOI: 10.1002/lrh2.10419] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/05/2023] [Revised: 02/23/2024] [Accepted: 03/05/2024] [Indexed: 07/23/2024] Open
Abstract
Introduction When performance data are provided as feedback to healthcare professionals, they may use it to significantly improve care quality. However, the question of how to provide effective feedback remains unanswered, as decades of evidence have produced a consistent pattern of effects-with wide variation. From a coaching perspective, feedback is often based on a learner's objectives and goals. Furthermore, when coaches provide feedback, it is ideally informed by their understanding of the learner's needs and motivation. We anticipate that a "coaching"-informed approach to feedback may improve its effectiveness in two ways. First, by aligning feedback with healthcare professionals' chosen goals and objectives, and second, by enabling large-scale feedback systems to use new types of data to learn what kind of performance information is motivating in general. Our objective is to propose a conceptual model of precision feedback to support these anticipated enhancements to feedback interventions. Methods We iteratively represented models of feedback's influence from theories of motivation and behavior change, visualization, and human-computer interaction. Through cycles of discussion and reflection, application to clinical examples, and software development, we implemented and refined the models in a software application to generate precision feedback messages from performance data for anesthesia providers. Results We propose that precision feedback is feedback that is prioritized according to its motivational potential for a specific recipient. We identified three factors that influence motivational potential: (1) the motivating information in a recipient's performance data, (2) the surprisingness of the motivating information, and (3) a recipient's preferences for motivating information and its visual display. Conclusions We propose a model of precision feedback that is aligned with leading theories of feedback interventions to support learning about the success of feedback interventions. We plan to evaluate this model in a randomized controlled trial of a precision feedback system that enhances feedback emails to anesthesia providers.
Collapse
Affiliation(s)
- Zach Landis‐Lewis
- Department of Learning Health SciencesUniversity of MichiganAnn ArborMichiganUSA
| | - Allison M. Janda
- Department of AnesthesiologyUniversity of MichiganAnn ArborMichiganUSA
| | - Hana Chung
- School of InformationUniversity of MichiganAnn ArborMichiganUSA
| | - Patrick Galante
- Department of Learning Health SciencesUniversity of MichiganAnn ArborMichiganUSA
| | - Yidan Cao
- Department of Learning Health SciencesUniversity of MichiganAnn ArborMichiganUSA
| | - Andrew E. Krumm
- Department of Learning Health SciencesUniversity of MichiganAnn ArborMichiganUSA
- School of InformationUniversity of MichiganAnn ArborMichiganUSA
- Department of SurgeryUniversity of MichiganAnn ArborMichiganUSA
| |
Collapse
|
2
|
Shuldiner J, Kiran T, Agarwal P, Daneshvarfard M, Eldridge K, Kim S, Greiver M, Jokhio I, Ivers N. Developing an Audit and Feedback Dashboard for Family Physicians: User-Centered Design Process. JMIR Hum Factors 2023; 10:e47718. [PMID: 37943586 PMCID: PMC10667970 DOI: 10.2196/47718] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/29/2023] [Revised: 06/29/2023] [Accepted: 07/22/2023] [Indexed: 11/10/2023] Open
Abstract
BACKGROUND Audit and feedback (A&F), the summary and provision of clinical performance data, is a common quality improvement strategy. Successful design and implementation of A&F-or any quality improvement strategy-should incorporate evidence-informed best practices as well as context-specific end user input. OBJECTIVE We used A&F theory and user-centered design to inform the development of a web-based primary care A&F dashboard. We describe the design process and how it influenced the design of the dashboard. METHODS Our design process included 3 phases: prototype development based on A&F theory and input from clinical improvement leaders; workshop with family physician quality improvement leaders to develop personas (ie, fictional users that represent an archetype character representative of our key users) and application of those personas to design decisions; and user-centered interviews with family physicians to learn about the physician's reactions to the revised dashboard. RESULTS The team applied A&F best practices to the dashboard prototype. Personas were used to identify target groups with challenges and behaviors as a tool for informed design decision-making. Our workshop produced 3 user personas, Dr Skeptic, Frazzled Physician, and Eager Implementer, representing common users based on the team's experience of A&F. Interviews were conducted to further validate findings from the persona workshop and found that (1) physicians were interested in how they compare with peers; however, if performance was above average, they were not motivated to improve even if gaps compared to other standards in their care remained; (2) burnout levels were high as physicians are trying to catch up on missed care during the pandemic and are therefore less motivated to act on the data; and (3) additional desired features included integration within the electronic medical record, and more up-to-date and accurate data. CONCLUSIONS We found that carefully incorporating data from user interviews helped operationalize generic best practices for A&F to achieve an acceptable dashboard that could meet the needs and goals of physicians. We demonstrate such a design process in this paper. A&F dashboards should address physicians' data skepticism, present data in a way that spurs action, and support physicians to have the time and capacity to engage in quality improvement work; the steps we followed may help those responsible for quality improvement strategy implementation achieve these aims.
Collapse
Affiliation(s)
| | - Tara Kiran
- Department of Family and Community Medicine, University of Toronto, Toronto, ON, Canada
- St Michael's Hospital, Unity Health Toronto, Toronto, ON, Canada
- MAP Centre for Urban Health Solutions, St Michael's Hospital, Toronto, ON, Canada
- Institute of Health Policy, Management and Evaluation, University of Toronto, Toronto, ON, Canada
| | - Payal Agarwal
- Women's College Hospital, Toronto, ON, Canada
- Department of Family and Community Medicine, University of Toronto, Toronto, ON, Canada
| | - Maryam Daneshvarfard
- MAP Centre for Urban Health Solutions, St Michael's Hospital, Toronto, ON, Canada
| | - Kirsten Eldridge
- Department of Family and Community Medicine, University of Toronto, Toronto, ON, Canada
| | - Susie Kim
- Department of Family and Community Medicine, University of Toronto, Toronto, ON, Canada
- Women's College Academic Family Health Team, Women's College Hospital, Toronto, ON, Canada
| | - Michelle Greiver
- North York General Hospital Office of Research and Innovation, Toronto, ON, Canada
| | | | - Noah Ivers
- Women's College Hospital, Toronto, ON, Canada
- Department of Family and Community Medicine, University of Toronto, Toronto, ON, Canada
- Institute of Health Policy, Management and Evaluation, University of Toronto, Toronto, ON, Canada
| |
Collapse
|
3
|
Kushniruk A, Reis C, Ivers N, Desveaux L. Characterizing the Gaps Between Best-Practice Implementation Strategies and Real-world Implementation: Qualitative Study Among Family Physicians Who Engaged With Audit and Feedback Reports. JMIR Hum Factors 2023; 10:e38736. [PMID: 36607715 PMCID: PMC9947922 DOI: 10.2196/38736] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/18/2022] [Revised: 07/28/2022] [Accepted: 11/10/2022] [Indexed: 11/12/2022] Open
Abstract
BACKGROUND In Ontario, Canada, a government agency known as Ontario Health is responsible for making audit and feedback reports available to all family physicians to encourage ongoing quality improvement. The confidential report provides summary data on 3 key areas of practice: safe prescription, cancer screening, and diabetes management. OBJECTIVE This report was redesigned to improve its usability in line with evidence. The objective of this study was to explore how the redesign was perceived, with an emphasis on recipients' understanding of the report and their engagement with it. METHODS We conducted qualitative semistructured interviews with family physicians who had experience with both versions of the report recruited through purposeful and snowball sampling. We analyzed the transcripts following an emergent and iterative approach. RESULTS Saturation was reached after 17 family physicians participated. In total, 2 key themes emerged as factors that affected the perceived usability of the report: alignment between the report and the recipients' expectations and capacity to engage in quality improvement. Family physicians expected the report and its quality indicators to reflect best practices and to be valid and accurate. They also expected the report to offer feedback on the clinical activities they perceived to be within their control to change. Furthermore, family physicians expected the goal of the report to be aligned with their perspective on feasible quality improvement activities. Most of these expectations were not met, limiting the perceived usability of the report. The capacity to engage with audit and feedback was hindered by several organizational and physician-level barriers, including the lack of fit with the existing workflow, competing priorities, time constraints, and insufficient skills for bridging the gaps between their data and the corresponding desired actions. CONCLUSIONS Despite recognized improvements in the design of the report to better align with best practices, it was not perceived as highly usable. Improvements in the presentation of the data could not overcome misalignment with family physicians' expectations or the limited capacity to engage with the report. Integrating iterative evaluations informed by user-centered design can complement evidence-based guidance for implementation strategies. Creating a space for bringing together audit and feedback designers and recipients may help improve usability and effectiveness.
Collapse
Affiliation(s)
| | - Catherine Reis
- Institute for Health System Solutions and Virtual Care, Women's College Hospital, Toronto, ON, Canada
| | - Noah Ivers
- Institute for Health System Solutions and Virtual Care, Women's College Hospital, Toronto, ON, Canada.,Institute of Health Policy, Management and Evaluation, University of Toronto, Toronto, ON, Canada.,Department of Family and Community Medicine, University of Toronto, Toronto, ON, Canada
| | - Laura Desveaux
- Institute for Health System Solutions and Virtual Care, Women's College Hospital, Toronto, ON, Canada.,Institute of Health Policy, Management and Evaluation, University of Toronto, Toronto, ON, Canada.,Institute for Better Health, Trillium Health Partners, Mississauga, ON, Canada
| |
Collapse
|
4
|
de Bekker PJGM, de Weerdt V, Vink MDH, van der Kolk AB, Donker MH, van der Hijden EJE. 'Give me something meaningful': GPs perspectives on how to improve an audit and feedback report provided by health insurers - an exploratory qualitative study. BMJ Open Qual 2022; 11:bmjoq-2022-002006. [PMID: 36375859 PMCID: PMC9664288 DOI: 10.1136/bmjoq-2022-002006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2022] [Accepted: 10/24/2022] [Indexed: 11/16/2022] Open
Abstract
BACKGROUND Audit and feedback (A&F) is a valuable quality improvement strategy, which can contribute to de-implementation of low-value care. In the Netherlands, all health insurers collaboratively provide A&F to general practitioners (GPs), the 'Primary Care Practice Report' (PCPR). Unfortunately, the use of this report by GPs is limited. This study examined the thoughts of GPs on the usability of the PCPR and GPs recommendations for improving the PCPR. METHOD We used an interpretative qualitative design, with think-aloud tasks to uncover thoughts of GPs on the usability of the PCPR and semistructured interview questions to ask GPs' recommendations for improvement of the PCPR. Interviews were audiorecorded and transcribed ad verbatim. Data were analysed using thematic content analysis. RESULTS We identified two main themes: 'poor usability of the PCPR', and 'minimal motivation to change based on the PCPR'. The GPs found the usability of the PCPR poor due to the feedback not being clinically meaningful, the data not being recent, individual and reliable, the performance comparators offer insufficient guidance to assess clinical performance, the results are not discussed with peers and the definitions and visuals are unclear. The GPs recommended improving these issues. The GPs motivation to change based on the PCPR was minimal. CONCLUSIONS The GPs evaluated the PCPR as poorly usable and were minimally motivated to change. The PCPR seems developed from the perspective of the reports' commissioners, health insurers, and does not meet known criteria for effective A&F design and user-centred design. Importantly, the GPs did state that well-designed feedback could contribute to their motivation to improve clinical performance.Furthermore, the GPs stated that they receive a multitude of A&F reports, which they hardly use. Thus, we see a need for policy makers to invest in less, but more usable A&F reports.
Collapse
Affiliation(s)
- P J G M de Bekker
- Department of Health Economics & Talma Institute, Vrije Universiteit Amsterdam, Amsterdam, Netherlands .,Zorgvuldig Advies, Utrecht, Netherlands
| | - V de Weerdt
- Department of Health Economics & Talma Institute, Vrije Universiteit Amsterdam, Amsterdam, Netherlands.,Amsterdam University Medical Centres, Holendrecht, Netherlands
| | - M D H Vink
- Department of Health Economics & Talma Institute, Vrije Universiteit Amsterdam, Amsterdam, Netherlands.,Gynaecology, Amsterdam Universitair Medische Centra, Duivendrecht, Netherlands
| | - A B van der Kolk
- Talma Institute, Vrije Universiteit Amsterdam, Amsterdam, Netherlands
| | - M H Donker
- Department of Health Sciences, Vrije Universiteit Amsterdam, Amsterdam, Netherlands
| | - E J E van der Hijden
- Department of Health Economics & Talma Institute, Vrije Universiteit Amsterdam, Amsterdam, Netherlands.,Zilveren Kruis Health Insurance, Zeist, Netherlands
| |
Collapse
|
5
|
de Lusignan S, Liyanage H, Sherlock J, Ferreira F, Munro N, Feher M, Hobbs R. Atrial fibrillation dashboard evaluation using the think aloud protocol. BMJ Health Care Inform 2020; 27:bmjhci-2020-100191. [PMID: 33087337 PMCID: PMC7580041 DOI: 10.1136/bmjhci-2020-100191] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2020] [Revised: 08/19/2020] [Accepted: 08/24/2020] [Indexed: 01/17/2023] Open
Abstract
BACKGROUND Atrial fibrillation (AF) is a common cardiac arrhythmia which is a major risk factor for stroke, transient ischaemic attacks and increased mortality. Primary care management of AF can significantly reduce these risks. We carried out an evaluation to asses the usability of an AF dashboard developed to improve data quality and the quality of care. METHOD We developed an online dashboard about the quality of AF management for general practices of the Oxford Royal College of General Practitioners Research and Surveillance Centre network. The dashboard displays (1) case ascertainment, (2) a calculation of stroke and haemorrhage risk to assess whether the benefits of anticogulants outweigh their risk, (3) prescriptions of different types of anticoagulant and (4) if prescribed anticoagulant is at the correct dose. We conducted the think aloud evaluation, involving 24 dashboard users to improve its usability. RESULTS Analysis of 24 transcripts received produced 120 individual feedback items (ie, verbalised tasks) that were mapped across five usability problem classes. We enhanced the dashboard based on evaluation feedback to encourage adoption by general practices participating in the sentinel network. CONCLUSIONS The think aloud evaluation provided useful insights into important usability issues that require further development. Our enhanced AF dashboard was acceptable to clinicians and its impact on data quality and care should be assessed in a formal study.
Collapse
Affiliation(s)
- Simon de Lusignan
- Nuffield Department of Primary Care Health Sciences, University of Oxford, Oxford, UK .,Royal College of General Practitioners Research and Surveillance Centre, Royal College of General Practitioners, London, UK
| | - Harshana Liyanage
- Nuffield Department of Primary Care Health Sciences, University of Oxford, Oxford, UK
| | - Julian Sherlock
- Nuffield Department of Primary Care Health Sciences, University of Oxford, Oxford, UK
| | - Filipa Ferreira
- Nuffield Department of Primary Care Health Sciences, University of Oxford, Oxford, UK
| | - Neil Munro
- Nuffield Department of Primary Care Health Sciences, University of Oxford, Oxford, UK
| | - Michael Feher
- Nuffield Department of Primary Care Health Sciences, University of Oxford, Oxford, UK
| | - Richard Hobbs
- Nuffield Department of Primary Care Health Sciences, University of Oxford, Oxford, UK
| |
Collapse
|
6
|
Landis-Lewis Z, Kononowech J, Scott WJ, Hogikyan RV, Carpenter JG, Periyakoil VS, Miller SC, Levy C, Ersek M, Sales A. Designing clinical practice feedback reports: three steps illustrated in Veterans Health Affairs long-term care facilities and programs. Implement Sci 2020; 15:7. [PMID: 31964414 PMCID: PMC6975062 DOI: 10.1186/s13012-019-0950-y] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2019] [Accepted: 10/17/2019] [Indexed: 11/10/2022] Open
Abstract
Background User-centered design (UCD) methods are well-established techniques for creating useful artifacts, but few studies illustrate their application to clinical feedback reports. When used as an implementation strategy, the content of feedback reports depends on a foundational audit process involving performance measures and data, but these important relationships have not been adequately described. Better guidance on UCD methods for designing feedback reports is needed. Our objective is to describe the feedback report design method for refining the content of prototype reports. Methods We propose a three-step feedback report design method (refinement of measures, data, and display). The three steps follow dependencies such that refinement of measures can require changes to data, which in turn may require changes to the display. We believe this method can be used effectively with a broad range of UCD techniques. Results We illustrate the three-step method as used in implementation of goals of care conversations in long-term care settings in the U.S. Veterans Health Administration. Using iterative usability testing, feedback report content evolved over cycles of the three steps. Following the steps in the proposed method through 12 iterations with 13 participants, we improved the usability of the feedback reports. Conclusions UCD methods can improve feedback report content through an iterative process. When designing feedback reports, refining measures, data, and display may enable report designers to improve the user centeredness of feedback reports.
Collapse
Affiliation(s)
- Zach Landis-Lewis
- Department of Learning Health Sciences, University of Michigan Medical School, 1161 J NIB, 300 N. Ingalls Street, SPC 5403, Ann Arbor, Michigan, 48109-5403, USA.
| | | | | | - Robert V Hogikyan
- VA Ann Arbor Healthcare System, Ann Arbor, Michigan, USA.,Department of Internal Medicine, University of MichiganMedical School, Ann Arbor, Michigan, USA
| | - Joan G Carpenter
- Corporal Michael J. Crescenz VAMC, Philadelphia, Pennsylvania, USA.,School of Nursing, University of Pennsylvania, Philadelphia, Pennsylvania, USA
| | - V S Periyakoil
- VA Palo Alto Health Care System, Palo Alto, California, USA.,School of Medicine, Stanford University, Palo Alto, California, USA
| | - Susan C Miller
- Brown University School of Public Health, Providence, Rhode Island, USA
| | - Cari Levy
- Eastern Colorado Health Care System, Aurora, Colorado, USA.,School of Medicine, University of Colorado Anschutz Campus, Aurora, Colorado, USA
| | - Mary Ersek
- Corporal Michael J. Crescenz VAMC, Philadelphia, Pennsylvania, USA.,School of Nursing, University of Pennsylvania, Philadelphia, Pennsylvania, USA.,Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania, USA
| | - Anne Sales
- Department of Learning Health Sciences, University of Michigan Medical School, 1161 J NIB, 300 N. Ingalls Street, SPC 5403, Ann Arbor, Michigan, 48109-5403, USA.,VA Ann Arbor Healthcare System, Ann Arbor, Michigan, USA
| |
Collapse
|
7
|
Wagner DJ, Durbin J, Barnsley J, Ivers NM. Measurement without management: qualitative evaluation of a voluntary audit & feedback intervention for primary care teams. BMC Health Serv Res 2019; 19:419. [PMID: 31234916 PMCID: PMC6591867 DOI: 10.1186/s12913-019-4226-7] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/18/2019] [Accepted: 06/06/2019] [Indexed: 11/29/2022] Open
Abstract
Background The use of clinical performance feedback to support quality improvement (QI) activities is based on the sound rationale that measurement is necessary to improve quality of care. However, concerns persist about the reliability of this strategy, known as Audit and Feedback (A&F) to support QI. If successfully implemented, A&F should reflect an iterative, self-regulating QI process. Whether and how real-world A&F initiatives result in this type of feedback loop are scarcely reported. This study aimed to identify barriers or facilitators to implementation in a team-based primary care context. Methods Semi-structured interviews were conducted with key informants from team-based primary care practices in Ontario, Canada. At the time of data collection, practices could have received up to three iterations of the voluntary A&F initiative. Interviews explored whether, how, and why practices used the feedback to guide their QI activities. The Consolidated Framework for Implementation Research was used to code transcripts and the resulting frameworks were analyzed inductively to generate key themes. Results Twenty-five individuals representing 18 primary care teams participated in the study. Analysis of how the A&F intervention was used revealed that implementation reflected an incomplete feedback loop. Participation was facilitated by the reliance on an external resource to facilitate the practice audit. The frequency of feedback, concerns with data validity, the design of the feedback report, the resource requirements to participate, and the team relationship were all identified as barriers to implementation of A&F. Conclusions The implementation of a real-world, voluntary A&F initiative did not lead to desired QI activities despite substantial investments in performance measurement. In small primary care teams, it may take long periods of time to develop capacity for QI and future evaluations may reveal shifts in the implementation state of the initiative. Findings from the present study demonstrate that the potential mechanism of action of A&F may be deceptively clear; in practice, moving from measurement to action can be complex. Electronic supplementary material The online version of this article (10.1186/s12913-019-4226-7) contains supplementary material, which is available to authorized users.
Collapse
Affiliation(s)
- Daniel J Wagner
- Department of Community Health Sciences, Cumming School of Medicine, University of Calgary, 3280 Hospital Drive NW, Calgary, Alberta, T2N 4Z6, Canada.
| | - Janet Durbin
- Centre for Addiction and Mental Health, 33 Russell Street, Toronto, Ontario, M5S 2S1, Canada.,Department of Psychiatry, University of Toronto, Toronto, Ontario, Canada
| | - Jan Barnsley
- Institute of Health Policy, Management and Evaluation, University of Toronto, Suite 425, 155 College Street, Toronto, Ontario, M5T 3M6, Canada
| | - Noah M Ivers
- Institute of Health Policy, Management and Evaluation, University of Toronto, Suite 425, 155 College Street, Toronto, Ontario, M5T 3M6, Canada.,Department of Family and Communtiy Medicine, University of Toronto, Toronto, Ontario, Canada.,Family Practice Health Centre, Institute for Health Systems Solutions and Women's College Hospital Research Institute, Women's College Hospital, 76 Grenville Street, Toronto, Ontario, M5S 1B2, Canada
| |
Collapse
|