1
|
Donnelly LF, Guimaraes CV. Event-Based Learning and Improvement: Radiology's Move From Peer Review to Peer Learning. Semin Ultrasound CT MR 2024; 45:161-169. [PMID: 38373672 DOI: 10.1053/j.sult.2024.02.005] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/21/2024]
Abstract
Over the past 15 years, the radiology community has made great progress moving from a system of score-based peer review to one of peer learning. Much has been learned along the way. In peer learning, cases in which learning opportunities are identified are reviewed solely for the purpose of fostering learning and improvement. This article defines peer learning and peer review and emphasizes the difference; looks back at the 20-year history of score-based peer review and transition to peer learning; outlines the problems with score-based peer review and the key elements of peer learning; discusses the current state of peer learning; and outlines future challenges and opportunities.
Collapse
Affiliation(s)
- Lane F Donnelly
- Department of Radiology, University of North Carolina School of Medicine, Chapel Hill, NC; Department of Pediatrics, University of North Carolina School of Medicine, Chapel Hill, NC.
| | - Carolina V Guimaraes
- Department of Radiology, University of North Carolina School of Medicine, Chapel Hill, NC
| |
Collapse
|
2
|
Kwak DH, Yang L, Hu-Wang E, Seetharam S, Nijhawan K, Chung JH, Patel P. Peer learning is both preferable and less expensive than score-based peer review: Initial experience at a tertiary academic center. Clin Imaging 2024; 106:110065. [PMID: 38113549 DOI: 10.1016/j.clinimag.2023.110065] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2023] [Revised: 12/07/2023] [Accepted: 12/10/2023] [Indexed: 12/21/2023]
Abstract
PURPOSE To examine radiologist experiences and perceptions during a transition from score-based peer review to a peer learning program, and to assess differences in time-cost efficiency between the two models of quality improvement. METHODS Differences in Likert scale survey responses from radiologists (N = 27) in a multispecialty group at a single tertiary academic center before and following intervention were evaluated by Mann-Whitney U test. Multiple variable linear regression analysis assessed independent variables and program preference. RESULTS All positive impacts rated significantly higher for the peer learning program. Workflow disruption for the peer learning program rated significantly lower. 70.4 % (19 of 27) preferred the new program, and 25.9 % (7 of 27) preferred the old program. Only the "worth investment" questionnaire score demonstrated a significant correlation to program preference and with an effect that was greatest among all variables (Beta = 1.11, p = 0.02). There was a significantly decreased amount of time per month used to complete peer learning exercises (0.76 ± 0.45 h, N = 27) versus peer review exercises (1.71 ± 1.84 h, N = 34, p = 0.011). The result was a difference of 0.95 ± 1.89 h/month (11.4 ± 22.7 h/year), translating to an estimated direct salary time-cost saving of $1653.68/year/radiologists and a direct productivity time-cost saving of $3469.39/year/radiologist when utilizing the peer learning program. CONCLUSIONS There was a strongly positive perception of the new peer learning program. There was a substantial implied direct time-cost saving from the transition to the peer learning program. PRECIS The peer learning model emphasizes learning from errors via feedback in a non-punitive environment. This model was positively perceived and demonstrated substantial implied direct time-cost saving.
Collapse
Affiliation(s)
- Daniel H Kwak
- Department of Radiology, The University of Chicago Medical Center, 5841 S. Maryland Ave, Chicago, IL 60637, United States of America.
| | - Lindsay Yang
- Department of Radiology, The University of Chicago Medical Center, 5841 S. Maryland Ave, Chicago, IL 60637, United States of America
| | - Eileen Hu-Wang
- Department of Radiology, The University of Chicago Medical Center, 5841 S. Maryland Ave, Chicago, IL 60637, United States of America
| | - Sachin Seetharam
- Department of Radiology, The University of Chicago Medical Center, 5841 S. Maryland Ave, Chicago, IL 60637, United States of America
| | - Karan Nijhawan
- Department of Radiology, The University of Chicago Medical Center, 5841 S. Maryland Ave, Chicago, IL 60637, United States of America
| | - Jonathan H Chung
- Department of Radiology, The University of Chicago Medical Center, 5841 S. Maryland Ave, Chicago, IL 60637, United States of America
| | - Pritesh Patel
- Department of Radiology, Medical College of Wisconsin, 9200 W. Wisconsin Ave, Milwaukee, WI 53226, United States of America
| |
Collapse
|
3
|
Siewert B, Brook OR, Kruskal JB. Peer learning in abdominal radiology: iterative process improvements over a 20-year experience. Abdom Radiol (NY) 2024; 49:662-677. [PMID: 38093102 DOI: 10.1007/s00261-023-04118-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2023] [Revised: 10/23/2023] [Accepted: 10/26/2023] [Indexed: 02/01/2024]
Abstract
PURPOSE After a slow and challenging transition period, peer learning and improvement (PLI) is now being more widely adopted by practices as an option for continuous personal and practice performance improvement. In addition to gaps that exist in the understanding of what PLI is and how it should be practiced, wide variation exists in how the process is implemented, administered, how outcomes are measured, and what strategies are employed to engage radiologists. This report aims to describe lessons learned from our 20-year experience with the design, implementation, and continuous improvements of a PLI program in a large academic program. METHODS Since initial implementation in 2004, an oversight team prospectively documented iterative process improvements and data submission trends in our PLI process. Process data included strategies for engaging radiologists in the PLI process (fostering case submission, PLI meeting participation), steps for achieving regulatory compliance, and template content for facilitating the value and impact of PLI meetings (case analysis, review of contributing factors, identification of improvement opportunities). RESULTS Submission trends, submitted case content, and improvement opportunities varied by clinical section. Process improvements that fostered engagement included closing the loop with participants, expanding criteria for case submission beyond interpretive disagreements (e.g., great pickups, near misses), minimizing impacts to workflow, and using evidence-based templates for case and contributor categorization, bias analysis, and identification of improvement opportunities. CONCLUSION Implementing an effective PLI program requires sustained communication, education, and continuous process improvement. While PLI can certainly lead to process and individual performance improvement, the program requires trained champions, designated time, effort, resources, education, and patience to be effectively implemented.
Collapse
Affiliation(s)
- Bettina Siewert
- Department of Radiology, Beth Israel Deaconess Medical Center, 1 Deaconess Rd, Boston, MA, 02215, USA
| | - Olga R Brook
- Department of Radiology, Beth Israel Deaconess Medical Center, 1 Deaconess Rd, Boston, MA, 02215, USA
| | - Jonathan B Kruskal
- Department of Radiology, Beth Israel Deaconess Medical Center, 1 Deaconess Rd, Boston, MA, 02215, USA.
| |
Collapse
|
4
|
Panagides JC, Hancel K, Kalva S, Schenker M, Saini S, Glazer DI, Khorasani R, Daye D. Interventional Radiology Peer Learning Platform and Adverse Event Reporting (IR-PEER): Initial Experience Implementing a Team-based Novel Peer Learning System in Interventional Radiology. J Am Coll Radiol 2024; 21:93-102. [PMID: 37659453 DOI: 10.1016/j.jacr.2023.07.022] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2023] [Revised: 07/21/2023] [Accepted: 07/26/2023] [Indexed: 09/04/2023]
Abstract
Although the transition from peer review to peer learning has had favorable outcomes in diagnostic radiology, experience with implementing a team-based peer review system in interventional radiology (IR) remains limited. Peer learning systems benefit diverse IR teams composed of multiple clinical roles and could contribute value in archiving events that have potential educational value. With multiple stakeholder input from clinical roles within the IR division at our institution (ie, radiologic technologists, nurses, advanced practice providers, residents, fellows, and attending physicians), we launched a HIPAA-compliant secure IR complication and learning opportunity reporting platform in April 2022. Case submissions were monitored over the subsequent 24 weeks, with monthly dashboard reports provided to departmental leadership. Preintervention and postintervention surveys were used to assess the impact of the peer learning platform and adverse event reporting in IR (IR-PEER) on perceptions of complication reporting in the IR division across clinical roles. Ninety-two peer learning submissions were collected for a weekly average ± standard error of 3.8 ± 0.6 submissions per week, and an additional 26 submissions were collected as part of the division's ongoing monthly complication review conference, for a total of 98 unique total case references. A total of 64.1% of submissions (59 of 92) involved a complication and/or adverse event, and 35.9% of submissions (33 of 92) identified a learning opportunity (no complication or adverse event). Nurses reported that IR-PEER made the complication-reporting process easier (P = .01), and all clinical roles reported that IR-PEER improved the overall process of complication reporting. Peer learning frameworks such as IR-PEER provide a more equitable communication platform for multidisciplinary teams to capture and archive learning opportunities that support quality and safety improvement efforts.
Collapse
Affiliation(s)
| | - Kayesha Hancel
- Department of Interventional Radiology, Massachusetts General Hospital, Boston, Massachusetts
| | - Sanjeeva Kalva
- Department of Interventional Radiology, Massachusetts General Hospital, Boston, Massachusetts
| | - Matthew Schenker
- Department of Radiology, Brigham and Women's Hospital, Boston, Massachusetts
| | - Sanjay Saini
- Department of Radiology, Brigham and Women's Hospital, Boston, Massachusetts
| | - Daniel I Glazer
- Department of Radiology, Brigham and Women's Hospital, Boston, Massachusetts
| | - Ramin Khorasani
- Department of Radiology, Brigham and Women's Hospital, Boston, Massachusetts
| | - Dania Daye
- Department of Interventional Radiology, Massachusetts General Hospital, Boston, Massachusetts.
| |
Collapse
|
5
|
Donnelly LF, Podberesky DJ, Towbin AJ, Loh L, Basta KH, Platchek TS, Vossmeyer MT, Shook JE. The Joint Commission's Ongoing Professional Practice Evaluation Process: Costly, Ineffective, and Potentially Harmful to Safety Culture. J Am Coll Radiol 2024; 21:61-69. [PMID: 37683817 DOI: 10.1016/j.jacr.2023.08.031] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2023] [Revised: 08/14/2023] [Accepted: 08/30/2023] [Indexed: 09/10/2023]
Abstract
OBJECTIVE To evaluate the estimated labor costs and effectiveness of Ongoing Professional Practice Evaluation (OPPE) processes at identifying outlier performers in a large sample of providers across multiple health care systems and to extrapolate costs and effectiveness nationally. METHODS Six hospital systems partnered to evaluate their labor expenses related to conducting OPPE. Estimates for mean labor hours and wages were created for the following: data analysts, medical staff office professionals, department physician leaders, and administrative assistants. The total number of outlier performers who were identified by OPPE metrics alone and that resulted in lack of renewal, limitation, or revoking of hospital privileges during the past annual OPPE cycle (2022) was recorded. National costs of OPPE were extrapolated. Literature review of the effect of OPPE on safety culture in radiology was performed. RESULTS The evaluated systems had 12,854 privileged providers evaluated by OPPE. The total estimated annual recurring labor cost per provider was $50.20. Zero of 12,854 providers evaluated were identified as outlier performers solely through the OPPE process. The total estimated annual recurring cost of administering OPPE nationally was $78.54 million. In radiology over the past 15 years, the use of error rates based on score-based peer review as an OPPE metric has been perceived as punitive and had an adverse effect on safety culture. CONCLUSION OPPE is expensive to administer, inefficient at identifying outlier performers, diverts human resources away from potentially more effective improvement work, and has been associated with an adverse impact on safety culture in radiology.
Collapse
Affiliation(s)
- Lane F Donnelly
- Professor of Radiology and Pediatrics, Departments of Radiology and Pediatrics, University of North Carolina School of Medicine, Chapel Hill, North Carolina; Executive Medical Director, Pediatric Population Health and Quality, UNC Health; Director of Quality, UNC Children's Hospital; member, ACR Peer Learning Committee.
| | - Daniel J Podberesky
- Vice President and Chief Medical Officer, Nemours Children's Health, Orlando, Florida, and Professor of Radiology, University of Central Florida, College of Medicine, Orlando, Florida
| | - Alexander J Towbin
- Associate Chief, Associate Chief Medical Information Officer, and Neil D. Johnson Chair of Radiology Informatics, Department of Radiology, Cincinnati Children's Hospital, Cincinnati, Ohio; Professor of Radiology, Department of Radiology, University of Cincinnati College of Medicine, Cincinnati, Ohio; ACR Roles: Informatics Commission, Councilor-at-Large (2023), Data Science Institute Non-Interpretive Panel Cochair, LI-RADS Steering Committee-Pediatric LI-RADS, Relevance and Impact Workgroup, Pediatric Measures Committee, ACR Annual Meeting Abstract Reviewers, Pediatric AI Workgroup
| | - Ling Loh
- Director, Analytics and Clinical Effectiveness, Center for Pediatric and Maternal Value, Stanford Medicine Children's Health, Palo Alto, California
| | - Kathryne H Basta
- Assistant Director, Quality and Patient Safety, Department of Quality and Safety, Texas Children's Hospital, Houston, Texas
| | - Terry S Platchek
- Vice President for Performance Improvement and Associate Chief Quality Officer, Center for Pediatric and Maternal Value, Stanford Medicine Children's Health, Palo Alto, California; Professor, Pediatrics and Internal Medicine, and Fellowship Director, Clinical Excellence Research Center, Department of Pediatrics, Stanford University School of Medicine, Palo Alto, California
| | - Michael T Vossmeyer
- Department of Pediatrics, Cincinnati Children's Hospital, Cincinnati, Ohio; Associate Professor, Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, Ohio; Chair, Utilization Review Committee; Chair, Focused Professional Practice Evaluation/OPPE Committee; member, Credentials Committee; member, Medical Executive Committee, Cincinnati Children's Hospital
| | - Joan E Shook
- Center for Pediatric and Maternal Value, Stanford Medicine Children's Health, Palo Alto, California; Professor of Pediatrics-Emergency Medicine, Department of Pediatrics, Baylor College of Medicine, Houston, Texas; Chief Safety Officer, Deputy Chief Quality Officer, Texas Children's Hospital
| |
Collapse
|
6
|
Parrott EH, Saeedipour S, Walker CM, Best SR, Harn NR, Ash RM. Transition from Peer Review to Peer Learning: Lessons Learned. Curr Probl Diagn Radiol 2023; 52:223-229. [PMID: 37069021 DOI: 10.1067/j.cpradiol.2023.03.003] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/02/2023] [Accepted: 03/16/2023] [Indexed: 03/30/2023]
Abstract
Landmark publications, such as To Err is Human, confronted the healthcare community with the egregious toll medical errors played in both patient safety and overall healthcare costs. This heralded a paradigm shift and a call for action by professional organizations to enact methods to ensure physician competency and quality assurance. The American College of Radiology similarly convened a task force to discuss these concerns and how best to address quality assurance in radiology practice, leading to the development of RADPEER, a score-based peer review system. However, critics were quick to point out the deficiencies of this model, highlighting it as punitive and a poor evaluator of physician performance. The recognized deficiencies in score-based peer review prompted the pursuit of an alternate model that would instead emphasize learning and improvement. Peer learning was proposed and highlighted the necessity of an inclusive and collaborative environment where colleagues could discuss case errors as learning opportunities without fear of punitive consequence. This paper explores peer learning, its benefits and challenges, as well as how to identify specific learning opportunities by utilizing case examples.
Collapse
|
7
|
Morris DV, Sasson AL, Delman BN, Margolies LR. Investing in Peer Learning as a Qualifying Assessment Model in Breast Imaging: A Paradigm Shift from Peer Review to Peer Learning. CURRENT RADIOLOGY REPORTS 2022. [DOI: 10.1007/s40134-022-00409-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/29/2022]
|
8
|
Mojtahed A, Kilcoyne A, Crowley C, Furtado F, Anderson MA, Catalano OA, Gee MS, Kambadakone A, Saini S, Pandharipande PV. Introduction of a daily peer learning process with added value for faculty and trainees. Clin Imaging 2022; 92:83-87. [DOI: 10.1016/j.clinimag.2022.10.001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/01/2022] [Revised: 09/25/2022] [Accepted: 10/02/2022] [Indexed: 11/03/2022]
|
9
|
Biddle G, Assadsangabi R, Broadhead K, Hacein-Bey L, Ivanovic V. Diagnostic Errors in Cerebrovascular Pathology: Retrospective Analysis of a Neuroradiology Database at a Large Tertiary Academic Medical Center. AJNR Am J Neuroradiol 2022; 43:1271-1278. [PMID: 35926887 PMCID: PMC9451623 DOI: 10.3174/ajnr.a7596] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2021] [Accepted: 06/16/2022] [Indexed: 01/26/2023]
Abstract
BACKGROUND AND PURPOSE Diagnostic errors affect 2%-8% of neuroradiology studies, resulting in significant potential morbidity and mortality. This retrospective analysis of a large database at a single tertiary academic institution focuses on diagnostic misses in cerebrovascular pathology and suggests error-reduction strategies. MATERIALS AND METHODS CT and MR imaging reports from a consecutive database spanning 2015-2020 were searched for errors of attending physicians in cerebrovascular pathology. Data were collected on missed findings, study types, and interpretation settings. Errors were categorized as ischemic, arterial, venous, hemorrhagic, and "other." RESULTS A total of 245,762 CT and MR imaging neuroradiology examinations were interpreted during the study period. Vascular diagnostic errors were present in 165 reports, with a mean of 49.6 (SD, 23.3) studies on the shifts when an error was made, compared with 34.9 (SD, 19.2) on shifts without detected errors (P < .0001). Seventy percent of examinations occurred in the hospital setting; 93.3% of errors were perceptual; 6.7% were interpretive; and 93.9% (n = 155) were clinically significant (RADPEER 2B or 3B). The distribution of errors was arterial and ischemic each with 33.3%, hemorrhagic with 21.8%, and venous with 7.5%. Most errors involved brain MR imaging (30.3%) followed by head CTA (27.9%) and noncontrast head CT (26.1%). The most common misses were acute/subacute infarcts (25.1%), followed by aneurysms (13.7%) and subdural hematomas (9.7%). CONCLUSIONS Most cerebrovascular diagnostic errors were perceptual and clinically significant, occurred in the emergency/inpatient setting, and were associated with higher-volume shifts. Diagnostic errors could be minimized by adjusting search patterns to ensure vigilance on the sites of the frequently missed pathologies.
Collapse
Affiliation(s)
- G Biddle
- From the Neuroradiology Division (G.B., L.H.-B.), Department of Radiology, University of California Davis School of Medicine, Sacramento, California
| | - R Assadsangabi
- Neuroradiology Division (R.A.), Department of Radiology, University of Southern California, Los Angeles, California
| | - K Broadhead
- Department of Statistics (K.B.), University of California Davis, Davis, California
| | - L Hacein-Bey
- From the Neuroradiology Division (G.B., L.H.-B.), Department of Radiology, University of California Davis School of Medicine, Sacramento, California
| | - V Ivanovic
- Neuroradiology division (V.I.), Department of Radiology, Medical College of Wisconsin, Milwaukee, Wisconsin
| |
Collapse
|
10
|
Sayyouh MMH, Sella EC, Shankar PR, Marshall GE, Quint LE, Agarwal PP. Lessons Learned from Peer Learning Conference in Cardiothoracic Radiology. Radiographics 2022; 42:579-593. [PMID: 35148241 DOI: 10.1148/rg.210125] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
Medical errors may lead to patient harm and may also have a devastating effect on medical providers, who may suffer from guilt and the personal impact of a given error (second victim experience). While it is important to recognize and remedy errors, it should be done in a way that leads to long-standing practice improvement and focuses on systems-level opportunities rather than in a punitive fashion. Traditional peer review systems are score based and have some undesirable attributes. The authors discuss the differences between traditional peer review systems and peer learning approaches and offer practical suggestions for transitioning to peer learning conferences. Peer learning conferences focus on learning opportunities and embrace errors as an opportunity to learn. The authors also discuss various types and sources of errors relevant to the practice of radiology and how discussions in peer learning conferences can lead to widespread system improvement. In the authors' experience, these strategies have resulted in practice improvement not only at a division level in radiology but in a broader multidisciplinary setting as well. The online slide presentation from the RSNA Annual Meeting is available for this article. ©RSNA, 2022.
Collapse
Affiliation(s)
- Mohamed M H Sayyouh
- From the Cardiothoracic Imaging Division, Department of Radiology, University of Michigan, Taubman Center B1-132D, 1500 E Medical Center Dr, Ann Arbor, MI 48109-5302 (M.M.H.S., E.C.S., G.E.M., L.E.Q., P.P.A.); and Abdominal Imaging Division and Michigan Radiology Quality Collaborative, Department of Radiology, University of Michigan, Ann Arbor, Mich (P.R.S.)
| | - Edith C Sella
- From the Cardiothoracic Imaging Division, Department of Radiology, University of Michigan, Taubman Center B1-132D, 1500 E Medical Center Dr, Ann Arbor, MI 48109-5302 (M.M.H.S., E.C.S., G.E.M., L.E.Q., P.P.A.); and Abdominal Imaging Division and Michigan Radiology Quality Collaborative, Department of Radiology, University of Michigan, Ann Arbor, Mich (P.R.S.)
| | - Prasad R Shankar
- From the Cardiothoracic Imaging Division, Department of Radiology, University of Michigan, Taubman Center B1-132D, 1500 E Medical Center Dr, Ann Arbor, MI 48109-5302 (M.M.H.S., E.C.S., G.E.M., L.E.Q., P.P.A.); and Abdominal Imaging Division and Michigan Radiology Quality Collaborative, Department of Radiology, University of Michigan, Ann Arbor, Mich (P.R.S.)
| | - Giselle E Marshall
- From the Cardiothoracic Imaging Division, Department of Radiology, University of Michigan, Taubman Center B1-132D, 1500 E Medical Center Dr, Ann Arbor, MI 48109-5302 (M.M.H.S., E.C.S., G.E.M., L.E.Q., P.P.A.); and Abdominal Imaging Division and Michigan Radiology Quality Collaborative, Department of Radiology, University of Michigan, Ann Arbor, Mich (P.R.S.)
| | - Leslie E Quint
- From the Cardiothoracic Imaging Division, Department of Radiology, University of Michigan, Taubman Center B1-132D, 1500 E Medical Center Dr, Ann Arbor, MI 48109-5302 (M.M.H.S., E.C.S., G.E.M., L.E.Q., P.P.A.); and Abdominal Imaging Division and Michigan Radiology Quality Collaborative, Department of Radiology, University of Michigan, Ann Arbor, Mich (P.R.S.)
| | - Prachi P Agarwal
- From the Cardiothoracic Imaging Division, Department of Radiology, University of Michigan, Taubman Center B1-132D, 1500 E Medical Center Dr, Ann Arbor, MI 48109-5302 (M.M.H.S., E.C.S., G.E.M., L.E.Q., P.P.A.); and Abdominal Imaging Division and Michigan Radiology Quality Collaborative, Department of Radiology, University of Michigan, Ann Arbor, Mich (P.R.S.)
| |
Collapse
|
11
|
Phalak KA, Gerlach K, Parikh JR. Peer learning in breast imaging. Clin Imaging 2022; 85:60-63. [PMID: 35247790 DOI: 10.1016/j.clinimag.2022.02.027] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2021] [Revised: 02/11/2022] [Accepted: 02/25/2022] [Indexed: 11/17/2022]
Abstract
With the increasing focus on quality and safety in medicine, radiology practices are increasingly transitioning from traditional score-based peer review to peer learning. Participation in a peer learning program can increase learning, practice improvement, and cultivation of interpersonal relationships in a non-punitive environment. As breast imaging errors are the most cited in medical malpractice cases, learning and attention to and reduction of these errors in breast imaging are especially important. We describe the strengths of a peer learning program, implementation process in a breast imaging program, challenges to overcome, and strategies to support success.
Collapse
Affiliation(s)
- Kanchan A Phalak
- Department of Radiology, University MD Anderson Cancer Center, Houston, TX, USA.
| | - Karen Gerlach
- Department of Radiology, University MD Anderson Cancer Center, Houston, TX, USA.
| | - Jay R Parikh
- Department of Radiology, University MD Anderson Cancer Center, Houston, TX, USA.
| |
Collapse
|
12
|
Torres FS, Costa AF, Kagoma Y, Arrigan M, Scott M, Yemen B, Hurrell C, Kielar A. CAR Peer Learning Guide. Can Assoc Radiol J 2022; 73:491-498. [PMID: 35077247 DOI: 10.1177/08465371211065454] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Peer learning is a quality initiative used to identify potential areas of practice improvement, both on a patient level and on a systemic level. Opportunities for peer learning include review of prior imaging studies, evaluation of cases from multidisciplinary case conferences, and review of radiology trainees' call cases. Peer learning is non-punitive and focuses on promoting life-long learning. It seeks to identify and disseminate learning opportunities and areas for systems improvement compared to traditional peer review. Learning opportunities arise from peer learning through both individual communication of cases reviewed for routine work, as well as through anonymous presentation of aggregate cases in an educational format. In conjunction with other tools such as root cause analysis, peer learning can be used to guide future practice improvement opportunities. This guide provides definitions of terms and a synthetic evidence review regarding peer review and peer learning, as well as medicolegal and jurisdictional considerations. Important aspects of what makes an effective peer learning program and best practices for implementing such a program are presented. The guide is intended to be a living document that will be updated regularly as new data emerges and peer learning continues to evolve in radiology practices.
Collapse
Affiliation(s)
- Felipe Soares Torres
- Joint Department of Medical Imaging, Toronto General Hospital, 7938University of Toronto, Toronto, ON, Canada
| | - Andreu F Costa
- Department of Radiology, Queen Elizabeth II Health Sciences Centre, Dalhousie University, Halifax, NS, Canada
| | - Yoan Kagoma
- Hamilton Health Sciences, McMaster University Faculty of Health Sciences, Hamilton, ON, Canada
| | | | - Malcolm Scott
- Misericordia Community Hospital, University of Alberta, Edmonton, AB, Canada
| | - Brian Yemen
- Hamilton Health Sciences, 3710McMaster University, Hamilton, ON, Canada
| | - Casey Hurrell
- Canadian Association of Radiologists, Ottawa, ON, Canada
| | - Ania Kielar
- Joint Department of Medical Imaging, University of Toronto, Toronto, ON, Canada
| |
Collapse
|
13
|
Bowman AW, Tan N, Adamo DA, Chen F, Venkatesh SK, Baumgarten DA. Implementation of peer learning conferences throughout a multi-site abdominal radiology practice. Abdom Radiol (NY) 2021; 46:5489-5499. [PMID: 33999282 DOI: 10.1007/s00261-021-03114-8] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2021] [Revised: 04/25/2021] [Accepted: 04/30/2021] [Indexed: 11/28/2022]
Abstract
PURPOSE To initiate a peer learning conference for our abdominal radiology division across multiple geographically separated sites and different time zones, and to determine radiologist preference for peer learning versus traditional score-based peer review. METHODS We implemented a monthly peer learning videoconference for our abdominal radiology division. Surveys regarding radiologist opinion regarding traditional peer review and the new peer learning conferences were conducted before and after 6 months of conferences. RESULTS Peer learning conferences were well attended across our multiple sites, with an average of 43 participants per conference. Radiologist opinion regarding peer review was poor, with survey radiologists responding positively to only 1 out of 12 process questions. Opinion regarding peer learning was extremely favorable, with radiologists responding positively to 12 out of the same 12 process questions. After 6 months of peer learning conferences, 87.9% of surveyed radiologists wished to continue them in some fashion, and no one preferred to return to score-based peer review alone. CONCLUSION We successfully implemented a peer learning conference for our abdominal radiology division spread out over multiple geographic sites. Our radiologists strongly preferred peer learning conferences over our traditional peer review system for quality control.
Collapse
Affiliation(s)
- Andrew W Bowman
- Department of Radiology, Mayo Clinic, 4500 San Pablo Rd, Jacksonville, FL, 32224, USA.
| | - Nelly Tan
- Department of Radiology, Mayo Clinic, 5777 Mayo Blvd, Phoenix, AZ, 85054, USA
| | - Daniel A Adamo
- Department of Radiology, Mayo Clinic, 200 First St SW, Rochester, MN, 55905, USA
| | - Frederick Chen
- Department of Radiology, Mayo Clinic, 5777 Mayo Blvd, Phoenix, AZ, 85054, USA
| | - Sudhakar K Venkatesh
- Department of Radiology, Mayo Clinic, 200 First St SW, Rochester, MN, 55905, USA
| | - Deborah A Baumgarten
- Department of Radiology, Mayo Clinic, 4500 San Pablo Rd, Jacksonville, FL, 32224, USA
| |
Collapse
|
14
|
Schafer LE, Perry H, Fishman MD, Jakomin BV, Slanetz PJ. Incorporating Peer Learning Into Your Breast Imaging Practice. JOURNAL OF BREAST IMAGING 2021; 3:491-497. [PMID: 38424796 DOI: 10.1093/jbi/wbab043] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/04/2021] [Indexed: 03/02/2024]
Abstract
Traditional score-based peer review has come under scrutiny in recent years, as studies have demonstrated it to be generally ineffective at improving quality. Many practices and programs are transitioning to a peer learning model to replace or supplement traditional peer review. Peer learning differs from traditional score-based peer review in that the emphasis is on sharing learning opportunities and creating an environment that fosters discussion of errors in a nonpunitive forum with the goal of improved patient care. Creating a just culture is central to fostering successful peer learning. In a just culture, mistakes can be discussed without shame or fear of retribution and the focus is on systems improvement rather than individual blame. Peer learning, as it pertains to breast imaging, can occur in many forms and venues. Examples of the various formats in which peer learning can occur include through individual colleague interaction, as well as divisional, multidisciplinary, department-wide, and virtual conferences, and with the assistance of artificial intelligence. Incorporating peer learning into the practice of breast imaging aims to reduce delayed diagnoses of breast cancer and optimize patient care.
Collapse
Affiliation(s)
- Leah E Schafer
- Boston Medical Center and Boston University School of Medicine, Department of Radiology, Boston, MA, USA
| | - Hannah Perry
- University of Vermont Medical Center and Larner College of Medicine at the University of Vermont, Department of Radiology, Burlington, VT, USA
| | - Michael Dc Fishman
- Boston Medical Center and Boston University School of Medicine, Department of Radiology, Boston, MA, USA
| | - Bernadette V Jakomin
- Boston Medical Center and Boston University School of Medicine, Department of Radiology, Boston, MA, USA
| | - Priscilla J Slanetz
- Boston Medical Center and Boston University School of Medicine, Department of Radiology, Boston, MA, USA
| |
Collapse
|
15
|
Lamoureux C, Hanna TN, Sprecher D, Weber S, Callaway E. Radiologist errors by modality, anatomic region, and pathology for 1.6 million exams: what we have learned. Emerg Radiol 2021; 28:1135-1141. [PMID: 34328592 DOI: 10.1007/s10140-021-01959-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2021] [Accepted: 06/21/2021] [Indexed: 11/26/2022]
Abstract
PURPOSE To evaluate the feasibility of adding pathology to recent radiologist error characterization schemes of modality and anatomic region and the potential of this data to more specifically inform peer review and peer learning. METHODS Quality assurance data originating from 349 radiologists in a national teleradiology practice were collected for 2019. Interpretive errors were simply categorized as major or minor. Reporting or communication errors were classified as administrative errors. Interpretive errors were then divided by modality, anatomic region and placed into one of 64 pathologic categories. RESULTS Out of 1,628,464 studies, the discrepancy rate was 0.5% (8181/1,634,201). The 8181 total errors consisted of 2992 major errors (0.18%) and 5189 minor errors (0.32%). Precisely, 3.1% (257/8181) of total errors were administrative. Of major interpretive errors, 75.5% occurred on CT, with CT abdomen and pelvis accounting for 40.4%. The most common pathologic discrepancy for all exams was in the category of mass, nodule, or adenopathy (1583/8181), the majority of which were minor (1315/1583). The most common pathologic discrepancy for the 2937 major interpretive errors was fracture or dislocation (27%; 793/2937), followed by bleed (10.7%; 315/2937). CONCLUSION The addition of error-related pathology to peer review is both feasible and practical and provides a more detailed guide to targeted individual and practice-wide peer learning quality improvement efforts. Future research is needed to determine if there are measurable improvements in detection or interpretation of specific pathologies following error feedback and educational interventions.
Collapse
Affiliation(s)
| | - Tarek N Hanna
- Division of Emergency Radiology, Department of Radiology and Imaging Sciences, Emory University, 550 Peachtree Rd, Atlanta, GA, 30308, USA
| | - Devin Sprecher
- Virtual Radiologic, 11995 Singletree Ln #500, Eden Prairie, MN, 55344, USA
| | - Scott Weber
- Virtual Radiologic, 11995 Singletree Ln #500, Eden Prairie, MN, 55344, USA
| | - Edward Callaway
- Virtual Radiologic, 11995 Singletree Ln #500, Eden Prairie, MN, 55344, USA
| |
Collapse
|
16
|
Peer Learning Through Multi-Institutional Case Conferences: Abdominal and Cardiothoracic Radiology Experience. Acad Radiol 2021; 28:255-260. [PMID: 32061469 DOI: 10.1016/j.acra.2020.01.015] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2019] [Revised: 12/31/2019] [Accepted: 01/02/2020] [Indexed: 11/23/2022]
Abstract
RATIONALE AND OBJECTIVES We describe a model of multi-institutional, multisociety, online case conferences that is a case-based group discussion of selected (nonrandom) cases which are subsequently hosted on social media and online platforms (e.g., YouTube, websites) to be available for a wider audience. MATERIALS AND METHODS Using online conferencing software (Zoom, GoToMeeting), participants from both abdominal and cardiothoracic radiologists engage in separate, subspecialty one-hour meetings discussing a variety of meaningful cases. Participants take turns presenting their cases to the group and discuss significant findings, interpretations, differential diagnoses, and any other teaching points. All of the case conferences for both societies are recorded and edited to be uploaded on YouTube and their respective websites. RESULTS Participants from these conferences log in from 14 institutions in 7 states across the United States. The YouTube videos reach thousands of people around the world. The abdominal case conference on YouTube has received almost 1,300 views with 90 videos uploaded. The thoracic (the Society of Thoracic Radiology) case conference has been running for over 7 years, with 226 videos uploaded to YouTube and 38,200 views, 1426 subscribers, and a total watch time of over 525,800 minutes. Twitter has been utilized by both groups to promote online viewership. CONCLUSION Our model is feasible and effective compared to traditional peer review. The cases selected are deliberate and focused on quality improvement and/or education. We harness online engagement, specifically social media presence, which has opened new opportunities to educate our peers and reach a global audience, including the nonradiologic community, to learn about radiology and unique practices.
Collapse
|
17
|
Radiologist Opinions of a Quality Assurance Program: The Interaction Between Error, Emotion, and Preventative Action. Acad Radiol 2021; 28:e54-e61. [PMID: 32139303 DOI: 10.1016/j.acra.2020.01.027] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2019] [Revised: 01/24/2020] [Accepted: 01/26/2020] [Indexed: 11/23/2022]
Abstract
RATIONALE AND OBJECTIVES To investigate inter-relationships between radiologist opinions of a quality assurance (QA) program, QA Committee communications, negative emotions, self-identified risk factors, and preventive actions taken following major errors. MATERIALS AND METHODS A 48 question electronic survey was distributed to all 431 radiologists within the same teleradiology organization between June 15 and July 3, 2018. Two reminders were sent during the survey time period. Descriptive statistics were generated, and comparisons were made with Fisher exact test. Significance level was set at p < 0.05. RESULTS Response rate was 67.5% (291/431), and 72.5% of respondents completed all survey questions. A total of 64.3% of respondents were male, and the highest proportion of radiologists (28.9%, 187/291) had been in practice >20 years. Preventative actions following an error were positively correlated to a higher opinion of the QA process, self-identification of personal risk factors for error, and greater negative emotions following an error (all p < 0.05). A higher opinion of communications with the QA committee was associated with a positive opinion of the QA process (p < 0.001). An inverse relationship existed between negative emotion and opinion of QA committee communications (p < 0.05) and negative emotion and opinion of the QA process (p < 0.05). Radiologist gender and full time versus part time status had a significant effect on perception of the QA process (p < 0.05). CONCLUSION Radiologist opinions of their institutional QA process was related to the number of negative emotions experienced and preventative actions taken following major errors. Nurturing trust and incorporating more positive feedback in the QA process may improve interactions with QA Committees and mitigate future errors.
Collapse
|
18
|
Larson DB, Broder JC, Bhargavan-Chatfield M, Donnelly LF, Kadom N, Khorasani R, Sharpe RE, Pahade JK, Moriarity AK, Tan N, Siewert B, Kruskal JB. Transitioning From Peer Review to Peer Learning: Report of the 2020 Peer Learning Summit. J Am Coll Radiol 2020; 17:1499-1508. [DOI: 10.1016/j.jacr.2020.07.016] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2020] [Revised: 07/05/2020] [Accepted: 07/15/2020] [Indexed: 10/23/2022]
|
19
|
Collaborative Learning in Radiology: From Peer Review to Peer Learning and Peer Coaching. Acad Radiol 2020; 27:1261-1267. [PMID: 31636005 DOI: 10.1016/j.acra.2019.09.021] [Citation(s) in RCA: 23] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2019] [Revised: 08/28/2019] [Accepted: 09/05/2019] [Indexed: 11/20/2022]
Abstract
BACKGROUND A Radiology Research Alliance Task Force was assembled in 2018 to review the literature on peer review and report on best practices for peer learning and peer coaching. FINDINGS This report provides a historical perspective on peer review and the transition to peer collaborative learning and peer coaching. Most forms of current peer review have fulfilled regulatory requirements but have failed to significantly impact quality improvement or learning opportunities. Peer learning involves joint intellectual efforts by two or more individuals to study best practices and review error collaboratively. Peer coaching is a process in which individuals in a trusted environment work to expand, refine, and build new skills in order to facilitate self-directed learning and professional growth. We discuss the value in creating opportunities for peer learning and peer coaching. CONCLUSION Peer collaborative learning combined with peer coaching provides opportunities for teams to learn and grow together, benefit from each other's expertise and experience, improve faculty morale, and provide more opportunities for collaborations between faculty.
Collapse
|
20
|
Maurer MH, Brönnimann M, Schroeder C, Ghadamgahi E, Streitparth F, Heverhagen JT, Leichtle A, de Bucourt M, Meyl TP. Time Requirement and Feasibility of a Systematic Quality Peer Review of Reporting in Radiology. ROFO-FORTSCHR RONTG 2020; 193:160-167. [PMID: 32698235 DOI: 10.1055/a-1178-1113] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
OBJECTIVE To estimate the human resources required for a retrospective quality review of different percentages of all routine diagnostic procedures in the Department of Radiology at Bern University Hospital, Switzerland. MATERIALS AND METHODS Three board-certified radiologists retrospectively evaluated the quality of the radiological reports of a total of 150 examinations (5 different examination types: abdominal CT, chest CT, mammography, conventional X-ray images and abdominal MRI). Each report was assigned a RADPEER score of 1 to 3 (score 1: concur with previous interpretation; score 2: discrepancy in interpretation/not ordinarily expected to be made; score 3: discrepancy in interpretation/should be made most of the time). The time (in seconds, s) required for each review was documented and compared. A sensitivity analysis was conducted to calculate the total workload for reviewing different percentages of the total annual reporting volume of the clinic. RESULTS Among the total of 450 reviews analyzed, 91.1 % (410/450) were assigned a score of 1 and 8.9 % (40/450) were assigned scores of 2 or 3. The average time (in seconds) required for a peer review was 60.4 s (min. 5 s, max. 245 s). The reviewer with the greatest clinical experience needed significantly less time for reviewing the reports than the two reviewers with less clinical expertise (p < 0.05). Average review times were longer for discrepant ratings with a score of 2 or 3 (p < 0.05). The total time requirement calculated for reviewing all 5 types of examination for one year would be more than 1200 working hours. CONCLUSION A retrospective peer review of reports of radiological examinations using the RADPEER system requires considerable human resources. However, to improve quality, it seems feasible to peer review at least a portion of the total yearly reporting volume. KEY POINTS · A systematic retrospective assessment of the content of radiological reports using the RADPEER system involves high personnel costs.. · The retrospective assessment of all reports of a clinic or practice seems unrealistic due to the lack of highly specialized personnel.. · At least part of all reports should be reviewed with the aim of improving the quality of reports.. CITATION FORMAT · Maurer MH, Brönnimann M, Schroeder C et al. Time Requirement and Feasibility of a Systematic Quality Peer Review of Reporting in Radiology. Fortschr Röntgenstr 2021; 193: 160 - 167.
Collapse
Affiliation(s)
- Martin H Maurer
- Department of Diagnostic, Interventional and Paediatric Radiology, Inselspital, Bern Universtity Hospital, University of Bern, Switzerland
| | - Michael Brönnimann
- Department of Diagnostic, Interventional and Paediatric Radiology, Inselspital, Bern Universtity Hospital, University of Bern, Switzerland
| | - Christophe Schroeder
- Department of Diagnostic, Interventional and Paediatric Radiology, Inselspital, Bern Universtity Hospital, University of Bern, Switzerland
| | | | | | - Johannes T Heverhagen
- Department of Diagnostic, Interventional and Paediatric Radiology, Inselspital, Bern Universtity Hospital, University of Bern, Switzerland
| | - Alexander Leichtle
- Institute of Clinical Chemistry, Inselspital, Bern Universtiy Hospital, University of Bern, Switzerland
| | - Maximilian de Bucourt
- Institute for Diagnostic and Interventional Radiology, Charité University Medicine Berlin, Germany
| | - Tobias Philipp Meyl
- Medical Department, Medical Strategy, Inselspital, Bern University Hospital, University of Bern, Switzerland
| |
Collapse
|
21
|
Integration of Peer Review in PACS Results in a Marked Increase in the Discrepancies Reported. AJR Am J Roentgenol 2020; 214:613-617. [PMID: 31846375 DOI: 10.2214/ajr.19.21952] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
OBJECTIVE. The objective of this article is to assess the impact of integrating peer review in PACS on the reporting of discrepancies. Our hypothesis is that a PACS-integrated machine-randomized and semiblinded peer review tool leads to an increase in discrepancies reported. MATERIALS AND METHODS. A PACS tool was implemented to prompt radiologists to perform peer review of prior comparison studies in a randomized fashion. The reviewed radiologist's name was omitted from the prior report in PACS. Before this implementation, radiologists entered peer reviews directly on the RADPEER website. Three academic subspecialty sections comprising 24 radiologists adopted the tool (adopters group). Three sections comprising 14 radiologists did not adopt the tool (nonadopters group). Peer review submissions were analyzed for 4 months before and 4 months after the implementation. The mean rate of significant discrepancies (RADPEER score 2b or higher) reported per radiologist was calculated and the discrepancy rates of the periods before and after the implementation were compared. RESULTS. The mean significant discrepancy rate reported per radiologist in the adopters group increased from 0.19% ± 0.46% (SD) before the implementation to 0.93% ± 1.45% after implementation (p = 0.01). No significant discrepancies were reported by the nonadopters group in either period. CONCLUSION. In this single institutional retrospective analysis, integrating peer review in PACS resulted in a fivefold increase in reported significant discrepancies. These results suggest that peer review data are influenced by the design of the tool used including PACS integration, randomization, and blinding.
Collapse
|
22
|
Peer learning on a shoe string: success of a distributive model for peer learning in a community radiology practice. Clin Imaging 2020; 59:114-118. [DOI: 10.1016/j.clinimag.2019.10.012] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/22/2019] [Revised: 10/08/2019] [Accepted: 10/11/2019] [Indexed: 11/20/2022]
|
23
|
Chaudhry H, Del Gaizo AJ, Frigini LA, Goldberg-Stein S, Long SD, Metwalli ZA, Morgan JA, Nguyen XV, Parker MS, Abujudeh H. Forty-One Million RADPEER Reviews Later: What We Have Learned and Are Still Learning. J Am Coll Radiol 2020; 17:779-785. [PMID: 31991118 DOI: 10.1016/j.jacr.2019.12.023] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/17/2019] [Revised: 12/25/2019] [Accepted: 12/27/2019] [Indexed: 11/19/2022]
Abstract
ACR RADPEER® is the leading method of radiologic peer review in the United States. The program has evolved since its inception in 2002 and was most recently updated in 2016. In 2018, a survey was sent to RADPEER participants to gauge the current state of the program and explore opportunities for continued improvement. A total of 26 questions were included, and more than 300 practices responded. In this report, the ACR RADPEER Committee authors summarize the survey results and discuss opportunities for future iterations of the RADPEER program.
Collapse
Affiliation(s)
- Humaira Chaudhry
- Department of Radiology, Rutgers - New Jersey Medical School, Newark, New Jersey.
| | - Andrew J Del Gaizo
- Department of Radiology, Department of Veterans Affairs National Teleradiology Program, Durham, North Carolina and Wake Forest University Baptist Medical Center, Winston-Salem, North Carolina
| | | | - Shlomit Goldberg-Stein
- Montefiore Medical Center, The University Hospital at Albert Einstein College of Medicine, Bronx, New York
| | - Scott D Long
- Southern Illinois University School of Medicine, Springfield, Illinois
| | - Zeyad A Metwalli
- Department of Interventional Radiology, MD Anderson Cancer Center, Houston, Texas
| | | | - Xuan V Nguyen
- Department of Radiology, The Ohio State University College of Medicine, Columbus, Ohio
| | - Mark S Parker
- Thoracic Imaging Division, VCU Health Systems, Richmond, Virginia
| | - Hani Abujudeh
- Detroit Medical Center, Envision Physician Services, Detroit, Michigan
| |
Collapse
|
24
|
Current Status and Future Wish List of Peer Review: A National Questionnaire of U.S. Radiologists. AJR Am J Roentgenol 2020; 214:493-497. [PMID: 31939700 DOI: 10.2214/ajr.19.22194] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
OBJECTIVE. Most peer review programs focus on error detection, numeric scoring, and radiologist-specific error rates. The effectiveness of this method on learning and systematic improvement is uncertain at best. Radiologists have been pushing for a transition from an individually punitive peer review system to a peer-learning model. This national questionnaire of U.S. radiologists aims to assess the current status of peer review and opportunities for improvement. MATERIALS AND METHODS. A 21-question multiple-choice questionnaire was developed and face validity assessed by the ARRS Performance Quality Improvement subcommittee. The questionnaire was e-mailed to 17,695 ARRS members and open for 4 weeks; two e-mail reminders were sent. Response collection was anonymous. Only responses from board-certified, practicing radiologists participating in peer review were analyzed. RESULTS. The response rate was 4.2% (742/17,695), and 73.7% (547/742) met inclusion criteria. Most responders were in private practice (51.7%, 283/547) with a group size of 11-50 radiologists (50.5%) and in an urban setting (61.6%). Significant diversity was noted in peer review systems, with RADPEER used by less than half (45.0%) and cases selected most commonly by commercial software (36.2%) or manually (31.2%). There was no consensus on the number of required peer reviews per month (10-20 cases, 32.1%; > 20 cases, 29.1%; < 10 cases, 21.7%). Less than half (43.7%) did not use peer review for group education. Whereas most (67.7%) were notified of their peer review results individually, 21.5% were not notified at all. Around half were dissatisfied (44.5%) because of insufficient learning (94.0%) and inaccurate representation of their performance improvement (75.5%). Overall, the group discrepancy rates were unknown to most radiologists who participate in peer review (54.3%). Submission bias was the main reason for underreporting of serious discrepancies (49.0%). Most found four peer-learning methods feasible in daily practice: incidental observation, 65.1%; focused practice review, 52.9%; professional auditing, 45.8%; and blinded double reading, 35.4%. CONCLUSION. More than half of participants reported that peer review data are used for educational purposes. However, significant diversity remains in current peer review practice with no agreement on number of required reviews, method of case selection, and oversight of results. Nearly half of the radiologists reported insufficient learning, although most feel a better system would be feasible in daily practice.
Collapse
|
25
|
Brown SD, Bruno MA, Shyu JY, Eisenberg R, Abujudeh H, Norbash A, Gallagher TH. Error Disclosure and Apology in Radiology: The Case for Further Dialogue. Radiology 2019; 293:30-35. [DOI: 10.1148/radiol.2019190126] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
|
26
|
Utility of an Automated Radiology-Pathology Feedback Tool. J Am Coll Radiol 2019; 16:1211-1217. [DOI: 10.1016/j.jacr.2019.03.001] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/12/2018] [Revised: 02/20/2019] [Accepted: 03/10/2019] [Indexed: 11/20/2022]
|
27
|
Pfeifer CM. Overcoming Barriers to Effective Peer Review. J Am Coll Radiol 2019; 16:886-888. [DOI: 10.1016/j.jacr.2018.12.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2018] [Revised: 11/23/2018] [Accepted: 12/02/2018] [Indexed: 11/24/2022]
|
28
|
Iyer RS, Swenson DW, Anand N, Blumfield E, Chandra T, Chavhan GB, Goodman TR, Khan N, Moore MM, Ngo TD, Sammet CL, Sze RW, Vera CD, Stanescu AL. Survey of peer review programs among pediatric radiologists: report from the SPR Quality and Safety Committee. Pediatr Radiol 2019; 49:517-525. [PMID: 30923884 DOI: 10.1007/s00247-018-4289-3] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/01/2018] [Revised: 08/20/2018] [Accepted: 10/16/2018] [Indexed: 10/27/2022]
Abstract
During the last 15 years, peer review has been widely incorporated into radiology quality improvement programs. However, current implementations are variable and carry concerns, including subjectivity of numerical scores and a sense of merely satisfying regulatory requirements. The Society for Pediatric Radiology (SPR) Quality and Safety Committee sought to evaluate the state of peer review programs in pediatric radiology practices, including implementation methods, perceived functions, strengths and weaknesses, and opportunities for improvement. We distributed an online 16-question survey to SPR members. Questions pertained to the type of peer review system, the use of numerical scores and comments, how feedback on discordances is given and received, and the use of peer learning conferences. We collected 219 responses (15% of survey invitations), 80% of which were from children's hospitals. Fifty percent of respondents said they use a picture archiving and communication system (PACS)-integrated peer review system. Comment-enhanced feedback for interpretive discordances was either very important or somewhat important to performance improvement in 86% of responses, compared to 48% with a similar perception of numerical scores. Sixty-eight percent of respondents said they either rarely or never check their numerical scores, and 82% either strongly or somewhat agreed that comments are more effective feedback than numerical scores. Ninety-three percent either strongly or somewhat agreed that peer learning conferences would be beneficial to their practice. Forty-eight percent thought that their current peer review system should be modified. Survey results demonstrate that peer review systems in pediatric radiology practices are implemented variably, and nearly half of respondents believe their systems should be modified. Most respondents prefer feedback in the form of comments and peer learning conferences, which are thought to be more beneficial for performance improvement than numerical scores.
Collapse
Affiliation(s)
- Ramesh S Iyer
- Department of Radiology, MA.7.220, Seattle Children's Hospital, University of Washington School of Medicine, 4800 Sand Point Way NE, Seattle, WA, 98105, USA.
| | - David W Swenson
- Department of Radiology, Warren Alpert School of Medicine, Brown University, Providence, RI, USA
| | - Neil Anand
- Department of Diagnostic Radiology, Morristown Medical Center, Morristown, NJ, USA
| | - Einat Blumfield
- Department of Radiology, Children's Hospital of Montefiore, Albert Einstein College of Medicine, Bronx, NY, USA
| | - Tushar Chandra
- Department of Medical Imaging, Nemours Children's Hospital, Orlando, FL, USA
| | - Govind B Chavhan
- Department of Radiology, The Hospital for Sick Children, Toronto, ON, Canada
| | | | - Naeem Khan
- Department of Diagnostic Imaging, IWK Health Center, Halifax, NS, Canada
| | - Michael M Moore
- Department of Radiology, Pennsylvania State University, Hershey, PA, USA
| | - Thang D Ngo
- Department of Medical Imaging, Nemours Children's Hospital, Orlando, FL, USA
| | - Christina L Sammet
- Department of Radiology, Ann & Robert H. Lurie Children's Hospital, Chicago, IL, USA
| | - Raymond W Sze
- Department of Radiology, Children's Hospital of Philadelphia, Philadelphia, PA, USA
| | - Chido D Vera
- Department of Radiology, Children's Hospital of Pittsburgh, Pittsburgh, PA, USA
| | - A Luana Stanescu
- Department of Radiology, MA.7.220, Seattle Children's Hospital, University of Washington School of Medicine, 4800 Sand Point Way NE, Seattle, WA, 98105, USA
| |
Collapse
|
29
|
Practical considerations when implementing peer learning conferences. Pediatr Radiol 2019; 49:526-530. [PMID: 30923885 DOI: 10.1007/s00247-018-4305-7] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/30/2018] [Revised: 09/29/2018] [Accepted: 11/02/2018] [Indexed: 10/27/2022]
Abstract
Peer learning represents a shift away from traditional peer review. Peer learning focuses on improvement of diagnostic performance rather than on suboptimal performance. The shift in focus away from random selection and toward identification of cases with valuable teaching points can encourage more active radiologist engagement in the learning process. An effective peer learning program relies on a trusting environment that lessens the fear of embarrassment or punitive action. Here we describe the shortcomings of traditional peer review, and the benefits of peer learning. We also provide tips for a successful peer learning program and examples of implementation.
Collapse
|
30
|
Luo M, Berkowitz S, Nguyen Q, Yam CS, Faintuch S, Ahmed M, Collares F, Sarwar A, Weinstein JL, Brook OR. Electronic IR Group Peer Review and Learning Performed during Daily Clinical Rounds. J Vasc Interv Radiol 2019; 30:594-600. [DOI: 10.1016/j.jvir.2018.09.005] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2018] [Revised: 08/30/2018] [Accepted: 09/04/2018] [Indexed: 01/20/2023] Open
|
31
|
Charkhchi P, Wang B, Caffo B, Yousem DM. Bias in Neuroradiology Peer Review: Impact of a "Ding" on "Dinging" Others. AJNR Am J Neuroradiol 2018; 40:19-24. [PMID: 30523137 DOI: 10.3174/ajnr.a5908] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/10/2018] [Accepted: 10/24/2018] [Indexed: 11/07/2022]
Abstract
BACKGROUND AND PURPOSE The validity of radiology peer review requires an unbiased assessment of studies in an environment that values the process. We assessed radiologists' behavior reviewing colleagues' reports. We hypothesized that when a radiologist receives a discrepant peer review, he is more likely to submit a discrepant review about another radiologist. MATERIALS AND METHODS We analyzed the anonymous peer review submissions of 13 neuroradiologists in semimonthly blocks of time from 2016 to 2018. We defined a discrepant review as any one of the following: 1) detection miss, clinically significant; 2) detection miss, clinically not significant; 3) interpretation miss, clinically significant; or 4) interpretation miss, clinically not significant. We used random-effects Poisson regression analysis to determine whether a neuroradiologist was more likely to submit a discrepant report during the semimonthly block in which he or she received one versus the semimonthly block thereafter. RESULTS Four hundred sixty-eight discrepant peer review reports were submitted; 161 were submitted in the same semimonthly block of receipt of a discrepant report and 325 were not. Receiving a discrepant report had a positive effect on submitting discrepant reports: an expected relative increase of 14% (95% CI, 8%-21%). Notably, receiving a clinically not significant discrepant report (coefficient = 0.13; 95% CI, 0.05-0.22) significantly and positively correlated with submitting a discrepant report within the same time block, but this was not true of clinically significant reports. CONCLUSIONS The receipt of a clinically not significant discrepant report leads to a greater likelihood of submitting a discrepant report. The motivation for such an increase should be explored for potential bias.
Collapse
Affiliation(s)
- P Charkhchi
- From the Department of Radiology and Radiological Science (P.C., D.M.Y.), Division of Neuroradiology, Johns Hopkins Medical Institutions, Baltimore, Maryland
| | - B Wang
- Department of Biostatistics (B.W., B.C.), Johns Hopkins Bloomberg School of Public Health, Baltimore, Maryland
| | - B Caffo
- Department of Biostatistics (B.W., B.C.), Johns Hopkins Bloomberg School of Public Health, Baltimore, Maryland
| | - D M Yousem
- From the Department of Radiology and Radiological Science (P.C., D.M.Y.), Division of Neuroradiology, Johns Hopkins Medical Institutions, Baltimore, Maryland
| |
Collapse
|
32
|
Part-Time Pediatric Radiology: The Realities and Perceptions of Part-Time Employment in the Academic Setting. AJR Am J Roentgenol 2018; 211:971-977. [DOI: 10.2214/ajr.18.19922] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
|
33
|
Implementation of a Peer Learning Program Replacing Score-Based Peer Review in a Multispecialty Integrated Practice. AJR Am J Roentgenol 2018; 211:949-956. [DOI: 10.2214/ajr.18.19891] [Citation(s) in RCA: 22] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
|
34
|
Simplified Readability Metric Drives Improvement of Radiology Reports: an Experiment on Ultrasound Reports at a Pediatric Hospital. J Digit Imaging 2018; 30:710-717. [PMID: 28484918 DOI: 10.1007/s10278-017-9972-7] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022] Open
Abstract
Highly complex medical documents, including ultrasound reports, are greatly mismatched with patient literacy levels. While improving radiology reports for readability is a longstanding concern, few articles objectively measure the effectiveness of physician training for readability improvement. We hypothesized that writing styles may be evaluated using an objective two-dimensional measure and writing training could improve the writing styles of radiologists. To test it, a simplified "grade vs. length" readability metric is developed based on results from factor analysis of ten readability metrics applied to more than 500,000 radiology reports. To test the short-term effectiveness of a writing workshop, we measured the writing style improvement before and after the training. Statistically significant writing style improvement occurred as a result of the training. Although the degree of improvement varied for different measures, it is evident that targeted training could provide potential benefits to improve readability due to our statistically significant results. The simplified grade vs. length metric enables future clinical decision support systems to quantitatively guide physicians to improve writing styles through writing workshops.
Collapse
|
35
|
Peer Review to Peer Learning in Radiology: Where Have We Been, What Have We Learned and Where Are We Headed? CURRENT RADIOLOGY REPORTS 2018. [DOI: 10.1007/s40134-018-0292-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
|
36
|
Olthof AW, de Groot JC, Zorgdrager AN, Callenbach PMC, van Ooijen PMA. Perception of radiology reporting efficacy by neurologists in general and university hospitals. Clin Radiol 2018; 73:675.e1-675.e7. [PMID: 29622361 DOI: 10.1016/j.crad.2018.03.001] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2017] [Accepted: 03/01/2018] [Indexed: 11/26/2022]
Abstract
AIM To investigate how neurologists perceive the value of the radiology report and to analyse the relation with the neurologists own expertise in radiology and the level of subspecialisation of radiologists. MATERIALS AND METHODS A web-based survey was distributed to neurologists. The level of subspecialisation was assessed by the percentage of fellowship-trained radiologists and the percentage of radiologists that were members of the Dutch Society of Neuroradiology. RESULTS Most neurologists interpret all computed tomography (CT) and magnetic resonance imaging (MRI) studies themselves, and their self-confidence in making correct interpretations is high. Residents gave higher scores than neurologists for "Radiologist report answers the question" (p=0.039) and for "Radiologist reports give helpful advice" (p=0.001). Neurologists from university hospitals stated more frequently that the report answered their questions than neurologists from general hospitals (p=0.008). The general appreciation for radiology reports was higher for neurologists from university hospitals than from general hospitals (8.2 versus 7.2; p=0.003). Radiologists at university hospitals have a higher level of subspecialisation than those at general hospitals. CONCLUSION Subspecialisation of radiologists leads to higher quality of radiology reporting as perceived by neurologists. Because of their expertise in radiology, neurologists are valuable sources of feedback for radiologists. Paying attention to the clinical questions and giving advice tailored to the needs of the referring physicians are opportunities to improve radiology reporting.
Collapse
Affiliation(s)
- A W Olthof
- Department of Radiology, Treant Health Care Group, Dr. G.H. Amshoffweg 1, Hoogeveen, The Netherlands.
| | - J C de Groot
- Department of Radiology, University of Groningen, University Medical Center Groningen, Hanzeplein 1, Groningen, The Netherlands
| | - A N Zorgdrager
- Department of Neurology, Treant Health Care Group, Dr. G.H. Amshoffweg 1, Hoogeveen, The Netherlands
| | - P M C Callenbach
- Treant Health Care Group, Research Bureau, Dr. G.H. Amshoffweg 1, Hoogeveen, The Netherlands
| | - P M A van Ooijen
- Department of Radiology, University of Groningen, University Medical Center Groningen, Hanzeplein 1, Groningen, The Netherlands; University of Groningen, University Medical Center Groningen, Center for Medical Imaging North East Netherlands (CMI-NEN), Hanzeplein 1, Groningen, The Netherlands
| |
Collapse
|
37
|
Added value of double reading in diagnostic radiology,a systematic review. Insights Imaging 2018; 9:287-301. [PMID: 29594850 PMCID: PMC5990995 DOI: 10.1007/s13244-018-0599-0] [Citation(s) in RCA: 36] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2017] [Revised: 01/10/2018] [Accepted: 01/15/2018] [Indexed: 01/10/2023] Open
Abstract
Objectives Double reading in diagnostic radiology can find discrepancies in the original report, but a systematic program of double reading is resource consuming. There are conflicting opinions on the value of double reading. The purpose of the current study was to perform a systematic review on the value of double reading. Methods A systematic review was performed to find studies calculating the rate of misses and overcalls with the aim of establishing the added value of double reading by human observers. Results The literature search resulted in 1610 hits. After abstract and full-text reading, 46 articles were selected for analysis. The rate of discrepancy varied from 0.4 to 22% depending on study setting. Double reading by a sub-specialist, in general, led to high rates of changed reports. Conclusions The systematic review found rather low discrepancy rates. The benefit of double reading must be balanced by the considerable number of working hours a systematic double-reading scheme requires. A more profitable scheme might be to use systematic double reading for selected, high-risk examination types. A second conclusion is that there seems to be a value of sub-specialisation for increased report quality. A consequent implementation of this would have far-reaching organisational effects. Key Points • In double reading, two or more radiologists read the same images. • A systematic literature review was performed. • The discrepancy rates varied from 0.4 to 22% in various studies. • Double reading by sub-specialists found high discrepancy rates. Electronic supplementary material The online version of this article (10.1007/s13244-018-0599-0) contains supplementary material, which is available to authorised users.
Collapse
|
38
|
Abstract
OBJECTIVE The purpose of this article is to outline practical steps that a department can take to transition to a peer learning model. CONCLUSION The 2015 Institute of Medicine report on improving diagnosis emphasized that organizations and industries that embrace error as an opportunity to learn tend to outperform those that do not. To meet this charge, radiology must transition from a peer review to a peer learning approach.
Collapse
|
39
|
Scali EP, Harris AC, Martin ML. Peer Review in Radiology: How Can We Learn From Our Mistakes? Can Assoc Radiol J 2017; 68:368-370. [PMID: 28818363 DOI: 10.1016/j.carj.2017.04.002] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/23/2016] [Revised: 02/24/2017] [Accepted: 04/10/2017] [Indexed: 10/19/2022] Open
Affiliation(s)
- Elena P Scali
- Department of Radiology, University of British Columbia, Vancouver, British Columbia, Canada.
| | - Alison C Harris
- Department of Radiology, University of British Columbia, Vancouver, British Columbia, Canada
| | - Michael L Martin
- Department of Radiology, University of British Columbia, Vancouver, British Columbia, Canada
| |
Collapse
|
40
|
Goldberg-Stein S, Frigini LA, Long S, Metwalli Z, Nguyen XV, Parker M, Abujudeh H. ACR RADPEER Committee White Paper with 2016 Updates: Revised Scoring System, New Classifications, Self-Review, and Subspecialized Reports. J Am Coll Radiol 2017; 14:1080-1086. [DOI: 10.1016/j.jacr.2017.03.023] [Citation(s) in RCA: 60] [Impact Index Per Article: 8.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/16/2017] [Revised: 03/16/2017] [Accepted: 03/24/2017] [Indexed: 11/26/2022]
|
41
|
Larson DB, Donnelly LF, Podberesky DJ, Merrow AC, Sharpe RE, Kruskal JB. Peer Feedback, Learning, and Improvement: Answering the Call of the Institute of Medicine Report on Diagnostic Error. Radiology 2017; 283:231-241. [DOI: 10.1148/radiol.2016161254] [Citation(s) in RCA: 78] [Impact Index Per Article: 11.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Affiliation(s)
- David B. Larson
- From the Department of Radiology, Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105 (D.B.L.); Texas Children’s Hospital, Houston, Tex (L.F.D.); Nemours Children’s Health System, Orlando, Fla (D.J.P.); Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio (A.C.M.); Kaiser Permanente, Denver, Colo (R.E.S.); and Beth Israel Deaconess Medical Center, Boston, Mass (J.B.K.)
| | - Lane F. Donnelly
- From the Department of Radiology, Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105 (D.B.L.); Texas Children’s Hospital, Houston, Tex (L.F.D.); Nemours Children’s Health System, Orlando, Fla (D.J.P.); Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio (A.C.M.); Kaiser Permanente, Denver, Colo (R.E.S.); and Beth Israel Deaconess Medical Center, Boston, Mass (J.B.K.)
| | - Daniel J. Podberesky
- From the Department of Radiology, Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105 (D.B.L.); Texas Children’s Hospital, Houston, Tex (L.F.D.); Nemours Children’s Health System, Orlando, Fla (D.J.P.); Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio (A.C.M.); Kaiser Permanente, Denver, Colo (R.E.S.); and Beth Israel Deaconess Medical Center, Boston, Mass (J.B.K.)
| | - Arnold C. Merrow
- From the Department of Radiology, Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105 (D.B.L.); Texas Children’s Hospital, Houston, Tex (L.F.D.); Nemours Children’s Health System, Orlando, Fla (D.J.P.); Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio (A.C.M.); Kaiser Permanente, Denver, Colo (R.E.S.); and Beth Israel Deaconess Medical Center, Boston, Mass (J.B.K.)
| | - Richard E. Sharpe
- From the Department of Radiology, Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105 (D.B.L.); Texas Children’s Hospital, Houston, Tex (L.F.D.); Nemours Children’s Health System, Orlando, Fla (D.J.P.); Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio (A.C.M.); Kaiser Permanente, Denver, Colo (R.E.S.); and Beth Israel Deaconess Medical Center, Boston, Mass (J.B.K.)
| | - Jonathan B. Kruskal
- From the Department of Radiology, Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105 (D.B.L.); Texas Children’s Hospital, Houston, Tex (L.F.D.); Nemours Children’s Health System, Orlando, Fla (D.J.P.); Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio (A.C.M.); Kaiser Permanente, Denver, Colo (R.E.S.); and Beth Israel Deaconess Medical Center, Boston, Mass (J.B.K.)
| |
Collapse
|
42
|
Chatterjee AR, Stalcup S, Sharma A, Sato TS, Gupta P, Lee YZ, Malone C, McBee M, Hotaling EL, Kansagra AP. Image Sharing in Radiology-A Primer. Acad Radiol 2017; 24:286-294. [PMID: 28193378 DOI: 10.1016/j.acra.2016.12.002] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/03/2016] [Revised: 12/01/2016] [Accepted: 12/02/2016] [Indexed: 12/21/2022]
Abstract
By virtue of its information technology-oriented infrastructure, the specialty of radiology is uniquely positioned to be at the forefront of efforts to promote data sharing across the healthcare enterprise, including particularly image sharing. The potential benefits of image sharing for clinical, research, and educational applications in radiology are immense. In this work, our group-the Association of University Radiologists (AUR) Radiology Research Alliance Task Force on Image Sharing-reviews the benefits of implementing image sharing capability, introduces current image sharing platforms and details their unique requirements, and presents emerging platforms that may see greater adoption in the future. By understanding this complex ecosystem of image sharing solutions, radiologists can become important advocates for the successful implementation of these powerful image sharing resources.
Collapse
|
43
|
Radiology Research in Quality and Safety: Current Trends and Future Needs. Acad Radiol 2017; 24:263-272. [PMID: 28193376 DOI: 10.1016/j.acra.2016.07.021] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/29/2016] [Revised: 07/19/2016] [Accepted: 07/19/2016] [Indexed: 01/29/2023]
Abstract
Promoting quality and safety research is now essential for radiology as reimbursement is increasingly tied to measures of quality, patient safety, efficiency, and appropriateness of imaging. This article provides an overview of key features necessary to promote successful quality improvement efforts in radiology. Emphasis is given to current trends and future opportunities for directing research. Establishing and maintaining a culture of safety is paramount to organizations wishing to improve patient care. The correct culture must be in place to support quality initiatives and create accountability for patient care. Focused educational curricula are necessary to teach quality and safety-related skills and behaviors to trainees, staff members, and physicians. The increasingly complex healthcare landscape requires that organizations build effective data infrastructures to support quality and safety research. Incident reporting systems designed specifically for medical imaging will benefit quality improvement initiatives by identifying and learning from system errors, enhancing knowledge about safety, and creating safer systems through the implementation of standardized practices and standards. Finally, validated performance measures must be developed to accurately reflect the value of the care we provide for our patients and referring providers. Common metrics used in radiology are reviewed with focus on current and future opportunities for investigation.
Collapse
|
44
|
Meaningful Peer Review in Radiology: A Review of Current Practices and Potential Future Directions. J Am Coll Radiol 2016; 13:1519-1524. [DOI: 10.1016/j.jacr.2016.08.005] [Citation(s) in RCA: 26] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/04/2016] [Revised: 08/03/2016] [Accepted: 08/04/2016] [Indexed: 11/18/2022]
|
45
|
Heller RE. An Analysis of Quality Measures in Diagnostic Radiology with Suggestions for Future Advancement. J Am Coll Radiol 2016; 13:1182-1187. [DOI: 10.1016/j.jacr.2016.05.024] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/17/2016] [Revised: 05/23/2016] [Accepted: 05/27/2016] [Indexed: 10/21/2022]
|
46
|
Harvey HB, Alkasab TK, Prabhakar AM, Halpern EF, Rosenthal DI, Pandharipande PV, Gazelle GS. Radiologist Peer Review by Group Consensus. J Am Coll Radiol 2016; 13:656-62. [DOI: 10.1016/j.jacr.2015.11.013] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2015] [Revised: 11/10/2015] [Accepted: 11/14/2015] [Indexed: 10/22/2022]
|
47
|
Kruskal JB, Eisenberg RL, Brook O, Siewert B. Transitioning from peer review to peer learning for abdominal radiologists. Abdom Radiol (NY) 2016; 41:416-28. [PMID: 26940330 DOI: 10.1007/s00261-016-0675-1] [Citation(s) in RCA: 27] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
|
48
|
O'Keeffe MM, Davis TM, Siminoski K. Performance results for a workstation-integrated radiology peer review quality assurance program. Int J Qual Health Care 2016; 28:294-8. [PMID: 26892609 DOI: 10.1093/intqhc/mzw017] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/07/2016] [Indexed: 11/14/2022] Open
Abstract
OBJECTIVE To assess review completion rates, RADPEER score distribution, and sources of disagreement when using a workstation-integrated radiology peer review program, and to evaluate radiologist perceptions of the program. DESIGN Retrospective review of prospectively collected data. SETTING Large private outpatient radiology practice. PARTICIPANTS Radiologists (n = 66) with a mean of 16.0 (standard deviation, 9.2) years of experience. INTERVENTIONS Prior studies and reports of cases being actively reported were randomly selected for peer review using the RADPEER scoring system (a 4-point scale, with a score of 1 indicating agreement and scores of 2-4 indicating increasing levels of disagreement). MAIN OUTCOME MEASURES Assigned peer review completion rates, review scores, sources of disagreement and radiologist survey responses. RESULTS Of 31 293 assigned cases, 29 044 (92.8%; 95% CI 92.5-93.1%) were reviewed. Discrepant scores (score = 2, 3 or 4) were given in 0.69% (95% CI 0.60-0.79%) of cases and clinically significant discrepancy (score = 3 or 4) was assigned in 0.42% (95% CI 0.35-0.50%). The most common cause of disagreement was missed diagnosis (75.2%; 95% CI 66.8-82.1%). By anonymous survey, 94% of radiologists felt that peer review was worthwhile, 90% reported that the scores they received were appropriate and 78% felt that the received feedback was valuable. CONCLUSION Workstation-based peer review can increase completion rates and levels of radiologist acceptance while producing RADPEER scores similar to those previously reported. This approach may be one way to increase radiologist engagement in peer review quality assurance.
Collapse
Affiliation(s)
- Margaret M O'Keeffe
- Department of Radiology and Diagnostic Imaging, University of Alberta, Edmonton, Alberta, Canada Medical Imaging Consultants, 11010-101 Street, Edmonton, Alberta, Canada T5H 4B9
| | - Todd M Davis
- Intelerad, Montreal, Quebec, Canada Present address: 295 Midpark Way SE, Suite 380, Calgary, Alberta, Canada T2X 2A8
| | - Kerry Siminoski
- Department of Radiology and Diagnostic Imaging, University of Alberta, Edmonton, Alberta, Canada Medical Imaging Consultants, 11010-101 Street, Edmonton, Alberta, Canada T5H 4B9
| |
Collapse
|
49
|
Big Data and the Future of Radiology Informatics. Acad Radiol 2016; 23:30-42. [PMID: 26683510 DOI: 10.1016/j.acra.2015.10.004] [Citation(s) in RCA: 59] [Impact Index Per Article: 7.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2015] [Revised: 10/08/2015] [Accepted: 10/14/2015] [Indexed: 01/18/2023]
Abstract
Rapid growth in the amount of data that is electronically recorded as part of routine clinical operations has generated great interest in the use of Big Data methodologies to address clinical and research questions. These methods can efficiently analyze and deliver insights from high-volume, high-variety, and high-growth rate datasets generated across the continuum of care, thereby forgoing the time, cost, and effort of more focused and controlled hypothesis-driven research. By virtue of an existing robust information technology infrastructure and years of archived digital data, radiology departments are particularly well positioned to take advantage of emerging Big Data techniques. In this review, we describe four areas in which Big Data is poised to have an immediate impact on radiology practice, research, and operations. In addition, we provide an overview of the Big Data adoption cycle and describe how academic radiology departments can promote Big Data development.
Collapse
|
50
|
Abstract
Peer review in radiology means an assessment of the accuracy of a report issued by another radiologist. Inevitably, this involves a judgement opinion from the reviewing radiologist. Peer feedback is the means by which any form of peer review is communicated back to the original author of the report. This article defines terms, discusses the current status, identifies problems, and provides some recommendations as to the way forward, concentrating upon the software requirements for efficient peer review and peer feedback of reported imaging studies. Radiologists undertake routine peer review in their everyday clinical practice, particularly when reporting and preparing for multidisciplinary team meetings. More formal peer review of reported imaging studies has been advocated as a quality assurance measure to promote good clinical practice. It is also a way of assessing the competency of reporting radiologists referred for investigation to bodies such as the General Medical Council (GMC). The literature shows, firstly, that there is a very wide reported range of discrepancy rates in many studies, which have used a variety of non-comparable methodologies; and secondly, that applying scoring systems in formal peer review is often meaningless, unhelpful, and can even be detrimental. There is currently a lack of electronic peer feedback system software on the market to inform radiologists of any review of their work that has occurred or to provide them with clinical outcome information on cases they have previously reported. Learning opportunities are therefore missed. Radiologists should actively engage with the medical informatics industry to design optimal peer review and feedback software with features to meet their needs. Such a system should be easy to use, be fully integrated with the radiological information and picture archiving systems used clinically, and contain a free-text comment box, without a numerical scoring system. It should form a temporary record that cannot be permanently archived. It must provide automated feedback to the original author. Peer feedback, as part of everyday reporting, should enhance daily learning for radiologists. Software requirements for everyday peer feedback differ from those needed for a formal peer review process, which might only be necessary in the setting of a formal GMC enquiry into a particular radiologist's reporting competence, for example.
Collapse
Affiliation(s)
- N H Strickland
- Imaging Department, Hammersmith Hospital, Imperial College Healthcare NHS Trust, Du Cane Road, London W12 0HS, UK.
| |
Collapse
|