1
|
McKnight L, Schultz A, Vidic N, Palmer EE, Jaffe A. Learning to make a difference for chILD: Value creation through network collaboration and team science. Pediatr Pulmonol 2024; 59:2257-2266. [PMID: 36855907 DOI: 10.1002/ppul.26377] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/01/2022] [Revised: 01/20/2023] [Accepted: 02/24/2023] [Indexed: 03/02/2023]
Abstract
Addressing the recognized challenges and inequalities in providing high quality healthcare for rare diseases such as children's interstitial lung disease (chILD) requires collaboration across institutional, geographical, discipline, and system boundaries. The Children's Interstitial Lung Disease Respiratory Network of Australia and New Zealand (chILDRANZ) is an example of a clinical network that brings together multidisciplinary health professionals for collaboration, peer learning, and advocacy with the goal of improving the diagnosis and management of this group of rare and ultra-rare conditions. This narrative review explores the multifaceted benefits arising from social learning spaces within rare disease clinical networks by applying the value creation framework. The operation of the chILDRANZ network is used as an example across the framework to highlight how value is generated, realized, and transferred within such collaborative clinical and research networks. The community of practice formed in the chILDRANZ multidisciplinary meetings provides a strong example of social learning that engages with the uncertainty inherent in rare disease diagnosis and management and pays attention to generate new knowledge and best practice to make a difference for children and families living with chILD. This review underscores international calls for further investment in, and support of, collaborative clinical networks and virtual centers of excellence for rare disease.
Collapse
Affiliation(s)
- Lauren McKnight
- Discipline of Paediatrics and Child Health, School of Clinical Medicine, UNSW Sydney, Kensington, New South Wales, Australia
| | - André Schultz
- Telethon Kids Institute, University of Western Australia, Perth, Western Australia, Australia
- Department of Respiratory and Sleep Medicine, Perth Children's Hospital, Perth, Western Australia, Australia
| | - Nada Vidic
- Discipline of Paediatrics and Child Health, School of Clinical Medicine, UNSW Sydney, Kensington, New South Wales, Australia
| | - Elizabeth E Palmer
- Discipline of Paediatrics and Child Health, School of Clinical Medicine, UNSW Sydney, Kensington, New South Wales, Australia
- Centre for Clinical Genetics, Sydney Children's Hospital, Randwick, New South Wales, Australia
| | - Adam Jaffe
- Discipline of Paediatrics and Child Health, School of Clinical Medicine, UNSW Sydney, Kensington, New South Wales, Australia
- Respiratory Department, Sydney Children's Hospital, Randwick, New South Wales, Australia
| |
Collapse
|
2
|
Larson DB. Invited Commentary: Understanding and Addressing Cognitive Biases in Radiology. Radiographics 2024; 44:e230244. [PMID: 38843096 DOI: 10.1148/rg.230244] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 08/21/2024]
Affiliation(s)
- David B Larson
- From the Department of Radiology, Stanford University Medical Center, 453 Quarry Rd, MC 5659, Stanford, CA 94305
| |
Collapse
|
3
|
Coelho FMA, Baroni RH. Strategies for improving image quality in prostate MRI. Abdom Radiol (NY) 2024:10.1007/s00261-024-04396-4. [PMID: 38940911 DOI: 10.1007/s00261-024-04396-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/31/2024] [Revised: 05/15/2024] [Accepted: 05/17/2024] [Indexed: 06/29/2024]
Abstract
Prostate magnetic resonance imaging (MRI) stands as the cornerstone in diagnosing prostate cancer (PCa), offering superior detection capabilities while minimizing unnecessary biopsies. Despite its critical role, global disparities in MRI diagnostic performance persist, stemming from variations in image quality and radiologist expertise. This manuscript reviews the challenges and strategies for enhancing image quality in prostate MRI, spanning patient preparation, MRI unit optimization, and radiology team engagement. Quality assurance (QA) and quality control (QC) processes are pivotal, emphasizing standardized protocols, meticulous patient evaluation, MRI unit workflow, and radiology team performance. Additionally, artificial intelligence (AI) advancements offer promising avenues for improving image quality and reducing acquisition times. The Prostate-Imaging Quality (PI-QUAL) scoring system emerges as a valuable tool for assessing MRI image quality. A comprehensive approach addressing technical, procedural, and interpretative aspects is essential to ensure consistent and reliable prostate MRI outcomes.
Collapse
Affiliation(s)
| | - Ronaldo Hueb Baroni
- Department of Radiology, Hospital Israelita Albert Einstein, 627 Albert Einstein Ave., Sao Paulo, SP, 05652-900, Brazil.
| |
Collapse
|
4
|
Butler JM, Taft T, Taber P, Rutter E, Fix M, Baker A, Weir C, Nevers M, Classen D, Cosby K, Jones M, Chapman A, Jones BE. Pneumonia diagnosis performance in the emergency department: a mixed-methods study about clinicians' experiences and exploration of individual differences and response to diagnostic performance feedback. J Am Med Inform Assoc 2024; 31:1503-1513. [PMID: 38796835 PMCID: PMC11187426 DOI: 10.1093/jamia/ocae112] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/14/2023] [Revised: 03/25/2024] [Accepted: 05/22/2024] [Indexed: 05/29/2024] Open
Abstract
OBJECTIVES We sought to (1) characterize the process of diagnosing pneumonia in an emergency department (ED) and (2) examine clinician reactions to a clinician-facing diagnostic discordance feedback tool. MATERIALS AND METHODS We designed a diagnostic feedback tool, using electronic health record data from ED clinicians' patients to establish concordance or discordance between ED diagnosis, radiology reports, and hospital discharge diagnosis for pneumonia. We conducted semistructured interviews with 11 ED clinicians about pneumonia diagnosis and reactions to the feedback tool. We administered surveys measuring individual differences in mindset beliefs, comfort with feedback, and feedback tool usability. We qualitatively analyzed interview transcripts and descriptively analyzed survey data. RESULTS Thematic results revealed: (1) the diagnostic process for pneumonia in the ED is characterized by diagnostic uncertainty and may be secondary to goals to treat and dispose the patient; (2) clinician diagnostic self-evaluation is a fragmented, inconsistent process of case review and follow-up that a feedback tool could fill; (3) the feedback tool was described favorably, with task and normative feedback harnessing clinician values of high-quality patient care and personal excellence; and (4) strong reactions to diagnostic feedback varied from implicit trust to profound skepticism about the validity of the concordance metric. Survey results suggested a relationship between clinicians' individual differences in learning and failure beliefs, feedback experience, and usability ratings. DISCUSSION AND CONCLUSION Clinicians value feedback on pneumonia diagnoses. Our results highlight the importance of feedback about diagnostic performance and suggest directions for considering individual differences in feedback tool design and implementation.
Collapse
Affiliation(s)
- Jorie M Butler
- Department of Biomedical Informatics, University of Utah Spencer Fox Eccles School of Medicine, Salt Lake City, UT 84108, United States
- Department of Internal Medicine, Division of Geriatrics, University of Utah Spencer Fox Eccles School of Medicine, Salt Lake City, UT 84132, United States
- Salt Lake City VA Informatics Decision-Enhancement and Analytic Sciences (IDEAS) Center for Innovation, Salt Lake City, UT 84148, United States
- Geriatrics Research, Education, and Clinical Center (GRECC), VA Salt Lake City Health Care System, Salt Lake City, UT 84148, United States
| | - Teresa Taft
- Department of Biomedical Informatics, University of Utah Spencer Fox Eccles School of Medicine, Salt Lake City, UT 84108, United States
| | - Peter Taber
- Department of Biomedical Informatics, University of Utah Spencer Fox Eccles School of Medicine, Salt Lake City, UT 84108, United States
- Salt Lake City VA Informatics Decision-Enhancement and Analytic Sciences (IDEAS) Center for Innovation, Salt Lake City, UT 84148, United States
| | - Elizabeth Rutter
- Department of Emergency Medicine, University of Utah Spencer Fox Eccles School of Medicine, Salt Lake City, UT 84108, United States
| | - Megan Fix
- Department of Emergency Medicine, University of Utah Spencer Fox Eccles School of Medicine, Salt Lake City, UT 84108, United States
| | - Alden Baker
- Department of Family and Preventive Medicine, Division of Physician Assistant Studies, University of Utah Spencer Fox Eccles School of Medicine, Salt Lake City, UT 84108, United States
| | - Charlene Weir
- Department of Biomedical Informatics, University of Utah Spencer Fox Eccles School of Medicine, Salt Lake City, UT 84108, United States
| | - McKenna Nevers
- Department of Internal Medicine, Division of Epidemiology, University of Utah Spencer Fox Eccles School of Medicine, Salt Lake City, UT 84108, United States
| | - David Classen
- Department of Internal Medicine, Division of Epidemiology, University of Utah Spencer Fox Eccles School of Medicine, Salt Lake City, UT 84108, United States
| | - Karen Cosby
- Department of Emergency Medicine, Cook County Hospital, Rush Medical College, Chicago, IL 60612, United States
| | - Makoto Jones
- Salt Lake City VA Informatics Decision-Enhancement and Analytic Sciences (IDEAS) Center for Innovation, Salt Lake City, UT 84148, United States
- Department of Internal Medicine, Division of Epidemiology, University of Utah Spencer Fox Eccles School of Medicine, Salt Lake City, UT 84108, United States
| | - Alec Chapman
- Department of Population Health Sciences, University of Utah Spencer Fox Eccles School of Medicine, Salt Lake City, UT 84108, United States
| | - Barbara E Jones
- Salt Lake City VA Informatics Decision-Enhancement and Analytic Sciences (IDEAS) Center for Innovation, Salt Lake City, UT 84148, United States
- Department of Internal Medicine, Division of Pulmonology, University of Utah Spencer Fox Eccles School of Medicine, Salt Lake City, UT 84108, United States
| |
Collapse
|
5
|
Photopoulos GS, Wilson DS, Clarke SE, Costa AF. Reinterpretation of Hepatopancreaticobiliary Imaging Exams: Assessment of Clinical Impact, Peer Learning, and Physician Satisfaction. Acad Radiol 2024; 31:1870-1877. [PMID: 38052671 DOI: 10.1016/j.acra.2023.10.047] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/03/2023] [Revised: 10/25/2023] [Accepted: 10/25/2023] [Indexed: 12/07/2023]
Abstract
OBJECTIVES To assess the impact on clinical management, potential for peer learning, and referring physician satisfaction with subspecialist reinterpretations of hepatopancreaticobiliary (HPB) imaging examinations. MATERIALS AND METHODS HPB CTs and MRIs from outside hospitals were reinterpreted by two subspecialty radiologists between March 2021 and August 2022. Reinterpretation reports were mailed to radiologists that issued primary reports. The electronic record was reviewed to assess for changes in clinical management based on the reinterpretations (yes/no/unavailable). To assess the potential for peer learning, a survey using a 5-point Likert scale was sent to radiologists who issued primary reports. A separate survey was sent to referring physicians to assess satisfaction with reinterpretations. RESULTS Two hundred fifty imaging examinations (122 CT, 128 MRI) were reinterpreted at the request of 19 referring physicians. Ninety-six radiologists issued primary reports. RADPEER scores 1-3 were assigned to 131/250 (52%), 86/250 (34%), and 33/250 (13%) examinations, respectively. Of 213 reinterpretations with adequate records for assessment, 75/213 (35%) were associated with a change in management; of these, 71/75 (95%) were classified as RADPEER 2 or 3. Most radiologists agreed or strongly agreed with the following: prefer to receive reinterpretations (34/36, 94%); reinterpretations changed practice of reporting HPB imaging examinations (23/36, 64%); and reinterpretations offer opportunities for peer learning (34/36, 94%). Referring physicians agreed or strongly agreed (7/7, 100%) that reinterpretations are valuable and often change or clarify management of patients with complex HPB disease, and offer an opportunity for peer learning. CONCLUSION Radiologists and referring physicians strongly agree that HPB imaging reinterpretations help support peer learning and patient management, respectively.
Collapse
Affiliation(s)
- Gregory S Photopoulos
- Faculty of Medicine, Dalhousie University, 5849 University Avenue, Halifax, NS B3H 4R2, Canada (G.S.P., D.S.W., S.E.C., A.F.C.)
| | - Darcie S Wilson
- Faculty of Medicine, Dalhousie University, 5849 University Avenue, Halifax, NS B3H 4R2, Canada (G.S.P., D.S.W., S.E.C., A.F.C.)
| | - Sharon E Clarke
- Faculty of Medicine, Dalhousie University, 5849 University Avenue, Halifax, NS B3H 4R2, Canada (G.S.P., D.S.W., S.E.C., A.F.C.); Department of Diagnostic Radiology, Queen Elizabeth II Health Sciences Centre, Victoria General Building, 3rd floor, 1276 South Park Street, Halifax, NS B3H 2Y9, Canada (S.E.C., A.F.C.)
| | - Andreu F Costa
- Faculty of Medicine, Dalhousie University, 5849 University Avenue, Halifax, NS B3H 4R2, Canada (G.S.P., D.S.W., S.E.C., A.F.C.); Department of Diagnostic Radiology, Queen Elizabeth II Health Sciences Centre, Victoria General Building, 3rd floor, 1276 South Park Street, Halifax, NS B3H 2Y9, Canada (S.E.C., A.F.C.).
| |
Collapse
|
6
|
Donnelly LF, Guimaraes CV. Event-Based Learning and Improvement: Radiology's Move From Peer Review to Peer Learning. Semin Ultrasound CT MR 2024; 45:161-169. [PMID: 38373672 DOI: 10.1053/j.sult.2024.02.005] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/21/2024]
Abstract
Over the past 15 years, the radiology community has made great progress moving from a system of score-based peer review to one of peer learning. Much has been learned along the way. In peer learning, cases in which learning opportunities are identified are reviewed solely for the purpose of fostering learning and improvement. This article defines peer learning and peer review and emphasizes the difference; looks back at the 20-year history of score-based peer review and transition to peer learning; outlines the problems with score-based peer review and the key elements of peer learning; discusses the current state of peer learning; and outlines future challenges and opportunities.
Collapse
Affiliation(s)
- Lane F Donnelly
- Department of Radiology, University of North Carolina School of Medicine, Chapel Hill, NC; Department of Pediatrics, University of North Carolina School of Medicine, Chapel Hill, NC.
| | - Carolina V Guimaraes
- Department of Radiology, University of North Carolina School of Medicine, Chapel Hill, NC
| |
Collapse
|
7
|
Czerminski J, Pahade JK, Davis MA, Mezrich JL. The disproportionate impact of peer learning on emergency radiology. Emerg Radiol 2024; 31:133-139. [PMID: 38261134 DOI: 10.1007/s10140-024-02207-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/30/2023] [Accepted: 01/17/2024] [Indexed: 01/24/2024]
Abstract
PURPOSE The use of peer learning methods in radiology continues to grow as a means to constructively learn from past mistakes. This study examined whether emergency radiologists receive a disproportionate amount of peer learning feedback entered as potential learning opportunities (PLO), which could play a significant role in stress and career satisfaction. Our institution offers 24/7 attending coverage, with emergency radiologists interpreting a wide range of X-ray, ultrasound and CT exams on both adults and pediatric patients. MATERIALS AND METHODS Peer learning submissions entered as PLO at a single large academic medical center over a span of 3 years were assessed by subspecialty distribution and correlated with the number of attending radiologists in each section. Total number of studies performed on emergency department patients and throughout the hospital system were obtained for comparison purposes. Data was assessed using analysis of variance and post hoc analysis. RESULTS Emergency radiologists received significantly more (2.5 times) PLO submissions than the next closest subspeciality division and received more yearly PLO submissions per attending compared to other subspeciality divisions. This was found to still be true when normalizing for increased case volumes; Emergency radiologists received more PLO submissions per 1000 studies compared to other divisions in our department (1.59 vs. 0.85, p = 0.04). CONCLUSION Emergency radiologists were found to receive significantly more PLO submissions than their non-emergency colleagues. Presumed causes for this discrepancy may include a higher error rate secondary to wider range of studies interpreted, demand for shorter turn-around times, higher volumes of exams read per shift, and hindsight bias in the setting of follow-up review.
Collapse
Affiliation(s)
- Jan Czerminski
- Department of Radiology and Biomedical Imaging, Yale School of Medicine, 333 Cedar Street, TE2, New Haven, CT, 06520, USA
| | - Jay K Pahade
- Department of Radiology and Biomedical Imaging, Yale School of Medicine, 333 Cedar Street, TE2, New Haven, CT, 06520, USA
| | - Melissa A Davis
- Department of Radiology and Biomedical Imaging, Yale School of Medicine, 333 Cedar Street, TE2, New Haven, CT, 06520, USA
| | - Jonathan L Mezrich
- Department of Radiology and Biomedical Imaging, Yale School of Medicine, 333 Cedar Street, TE2, New Haven, CT, 06520, USA.
| |
Collapse
|
8
|
Kwak DH, Yang L, Hu-Wang E, Seetharam S, Nijhawan K, Chung JH, Patel P. Peer learning is both preferable and less expensive than score-based peer review: Initial experience at a tertiary academic center. Clin Imaging 2024; 106:110065. [PMID: 38113549 DOI: 10.1016/j.clinimag.2023.110065] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2023] [Revised: 12/07/2023] [Accepted: 12/10/2023] [Indexed: 12/21/2023]
Abstract
PURPOSE To examine radiologist experiences and perceptions during a transition from score-based peer review to a peer learning program, and to assess differences in time-cost efficiency between the two models of quality improvement. METHODS Differences in Likert scale survey responses from radiologists (N = 27) in a multispecialty group at a single tertiary academic center before and following intervention were evaluated by Mann-Whitney U test. Multiple variable linear regression analysis assessed independent variables and program preference. RESULTS All positive impacts rated significantly higher for the peer learning program. Workflow disruption for the peer learning program rated significantly lower. 70.4 % (19 of 27) preferred the new program, and 25.9 % (7 of 27) preferred the old program. Only the "worth investment" questionnaire score demonstrated a significant correlation to program preference and with an effect that was greatest among all variables (Beta = 1.11, p = 0.02). There was a significantly decreased amount of time per month used to complete peer learning exercises (0.76 ± 0.45 h, N = 27) versus peer review exercises (1.71 ± 1.84 h, N = 34, p = 0.011). The result was a difference of 0.95 ± 1.89 h/month (11.4 ± 22.7 h/year), translating to an estimated direct salary time-cost saving of $1653.68/year/radiologists and a direct productivity time-cost saving of $3469.39/year/radiologist when utilizing the peer learning program. CONCLUSIONS There was a strongly positive perception of the new peer learning program. There was a substantial implied direct time-cost saving from the transition to the peer learning program. PRECIS The peer learning model emphasizes learning from errors via feedback in a non-punitive environment. This model was positively perceived and demonstrated substantial implied direct time-cost saving.
Collapse
Affiliation(s)
- Daniel H Kwak
- Department of Radiology, The University of Chicago Medical Center, 5841 S. Maryland Ave, Chicago, IL 60637, United States of America.
| | - Lindsay Yang
- Department of Radiology, The University of Chicago Medical Center, 5841 S. Maryland Ave, Chicago, IL 60637, United States of America
| | - Eileen Hu-Wang
- Department of Radiology, The University of Chicago Medical Center, 5841 S. Maryland Ave, Chicago, IL 60637, United States of America
| | - Sachin Seetharam
- Department of Radiology, The University of Chicago Medical Center, 5841 S. Maryland Ave, Chicago, IL 60637, United States of America
| | - Karan Nijhawan
- Department of Radiology, The University of Chicago Medical Center, 5841 S. Maryland Ave, Chicago, IL 60637, United States of America
| | - Jonathan H Chung
- Department of Radiology, The University of Chicago Medical Center, 5841 S. Maryland Ave, Chicago, IL 60637, United States of America
| | - Pritesh Patel
- Department of Radiology, Medical College of Wisconsin, 9200 W. Wisconsin Ave, Milwaukee, WI 53226, United States of America
| |
Collapse
|
9
|
Panagides JC, Hancel K, Kalva S, Schenker M, Saini S, Glazer DI, Khorasani R, Daye D. Interventional Radiology Peer Learning Platform and Adverse Event Reporting (IR-PEER): Initial Experience Implementing a Team-based Novel Peer Learning System in Interventional Radiology. J Am Coll Radiol 2024; 21:93-102. [PMID: 37659453 DOI: 10.1016/j.jacr.2023.07.022] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2023] [Revised: 07/21/2023] [Accepted: 07/26/2023] [Indexed: 09/04/2023]
Abstract
Although the transition from peer review to peer learning has had favorable outcomes in diagnostic radiology, experience with implementing a team-based peer review system in interventional radiology (IR) remains limited. Peer learning systems benefit diverse IR teams composed of multiple clinical roles and could contribute value in archiving events that have potential educational value. With multiple stakeholder input from clinical roles within the IR division at our institution (ie, radiologic technologists, nurses, advanced practice providers, residents, fellows, and attending physicians), we launched a HIPAA-compliant secure IR complication and learning opportunity reporting platform in April 2022. Case submissions were monitored over the subsequent 24 weeks, with monthly dashboard reports provided to departmental leadership. Preintervention and postintervention surveys were used to assess the impact of the peer learning platform and adverse event reporting in IR (IR-PEER) on perceptions of complication reporting in the IR division across clinical roles. Ninety-two peer learning submissions were collected for a weekly average ± standard error of 3.8 ± 0.6 submissions per week, and an additional 26 submissions were collected as part of the division's ongoing monthly complication review conference, for a total of 98 unique total case references. A total of 64.1% of submissions (59 of 92) involved a complication and/or adverse event, and 35.9% of submissions (33 of 92) identified a learning opportunity (no complication or adverse event). Nurses reported that IR-PEER made the complication-reporting process easier (P = .01), and all clinical roles reported that IR-PEER improved the overall process of complication reporting. Peer learning frameworks such as IR-PEER provide a more equitable communication platform for multidisciplinary teams to capture and archive learning opportunities that support quality and safety improvement efforts.
Collapse
Affiliation(s)
| | - Kayesha Hancel
- Department of Interventional Radiology, Massachusetts General Hospital, Boston, Massachusetts
| | - Sanjeeva Kalva
- Department of Interventional Radiology, Massachusetts General Hospital, Boston, Massachusetts
| | - Matthew Schenker
- Department of Radiology, Brigham and Women's Hospital, Boston, Massachusetts
| | - Sanjay Saini
- Department of Radiology, Brigham and Women's Hospital, Boston, Massachusetts
| | - Daniel I Glazer
- Department of Radiology, Brigham and Women's Hospital, Boston, Massachusetts
| | - Ramin Khorasani
- Department of Radiology, Brigham and Women's Hospital, Boston, Massachusetts
| | - Dania Daye
- Department of Interventional Radiology, Massachusetts General Hospital, Boston, Massachusetts.
| |
Collapse
|
10
|
Donnelly LF, Podberesky DJ, Towbin AJ, Loh L, Basta KH, Platchek TS, Vossmeyer MT, Shook JE. The Joint Commission's Ongoing Professional Practice Evaluation Process: Costly, Ineffective, and Potentially Harmful to Safety Culture. J Am Coll Radiol 2024; 21:61-69. [PMID: 37683817 DOI: 10.1016/j.jacr.2023.08.031] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2023] [Revised: 08/14/2023] [Accepted: 08/30/2023] [Indexed: 09/10/2023]
Abstract
OBJECTIVE To evaluate the estimated labor costs and effectiveness of Ongoing Professional Practice Evaluation (OPPE) processes at identifying outlier performers in a large sample of providers across multiple health care systems and to extrapolate costs and effectiveness nationally. METHODS Six hospital systems partnered to evaluate their labor expenses related to conducting OPPE. Estimates for mean labor hours and wages were created for the following: data analysts, medical staff office professionals, department physician leaders, and administrative assistants. The total number of outlier performers who were identified by OPPE metrics alone and that resulted in lack of renewal, limitation, or revoking of hospital privileges during the past annual OPPE cycle (2022) was recorded. National costs of OPPE were extrapolated. Literature review of the effect of OPPE on safety culture in radiology was performed. RESULTS The evaluated systems had 12,854 privileged providers evaluated by OPPE. The total estimated annual recurring labor cost per provider was $50.20. Zero of 12,854 providers evaluated were identified as outlier performers solely through the OPPE process. The total estimated annual recurring cost of administering OPPE nationally was $78.54 million. In radiology over the past 15 years, the use of error rates based on score-based peer review as an OPPE metric has been perceived as punitive and had an adverse effect on safety culture. CONCLUSION OPPE is expensive to administer, inefficient at identifying outlier performers, diverts human resources away from potentially more effective improvement work, and has been associated with an adverse impact on safety culture in radiology.
Collapse
Affiliation(s)
- Lane F Donnelly
- Professor of Radiology and Pediatrics, Departments of Radiology and Pediatrics, University of North Carolina School of Medicine, Chapel Hill, North Carolina; Executive Medical Director, Pediatric Population Health and Quality, UNC Health; Director of Quality, UNC Children's Hospital; member, ACR Peer Learning Committee.
| | - Daniel J Podberesky
- Vice President and Chief Medical Officer, Nemours Children's Health, Orlando, Florida, and Professor of Radiology, University of Central Florida, College of Medicine, Orlando, Florida
| | - Alexander J Towbin
- Associate Chief, Associate Chief Medical Information Officer, and Neil D. Johnson Chair of Radiology Informatics, Department of Radiology, Cincinnati Children's Hospital, Cincinnati, Ohio; Professor of Radiology, Department of Radiology, University of Cincinnati College of Medicine, Cincinnati, Ohio; ACR Roles: Informatics Commission, Councilor-at-Large (2023), Data Science Institute Non-Interpretive Panel Cochair, LI-RADS Steering Committee-Pediatric LI-RADS, Relevance and Impact Workgroup, Pediatric Measures Committee, ACR Annual Meeting Abstract Reviewers, Pediatric AI Workgroup
| | - Ling Loh
- Director, Analytics and Clinical Effectiveness, Center for Pediatric and Maternal Value, Stanford Medicine Children's Health, Palo Alto, California
| | - Kathryne H Basta
- Assistant Director, Quality and Patient Safety, Department of Quality and Safety, Texas Children's Hospital, Houston, Texas
| | - Terry S Platchek
- Vice President for Performance Improvement and Associate Chief Quality Officer, Center for Pediatric and Maternal Value, Stanford Medicine Children's Health, Palo Alto, California; Professor, Pediatrics and Internal Medicine, and Fellowship Director, Clinical Excellence Research Center, Department of Pediatrics, Stanford University School of Medicine, Palo Alto, California
| | - Michael T Vossmeyer
- Department of Pediatrics, Cincinnati Children's Hospital, Cincinnati, Ohio; Associate Professor, Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, Ohio; Chair, Utilization Review Committee; Chair, Focused Professional Practice Evaluation/OPPE Committee; member, Credentials Committee; member, Medical Executive Committee, Cincinnati Children's Hospital
| | - Joan E Shook
- Center for Pediatric and Maternal Value, Stanford Medicine Children's Health, Palo Alto, California; Professor of Pediatrics-Emergency Medicine, Department of Pediatrics, Baylor College of Medicine, Houston, Texas; Chief Safety Officer, Deputy Chief Quality Officer, Texas Children's Hospital
| |
Collapse
|
11
|
Goldberg-Stein S, Bhargavan-Chatfield M, Donnelly LF, Hernandez D, Kunst MM, Sharpe RE, Broder J. Applying Implementation Science Principles to Design the ACR Peer Learning Pathway: A Case Study. J Am Coll Radiol 2024; 21:103-106. [PMID: 37944877 DOI: 10.1016/j.jacr.2023.11.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2023] [Revised: 11/01/2023] [Accepted: 11/03/2023] [Indexed: 11/12/2023]
Affiliation(s)
- Shlomit Goldberg-Stein
- Associate Professor at Northwell Health of Hofstra Medical School, New York, New York; Chair, ACR Quality and Safety & Informatics Annual Conference Committee; Co-Chair of New York State Radiological Society Quality and Safety Committee.
| | | | - Lane F Donnelly
- Executive Medical Director of Pediatric Population Health and Quality, and Director of Children's Quality, University of North Carolina, Chapel Hill, North Carolina
| | - Dina Hernandez
- Senior Director for Accreditation, American College of Radiology, Reston, Virginia
| | - Mara M Kunst
- Neuroradiology Section Head, Beth Israel Lahey Health, Burlington, Massachusetts
| | - Richard E Sharpe
- Division Chair of Breast Imaging, Mayo Clinic, Phoenix, Arizona. https://twitter.com/RichSharpeJr
| | - Jennifer Broder
- Vice Chair, Radiology Quality and Safety, Lahey Hospital and Medical Center, Burlington, Massachusetts; Vice Chair, ACR Commission on Quality and Safety; and Chair, ACR Peer Learning Committee. https://twitter.com/jcbroderMD
| |
Collapse
|
12
|
Mani K, Shah K, Kadom N, Seidenwurm D, Nemeth AJ. Peer Learning in Neuroradiology: Not as Easy as It Sounds. AJNR Am J Neuroradiol 2023; 44:1109-1115. [PMID: 37793783 PMCID: PMC10549937 DOI: 10.3174/ajnr.a7973] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2023] [Accepted: 07/21/2023] [Indexed: 10/06/2023]
Affiliation(s)
- K Mani
- University Radiology GroupRutgers University School of MedicineNewark, New Jersey
| | - K Shah
- MD Anderson Cancer CenterHouston, Texas
| | - N Kadom
- Emory University School of MedicineChildren's Healthcare of AtlantaAtlanta, Georgia
| | | | - A J Nemeth
- Northwestern University, Feinberg School of MedicineNorthwestern Memorial HospitalChicago, Illinois
| |
Collapse
|
13
|
Parrott EH, Saeedipour S, Walker CM, Best SR, Harn NR, Ash RM. Transition from Peer Review to Peer Learning: Lessons Learned. Curr Probl Diagn Radiol 2023; 52:223-229. [PMID: 37069021 DOI: 10.1067/j.cpradiol.2023.03.003] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/02/2023] [Accepted: 03/16/2023] [Indexed: 03/30/2023]
Abstract
Landmark publications, such as To Err is Human, confronted the healthcare community with the egregious toll medical errors played in both patient safety and overall healthcare costs. This heralded a paradigm shift and a call for action by professional organizations to enact methods to ensure physician competency and quality assurance. The American College of Radiology similarly convened a task force to discuss these concerns and how best to address quality assurance in radiology practice, leading to the development of RADPEER, a score-based peer review system. However, critics were quick to point out the deficiencies of this model, highlighting it as punitive and a poor evaluator of physician performance. The recognized deficiencies in score-based peer review prompted the pursuit of an alternate model that would instead emphasize learning and improvement. Peer learning was proposed and highlighted the necessity of an inclusive and collaborative environment where colleagues could discuss case errors as learning opportunities without fear of punitive consequence. This paper explores peer learning, its benefits and challenges, as well as how to identify specific learning opportunities by utilizing case examples.
Collapse
|
14
|
DiPiro PJ, Licaros A, Zhao AH, Glazer DI, Healey MJ, Curley PJ, Giess CS, Khorasani R. Frequency and Clinical Utility of Alerts for Intra-Institutional Radiologist Discrepant Opinions. J Am Coll Radiol 2023; 20:431-437. [PMID: 36841320 DOI: 10.1016/j.jacr.2022.12.021] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/12/2022] [Revised: 12/15/2022] [Accepted: 12/17/2022] [Indexed: 02/26/2023]
Abstract
OBJECTIVE Determine the rate of documented notification, via an alert, for intra-institutional discrepant radiologist opinions and addended reports and resulting clinical management changes. METHODS This institutional review board-exempt, retrospective study was performed at a large academic medical center. We defined an intra-institutional discrepant opinion as when a consultant radiologist provides a different interpretation from that formally rendered by a colleague at our institution. We implemented a discrepant opinion policy requiring closed-loop notification of the consulting radiologist's second opinion to the original radiologist, who must acknowledge this alert within 30 days. This study included all discrepant opinion alerts created December 1, 2019, to December 31, 2021, of which two radiologists and an internal medicine physician performed consensus review. Primary outcomes were degree of discrepancy and percent of discrepant opinions leading to change in clinical management. Secondary outcome was report addendum rate compared with an existing peer learning program using Fisher's exact test. RESULTS Of 114 discrepant opinion alerts among 1,888,147 reports generated during the study period (0.006%), 58 alerts were categorized as major (50.9%), 41 as moderate (36.0%), and 15 as minor discrepancies (13.1%). Clinical management change occurred in 64 of 114 cases (56.1%). Report addendum rate for discrepant opinion alerts was 4-fold higher than for peer learning alerts at our institution (66 of 315 = 21% versus 432 of 8,273 =5.2%; P < .0001). DISCUSSION Although discrepant intra-institutional radiologist second opinions were rare, they frequently led to changes in clinical management. Capturing these discrepancies by encouraging alert use may help optimize patient care and document what was communicated to the referring or consulting care team by consulting radiologists.
Collapse
Affiliation(s)
- Pamela J DiPiro
- Radiology Quality and Safety Officer, Department of Radiology, Center for Evidence-Based Imaging, Brigham and Women's Hospital, Boston, Massachusetts; Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts; and Department of Imaging, Dana-Farber Cancer Institute, Harvard Medical School, Boston, Massachusetts.
| | - Andro Licaros
- Department of Radiology, Center for Evidence-Based Imaging, Brigham and Women's Hospital, Boston, Massachusetts; and Oncologic Imaging Fellow, Department of Imaging, Dana-Farber Cancer Institute, Harvard Medical School, Boston, Massachusetts
| | - Anna H Zhao
- Department of Radiology, Center for Evidence-Based Imaging, Brigham and Women's Hospital, Boston, Massachusetts; and Radiology Resident, Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts
| | - Daniel I Glazer
- Department of Radiology, Center for Evidence-Based Imaging, Brigham and Women's Hospital, Boston, Massachusetts; Medical Director of CT and Director, Cross-Sectional Interventional Radiology (CSIR), Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts; and Department of Imaging, Dana-Farber Cancer Institute, Harvard Medical School, Boston, Massachusetts
| | - Michael J Healey
- Department of Radiology, Center for Evidence-Based Imaging, Brigham and Women's Hospital, Boston, Massachusetts; and Associate Chief Medical Officer, Department of Medicine, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts
| | - Patrick J Curley
- Executive Director, Quality, Safety, Equity & Experience, Enterprise Radiology, Department of Radiology, Center for Evidence-Based Imaging, Brigham and Women's Hospital, Boston, Massachusetts
| | - Catherine S Giess
- Department of Radiology, Center for Evidence-Based Imaging, Brigham and Women's Hospital, Boston, Massachusetts; Deputy Chair, Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts; and Department of Imaging, Dana-Farber Cancer Institute, Harvard Medical School, Boston, Massachusetts
| | - Ramin Khorasani
- Vice Chair, Radiology Quality and Safety, Mass General Brigham; Vice Chair, Department of Radiology; Distinguished Chair, Medical Informatics; Director, Center for Evidence Based Imaging; Department of Radiology, Center for Evidence-Based Imaging, Brigham and Women's Hospital, Boston, Massachusetts; Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts; and Department of Imaging, Dana-Farber Cancer Institute, Harvard Medical School, Boston, Massachusetts
| |
Collapse
|
15
|
Morris DV, Sasson AL, Delman BN, Margolies LR. Investing in Peer Learning as a Qualifying Assessment Model in Breast Imaging: A Paradigm Shift from Peer Review to Peer Learning. CURRENT RADIOLOGY REPORTS 2022. [DOI: 10.1007/s40134-022-00409-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/29/2022]
|
16
|
Little D, Hardwick M, Graham R, Cheesewright J, Redman S. Learning from error: a nuclear medicine events and learning meeting. Nucl Med Commun 2022; 43:855-859. [PMID: 35506287 DOI: 10.1097/mnm.0000000000001574] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Affiliation(s)
- David Little
- Department of Nuclear Medicine, Royal United Hospitals Bath, Bath, UK
| | | | | | | | | |
Collapse
|
17
|
Giardina TD, Choi DT, Upadhyay DK, Korukonda S, Scott TM, Spitzmueller C, Schuerch C, Torretti D, Singh H. Inviting patients to identify diagnostic concerns through structured evaluation of their online visit notes. J Am Med Inform Assoc 2022; 29:1091-1100. [PMID: 35348688 PMCID: PMC9093029 DOI: 10.1093/jamia/ocac036] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2021] [Revised: 02/03/2022] [Accepted: 03/08/2022] [Indexed: 11/24/2022] Open
Abstract
BACKGROUND The 21st Century Cures Act mandates patients' access to their electronic health record (EHR) notes. To our knowledge, no previous work has systematically invited patients to proactively report diagnostic concerns while documenting and tracking their diagnostic experiences through EHR-based clinician note review. OBJECTIVE To test if patients can identify concerns about their diagnosis through structured evaluation of their online visit notes. METHODS In a large integrated health system, patients aged 18-85 years actively using the patient portal and seen between October 2019 and February 2020 were invited to respond to an online questionnaire if an EHR algorithm detected any recent unexpected return visit following an initial primary care consultation ("at-risk" visit). We developed and tested an instrument (Safer Dx Patient Instrument) to help patients identify concerns related to several dimensions of the diagnostic process based on notes review and recall of recent "at-risk" visits. Additional questions assessed patients' trust in their providers and their general feelings about the visit. The primary outcome was a self-reported diagnostic concern. Multivariate logistic regression tested whether the primary outcome was predicted by instrument variables. RESULTS Of 293 566 visits, the algorithm identified 1282 eligible patients, of whom 486 responded. After applying exclusion criteria, 418 patients were included in the analysis. Fifty-one patients (12.2%) identified a diagnostic concern. Patients were more likely to report a concern if they disagreed with statements "the care plan the provider developed for me addressed all my medical concerns" [odds ratio (OR), 2.65; 95% confidence interval [CI], 1.45-4.87) and "I trust the provider that I saw during my visit" (OR, 2.10; 95% CI, 1.19-3.71) and agreed with the statement "I did not have a good feeling about my visit" (OR, 1.48; 95% CI, 1.09-2.01). CONCLUSION Patients can identify diagnostic concerns based on a proactive online structured evaluation of visit notes. This surveillance strategy could potentially improve transparency in the diagnostic process.
Collapse
Affiliation(s)
- Traber D Giardina
- Center for Innovations in Quality, Effectiveness and Safety, Michael E. DeBakey Veterans Affairs Medical Center, and Baylor College of Medicine, Houston, Texas, USA
| | - Debra T Choi
- Center for Innovations in Quality, Effectiveness and Safety, Michael E. DeBakey Veterans Affairs Medical Center, and Baylor College of Medicine, Houston, Texas, USA
| | | | | | - Taylor M Scott
- Center for Innovations in Quality, Effectiveness and Safety, Michael E. DeBakey Veterans Affairs Medical Center, and Baylor College of Medicine, Houston, Texas, USA
| | | | | | | | - Hardeep Singh
- Center for Innovations in Quality, Effectiveness and Safety, Michael E. DeBakey Veterans Affairs Medical Center, and Baylor College of Medicine, Houston, Texas, USA
| |
Collapse
|
18
|
Ludwig DR, Strnad BS, Bierhals AJ, Mellnick VM. Implementation of a peer-learning program in an academic abdominal radiology practice and comparison with a traditional peer-review system. Abdom Radiol (NY) 2022; 47:2509-2519. [PMID: 35482105 DOI: 10.1007/s00261-022-03523-3] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2022] [Revised: 04/03/2022] [Accepted: 04/05/2022] [Indexed: 11/28/2022]
Abstract
OBJECTIVE The purpose of this study was to transition from a traditional score-based peer-review system to an education-oriented peer-learning program in our academic abdominal radiology practice. MATERIAL AND METHODS This retrospective study compared our experience with a score-based peer-review model used prior to September 2020 and a peer-learning model implemented and used exclusively beginning in October of 2020. In peer review, a web-based peer-review tool randomly generated a list of cases, which were blindly reviewed in consensus. Comparison of the consensus interpretation with the original report was used to categorize each reviewed case and to calculate the rates of significant and minor discrepancies. Only cases with a discrepancy were considered to represent a learning opportunity. In peer learning, faculty prospectively identified and submitted cases for review in several categories, including case interpretations with a discrepancy from subsequent opinion or result, interpretations considered to represent a great call, and interesting or challenging cases meriting further discussion. The peer-learning coordinator showed each case to the group in a manner which blinded the group to both submitting and interpreting radiologist and invited discussion during various stages of the case. RESULTS During peer review, a total of 172 cases were reviewed over 16 sessions occurring between April 2016 and September 2020. Only 3 cases (1.8%) yielded significant discrepancies whereas 13 (7.6%) yielded minor discrepancies, representing a total of 16 learning opportunities (3.6 per year). In peer learning, 64 cases were submitted and 52 reviewed over 7 sessions occurring between October 2020 and October 2021. 29 (56%) were submitted as an interesting or challenging case meriting further discussion, 18 (35%) were submitted for a discrepancy, and 5 (10%) were submitted for a great call. All 52 presented cases represented learning opportunities (48 per year). CONCLUSION An education-focused peer-learning program provided a platform for continuous quality improvement and yielded substantially more learning opportunities compared to score-based peer review.
Collapse
Affiliation(s)
- Daniel R Ludwig
- Mallinckrodt Institute of Radiology, Washington University School of Medicine, 510 S. Kingshighway Blvd, Campus Box 8131, Saint Louis, MO, 63110, USA.
| | - Benjamin S Strnad
- Mallinckrodt Institute of Radiology, Washington University School of Medicine, 510 S. Kingshighway Blvd, Campus Box 8131, Saint Louis, MO, 63110, USA
| | - Andrew J Bierhals
- Mallinckrodt Institute of Radiology, Washington University School of Medicine, 510 S. Kingshighway Blvd, Campus Box 8131, Saint Louis, MO, 63110, USA
| | - Vincent M Mellnick
- Mallinckrodt Institute of Radiology, Washington University School of Medicine, 510 S. Kingshighway Blvd, Campus Box 8131, Saint Louis, MO, 63110, USA
| |
Collapse
|
19
|
Schmidt E, Lo HS, Saghir A. Peer learning in emergency radiology: effects on learning, error identification, and radiologist experience. Emerg Radiol 2022; 29:655-661. [PMID: 35391565 DOI: 10.1007/s10140-022-02040-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2022] [Accepted: 03/29/2022] [Indexed: 11/24/2022]
Abstract
PURPOSE We established and evaluated a peer learning program in an emergency radiology (ER) division. Peer learning is an alternative to peer review focusing on non-punitive error reporting to mitigate consequences of inevitable human error. The central component is the peer learning conference, where cases are presented, key teaching points are discussed, and process improvement ideas are solicited. METHODS We established a prior imaging-based case identification system and a bimonthly remote videoconference where ER faculty discuss 5-15 cases selected for learning or process improvement opportunities. Case identification and conference characteristics were captured. A survey focused on learning and performance outcomes was administered to faculty initially and showed improved scores after 6 months. RESULTS Cases selected for conference favored perception errors (46%), with great calls (17%) and process improvement (15%) the next most common categories. A variety of anatomical regions were represented, with abdominal (35%) and musculoskeletal (29%) most common. Error detection was improved over peer review. All participants find the system easy to use and prefer peer learning to peer review for learning and process improvement. CONCLUSION A peer learning program can be successfully implemented within a busy academic emergency radiology division, as evidenced by increasing buy-in and engagement scores over time. When tied to a departmental peer learning infrastructure, interdisciplinary expertise and robust case identification can be leveraged to increase learning opportunities.
Collapse
Affiliation(s)
- Eric Schmidt
- Department of Radiology, University of Massachusetts Medical School, Worcester, MA, 01605, USA
| | - Hao S Lo
- Department of Radiology, University of Massachusetts Medical School, Worcester, MA, 01605, USA
| | - Amina Saghir
- Department of Radiology, University of Massachusetts Medical School, Worcester, MA, 01605, USA.
| |
Collapse
|
20
|
Kunst M, Elentuck D, Broder J. Leveraging the Peer Learning Conference to Establish and Maintain a Peer Learning Program. Curr Probl Diagn Radiol 2022; 51:686-690. [DOI: 10.1067/j.cpradiol.2022.04.010] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/06/2021] [Revised: 03/27/2022] [Accepted: 04/18/2022] [Indexed: 11/22/2022]
|
21
|
Sayyouh MMH, Sella EC, Shankar PR, Marshall GE, Quint LE, Agarwal PP. Lessons Learned from Peer Learning Conference in Cardiothoracic Radiology. Radiographics 2022; 42:579-593. [PMID: 35148241 DOI: 10.1148/rg.210125] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Abstract
Medical errors may lead to patient harm and may also have a devastating effect on medical providers, who may suffer from guilt and the personal impact of a given error (second victim experience). While it is important to recognize and remedy errors, it should be done in a way that leads to long-standing practice improvement and focuses on systems-level opportunities rather than in a punitive fashion. Traditional peer review systems are score based and have some undesirable attributes. The authors discuss the differences between traditional peer review systems and peer learning approaches and offer practical suggestions for transitioning to peer learning conferences. Peer learning conferences focus on learning opportunities and embrace errors as an opportunity to learn. The authors also discuss various types and sources of errors relevant to the practice of radiology and how discussions in peer learning conferences can lead to widespread system improvement. In the authors' experience, these strategies have resulted in practice improvement not only at a division level in radiology but in a broader multidisciplinary setting as well. The online slide presentation from the RSNA Annual Meeting is available for this article. ©RSNA, 2022.
Collapse
Affiliation(s)
- Mohamed M H Sayyouh
- From the Cardiothoracic Imaging Division, Department of Radiology, University of Michigan, Taubman Center B1-132D, 1500 E Medical Center Dr, Ann Arbor, MI 48109-5302 (M.M.H.S., E.C.S., G.E.M., L.E.Q., P.P.A.); and Abdominal Imaging Division and Michigan Radiology Quality Collaborative, Department of Radiology, University of Michigan, Ann Arbor, Mich (P.R.S.)
| | - Edith C Sella
- From the Cardiothoracic Imaging Division, Department of Radiology, University of Michigan, Taubman Center B1-132D, 1500 E Medical Center Dr, Ann Arbor, MI 48109-5302 (M.M.H.S., E.C.S., G.E.M., L.E.Q., P.P.A.); and Abdominal Imaging Division and Michigan Radiology Quality Collaborative, Department of Radiology, University of Michigan, Ann Arbor, Mich (P.R.S.)
| | - Prasad R Shankar
- From the Cardiothoracic Imaging Division, Department of Radiology, University of Michigan, Taubman Center B1-132D, 1500 E Medical Center Dr, Ann Arbor, MI 48109-5302 (M.M.H.S., E.C.S., G.E.M., L.E.Q., P.P.A.); and Abdominal Imaging Division and Michigan Radiology Quality Collaborative, Department of Radiology, University of Michigan, Ann Arbor, Mich (P.R.S.)
| | - Giselle E Marshall
- From the Cardiothoracic Imaging Division, Department of Radiology, University of Michigan, Taubman Center B1-132D, 1500 E Medical Center Dr, Ann Arbor, MI 48109-5302 (M.M.H.S., E.C.S., G.E.M., L.E.Q., P.P.A.); and Abdominal Imaging Division and Michigan Radiology Quality Collaborative, Department of Radiology, University of Michigan, Ann Arbor, Mich (P.R.S.)
| | - Leslie E Quint
- From the Cardiothoracic Imaging Division, Department of Radiology, University of Michigan, Taubman Center B1-132D, 1500 E Medical Center Dr, Ann Arbor, MI 48109-5302 (M.M.H.S., E.C.S., G.E.M., L.E.Q., P.P.A.); and Abdominal Imaging Division and Michigan Radiology Quality Collaborative, Department of Radiology, University of Michigan, Ann Arbor, Mich (P.R.S.)
| | - Prachi P Agarwal
- From the Cardiothoracic Imaging Division, Department of Radiology, University of Michigan, Taubman Center B1-132D, 1500 E Medical Center Dr, Ann Arbor, MI 48109-5302 (M.M.H.S., E.C.S., G.E.M., L.E.Q., P.P.A.); and Abdominal Imaging Division and Michigan Radiology Quality Collaborative, Department of Radiology, University of Michigan, Ann Arbor, Mich (P.R.S.)
| |
Collapse
|
22
|
Phalak KA, Gerlach K, Parikh JR. Peer learning in breast imaging. Clin Imaging 2022; 85:60-63. [PMID: 35247790 DOI: 10.1016/j.clinimag.2022.02.027] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2021] [Revised: 02/11/2022] [Accepted: 02/25/2022] [Indexed: 11/17/2022]
Abstract
With the increasing focus on quality and safety in medicine, radiology practices are increasingly transitioning from traditional score-based peer review to peer learning. Participation in a peer learning program can increase learning, practice improvement, and cultivation of interpersonal relationships in a non-punitive environment. As breast imaging errors are the most cited in medical malpractice cases, learning and attention to and reduction of these errors in breast imaging are especially important. We describe the strengths of a peer learning program, implementation process in a breast imaging program, challenges to overcome, and strategies to support success.
Collapse
Affiliation(s)
- Kanchan A Phalak
- Department of Radiology, University MD Anderson Cancer Center, Houston, TX, USA.
| | - Karen Gerlach
- Department of Radiology, University MD Anderson Cancer Center, Houston, TX, USA.
| | - Jay R Parikh
- Department of Radiology, University MD Anderson Cancer Center, Houston, TX, USA.
| |
Collapse
|
23
|
Torres FS, Costa AF, Kagoma Y, Arrigan M, Scott M, Yemen B, Hurrell C, Kielar A. CAR Peer Learning Guide. Can Assoc Radiol J 2022; 73:491-498. [PMID: 35077247 DOI: 10.1177/08465371211065454] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Peer learning is a quality initiative used to identify potential areas of practice improvement, both on a patient level and on a systemic level. Opportunities for peer learning include review of prior imaging studies, evaluation of cases from multidisciplinary case conferences, and review of radiology trainees' call cases. Peer learning is non-punitive and focuses on promoting life-long learning. It seeks to identify and disseminate learning opportunities and areas for systems improvement compared to traditional peer review. Learning opportunities arise from peer learning through both individual communication of cases reviewed for routine work, as well as through anonymous presentation of aggregate cases in an educational format. In conjunction with other tools such as root cause analysis, peer learning can be used to guide future practice improvement opportunities. This guide provides definitions of terms and a synthetic evidence review regarding peer review and peer learning, as well as medicolegal and jurisdictional considerations. Important aspects of what makes an effective peer learning program and best practices for implementing such a program are presented. The guide is intended to be a living document that will be updated regularly as new data emerges and peer learning continues to evolve in radiology practices.
Collapse
Affiliation(s)
- Felipe Soares Torres
- Joint Department of Medical Imaging, Toronto General Hospital, 7938University of Toronto, Toronto, ON, Canada
| | - Andreu F Costa
- Department of Radiology, Queen Elizabeth II Health Sciences Centre, Dalhousie University, Halifax, NS, Canada
| | - Yoan Kagoma
- Hamilton Health Sciences, McMaster University Faculty of Health Sciences, Hamilton, ON, Canada
| | | | - Malcolm Scott
- Misericordia Community Hospital, University of Alberta, Edmonton, AB, Canada
| | - Brian Yemen
- Hamilton Health Sciences, 3710McMaster University, Hamilton, ON, Canada
| | - Casey Hurrell
- Canadian Association of Radiologists, Ottawa, ON, Canada
| | - Ania Kielar
- Joint Department of Medical Imaging, University of Toronto, Toronto, ON, Canada
| |
Collapse
|
24
|
Bowman AW, Tan N, Adamo DA, Chen F, Venkatesh SK, Baumgarten DA. Implementation of peer learning conferences throughout a multi-site abdominal radiology practice. Abdom Radiol (NY) 2021; 46:5489-5499. [PMID: 33999282 DOI: 10.1007/s00261-021-03114-8] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2021] [Revised: 04/25/2021] [Accepted: 04/30/2021] [Indexed: 11/28/2022]
Abstract
PURPOSE To initiate a peer learning conference for our abdominal radiology division across multiple geographically separated sites and different time zones, and to determine radiologist preference for peer learning versus traditional score-based peer review. METHODS We implemented a monthly peer learning videoconference for our abdominal radiology division. Surveys regarding radiologist opinion regarding traditional peer review and the new peer learning conferences were conducted before and after 6 months of conferences. RESULTS Peer learning conferences were well attended across our multiple sites, with an average of 43 participants per conference. Radiologist opinion regarding peer review was poor, with survey radiologists responding positively to only 1 out of 12 process questions. Opinion regarding peer learning was extremely favorable, with radiologists responding positively to 12 out of the same 12 process questions. After 6 months of peer learning conferences, 87.9% of surveyed radiologists wished to continue them in some fashion, and no one preferred to return to score-based peer review alone. CONCLUSION We successfully implemented a peer learning conference for our abdominal radiology division spread out over multiple geographic sites. Our radiologists strongly preferred peer learning conferences over our traditional peer review system for quality control.
Collapse
Affiliation(s)
- Andrew W Bowman
- Department of Radiology, Mayo Clinic, 4500 San Pablo Rd, Jacksonville, FL, 32224, USA.
| | - Nelly Tan
- Department of Radiology, Mayo Clinic, 5777 Mayo Blvd, Phoenix, AZ, 85054, USA
| | - Daniel A Adamo
- Department of Radiology, Mayo Clinic, 200 First St SW, Rochester, MN, 55905, USA
| | - Frederick Chen
- Department of Radiology, Mayo Clinic, 5777 Mayo Blvd, Phoenix, AZ, 85054, USA
| | - Sudhakar K Venkatesh
- Department of Radiology, Mayo Clinic, 200 First St SW, Rochester, MN, 55905, USA
| | - Deborah A Baumgarten
- Department of Radiology, Mayo Clinic, 4500 San Pablo Rd, Jacksonville, FL, 32224, USA
| |
Collapse
|
25
|
Glazer DI, Zhao AH, Lacson R, Burk KS, DiPiro PJ, Kapoor N, Khorasani R. Use of a PACS Embedded System for Communicating Radiologist to Technologist Learning Opportunities and Patient Callbacks. Curr Probl Diagn Radiol 2021; 51:511-516. [PMID: 34836721 DOI: 10.1067/j.cpradiol.2021.09.007] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/26/2021] [Revised: 09/07/2021] [Accepted: 09/19/2021] [Indexed: 12/19/2022]
Abstract
OBJECTIVE This study aimed to determine effect of modality, care setting, and radiology subspecialty on frequency of diagnostic image quality issues identified by radiologists during image interpretation. METHODS This Institutional Review Board-exempt retrospective study was performed 10/1/18-6/30/20 at an academic radiology practice performing 700,000+ examinations annually. A closed-loop communication tool integrated in PACS workflow enabled radiologists to alert technologists to image quality issues. Radiologists categorized communications as requiring patient callback, or as technologist learning opportunities if image quality was adequate to generate a diagnostic report. Fisher's exact test assessed impact of imaging modality, radiology subspecialty, and care setting on radiologist-identified image quality issues. RESULTS 976,915 imaging examinations were performed during the study period. Radiologists generated 1,935 technologist learning opportunities (0.20%) and 208 callbacks (0.02%). Learning opportunity rates were highest for MRI (0.60%) when compared to CT (0.26%) and radiography (0.08%) (p<0.0001). The same was true for patient callbacks (MRI 0.13%, CT 0.02%, radiography 0.0006%; p<0.0001). Outpatient examinations generated more learning opportunities (1479/637,092; 0.23%) vs. inpatient (305/200,206; 0.15%) and Emergency Department (151/139,617; 0.11%) (p<0.0001). Abdominal subspecialists were most likely to generate learning opportunities when compared to other subspecialists and cardiovascular imagers were most likely to call a patient back. CONCLUSIONS Image quality issues identified by radiologists during the interpretation process were rare and 10 times more commonly categorized as learning opportunities not interfering with a clinically adequate report than as requiring patient callback. Further work is necessary to determine if creating learning opportunities leads to fewer patients requiring repeat examinations.
Collapse
Affiliation(s)
- Daniel I Glazer
- Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, MA.; Center for Evidence-Based Imaging, Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Brookline, MA..
| | - Anna H Zhao
- Center for Evidence-Based Imaging, Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Brookline, MA
| | - Ronilda Lacson
- Center for Evidence-Based Imaging, Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Brookline, MA
| | - Kristine S Burk
- Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, MA.; Center for Evidence-Based Imaging, Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Brookline, MA
| | - Pamela J DiPiro
- Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, MA.; Center for Evidence-Based Imaging, Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Brookline, MA
| | - Neena Kapoor
- Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, MA.; Center for Evidence-Based Imaging, Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Brookline, MA
| | - Ramin Khorasani
- Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, MA.; Center for Evidence-Based Imaging, Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Brookline, MA
| |
Collapse
|
26
|
Perspective: in pursuit of a learning culture. Abdom Radiol (NY) 2021; 46:5017-5020. [PMID: 34075467 DOI: 10.1007/s00261-021-03156-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2021] [Revised: 05/13/2021] [Accepted: 05/26/2021] [Indexed: 10/21/2022]
Abstract
Transitioning from peer review to peer learning is an important step forward in developing a learning culture. Additional measures are going to be required to meet this goal. Ideas toward establishing a learning culture are detailed in this perspective.
Collapse
|
27
|
Tee QX, Nambiar M, Stuckey S. Error and cognitive bias in diagnostic radiology. J Med Imaging Radiat Oncol 2021; 66:202-207. [PMID: 34467643 DOI: 10.1111/1754-9485.13320] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/26/2021] [Accepted: 08/16/2021] [Indexed: 11/29/2022]
Abstract
The above article was posted prematurely on 31 August 2021. The article will be made fully available at a later date.
Collapse
Affiliation(s)
- Qiao Xin Tee
- Department of Diagnostic Imaging, Monash Health, Clayton, Victoria, Australia
| | - Mithun Nambiar
- Department of Diagnostic Imaging, Monash Health, Clayton, Victoria, Australia
| | - Stephen Stuckey
- Department of Diagnostic Imaging, Monash Health, Clayton, Victoria, Australia
- School of Clinical Sciences at Monash Health, Faculty of Medicine, Nursing and Health Sciences, Monash University, Melbourne, Victoria, Australia
| |
Collapse
|
28
|
Virarkar M, Morani AC, Bhosale P, Wagner-Bartak NA, Carter BW, Lano E. Peer Learning and Operationalizing During COVID-19 Pandemic and Beyond. Cureus 2021; 13:e16568. [PMID: 34430170 PMCID: PMC8378281 DOI: 10.7759/cureus.16568] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/22/2021] [Indexed: 11/23/2022] Open
Abstract
The main objective of the article is to describe the changes in managing the peer learning system in the Department of Abdominal Imaging at our institution during the pandemic and its restrictions. The pandemic poses diverse challenges to academic institutions across the country including radiology education and peer learning. The health sector in some areas of the country has been stretched by the number of coronavirus disease 2019 (COVID-19) patients. In March 2020, our institution cancelled all in-person conferences as per guidelines from the Center of Disease Control and Prevention to mitigate the spread of COVID-19 and the conferences were shifted to virtual platforms. Our recent peer learning approach allowed us to practice appropriate social distancing while following the institutional and national guidelines with minimal disruption. Other institutions that are facing similar challenges can adopt or modify our framework of a successful and efficient virtual peer learning process.
Collapse
Affiliation(s)
- Mayur Virarkar
- Radiology, The University of Texas Health Science Center at Houston, Houston, USA
| | - Ajaykumar C Morani
- Abdominal Imaging, The University of Texas MD Anderson Cancer Center, Houston, USA
| | - Priya Bhosale
- Abdominal Imaging, The University of Texas MD Anderson Cancer Center, Houston, USA
| | | | - Brett W Carter
- Thoracic Imaging, The University of Texas MD Anderson Cancer Center, Houston, USA
| | - Elizabeth Lano
- Abdominal Imaging, The University of Texas MD Anderson Cancer Center, Houston, USA
| |
Collapse
|
29
|
Schafer LE, Perry H, Fishman MD, Jakomin BV, Slanetz PJ. Incorporating Peer Learning Into Your Breast Imaging Practice. JOURNAL OF BREAST IMAGING 2021; 3:491-497. [PMID: 38424796 DOI: 10.1093/jbi/wbab043] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/04/2021] [Indexed: 03/02/2024]
Abstract
Traditional score-based peer review has come under scrutiny in recent years, as studies have demonstrated it to be generally ineffective at improving quality. Many practices and programs are transitioning to a peer learning model to replace or supplement traditional peer review. Peer learning differs from traditional score-based peer review in that the emphasis is on sharing learning opportunities and creating an environment that fosters discussion of errors in a nonpunitive forum with the goal of improved patient care. Creating a just culture is central to fostering successful peer learning. In a just culture, mistakes can be discussed without shame or fear of retribution and the focus is on systems improvement rather than individual blame. Peer learning, as it pertains to breast imaging, can occur in many forms and venues. Examples of the various formats in which peer learning can occur include through individual colleague interaction, as well as divisional, multidisciplinary, department-wide, and virtual conferences, and with the assistance of artificial intelligence. Incorporating peer learning into the practice of breast imaging aims to reduce delayed diagnoses of breast cancer and optimize patient care.
Collapse
Affiliation(s)
- Leah E Schafer
- Boston Medical Center and Boston University School of Medicine, Department of Radiology, Boston, MA, USA
| | - Hannah Perry
- University of Vermont Medical Center and Larner College of Medicine at the University of Vermont, Department of Radiology, Burlington, VT, USA
| | - Michael Dc Fishman
- Boston Medical Center and Boston University School of Medicine, Department of Radiology, Boston, MA, USA
| | - Bernadette V Jakomin
- Boston Medical Center and Boston University School of Medicine, Department of Radiology, Boston, MA, USA
| | - Priscilla J Slanetz
- Boston Medical Center and Boston University School of Medicine, Department of Radiology, Boston, MA, USA
| |
Collapse
|
30
|
Lamoureux C, Hanna TN, Sprecher D, Weber S, Callaway E. Radiologist errors by modality, anatomic region, and pathology for 1.6 million exams: what we have learned. Emerg Radiol 2021; 28:1135-1141. [PMID: 34328592 DOI: 10.1007/s10140-021-01959-6] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2021] [Accepted: 06/21/2021] [Indexed: 11/26/2022]
Abstract
PURPOSE To evaluate the feasibility of adding pathology to recent radiologist error characterization schemes of modality and anatomic region and the potential of this data to more specifically inform peer review and peer learning. METHODS Quality assurance data originating from 349 radiologists in a national teleradiology practice were collected for 2019. Interpretive errors were simply categorized as major or minor. Reporting or communication errors were classified as administrative errors. Interpretive errors were then divided by modality, anatomic region and placed into one of 64 pathologic categories. RESULTS Out of 1,628,464 studies, the discrepancy rate was 0.5% (8181/1,634,201). The 8181 total errors consisted of 2992 major errors (0.18%) and 5189 minor errors (0.32%). Precisely, 3.1% (257/8181) of total errors were administrative. Of major interpretive errors, 75.5% occurred on CT, with CT abdomen and pelvis accounting for 40.4%. The most common pathologic discrepancy for all exams was in the category of mass, nodule, or adenopathy (1583/8181), the majority of which were minor (1315/1583). The most common pathologic discrepancy for the 2937 major interpretive errors was fracture or dislocation (27%; 793/2937), followed by bleed (10.7%; 315/2937). CONCLUSION The addition of error-related pathology to peer review is both feasible and practical and provides a more detailed guide to targeted individual and practice-wide peer learning quality improvement efforts. Future research is needed to determine if there are measurable improvements in detection or interpretation of specific pathologies following error feedback and educational interventions.
Collapse
Affiliation(s)
| | - Tarek N Hanna
- Division of Emergency Radiology, Department of Radiology and Imaging Sciences, Emory University, 550 Peachtree Rd, Atlanta, GA, 30308, USA
| | - Devin Sprecher
- Virtual Radiologic, 11995 Singletree Ln #500, Eden Prairie, MN, 55344, USA
| | - Scott Weber
- Virtual Radiologic, 11995 Singletree Ln #500, Eden Prairie, MN, 55344, USA
| | - Edward Callaway
- Virtual Radiologic, 11995 Singletree Ln #500, Eden Prairie, MN, 55344, USA
| |
Collapse
|
31
|
Availability of a final abdominopelvic CT report before emergency department disposition: risk-adjusted outcomes in patients with abdominal pain. Abdom Radiol (NY) 2021; 46:2900-2907. [PMID: 33386916 DOI: 10.1007/s00261-020-02899-4] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2020] [Revised: 11/24/2020] [Accepted: 12/04/2020] [Indexed: 10/22/2022]
Abstract
OBJECTIVE To determine whether availability of a final radiologist report versus an experienced senior resident preliminary report prior to disposition affects major care outcomes in emergency department (ED) patient presenting with abdominal pain undergoing abdominopelvic CT. MATERIALS AND METHODS This single-institution, IRB-approved, HIPAA-compliant retrospective cohort study included 5019 ED patients with abdominal pain undergoing abdominopelvic CT from October 2015 to April 2019. Patients were categorized as being dispositioned after either an experienced senior resident preliminary report (i.e., overnight model) or the final attending radiologist interpretation (i.e., daytime model) of the CT was available. Multivariable regression models were built accounting for demographic data, clinical factors (vital signs, ED triage score, laboratory data), and disposition timing to analyze the impact on four important patient outcomes: inpatient admission (primary outcome), readmission (within 30 days), second operation within 30 days, and death. RESULTS In the setting of an available experienced senior resident preliminary report, timing of the final radiologist report (before vs. after disposition) was not a significant multivariable predictor of inpatient admission (p = 0.63), readmission within 30 days (p = 0.66), second operation within 30 days (p = 0.09), or death (p = 0.63). Unadjusted event rates for overnight vs daytime reports, respectively, were 37.2% vs. 38.0% (inpatient admission), 15.9% vs. 16.5% (30-day readmission), 0.65% vs. 0.3% (second operation within 30 days), and 0.85% vs. 1.3% (death). CONCLUSION Given the presence of an experienced senior resident preliminary report, availability of a final radiology report prior to ED disposition did not affect four major clinical care outcomes of patients with abdominal pain undergoing abdominopelvic CT.
Collapse
|
32
|
Prostate Imaging and Data Reporting System Version 2 as a Radiology Performance Metric: An Analysis of 18 Abdominal Radiologists. J Am Coll Radiol 2021; 18:1069-1076. [PMID: 33848507 DOI: 10.1016/j.jacr.2021.02.032] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2021] [Revised: 02/11/2021] [Accepted: 02/12/2021] [Indexed: 11/24/2022]
Abstract
PURPOSE To determine expected trained provider performance dispersion in Prostate Imaging and Data Reporting System version 2 (PI-RADS v2) positive predictive values (PPVs). METHODS This single-center quality assurance retrospective cohort study evaluated 5,556 consecutive prostate MRIs performed on 4,593 patients. Studies were prospectively interpreted from October 8, 2016, to July 31, 2020, by 18 subspecialty-trained abdominal radiologists (1-22 years' experience; median MRIs per radiologist: 232, first-to-third quartile range [Q1-Q3]: 128-440; 13 interpreted at least 30 MRIs with a reference standard). Maximum prospectively reported whole-gland PI-RADS v2 score was compared to post-MRI biopsy histopathology obtained within 2 years. The primary outcome was PPV of MRI by provider stratified by maximum whole-gland PI-RADS v2 score. RESULTS Median provider-level PPVs for the radiologists who interpreted ≥30 MRIs with a reference standard were PI-RADS 3 (22.1%; Q1-Q3: 10.0%-28.6%), PI-RADS 4 (49.2%; Q1-Q3: 41.4%-50.0%), PI-RADS 5 (81.8%; Q1-Q3: 77.1%-84.4%). Overall, the maximum whole-gland PI-RADS v2 score was PI-RADS 1 to 2 (34.6% [1,925]), PI-RADS 3 (8.5% [474]), PI-RADS 4 (21.0% [1,166]), PI-RADS 5 (18.3% [1,018]), no PI-RADS score (17.5% [973]). System-level (all providers) PPVs for maximum PI-RADS v2 scores were 20.0% (95% confidence interval [CI]: 15.7%-24.9%) for PI-RADS 3, 48.5% (95% CI: 44.8%-52.2%) for PI-RADS 4, and 80.1% for PI-RADS 5 (95% CI: 75.7%-83.9%). CONCLUSION Subspecialty-trained abdominal radiologists with a wide range of experience can obtain consistent positive predictive values for PI-RADS v2 scores of 3 to 5. These data can be used for quality assurance benchmarking.
Collapse
|
33
|
Yacoub JH, Swanson CE, Jay AK, Cooper C, Spies J, Krishnan P. The Radiology Virtual Reading Room: During and Beyond the COVID-19 Pandemic. J Digit Imaging 2021; 34:308-319. [PMID: 33620622 PMCID: PMC7901504 DOI: 10.1007/s10278-021-00427-4] [Citation(s) in RCA: 13] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2020] [Revised: 01/08/2021] [Accepted: 01/18/2021] [Indexed: 12/24/2022] Open
Abstract
The COVID-19 pandemic has disrupted the radiology reading room with a potentially lasting impact. This disruption could introduce the risk of obviating the need for the reading room, which would be detrimental to many of the roles of radiology that occur in and around the reading room. This disruption could also create the opportunity for accelerated evolution of the reading room to meet the strategic needs of radiology and health care through thoughtful re-design of the virtual reading room. In this article, we overview the impact of the COVID-19 pandemic on radiology in our institution and across the country, specifically on the dynamics of the radiology reading room. We introduce the concept of the virtual reading room, which is a redesigned alternative to the physical reading room that can serve the diverse needs of radiology and healthcare during and beyond the pandemic.
Collapse
Affiliation(s)
- Joseph H Yacoub
- Medstar Georgetown University Hospital, 3800 Reservoir Rd NW Washington, 20007, Georgetown, DC, USA.
| | - Carl E Swanson
- Medstar Georgetown University Hospital, 3800 Reservoir Rd NW Washington, 20007, Georgetown, DC, USA
| | - Ann K Jay
- Medstar Georgetown University Hospital, 3800 Reservoir Rd NW Washington, 20007, Georgetown, DC, USA
| | - Cirrelda Cooper
- Medstar Georgetown University Hospital, 3800 Reservoir Rd NW Washington, 20007, Georgetown, DC, USA
| | - James Spies
- Medstar Georgetown University Hospital, 3800 Reservoir Rd NW Washington, 20007, Georgetown, DC, USA
| | - Pranay Krishnan
- Medstar Georgetown University Hospital, 3800 Reservoir Rd NW Washington, 20007, Georgetown, DC, USA
| |
Collapse
|
34
|
Peer Learning in Radiology: Effect of a Pay-for-Performance Initiative on Clinical Impact and Usage. AJR Am J Roentgenol 2021; 216:1659-1667. [PMID: 33787297 DOI: 10.2214/ajr.20.23253] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
OBJECTIVE. The purpose of this article is to assess the effects of a pay-for-performance (PFP) initiative on clinical impact and usage of a radiology peer learning tool. MATERIALS AND METHODS. This retrospective study was performed at a large academic hospital. On May 1, 2017, a peer learning tool was implemented to facilitate radiologist peer feedback including clinical follow-up, positive feedback, and consultation. Subsequently, PFP target numbers for peer learning tool alerts by subspecialty divisions (October 1, 2017) and individual radiologists (October 1, 2018) were set. The primary outcome was report addendum rate (percent of clinical follow-up alerts with addenda), which was a proxy for peer learning tool clinical impact. Secondary outcomes were peer learning tool usage rate (number of peer learning tool alerts per 1000 radiology reports) and proportion of clinical follow-up alerts (percent of clinical follow-ups among all peer learning tool alerts). Outcomes were assessed biweekly using ANOVA and statistical process control analyses. RESULTS. Among 1,265,839 radiology reports from May 1, 2017, to September 29, 2019, a total of 20,902 peer learning tool alerts were generated. The clinical follow-up alert addendum rate was not significantly different between the period before the PFP initiative (9.9%) and the periods including division-wide (8.3%) and individual (7.9%) PFP initiatives (p = .55; ANOVA). Peer learning tool usage increased from 2.2 alerts per 1000 reports before the PFP initiative to 12.6 per 1000 during the division-wide PFP period (5.7-fold increase; 12.6/2.2), to 25.2 in the individual PFP period (11.5-fold increase vs before PFP; twofold increase vs division-wide) (p < .001). The clinical follow-up alert proportion decreased from 37.5% before the PFP initiative, to 34.4% in the division-wide period, to 31.3% in the individual PFP period. CONCLUSION. A PFP initiative improved radiologist engagement in peer learning by marked increase in peer learning tool usage rate without a change in report addendum rate as a proxy for clinical impact.
Collapse
|
35
|
Funaro K, Niell B. Variability in Mammography Quality Assessment After Implementation of Enhancing Quality Using the Inspection Program (EQUIP). JOURNAL OF BREAST IMAGING 2021; 3:168-175. [PMID: 38424823 DOI: 10.1093/jbi/wbaa117] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2020] [Indexed: 03/02/2024]
Abstract
OBJECTIVE Analyze mammography quality and deficiencies, including variability in quality assessment among subspecialized breast radiologists, after implementing the Enhancing Quality Using the Inspection Program (EQUIP). METHODS After IRB approval, this single institution study retrospectively queried data prospectively entered into our automated reporting software after implementing EQUIP (October 2017-March 2019). Screening and diagnostic combination (digital mammography with tomosynthesis) mammograms were reviewed by seven breast radiologists. Quality was assessed as excellent, good, adequate, or problems found. Of those with problems found, the deficiency and corrective action were evaluated. The interpreting radiologist, EQUIP radiologist, and performing technologist were recorded. P values were calculated using Fisher exact test and chi-square analyses. RESULTS Of 17 312 mammograms, 529 (3%) underwent EQUIP review. Of 43 (8%) with problems found, 23 (53%) did not include sufficient tissue, 9 (21%) had motion degradation, 3 (7%) had artifacts, 2 each (4.7% each) had the nipple not in profile or skin folds, and 4 (9%) were categorized as "other." Nine (9/529, 1.7%) required recall for repeat imaging. The lead interpreting physician (LIP) was more likely to categorize mammograms as technically inadequate compared to other radiologists (P < 0.00001), and there were also statistically significant differences in how the remaining radiologists stratified cases (P < 0.00001) even when excluding the LIP. CONCLUSION Insufficient tissue was the most common problem identified in the EQUIP-reviewed mammograms with deficiencies. Significant variability was present among radiologist EQUIP designations. Ongoing review of clinical image quality with EQUIP allows for opportunities to provide corrective feedback.
Collapse
Affiliation(s)
- Kimberly Funaro
- H. Lee Moffitt Cancer Center and Research Institute, Department of Diagnostic Imaging and Interventional Radiology, Tampa, FL
| | - Bethany Niell
- H. Lee Moffitt Cancer Center and Research Institute, Department of Diagnostic Imaging and Interventional Radiology, Tampa, FL
| |
Collapse
|
36
|
Broder JC, Scheirey CD, Wald C. Step by Step: A Structured Approach for Proposing, Developing and Implementing a Radiology Peer Learning Program. Curr Probl Diagn Radiol 2021; 50:457-460. [PMID: 33663894 DOI: 10.1067/j.cpradiol.2021.02.007] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2020] [Revised: 01/12/2021] [Accepted: 02/01/2021] [Indexed: 11/22/2022]
Abstract
Similar to the experiences of other radiology practices, our radiology staff members felt that scored peer review identified few errors/learning opportunities while undermining team collegiality. They desired a more effective way to promote team collegiality and foster lifelong learning. We describe the steps our department took to transition from a peer review system to a peer learning program.
Collapse
Affiliation(s)
- Jennifer C Broder
- Vice Chair Quality and Safety, Department of Radiology, Lahey Hospital and Medical Center, Burlington, MA.
| | - Christopher D Scheirey
- Vice Chair Operations, Department of Radiology, Lahey Hospital and Medical Center, Burlington, MA
| | - Christoph Wald
- Chair, Department of Radiology, Lahey Hospital and Medical Center, Burlington, MA
| |
Collapse
|
37
|
Dick J, Darras KE, Lexa FJ, Denton E, Ehara S, Galloway H, Jankharia B, Kassing P, Kumamaru KK, Mildenberger P, Morozov S, Pyatigorskaya N, Song B, Sosna J, van Buchem M, Forster BB. An International Survey of Quality and Safety Programs in Radiology. Can Assoc Radiol J 2021; 72:135-141. [PMID: 32066249 DOI: 10.1177/0846537119899195] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/05/2023] Open
Abstract
PURPOSE The aim of this study was to determine the status of radiology quality improvement programs in a variety of selected nations worldwide. METHODS A survey was developed by select members of the International Economics Committee of the American College of Radiology on quality programs and was distributed to committee members. Members responded on behalf of their country. The 51-question survey asked about 12 different quality initiatives which were grouped into 4 themes: departments, users, equipment, and outcomes. Respondents reported whether a designated type of quality initiative was used in their country and answered subsequent questions further characterizing it. RESULTS The response rate was 100% and represented Australia, Canada, China, England, France, Germany, India, Israel, Japan, the Netherlands, Russia, and the United States. The most frequently reported quality initiatives were imaging appropriateness (91.7%) and disease registries (91.7%), followed by key performance indicators (83.3%) and morbidity and mortality rounds (83.3%). Peer review, equipment accreditation, radiation dose monitoring, and structured reporting were reported by 75.0% of respondents, followed by 58.3% of respondents for quality audits and critical incident reporting. The least frequently reported initiatives included Lean/Kaizen exercises and physician performance assessments, implemented by 25.0% of respondents. CONCLUSION There is considerable diversity in the quality programs used throughout the world, despite some influence by national and international organizations, from whom further guidance could increase uniformity and optimize patient care in radiology.
Collapse
Affiliation(s)
- Jeremy Dick
- University of British Columbia, Vancouver, British Columbia, Canada
| | - Kathryn E Darras
- University of British Columbia, Vancouver, British Columbia, Canada
- Department of Radiology, Faculty of Medicine, University of British Columbia, Vancouver, British Columbia, Canada
| | - Frank J Lexa
- Department of Medical Imaging, 12216University of Arizona College of Medicine, Tucson, AZ, USA
- The Radiology Leadership Institute and Commission on Leadership and Practice Development, 72672American College of Radiology, Tucson, AZ, USA
| | - Erika Denton
- Norfolk & Norwich University Hospital, Norwich, Norfolk, United Kingdom
| | - Shigeru Ehara
- Department of Radiology, Tohoku Medical and Pharmaceutical University, Sendai, Tohoku, Japan
| | | | | | - Pam Kassing
- 72672American College of Radiology, Reston, VA, USA
| | | | - Peter Mildenberger
- Department of Radiology, 9182University Medical Center Mainz, Mainz, Germany
| | | | - Nadya Pyatigorskaya
- Department of Neuroradiology, 27063Sorbonne University, Hôpital de la Pitié-Salpêtrière, Paris, France
| | - Bin Song
- West China Hospital, 12530Sichuan University, Chengdu, Sichuan, China
| | - Jacob Sosna
- Department of Radiology, 58884Hadassah Hebrew University Medical Center, Jerusalem, Israel
| | - Marcus van Buchem
- Department of Radiology, 4501Leiden University Medical Center, Leiden, the Netherlands
| | - Bruce B Forster
- University of British Columbia, Vancouver, British Columbia, Canada
- Department of Radiology, Faculty of Medicine, University of British Columbia, Vancouver, British Columbia, Canada
| |
Collapse
|
38
|
Woznitza N, Steele R, Hussain A, Gower S, Groombridge H, Togher D, Lofton L, Lainchbury J, Compton E, Rowe S, Robertson K. Reporting radiographer peer review systems: A cross-sectional survey of London NHS Trusts. Radiography (Lond) 2021; 27:173-177. [DOI: 10.1016/j.radi.2020.07.014] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/09/2020] [Revised: 07/12/2020] [Accepted: 07/17/2020] [Indexed: 12/20/2022]
|
39
|
Peer Learning Through Multi-Institutional Case Conferences: Abdominal and Cardiothoracic Radiology Experience. Acad Radiol 2021; 28:255-260. [PMID: 32061469 DOI: 10.1016/j.acra.2020.01.015] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2019] [Revised: 12/31/2019] [Accepted: 01/02/2020] [Indexed: 11/23/2022]
Abstract
RATIONALE AND OBJECTIVES We describe a model of multi-institutional, multisociety, online case conferences that is a case-based group discussion of selected (nonrandom) cases which are subsequently hosted on social media and online platforms (e.g., YouTube, websites) to be available for a wider audience. MATERIALS AND METHODS Using online conferencing software (Zoom, GoToMeeting), participants from both abdominal and cardiothoracic radiologists engage in separate, subspecialty one-hour meetings discussing a variety of meaningful cases. Participants take turns presenting their cases to the group and discuss significant findings, interpretations, differential diagnoses, and any other teaching points. All of the case conferences for both societies are recorded and edited to be uploaded on YouTube and their respective websites. RESULTS Participants from these conferences log in from 14 institutions in 7 states across the United States. The YouTube videos reach thousands of people around the world. The abdominal case conference on YouTube has received almost 1,300 views with 90 videos uploaded. The thoracic (the Society of Thoracic Radiology) case conference has been running for over 7 years, with 226 videos uploaded to YouTube and 38,200 views, 1426 subscribers, and a total watch time of over 525,800 minutes. Twitter has been utilized by both groups to promote online viewership. CONCLUSION Our model is feasible and effective compared to traditional peer review. The cases selected are deliberate and focused on quality improvement and/or education. We harness online engagement, specifically social media presence, which has opened new opportunities to educate our peers and reach a global audience, including the nonradiologic community, to learn about radiology and unique practices.
Collapse
|
40
|
Radiologist Opinions of a Quality Assurance Program: The Interaction Between Error, Emotion, and Preventative Action. Acad Radiol 2021; 28:e54-e61. [PMID: 32139303 DOI: 10.1016/j.acra.2020.01.027] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/20/2019] [Revised: 01/24/2020] [Accepted: 01/26/2020] [Indexed: 11/23/2022]
Abstract
RATIONALE AND OBJECTIVES To investigate inter-relationships between radiologist opinions of a quality assurance (QA) program, QA Committee communications, negative emotions, self-identified risk factors, and preventive actions taken following major errors. MATERIALS AND METHODS A 48 question electronic survey was distributed to all 431 radiologists within the same teleradiology organization between June 15 and July 3, 2018. Two reminders were sent during the survey time period. Descriptive statistics were generated, and comparisons were made with Fisher exact test. Significance level was set at p < 0.05. RESULTS Response rate was 67.5% (291/431), and 72.5% of respondents completed all survey questions. A total of 64.3% of respondents were male, and the highest proportion of radiologists (28.9%, 187/291) had been in practice >20 years. Preventative actions following an error were positively correlated to a higher opinion of the QA process, self-identification of personal risk factors for error, and greater negative emotions following an error (all p < 0.05). A higher opinion of communications with the QA committee was associated with a positive opinion of the QA process (p < 0.001). An inverse relationship existed between negative emotion and opinion of QA committee communications (p < 0.05) and negative emotion and opinion of the QA process (p < 0.05). Radiologist gender and full time versus part time status had a significant effect on perception of the QA process (p < 0.05). CONCLUSION Radiologist opinions of their institutional QA process was related to the number of negative emotions experienced and preventative actions taken following major errors. Nurturing trust and incorporating more positive feedback in the QA process may improve interactions with QA Committees and mitigate future errors.
Collapse
|
41
|
Optimizing Professional Practice Evaluation to Enable a Nonpunitive Learning Health System Approach to Peer Review. Pediatr Qual Saf 2020; 6:e375. [PMID: 33409427 PMCID: PMC7781295 DOI: 10.1097/pq9.0000000000000375] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2020] [Accepted: 08/28/2020] [Indexed: 01/06/2023] Open
Abstract
Healthcare organizations are focused on 2 different and sometimes conflicting tasks; (1) accelerate the improvement of clinical care delivery and (2) collect provider-specific data to determine the competency of providers. We describe creating a process to meet both of these aims while maintaining a culture that fosters improvement and teamwork.
Collapse
|
42
|
Armstrong V, Tan N, Sekhar A, Richardson ML, Kanne JP, Sai V, Chernyak V, Godwin JD, Tammisetti VS, Eberhardt SC, Henry TS. Peer Learning Through Multi-Institutional Web-based Case Conferences: Perceived Value (and Challenges) From Abdominal, Cardiothoracic, and Musculoskeletal Radiology Case Conference Participants. Acad Radiol 2020; 27:1641-1646. [PMID: 31848074 DOI: 10.1016/j.acra.2019.11.009] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/17/2019] [Revised: 11/02/2019] [Accepted: 11/12/2019] [Indexed: 11/19/2022]
Abstract
RATIONALE AND OBJECTIVES Peer learning is a case-based group-learning model intended to improve performance. In this descriptive paper, we describe multi-institutional, multi-subspecialty, web-based radiology case conferences and summarize the participants' experiences. MATERIALS AND METHODS A semi-structured, 27-question survey was administered to radiologists participating in abdominal, cardiothoracic, and musculoskeletal case conferences. Survey questions included demographics, perceived educational value and challenges experienced. Survey question formats were continuous, binary, five-point Likert scale or text-based. The measures of central tendencies, proportions of responses and patterns were tabulated. RESULTS From 57 responders, 12/57 (21.1%) were abdominal, 16/57 (28.1%) were cardiothoracic, and 29/57 (50.8%) were musculoskeletal conference participants; 50/56 (89.3%) represented academic practice. Median age was 45 years (range 35-74); 43/57 (75.4%) were male. Geographically, 16/52 (30.8%) of participants were from the East Coast, 16/52 (30.8%) Midwest, 18/52 (34.6%) West Coast, and 2/52 (3.8%) International. The median reported educational value was 5/5 (interquartile range 5-5). Benefits of the case conference included education (50/95, 52.6%) and networking (39/95, 41.1%). Participants reported presenting the following cases: "great call" 32/48 (66.7%), learning opportunity 32/48 (66.7%), new knowledge 41/49 (83.7%), "zebras" 46/49 (93.9%), and procedural-based 16/46 (34.8%). All 51/51 (100%) of responders reportedly gained new knowledge, 49/51 (96.1%) became more open to group discussion, 34/51 (66.7%) changed search patterns, and 50/51 (98%) would continue to participate. Reported challenges included time zone differences and support from departments for a protected time to participate. CONCLUSION Peer learning through multi-institutional case conferences provides educational and networking opportunities. Current challenges and desires include having department-supported protected time and ability to receive continuing medical education credit.
Collapse
Affiliation(s)
| | - Nelly Tan
- Loma Linda University Medical Center, Loma Linda, California.
| | - Aarti Sekhar
- Emory University School of Medicine, Atlanta, Georgia
| | | | | | | | | | | | | | | | | |
Collapse
|
43
|
Larson DB, Broder JC, Bhargavan-Chatfield M, Donnelly LF, Kadom N, Khorasani R, Sharpe RE, Pahade JK, Moriarity AK, Tan N, Siewert B, Kruskal JB. Transitioning From Peer Review to Peer Learning: Report of the 2020 Peer Learning Summit. J Am Coll Radiol 2020; 17:1499-1508. [DOI: 10.1016/j.jacr.2020.07.016] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2020] [Revised: 07/05/2020] [Accepted: 07/15/2020] [Indexed: 10/23/2022]
|
44
|
Abstract
Purpose To describe the transition from a traditional peer review process to the peer learning system as well as the issues that arose and subsequent actions taken. Methods Baseline peer review data were obtained over 1 year from our traditional peer review system and compared with data obtained over 1 year of using peer learning. Data included number of discrepancies and breakdown of types of discrepancies. Staff radiologists were surveyed to assess their perception of the transition. Results There were 5 significant discrepancies submitted under the traditional peer review system, and 416 cases submitted under the new peer learning methodology. The most reported peer learning events were perception (45.0 %) and great calls (35.1%). Surveys administered after the intervention period demonstrated that most radiologists felt peer learning contributed more to their professional development and had more opportunities for learning compared with the traditional peer review system. Conclusion The benefits of instituting peer learning include increased radiologist engagement and education. There may be challenges in the transition from a traditional peer review system to peer learning; however, the process of solving these issues can also result in an overall improved system.
Collapse
|
45
|
Liao M, Tan N. Collective Intelligence of Peer Learning: Promoting Culture of Learning and Improvement Among Radiologists. Curr Probl Diagn Radiol 2020; 50:761-763. [PMID: 33032854 DOI: 10.1067/j.cpradiol.2020.09.017] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/30/2020] [Revised: 08/31/2020] [Accepted: 09/15/2020] [Indexed: 11/22/2022]
Abstract
The traditional Scoring-Based Peer Review system has been the predominant radiology performance quality assurance model, which can become a condemning, ineffective process. In 2015, the Institute of Medicine called for "policies and practices that promote a non-punitive culture that values open discussion and feedback on diagnostic performance." The development of Peer Learning (PL), a process that encompasses peer feedback, learning, and improvement, has positively impacted radiology through the recognition of success, identification of mistakes as learning opportunities, and development of a professional culture of trust. Furthermore, collective intelligence advances the PL process within the learning community, optimizing the abilities of a group effort that outperform that of a single individual, especially in the setting of complex medical and diagnostic imaging decision-making. The objective of the review article is to highlight the collective intelligence aspect of PL program, which allows PL to be more effective than established peer review model.
Collapse
Affiliation(s)
- Millie Liao
- Department of Radiology, Loma Linda University Medical Center, Loma Linda, CA
| | - Nelly Tan
- Department of Radiology, Mayo Clinic - Arizona, Phoenix, AZ.
| |
Collapse
|
46
|
Ganeshan D, Rosenkrantz AB, Bassett RL, Williams L, Lenchik L, Yang W. Burnout in Academic Radiologists in the United States. Acad Radiol 2020; 27:1274-1281. [PMID: 32037261 DOI: 10.1016/j.acra.2019.12.029] [Citation(s) in RCA: 52] [Impact Index Per Article: 13.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/06/2019] [Revised: 12/24/2019] [Accepted: 12/27/2019] [Indexed: 11/24/2022]
Abstract
RATIONALE AND OBJECTIVES To assess the prevalence and associated factors of burnout among U.S. academic radiologists. MATERIALS AND METHODS An online survey was sent to the radiologists who were full members of the Association of University Radiologists in December 2018. Burnout was measured using the abbreviated Maslach Burnout Inventory Human Services Survey. Survey respondents were also requested to complete questions on demographics, potential professional stressors, sense of calling, and career satisfaction. Associations between survey participants' characteristics and burnout were tested using logistic regression model. RESULTS The survey response rate was 27% (228/831). Twenty-nine percent met all three criteria for high burnout, including high emotional exhaustion, high depersonalization, and low personal accomplishment. Seventy-nine percent had one or more symptoms of burnout. Numerous factors including work overload, inability to balance personal and professional life, lack of autonomy, lack of appreciation from patients and other medical staff were significantly associated (p < 0.05) with high burnout. Older age (OR, 0.95; 95%CI 0.92-0.98; p < 0.05), higher number of years of experience practicing as radiologists (OR, 0.95; 95%CI 0.92-0.98; p < 0.05), and holding academic rank of professor (OR, 0.25; 95%CI 0.11-0.56; p < 0.05) were factors associated with lower odds of experiencing burnout. Radiologists with high burnout were more likely to be dissatisfied with their career (OR, 2.28; 95%CI 1.70-3.07; p < 0.0001) and less likely to identify medicine as a calling. CONCLUSION Multiple factors contribute to high burnout in academic radiologists. Familiarity with these factors may help academic radiology departments to develop strategies to promote health and wellness of their faculty.
Collapse
|
47
|
Collaborative Learning in Radiology: From Peer Review to Peer Learning and Peer Coaching. Acad Radiol 2020; 27:1261-1267. [PMID: 31636005 DOI: 10.1016/j.acra.2019.09.021] [Citation(s) in RCA: 23] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2019] [Revised: 08/28/2019] [Accepted: 09/05/2019] [Indexed: 11/20/2022]
Abstract
BACKGROUND A Radiology Research Alliance Task Force was assembled in 2018 to review the literature on peer review and report on best practices for peer learning and peer coaching. FINDINGS This report provides a historical perspective on peer review and the transition to peer collaborative learning and peer coaching. Most forms of current peer review have fulfilled regulatory requirements but have failed to significantly impact quality improvement or learning opportunities. Peer learning involves joint intellectual efforts by two or more individuals to study best practices and review error collaboratively. Peer coaching is a process in which individuals in a trusted environment work to expand, refine, and build new skills in order to facilitate self-directed learning and professional growth. We discuss the value in creating opportunities for peer learning and peer coaching. CONCLUSION Peer collaborative learning combined with peer coaching provides opportunities for teams to learn and grow together, benefit from each other's expertise and experience, improve faculty morale, and provide more opportunities for collaborations between faculty.
Collapse
|
48
|
Moriarty AK, Cedeno-Kelly K, Hioe T. Private Practice Radiologists' Perceptions of Peer Learning. J Am Coll Radiol 2020; 17:1509-1514. [PMID: 32771495 DOI: 10.1016/j.jacr.2020.07.014] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2020] [Revised: 07/08/2020] [Accepted: 07/10/2020] [Indexed: 11/29/2022]
Affiliation(s)
- Andrew K Moriarty
- Advanced Radiology Services, Grand Rapids, Michigan; Division of Radiology and Biomedical Imaging, Michigan State University College of Human Medicine, Grand Rapids, Michigan.
| | | | - Tanya Hioe
- Division of Radiology and Biomedical Imaging, Michigan State University College of Human Medicine, Grand Rapids, Michigan; Spectrum Health-Michigan State University Radiology, Grand Rapids, Michigan
| |
Collapse
|
49
|
Bronner J, Kottler N, Chauvin D, Heller RE, Zaidi S, Huang A, Liang J, Rawal U. Scale Can Improve the Clinical Value of Radiology Practices. J Am Coll Radiol 2020; 17:355-360. [DOI: 10.1016/j.jacr.2019.12.010] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/03/2019] [Accepted: 12/13/2019] [Indexed: 10/24/2022]
|
50
|
Integration of Peer Review in PACS Results in a Marked Increase in the Discrepancies Reported. AJR Am J Roentgenol 2020; 214:613-617. [PMID: 31846375 DOI: 10.2214/ajr.19.21952] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
OBJECTIVE. The objective of this article is to assess the impact of integrating peer review in PACS on the reporting of discrepancies. Our hypothesis is that a PACS-integrated machine-randomized and semiblinded peer review tool leads to an increase in discrepancies reported. MATERIALS AND METHODS. A PACS tool was implemented to prompt radiologists to perform peer review of prior comparison studies in a randomized fashion. The reviewed radiologist's name was omitted from the prior report in PACS. Before this implementation, radiologists entered peer reviews directly on the RADPEER website. Three academic subspecialty sections comprising 24 radiologists adopted the tool (adopters group). Three sections comprising 14 radiologists did not adopt the tool (nonadopters group). Peer review submissions were analyzed for 4 months before and 4 months after the implementation. The mean rate of significant discrepancies (RADPEER score 2b or higher) reported per radiologist was calculated and the discrepancy rates of the periods before and after the implementation were compared. RESULTS. The mean significant discrepancy rate reported per radiologist in the adopters group increased from 0.19% ± 0.46% (SD) before the implementation to 0.93% ± 1.45% after implementation (p = 0.01). No significant discrepancies were reported by the nonadopters group in either period. CONCLUSION. In this single institutional retrospective analysis, integrating peer review in PACS resulted in a fivefold increase in reported significant discrepancies. These results suggest that peer review data are influenced by the design of the tool used including PACS integration, randomization, and blinding.
Collapse
|