1
|
Luo M, Berkowitz S, Nguyen Q, Yam CS, Faintuch S, Ahmed M, Collares F, Sarwar A, Weinstein JL, Brook OR. Electronic IR Group Peer Review and Learning Performed during Daily Clinical Rounds. J Vasc Interv Radiol 2019; 30:594-600. [DOI: 10.1016/j.jvir.2018.09.005] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/25/2018] [Revised: 08/30/2018] [Accepted: 09/04/2018] [Indexed: 01/20/2023] Open
|
2
|
Kruskal JB, Eisenberg RL, Ahmed M, Siewert B. Ongoing Professional Practice Evaluation of Radiologists: Strategies and Tools for Simplifying a Complex Process. Radiographics 2018; 38:1593-1608. [DOI: 10.1148/rg.2018180163] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Affiliation(s)
- Jonathan B. Kruskal
- From the Department of Radiology, Beth Israel Deaconess Medical Center, One Deaconess Rd, Boston, MA 02215
| | - Ronald L. Eisenberg
- From the Department of Radiology, Beth Israel Deaconess Medical Center, One Deaconess Rd, Boston, MA 02215
| | - Muneeb Ahmed
- From the Department of Radiology, Beth Israel Deaconess Medical Center, One Deaconess Rd, Boston, MA 02215
| | - Bettina Siewert
- From the Department of Radiology, Beth Israel Deaconess Medical Center, One Deaconess Rd, Boston, MA 02215
| |
Collapse
|
3
|
Loftus ML. OPPE, FPPE, QPS, and why the alphabet soup of physician assessment is essential for safer patient care. Clin Imaging 2018; 47:v-vii. [DOI: 10.1016/j.clinimag.2017.11.003] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2017] [Accepted: 11/07/2017] [Indexed: 10/18/2022]
|
4
|
Automated annotation and classification of BI-RADS assessment from radiology reports. J Biomed Inform 2017; 69:177-187. [PMID: 28428140 DOI: 10.1016/j.jbi.2017.04.011] [Citation(s) in RCA: 44] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/15/2016] [Revised: 04/12/2017] [Accepted: 04/14/2017] [Indexed: 01/09/2023]
Abstract
The Breast Imaging Reporting and Data System (BI-RADS) was developed to reduce variation in the descriptions of findings. Manual analysis of breast radiology report data is challenging but is necessary for clinical and healthcare quality assurance activities. The objective of this study is to develop a natural language processing (NLP) system for automated BI-RADS categories extraction from breast radiology reports. We evaluated an existing rule-based NLP algorithm, and then we developed and evaluated our own method using a supervised machine learning approach. We divided the BI-RADS category extraction task into two specific tasks: (1) annotation of all BI-RADS category values within a report, (2) classification of the laterality of each BI-RADS category value. We used one algorithm for task 1 and evaluated three algorithms for task 2. Across all evaluations and model training, we used a total of 2159 radiology reports from 18 hospitals, from 2003 to 2015. Performance with the existing rule-based algorithm was not satisfactory. Conditional random fields showed a high performance for task 1 with an F-1 measure of 0.95. Rules from partial decision trees (PART) algorithm showed the best performance across classes for task 2 with a weighted F-1 measure of 0.91 for BIRADS 0-6, and 0.93 for BIRADS 3-5. Classification performance by class showed that performance improved for all classes from Naïve Bayes to Support Vector Machine (SVM), and also from SVM to PART. Our system is able to annotate and classify all BI-RADS mentions present in a single radiology report and can serve as the foundation for future studies that will leverage automated BI-RADS annotation, to provide feedback to radiologists as part of a learning health system loop.
Collapse
|
5
|
Larson DB, Donnelly LF, Podberesky DJ, Merrow AC, Sharpe RE, Kruskal JB. Peer Feedback, Learning, and Improvement: Answering the Call of the Institute of Medicine Report on Diagnostic Error. Radiology 2017; 283:231-241. [DOI: 10.1148/radiol.2016161254] [Citation(s) in RCA: 78] [Impact Index Per Article: 11.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Affiliation(s)
- David B. Larson
- From the Department of Radiology, Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105 (D.B.L.); Texas Children’s Hospital, Houston, Tex (L.F.D.); Nemours Children’s Health System, Orlando, Fla (D.J.P.); Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio (A.C.M.); Kaiser Permanente, Denver, Colo (R.E.S.); and Beth Israel Deaconess Medical Center, Boston, Mass (J.B.K.)
| | - Lane F. Donnelly
- From the Department of Radiology, Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105 (D.B.L.); Texas Children’s Hospital, Houston, Tex (L.F.D.); Nemours Children’s Health System, Orlando, Fla (D.J.P.); Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio (A.C.M.); Kaiser Permanente, Denver, Colo (R.E.S.); and Beth Israel Deaconess Medical Center, Boston, Mass (J.B.K.)
| | - Daniel J. Podberesky
- From the Department of Radiology, Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105 (D.B.L.); Texas Children’s Hospital, Houston, Tex (L.F.D.); Nemours Children’s Health System, Orlando, Fla (D.J.P.); Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio (A.C.M.); Kaiser Permanente, Denver, Colo (R.E.S.); and Beth Israel Deaconess Medical Center, Boston, Mass (J.B.K.)
| | - Arnold C. Merrow
- From the Department of Radiology, Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105 (D.B.L.); Texas Children’s Hospital, Houston, Tex (L.F.D.); Nemours Children’s Health System, Orlando, Fla (D.J.P.); Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio (A.C.M.); Kaiser Permanente, Denver, Colo (R.E.S.); and Beth Israel Deaconess Medical Center, Boston, Mass (J.B.K.)
| | - Richard E. Sharpe
- From the Department of Radiology, Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105 (D.B.L.); Texas Children’s Hospital, Houston, Tex (L.F.D.); Nemours Children’s Health System, Orlando, Fla (D.J.P.); Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio (A.C.M.); Kaiser Permanente, Denver, Colo (R.E.S.); and Beth Israel Deaconess Medical Center, Boston, Mass (J.B.K.)
| | - Jonathan B. Kruskal
- From the Department of Radiology, Stanford University School of Medicine, 300 Pasteur Dr, Stanford, CA 94305-5105 (D.B.L.); Texas Children’s Hospital, Houston, Tex (L.F.D.); Nemours Children’s Health System, Orlando, Fla (D.J.P.); Cincinnati Children’s Hospital Medical Center, Cincinnati, Ohio (A.C.M.); Kaiser Permanente, Denver, Colo (R.E.S.); and Beth Israel Deaconess Medical Center, Boston, Mass (J.B.K.)
| |
Collapse
|
6
|
Walker EA, Petscavage-Thomas JM, Fotos JS, Bruno MA. Quality metrics currently used in academic radiology departments: results of the QUALMET survey. Br J Radiol 2017; 90:20160827. [PMID: 28118038 DOI: 10.1259/bjr.20160827] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022] Open
Abstract
OBJECTIVE We present the results of the 2015 quality metrics (QUALMET) survey, which was designed to assess the commonalities and variability of selected quality and productivity metrics currently employed by a large sample of academic radiology departments representing all regions in the USA. METHODS The survey of key radiology metrics was distributed in March-April of 2015 via personal e-mail to 112 academic radiology departments. RESULTS There was a 34.8% institutional response rate. We found that most academic departments of radiology commonly utilize metrics of hand hygiene, report turn around time (RTAT), relative value unit (RVU) productivity, patient satisfaction and participation in peer review. RTAT targets were found to vary widely. The implementation of radiology peer review and the variety of ways in which peer review results are used within academic radiology departments, the use of clinical decision support tools and requirements for radiologist participation in Maintenance of Certification also varied. Policies for hand hygiene and critical results communication were very similar across all institutions reporting, and most departments utilized some form of missed case/difficult case conference as part of their quality and safety programme, as well as some form of periodic radiologist performance reviews. CONCLUSION Results of the QUALMET survey suggest many similarities in tracking and utilization of the selected quality and productivity metrics included in our survey. Use of quality indicators is not a fully standardized process among academic radiology departments. Advances in knowledge: This article examines the current quality and productivity metrics in academic radiology.
Collapse
Affiliation(s)
- Eric A Walker
- 1 Department of Radiology, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA.,2 Department of Radiology and Nuclear Medicine, Uniformed University of the Health Sciences, Bethesda, MD, USA
| | | | - Joseph S Fotos
- 1 Department of Radiology, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
| | - Michael A Bruno
- 1 Department of Radiology, Penn State Milton S. Hershey Medical Center, Hershey, PA, USA
| |
Collapse
|
7
|
Abstract
OBJECTIVE The purpose of this article is to introduce the reader to basic concepts of quality and safety in radiology. CONCLUSION Concepts are introduced that are keys to identifying, understanding, and utilizing certain quality tools with the aim of making process improvements. Challenges, opportunities, and change drivers can be mapped from the radiology quality perspective. Best practices, informatics, and benchmarks can profoundly affect the outcome of the quality improvement initiative we all aim to achieve.
Collapse
|
8
|
O'Keeffe MM, Davis TM, Siminoski K. Performance results for a workstation-integrated radiology peer review quality assurance program. Int J Qual Health Care 2016; 28:294-8. [PMID: 26892609 DOI: 10.1093/intqhc/mzw017] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/07/2016] [Indexed: 11/14/2022] Open
Abstract
OBJECTIVE To assess review completion rates, RADPEER score distribution, and sources of disagreement when using a workstation-integrated radiology peer review program, and to evaluate radiologist perceptions of the program. DESIGN Retrospective review of prospectively collected data. SETTING Large private outpatient radiology practice. PARTICIPANTS Radiologists (n = 66) with a mean of 16.0 (standard deviation, 9.2) years of experience. INTERVENTIONS Prior studies and reports of cases being actively reported were randomly selected for peer review using the RADPEER scoring system (a 4-point scale, with a score of 1 indicating agreement and scores of 2-4 indicating increasing levels of disagreement). MAIN OUTCOME MEASURES Assigned peer review completion rates, review scores, sources of disagreement and radiologist survey responses. RESULTS Of 31 293 assigned cases, 29 044 (92.8%; 95% CI 92.5-93.1%) were reviewed. Discrepant scores (score = 2, 3 or 4) were given in 0.69% (95% CI 0.60-0.79%) of cases and clinically significant discrepancy (score = 3 or 4) was assigned in 0.42% (95% CI 0.35-0.50%). The most common cause of disagreement was missed diagnosis (75.2%; 95% CI 66.8-82.1%). By anonymous survey, 94% of radiologists felt that peer review was worthwhile, 90% reported that the scores they received were appropriate and 78% felt that the received feedback was valuable. CONCLUSION Workstation-based peer review can increase completion rates and levels of radiologist acceptance while producing RADPEER scores similar to those previously reported. This approach may be one way to increase radiologist engagement in peer review quality assurance.
Collapse
Affiliation(s)
- Margaret M O'Keeffe
- Department of Radiology and Diagnostic Imaging, University of Alberta, Edmonton, Alberta, Canada Medical Imaging Consultants, 11010-101 Street, Edmonton, Alberta, Canada T5H 4B9
| | - Todd M Davis
- Intelerad, Montreal, Quebec, Canada Present address: 295 Midpark Way SE, Suite 380, Calgary, Alberta, Canada T2X 2A8
| | - Kerry Siminoski
- Department of Radiology and Diagnostic Imaging, University of Alberta, Edmonton, Alberta, Canada Medical Imaging Consultants, 11010-101 Street, Edmonton, Alberta, Canada T5H 4B9
| |
Collapse
|
9
|
Stanescu AL, Parisi MT, Weinberger E, Ferguson MR, Otto RK, Iyer RS. Peer Review: Lessons Learned in A Pediatric Radiology Department. Curr Probl Diagn Radiol 2015; 45:139-48. [PMID: 26489791 DOI: 10.1067/j.cpradiol.2015.09.001] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2015] [Accepted: 09/05/2015] [Indexed: 12/21/2022]
Abstract
The purpose of this article is to illustrate types of diagnostic errors and feedback given to radiologists, using cases to support and clarify these categories. A comment-enhanced peer review system may be leveraged to generate a comprehensive feedback categorization scheme. These include errors of observation, errors of interpretation, inadequate patient data gathering, errors of communication, interobserver variability, informational feedback, and compliments. Much of this feedback is captured through comments associated with interpretative agreements.
Collapse
Affiliation(s)
- A Luana Stanescu
- Department of Radiology, Seattle Children's Hospital, University of Washington, Seattle, WA.
| | - Marguerite T Parisi
- Department of Radiology, Seattle Children's Hospital, University of Washington, Seattle, WA
| | - Edward Weinberger
- Department of Radiology, Seattle Children's Hospital, University of Washington, Seattle, WA
| | - Mark R Ferguson
- Department of Radiology, Seattle Children's Hospital, University of Washington, Seattle, WA
| | - Randolph K Otto
- Department of Radiology, Seattle Children's Hospital, University of Washington, Seattle, WA
| | - Ramesh S Iyer
- Department of Radiology, Seattle Children's Hospital, University of Washington, Seattle, WA
| |
Collapse
|
10
|
Blankenship JC, Rosenfield K, Jennings HS. Privileging and credentialing for interventional cardiology procedures. Catheter Cardiovasc Interv 2015; 86:655-63. [PMID: 25534235 DOI: 10.1002/ccd.25793] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/15/2014] [Accepted: 12/14/2014] [Indexed: 01/28/2023]
Abstract
Local institutional-specific credentialing and privileging for procedures is an important process for ensuring the quality of care provided by interventional cardiologists. Recently revised standards for coronary intervention and the blossoming of structural heart disease programs have generated controversy over these processes. How standards are set for credentialing and privileging is poorly understood by most interventional cardiologists, including those responsible for credentialing and privileging. Requirements from The Joint Commission dictate how credentialing and privileging is performed at hospitals they accredit. Physicians must be recredentialed every 2 years at each hospital, with privileges renewed at that time. Hospitals must review quality of physicians even more frequently using Ongoing Professional Practice Evaluations. Hospitals must also evaluate the performance of physicians when they join a hospital staff or when they begin performing new procedures using Focused Professional Practice Evaluations. Cardiology department directors and catheterization laboratory directors are responsible for recredentialing and reprivileging members of their departments. Individual physicians are responsible for cooperating with these processes, and for periodic recertification with specialty boards and governmental agencies. We provide specific guidance to help physicians navigate these processes.
Collapse
Affiliation(s)
| | - Kenneth Rosenfield
- Department of Cardiology, Massachusetts General Hospital, Boston, Massachusetts
| | - Henry S Jennings
- Vanderbilt Heart and Vascular Institute, Vanderbilt University School of Medicine, Nashville, Tennessee
| |
Collapse
|
11
|
Brook OR, Romero J, Brook A, Kruskal JB, Yam CS, Levine D. The Complementary Nature of Peer Review and Quality Assurance Data Collection. Radiology 2015; 274:221-9. [DOI: 10.1148/radiol.14132931] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
|
12
|
Alkasab TK, Harvey HB, Gowda V, Thrall JH, Rosenthal DI, Gazelle GS. Consensus-oriented group peer review: a new process to review radiologist work output. J Am Coll Radiol 2013; 11:131-8. [PMID: 24139321 DOI: 10.1016/j.jacr.2013.04.013] [Citation(s) in RCA: 36] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2013] [Accepted: 04/24/2013] [Indexed: 12/24/2022]
Abstract
The Joint Commission and other regulatory bodies have mandated that health care organizations implement processes for ongoing physician performance review. Software solutions, such as RADPEER™, have been created to meet this need efficiently. However, the authors believe that available systems are not optimally designed to produce changes in practice and overlook many important aspects of quality by excessive focus on diagnosis. The authors present a new model of peer review known as consensus-oriented group review, which is based on group discussion of cases in a conference setting and places greater emphasis on feedback than traditional systems of radiology peer review. By focusing on the process of peer review, consensus-oriented group review is intended to optimize performance improvement and foster group standards of practice. The authors also describe the software tool developed to implement this process of enriched peer review.
Collapse
Affiliation(s)
- Tarik K Alkasab
- Department of Radiology, Massachusetts General Hospital, Boston, Massachusetts; Harvard Medical School, Boston, Massachusetts.
| | - H Benjamin Harvey
- Department of Radiology, Massachusetts General Hospital, Boston, Massachusetts
| | - Vrushab Gowda
- Institute for Technology Assessment, Massachusetts General Hospital, Boston, Massachusetts
| | - James H Thrall
- Department of Radiology, Massachusetts General Hospital, Boston, Massachusetts; Harvard Medical School, Boston, Massachusetts
| | - Daniel I Rosenthal
- Department of Radiology, Massachusetts General Hospital, Boston, Massachusetts; Harvard Medical School, Boston, Massachusetts
| | - G Scott Gazelle
- Department of Radiology, Massachusetts General Hospital, Boston, Massachusetts; Institute for Technology Assessment, Massachusetts General Hospital, Boston, Massachusetts; Harvard Medical School, Boston, Massachusetts
| |
Collapse
|
13
|
Duncan JR, Street M, Strother M, Picus D. Optimizing radiation use during fluoroscopic procedures: a quality and safety improvement project. J Am Coll Radiol 2013; 10:847-53. [PMID: 24035122 DOI: 10.1016/j.jacr.2013.05.008] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/26/2013] [Accepted: 05/09/2013] [Indexed: 11/17/2022]
Abstract
PURPOSE The ionizing radiation used during fluoroscopically guided medical interventions carries risk. The teams performing these procedures seek to minimize those risks while preserving each procedure's benefits. This report describes a data-driven optimization strategy. METHODS Manual and automated data capture systems were used to collect a series of different metrics, including fluoroscopy time, kerma area product, and reference point air kerma, from both adult and pediatric interventional radiologic procedures. Tools from statistical process control were used to identify opportunities for improvement and assess which changes led to improvement. RESULTS Initial efforts focused on creating a system capable of reliably capturing fluoroscopy time from all interventional radiologic procedures. Ongoing data analysis and feedback to frontline teams led to the development of a manual workflow that reliably captured fluoroscopy time. Data capture was later supplemented by automatic capture of electronic records. This process exploited the standardized format (DICOM Structured Reporting) that newer fluoroscopy units use to record the radiation metrics. Data analysis found marked differences between the imaging protocols used for adults and children. Revision of the adult protocols led to a stable twofold reduction in average exposure per adult procedure. Analysis of balancing measures found no impact on workflow. CONCLUSIONS A systematic approach to improving radiation use during procedures led to a substantial and sustained reduction in risk with no reduction in benefits. Data were readily captured by both manual and automated processes. Concepts from cognitive psychology and information theory provided a theoretical basis for both data analysis and improvement opportunities.
Collapse
Affiliation(s)
- James R Duncan
- Mallinckrodt Institute of Radiology, St Louis, Missouri; Washington University School of Medicine, St Louis, Missouri.
| | | | | | | |
Collapse
|
14
|
Mucci B, Murray H, Downie A, Osborne K. Interrater variation in scoring radiological discrepancies. Br J Radiol 2013; 86:20130245. [PMID: 23833035 DOI: 10.1259/bjr.20130245] [Citation(s) in RCA: 31] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022] Open
Abstract
OBJECTIVE Discrepancy meetings are an important aspect of clinical governance. The Royal College of Radiologists has published advice on how to conduct meetings, suggesting that discrepancies are scored using the scale: 0=no error, 1=minor error, 2=moderate error and 3=major error. We have noticed variation in scores attributed to individual cases by radiologists and have sought to quantify the variation in scoring at our meetings. METHODS The scores from six discrepancy meetings totalling 161 scored events were collected. The reliability of scoring was measured using Fleiss' kappa, which calculates the degree of agreement in classification. RESULTS The number of cases rated at the six meetings ranged from 18 to 31 (mean 27). The number of raters ranged from 11 to 16 (mean 14). Only cases where all the raters scored were included in the analysis. The Fleiss' kappa statistic ranged from 0.12 to 0.20, and mean kappa was 0.17 for the six meetings. CONCLUSION A kappa of 1.0 indicates perfect agreement above chance and 0.0 indicates agreement equal to chance. A rule of thumb is that a kappa ≥0.70 indicates adequate interrater agreement. Our mean result of 0.172 shows poor agreement between scorers. This could indicate a problem with the scoring system or may indicate a need for more formal training and agreement in how scores are applied. ADVANCES IN KNOWLEDGE Scoring of radiology discrepancies is highly subjective and shows poor interrater agreement.
Collapse
Affiliation(s)
- B Mucci
- Department of Radiology, South Glasgow University Hospitals, Southern General Hospital, Glasgow, Scotland, UK.
| | | | | | | |
Collapse
|
15
|
O'Keeffe MM, Davis TM, Siminoski K. A workstation-integrated peer review quality assurance program: pilot study. BMC Med Imaging 2013; 13:19. [PMID: 23822583 PMCID: PMC3711932 DOI: 10.1186/1471-2342-13-19] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/20/2012] [Accepted: 06/26/2013] [Indexed: 11/15/2022] Open
Abstract
Background The surrogate indicator of radiological excellence that has become accepted is consistency of assessments between radiologists, and the technique that has become the standard for evaluating concordance is peer review. This study describes the results of a workstation-integrated peer review program in a busy outpatient radiology practice. Methods Workstation-based peer review was performed using the software program Intelerad Peer Review. Cases for review were randomly chosen from those being actively reported. If an appropriate prior study was available, and if the reviewing radiologist and the original interpreting radiologist had not exceeded review targets, the case was scored using the modified RADPEER system. Results There were 2,241 cases randomly assigned for peer review. Of selected cases, 1,705 (76%) were interpreted. Reviewing radiologists agreed with prior reports in 99.1% of assessments. Positive feedback (score 0) was given in three cases (0.2%) and concordance (scores of 0 to 2) was assigned in 99.4%, similar to reported rates of 97.0% to 99.8%. Clinically significant discrepancies (scores of 3 or 4) were identified in 10 cases (0.6%). Eighty-eight percent of reviewed radiologists found the reviews worthwhile, 79% found scores appropriate, and 65% felt feedback was appropriate. Two-thirds of radiologists found case rounds discussing significant discrepancies to be valuable. Conclusions The workstation-based computerized peer review process used in this pilot project was seamlessly incorporated into the normal workday and met most criteria for an ideal peer review system. Clinically significant discrepancies were identified in 0.6% of cases, similar to published outcomes using the RADPEER system. Reviewed radiologists felt the process was worthwhile.
Collapse
Affiliation(s)
- Margaret M O'Keeffe
- Department of Radiology and Diagnostic Imaging, University of Alberta, and Medical Imaging Consultants, 11010-101 Street, Edmonton, AB T5H 4B9, Canada
| | | | | |
Collapse
|
16
|
Harned RK. Familiarity with current practices of granting and maintaining privileges in pediatric interventional radiology--a worldwide survey of the members of the Society for Pediatric Interventional Radiology (SPIR). Pediatr Radiol 2012; 42:1316-21. [PMID: 22854847 DOI: 10.1007/s00247-012-2456-5] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/16/2012] [Revised: 05/31/2012] [Accepted: 06/13/2012] [Indexed: 10/28/2022]
Abstract
BACKGROUND Physician credentialing is a complex process driven by the demand for quality improvement in health care. In the U.S., the Joint Commission Standard of 2007 has tied hospital accreditation to credentialing through mandated use of the Focused Professional Practice Evaluation (FPPE) and Ongoing Professional Practice Evaluation (OPPE). OBJECTIVE To assess pediatric interventional radiologists' knowledge of how institutions grant them privileges. MATERIALS AND METHODS Members of the Society for Pediatric Interventional Radiology (SPIR) were sent a web-based survey regarding credentialing. RESULTS Of 122 members from 19 countries, 81 (66%) responded, and of these 81, 59 (73%) were familiar with their hospital's privileging process. Of 49 U.S. respondents and 32 non-U.S. respondents, 37 (76%) and 17 (53%), respectively, stated that interventional radiology credentialing was different from diagnostic radiology credentialing. Of the 49 U.S. respondents, 24 (49%) reported an OPPE, and of the 32 non-U.S. respondents, 8 (25%) reported an ongoing evaluation. The U.S. OPPE is performed at shorter intervals than its international equivalent. CONCLUSION Four years after the Joint Commission defined the FPPE and OPPE, separate credentialing of pediatric interventional radiology from pediatric diagnostic radiology is more likely in the U.S. than internationally, and U.S. pediatric interventional radiologists are more likely to have a defined ongoing professional evaluation and to be evaluated every 6 months or more frequently. There are many SPIR members who do not know how they obtain privileges and/or are not knowingly subject to an OPPE. This lack of knowledge may affect future education of interventional radiologists as well as the definition of pediatric interventional radiology practices within individual institutions.
Collapse
Affiliation(s)
- Roger K Harned
- Department of Radiology, Children's Hospital Colorado, University of Colorado School of Medicine, 13123 East 16th Avenue, B-125, Aurora, CO 80045, USA.
| |
Collapse
|
17
|
Rubin DL. Informatics in radiology: Measuring and improving quality in radiology: meeting the challenge with informatics. Radiographics 2012; 31:1511-27. [PMID: 21997979 DOI: 10.1148/rg.316105207] [Citation(s) in RCA: 34] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/01/2023]
Abstract
Quality is becoming a critical issue for radiology. Measuring and improving quality is essential not only to ensure optimum effectiveness of care and comply with increasing regulatory requirements, but also to combat current trends leading to commoditization of radiology services. A key challenge to implementing quality improvement programs is to develop methods to collect knowledge related to quality care and to deliver that knowledge to practitioners at the point of care. There are many dimensions to quality in radiology that need to be measured, monitored, and improved, including examination appropriateness, procedure protocol, accuracy of interpretation, communication of imaging results, and measuring and monitoring performance improvement in quality, safety, and efficiency. Informatics provides the key technologies that can enable radiologists to measure and improve quality. However, few institutions recognize the opportunities that informatics methods provide to improve safety and quality. The information technology infrastructure in most hospitals is limited, and they have suboptimal adoption of informatics techniques. Institutions can tackle the challenges of assessing and improving quality in radiology by means of informatics.
Collapse
Affiliation(s)
- Daniel L Rubin
- Department of Radiology, Stanford University, Richard M. Lucas Center, 1201 Welch Rd, Office P285, Stanford, CA 94305-5488, USA. dlrubin@ stanford.edu
| |
Collapse
|
18
|
Duncan JR, Balter S, Becker GJ, Brady J, Brink JA, Bulas D, Chatfield MB, Choi S, Connolly BL, Dixon RG, Gray JE, Kee ST, Miller DL, Robinson DW, Sands MJ, Schauer DA, Steele JR, Street M, Thornton RH, Wise RA. Optimizing radiation use during fluoroscopic procedures: proceedings from a multidisciplinary consensus panel. J Vasc Interv Radiol 2011; 22:425-9. [PMID: 21463753 DOI: 10.1016/j.jvir.2010.12.008] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/25/2010] [Revised: 12/04/2010] [Accepted: 12/04/2010] [Indexed: 11/18/2022] Open
Affiliation(s)
- James R Duncan
- Mallinckrodt Institute of Radiology, Washington University School of Medicine, 510 S. Kingshighway Blvd., St. Louis, MO 63110, USA.
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
Collapse
|
19
|
Larson DB, Nance JJ. Rethinking Peer Review: What Aviation Can Teach Radiology about Performance Improvement. Radiology 2011; 259:626-32. [DOI: 10.1148/radiol.11102222] [Citation(s) in RCA: 68] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
|
20
|
The Role of RADPEER™ in the Joint Commission Ongoing Practice Performance Evaluation. J Am Coll Radiol 2011; 8:6-7. [DOI: 10.1016/j.jacr.2010.08.025] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/27/2010] [Accepted: 08/31/2010] [Indexed: 11/22/2022]
|