1
|
Morales Santos Á, Del Cura Rodríguez JL, Antúnez Larrañaga N. Teleradiology: good practice guide. RADIOLOGIA 2023; 65:133-148. [PMID: 37059579 DOI: 10.1016/j.rxeng.2022.11.005] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2022] [Accepted: 11/12/2022] [Indexed: 04/16/2023]
Abstract
Teleradiology is the electronic transmission of radiological images from one location to another with the main purpose of interpreting or consulting a diagnosis and must be subject to codes of conduct agreed upon by professional societies. The content of fourteen teleradiology best practice guidelines is analyzed. Their guiding principles are: the best interest and benefit of the patient, quality and safety standards homologous to the local radiology service, and use as a complement and support of the same. As legal obligations: guaranteeing rights by applying the principle of the patient's country of origin, establishing requirements in international teleradiology and civil liability insurance. Regarding the radiological process: integration with the local service process, guaranteeing the quality of images and reports, access to previous studies and reports and complying with the principles of radioprotection. Regarding professional requirements: compliance with the required registrations, licenses and qualifications, training and qualification of the radiologist and technician, prevention of fraudulent practices, respect for labor standards and remuneration of the radiologist. Subcontracting must be justified, managing the risk of commoditization. Compliance with the system's technical standards.
Collapse
Affiliation(s)
- Á Morales Santos
- Servicio de Radiología, Hospital Universitario Donostia, San Sebastián, Spain.
| | | | | |
Collapse
|
2
|
Telerradiología: guía de buenas prácticas. RADIOLOGIA 2023. [DOI: 10.1016/j.rx.2022.11.006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/21/2023]
|
3
|
Ludwig DR, Strnad BS, Bierhals AJ, Mellnick VM. Implementation of a peer-learning program in an academic abdominal radiology practice and comparison with a traditional peer-review system. Abdom Radiol (NY) 2022; 47:2509-2519. [PMID: 35482105 DOI: 10.1007/s00261-022-03523-3] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/10/2022] [Revised: 04/03/2022] [Accepted: 04/05/2022] [Indexed: 11/28/2022]
Abstract
OBJECTIVE The purpose of this study was to transition from a traditional score-based peer-review system to an education-oriented peer-learning program in our academic abdominal radiology practice. MATERIAL AND METHODS This retrospective study compared our experience with a score-based peer-review model used prior to September 2020 and a peer-learning model implemented and used exclusively beginning in October of 2020. In peer review, a web-based peer-review tool randomly generated a list of cases, which were blindly reviewed in consensus. Comparison of the consensus interpretation with the original report was used to categorize each reviewed case and to calculate the rates of significant and minor discrepancies. Only cases with a discrepancy were considered to represent a learning opportunity. In peer learning, faculty prospectively identified and submitted cases for review in several categories, including case interpretations with a discrepancy from subsequent opinion or result, interpretations considered to represent a great call, and interesting or challenging cases meriting further discussion. The peer-learning coordinator showed each case to the group in a manner which blinded the group to both submitting and interpreting radiologist and invited discussion during various stages of the case. RESULTS During peer review, a total of 172 cases were reviewed over 16 sessions occurring between April 2016 and September 2020. Only 3 cases (1.8%) yielded significant discrepancies whereas 13 (7.6%) yielded minor discrepancies, representing a total of 16 learning opportunities (3.6 per year). In peer learning, 64 cases were submitted and 52 reviewed over 7 sessions occurring between October 2020 and October 2021. 29 (56%) were submitted as an interesting or challenging case meriting further discussion, 18 (35%) were submitted for a discrepancy, and 5 (10%) were submitted for a great call. All 52 presented cases represented learning opportunities (48 per year). CONCLUSION An education-focused peer-learning program provided a platform for continuous quality improvement and yielded substantially more learning opportunities compared to score-based peer review.
Collapse
Affiliation(s)
- Daniel R Ludwig
- Mallinckrodt Institute of Radiology, Washington University School of Medicine, 510 S. Kingshighway Blvd, Campus Box 8131, Saint Louis, MO, 63110, USA.
| | - Benjamin S Strnad
- Mallinckrodt Institute of Radiology, Washington University School of Medicine, 510 S. Kingshighway Blvd, Campus Box 8131, Saint Louis, MO, 63110, USA
| | - Andrew J Bierhals
- Mallinckrodt Institute of Radiology, Washington University School of Medicine, 510 S. Kingshighway Blvd, Campus Box 8131, Saint Louis, MO, 63110, USA
| | - Vincent M Mellnick
- Mallinckrodt Institute of Radiology, Washington University School of Medicine, 510 S. Kingshighway Blvd, Campus Box 8131, Saint Louis, MO, 63110, USA
| |
Collapse
|
4
|
A Guide to Performance Evaluation for the Intensivist: Ongoing Professional Practice Evaluation and Focused Professional Practice Evaluation in the ICU. Crit Care Med 2021; 48:1521-1527. [PMID: 32750247 DOI: 10.1097/ccm.0000000000004441] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
OBJECTIVES In 2008, The Joint Commission implemented a new standard mandating a detailed evaluation of a provider's performance. The Ongoing Professional Practice Evaluation was designed to provide ongoing performance evaluation as opposed to periodic evaluation. The Focused Professional Practice Evaluation was designed to evaluate the performance of providers new to the medical staff or providers who are requesting new privileges. To date, we are unable to find critical care specific literature on the implementation of Ongoing Professional Practice Evaluation/Focused Professional Practice Evaluation. The purpose of this concise definitive review is to familiarize the reader with The Joint Commission standards and their application to Ongoing Professional Practice Evaluation/Focused Professional Practice Evaluation design and implementation, literature review in the noncritical care setting, and future process optimization and automation. DATA SOURCES Studies were identified through MEDLINE search using a variety of search phrases related to Ongoing Professional Practice Evaluation, Focused Professional Practice Evaluation, critical care medicine, healthcare quality, and The Joint Commission. Additional articles were identified through a review of the reference lists of identified articles. STUDY SELECTION Original articles, review articles, and systematic reviews were considered. DATA EXTRACTION Manuscripts were selected for inclusion based on expert opinion of well-designed or key studies and review articles. DATA SYNTHESIS There is limited data for the process of Ongoing Professional Practice Evaluation and Focused Professional Practice Evaluation implementation in critical care medicine. Key recommendations exist from The Joint Commission but leave it up to healthcare institutions to realize these. The process and metrics can be tailored to specific institutions and departments. CONCLUSIONS Currently, there is no standard process to develop Ongoing Professional Practice Evaluation and Focused Professional Practice Evaluation processes in critical care medicine. Departments and institutions can tailor metrics and processes but it might be useful to standardize some metrics to assure the overall quality of care. In the future utilization of newer technologies like applications might make this process less time-intensive.
Collapse
|
5
|
Peer Review to Peer Learning in Radiology: Where Have We Been, What Have We Learned and Where Are We Headed? CURRENT RADIOLOGY REPORTS 2018. [DOI: 10.1007/s40134-018-0292-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/28/2022]
|
6
|
Harvey HB, Alkasab TK, Prabhakar AM, Halpern EF, Rosenthal DI, Pandharipande PV, Gazelle GS. Radiologist Peer Review by Group Consensus. J Am Coll Radiol 2016; 13:656-62. [DOI: 10.1016/j.jacr.2015.11.013] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2015] [Revised: 11/10/2015] [Accepted: 11/14/2015] [Indexed: 10/22/2022]
|
7
|
O'Keeffe MM, Davis TM, Siminoski K. Performance results for a workstation-integrated radiology peer review quality assurance program. Int J Qual Health Care 2016; 28:294-8. [PMID: 26892609 DOI: 10.1093/intqhc/mzw017] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 01/07/2016] [Indexed: 11/14/2022] Open
Abstract
OBJECTIVE To assess review completion rates, RADPEER score distribution, and sources of disagreement when using a workstation-integrated radiology peer review program, and to evaluate radiologist perceptions of the program. DESIGN Retrospective review of prospectively collected data. SETTING Large private outpatient radiology practice. PARTICIPANTS Radiologists (n = 66) with a mean of 16.0 (standard deviation, 9.2) years of experience. INTERVENTIONS Prior studies and reports of cases being actively reported were randomly selected for peer review using the RADPEER scoring system (a 4-point scale, with a score of 1 indicating agreement and scores of 2-4 indicating increasing levels of disagreement). MAIN OUTCOME MEASURES Assigned peer review completion rates, review scores, sources of disagreement and radiologist survey responses. RESULTS Of 31 293 assigned cases, 29 044 (92.8%; 95% CI 92.5-93.1%) were reviewed. Discrepant scores (score = 2, 3 or 4) were given in 0.69% (95% CI 0.60-0.79%) of cases and clinically significant discrepancy (score = 3 or 4) was assigned in 0.42% (95% CI 0.35-0.50%). The most common cause of disagreement was missed diagnosis (75.2%; 95% CI 66.8-82.1%). By anonymous survey, 94% of radiologists felt that peer review was worthwhile, 90% reported that the scores they received were appropriate and 78% felt that the received feedback was valuable. CONCLUSION Workstation-based peer review can increase completion rates and levels of radiologist acceptance while producing RADPEER scores similar to those previously reported. This approach may be one way to increase radiologist engagement in peer review quality assurance.
Collapse
Affiliation(s)
- Margaret M O'Keeffe
- Department of Radiology and Diagnostic Imaging, University of Alberta, Edmonton, Alberta, Canada Medical Imaging Consultants, 11010-101 Street, Edmonton, Alberta, Canada T5H 4B9
| | - Todd M Davis
- Intelerad, Montreal, Quebec, Canada Present address: 295 Midpark Way SE, Suite 380, Calgary, Alberta, Canada T2X 2A8
| | - Kerry Siminoski
- Department of Radiology and Diagnostic Imaging, University of Alberta, Edmonton, Alberta, Canada Medical Imaging Consultants, 11010-101 Street, Edmonton, Alberta, Canada T5H 4B9
| |
Collapse
|
8
|
Abujudeh H, Pyatt RS, Bruno MA, Chetlen AL, Buck D, Hobbs SK, Roth C, Truwit C, Agarwal R, Kennedy ST, Glenn L. RADPEER Peer Review: Relevance, Use, Concerns, Challenges, and Direction Forward. J Am Coll Radiol 2014; 11:899-904. [DOI: 10.1016/j.jacr.2014.02.004] [Citation(s) in RCA: 40] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2013] [Accepted: 02/10/2014] [Indexed: 10/25/2022]
|
9
|
Alkasab TK, Harvey HB, Gowda V, Thrall JH, Rosenthal DI, Gazelle GS. Consensus-oriented group peer review: a new process to review radiologist work output. J Am Coll Radiol 2013; 11:131-8. [PMID: 24139321 DOI: 10.1016/j.jacr.2013.04.013] [Citation(s) in RCA: 36] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/01/2013] [Accepted: 04/24/2013] [Indexed: 12/24/2022]
Abstract
The Joint Commission and other regulatory bodies have mandated that health care organizations implement processes for ongoing physician performance review. Software solutions, such as RADPEER™, have been created to meet this need efficiently. However, the authors believe that available systems are not optimally designed to produce changes in practice and overlook many important aspects of quality by excessive focus on diagnosis. The authors present a new model of peer review known as consensus-oriented group review, which is based on group discussion of cases in a conference setting and places greater emphasis on feedback than traditional systems of radiology peer review. By focusing on the process of peer review, consensus-oriented group review is intended to optimize performance improvement and foster group standards of practice. The authors also describe the software tool developed to implement this process of enriched peer review.
Collapse
Affiliation(s)
- Tarik K Alkasab
- Department of Radiology, Massachusetts General Hospital, Boston, Massachusetts; Harvard Medical School, Boston, Massachusetts.
| | - H Benjamin Harvey
- Department of Radiology, Massachusetts General Hospital, Boston, Massachusetts
| | - Vrushab Gowda
- Institute for Technology Assessment, Massachusetts General Hospital, Boston, Massachusetts
| | - James H Thrall
- Department of Radiology, Massachusetts General Hospital, Boston, Massachusetts; Harvard Medical School, Boston, Massachusetts
| | - Daniel I Rosenthal
- Department of Radiology, Massachusetts General Hospital, Boston, Massachusetts; Harvard Medical School, Boston, Massachusetts
| | - G Scott Gazelle
- Department of Radiology, Massachusetts General Hospital, Boston, Massachusetts; Institute for Technology Assessment, Massachusetts General Hospital, Boston, Massachusetts; Harvard Medical School, Boston, Massachusetts
| |
Collapse
|
10
|
O'Keeffe MM, Davis TM, Siminoski K. A workstation-integrated peer review quality assurance program: pilot study. BMC Med Imaging 2013; 13:19. [PMID: 23822583 PMCID: PMC3711932 DOI: 10.1186/1471-2342-13-19] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/20/2012] [Accepted: 06/26/2013] [Indexed: 11/15/2022] Open
Abstract
Background The surrogate indicator of radiological excellence that has become accepted is consistency of assessments between radiologists, and the technique that has become the standard for evaluating concordance is peer review. This study describes the results of a workstation-integrated peer review program in a busy outpatient radiology practice. Methods Workstation-based peer review was performed using the software program Intelerad Peer Review. Cases for review were randomly chosen from those being actively reported. If an appropriate prior study was available, and if the reviewing radiologist and the original interpreting radiologist had not exceeded review targets, the case was scored using the modified RADPEER system. Results There were 2,241 cases randomly assigned for peer review. Of selected cases, 1,705 (76%) were interpreted. Reviewing radiologists agreed with prior reports in 99.1% of assessments. Positive feedback (score 0) was given in three cases (0.2%) and concordance (scores of 0 to 2) was assigned in 99.4%, similar to reported rates of 97.0% to 99.8%. Clinically significant discrepancies (scores of 3 or 4) were identified in 10 cases (0.6%). Eighty-eight percent of reviewed radiologists found the reviews worthwhile, 79% found scores appropriate, and 65% felt feedback was appropriate. Two-thirds of radiologists found case rounds discussing significant discrepancies to be valuable. Conclusions The workstation-based computerized peer review process used in this pilot project was seamlessly incorporated into the normal workday and met most criteria for an ideal peer review system. Clinically significant discrepancies were identified in 0.6% of cases, similar to published outcomes using the RADPEER system. Reviewed radiologists felt the process was worthwhile.
Collapse
Affiliation(s)
- Margaret M O'Keeffe
- Department of Radiology and Diagnostic Imaging, University of Alberta, and Medical Imaging Consultants, 11010-101 Street, Edmonton, AB T5H 4B9, Canada
| | | | | |
Collapse
|
11
|
Silva E, Breslau J, Barr RM, Liebscher LA, Bohl M, Hoffman T, Boland GWL, Sherry C, Kim W, Shah SS, Tilkin M. ACR white paper on teleradiology practice: a report from the Task Force on Teleradiology Practice. J Am Coll Radiol 2013; 10:575-85. [PMID: 23684535 DOI: 10.1016/j.jacr.2013.03.018] [Citation(s) in RCA: 68] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2013] [Accepted: 03/25/2013] [Indexed: 11/30/2022]
Abstract
Teleradiology services are now embedded into the workflow of many radiology practices in the United States, driven largely by an expanding corporate model of services. This has brought opportunities and challenges to both providers and recipients of teleradiology services and has heightened the need to create best-practice guidelines for teleradiology to ensure patient primacy. To this end, the ACR Task Force on Teleradiology Practice has created this white paper to update the prior ACR communication on teleradiology and discuss the current and possible future state of teleradiology in the United States. This white paper proposes comprehensive best-practice guidelines for the practice of teleradiology, with recommendations offered regarding future actions.
Collapse
|
12
|
Assessing physician competency: an update on the joint commission requirement for ongoing and focused professional practice evaluation. Adv Anat Pathol 2012; 19:388-400. [PMID: 23060064 DOI: 10.1097/pap.0b013e318273f97e] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
In 2008, the Joint Commission created 2 new hospital standards that revolve around competency assessment of credentialed and privileged healthcare practitioners. These are called Ongoing Professional Practice Evaluation (OPPE) and Focused Professional Practice Evaluation (FPPE). As many pathologists work as members of a hospital medical staff, either through primary employment or contract relationships, pathology departments and groups need to have OPPE and FPPE policies in place and should be using them to evaluate physicians for competency as part of a regular cycle. There are many subtleties in the standards, and careful attention to the details of the policies will be essential. Furthermore, as credentialing and privileging decisions may be made based upon the assessments that are carried out in OPPE and FPPE, it is quite important to follow internal policies consistently. This review describes OPPE and FPPE in detail, with an analysis of the Standards and the Elements of Performance. It also provides scenarios to illustrate the concepts and charts that can be used to create OPPE and FPPE documents for a practice.
Collapse
|