1
|
Keegan MT, Harman AE, McLoughlin TM, Macario A, Deiner SG, Gaiser RR, Warner DO, Suresh S, Sun H. Administration of the American Board of Anesthesiology's virtual APPLIED Examination: successes, challenges, and lessons learned. BMC MEDICAL EDUCATION 2024; 24:749. [PMID: 38992662 PMCID: PMC11241991 DOI: 10.1186/s12909-024-05694-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/23/2024] [Accepted: 06/23/2024] [Indexed: 07/13/2024]
Abstract
In response to the COVID-19 pandemic, the American Board of Anesthesiology transitioned from in-person to virtual administration of its APPLIED Examination, assessing more than 3000 candidates for certification purposes remotely in 2021. Four hundred examiners were involved in delivering and scoring Standardized Oral Examinations (SOEs) and Objective Structured Clinical Examinations (OSCEs). More than 80% of candidates started their exams on time and stayed connected throughout the exam without any problems. Only 74 (2.5%) SOE and 45 (1.5%) OSCE candidates required rescheduling due to technical difficulties. Of those who experienced "significant issues", concerns with OSCE technical stations (interpretation of monitors and interpretation of echocardiograms) were reported most frequently (6% of candidates). In contrast, 23% of examiners "sometimes" lost connectivity during their multiple exam sessions, on a continuum from minor inconvenience to inability to continue. 84% of SOE candidates and 89% of OSCE candidates described "smooth" interactions with examiners and standardized patients/standardized clinicians, respectively. However, only 71% of SOE candidates and 75% of OSCE candidates considered themselves to be able to demonstrate their knowledge and skills without obstacles. When compared with their in-person experiences, approximately 40% of SOE examiners considered virtual evaluation to be more difficult than in-person evaluation and believed the remote format negatively affected their development as an examiner. The virtual format was considered to be less secure by 56% and 40% of SOE and OSCE examiners, respectively. The retirement of exam materials used virtually due to concern for compromise had implications for subsequent exam development. The return to in-person exams in 2022 was prompted by multiple factors, especially concerns regarding standardization and security. The technology is not yet perfect, especially for testing in-person communication skills and displaying dynamic exam materials. Nevertheless, the American Board of Anesthesiology's experience demonstrated the feasibility of conducting large-scale, high-stakes oral and performance exams in a virtual format and highlighted the adaptability and dedication of candidates, examiners, and administering board staff.
Collapse
Affiliation(s)
- Mark T Keegan
- Department of Anesthesiology and Perioperative Medicine, Mayo Clinic, Rochester, MN, 55905, USA
| | - Ann E Harman
- The American Board of Anesthesiology, 4200 Six Forks Rd, Suite 1100, Raleigh, NC, 27609, USA
| | - Thomas M McLoughlin
- Department of Anesthesiology, Lehigh Valley Health Network, Allentown, PA, 18103, USA
| | - Alex Macario
- Department of Anesthesiology, Perioperative and Pain Medicine, Stanford University, Stanford, CA, 94305, USA
| | - Stacie G Deiner
- Department of Anesthesiology, Dartmouth Hitchcock Medical Center, Lebanon, NH, 03756, USA
| | - Robert R Gaiser
- Department of Anesthesiology, Yale School of Medicine, New Haven, CT, 06510, USA
| | - David O Warner
- Department of Anesthesiology and Perioperative Medicine, Mayo Clinic, Rochester, MN, 55905, USA
| | - Santhanam Suresh
- Department of Anesthesiology, Northwestern University Feinberg School of Medicine, Chicago, IL, 60611, USA
| | - Huaping Sun
- The American Board of Anesthesiology, 4200 Six Forks Rd, Suite 1100, Raleigh, NC, 27609, USA.
| |
Collapse
|
2
|
Sadati L, Edalattalab F, Hajati N, Karami S, Bagheri AB, Bahri MH, Abjar R. OSABSS: An authentic examination for assessing basic surgical skills in surgical residents. Surg Open Sci 2024; 19:217-222. [PMID: 38860004 PMCID: PMC11163168 DOI: 10.1016/j.sopen.2024.04.008] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/16/2023] [Revised: 04/20/2024] [Accepted: 04/28/2024] [Indexed: 06/12/2024] Open
Abstract
Objectives This study aimed to develop and validate the OSABSS (Objective Structured Assessment of Basic Surgical Skills), a modified Objective Structured Clinical Examination (OSCE), to assess basic surgical skills in residents. Design A developmental study conducted in two phases. Basic skills were identified through literature review and gap analysis. The OSABSS was then designed as a modified OSCE. Setting This study took place at Alborz University of Medical Sciences in Iran. Interventions The OSABSS was created using Harden's OSCE (Objective Structured Clinical Examination) methodology. Scenarios, checklists, and station configurations were developed through expert panels. The exam was piloted and implemented with residents as participants and faculty as evaluators. Participants 32 surgical residents in gynecology, general surgery, orthopedics, and neurosurgery participated. 22 faculty members were evaluators. Primary and secondary outcome measures The primary outcome was OSABSS exam scores. Secondary outcomes were written exam scores, and national residency entrance ranks. Main results The mean OSABSS score was 16.59 ± 0.19 across all stations. Criterion validity was demonstrated through correlations between OSABSS scores, written scores and entrance ranks. Reliability was high, with a Cronbach's alpha of 0.87. No significant inter-rater score differences were found. Conclusions The rigorous OSABSS development process produced an exam demonstrating strong validity and reliability for assessing basic surgical skills. The comprehensive station variety evaluates diverse technical and non-technical competencies. Further research should expand participant samples across surgical disciplines.
Collapse
Affiliation(s)
- Leila Sadati
- Department of Operating Room, School of Paramedical Sciences, Alborz University of Medical Sciences, Karaj, Iran
| | - Fatemeh Edalattalab
- School of Paramedical Sciences, Alborz University of Medical Sciences, Karaj, Iran
| | - Niloofar Hajati
- Department of Operating Room, School of Paramedical Sciences, Alborz University of Medical Sciences, Karaj, Iran
| | - Sahar Karami
- Department of Operating Room, School of Paramedical Sciences, Alborz University of Medical Sciences, Karaj, Iran
- Medical education department, School of medicine, Tehran University of Medical Sciences, Tehran, Iran
| | - Ali Baradaran Bagheri
- Department of Neurosurgery, School of Medicine, Shahid Madani Hospital, Alborz University of Medical Sciences, Karaj, Iran
| | - Mohammad Hadi Bahri
- Department of Surgery, Shahid Madani Hospital, School of Medicine, Alborz University of Medical Sciences, Karaj, Iran
| | - Rana Abjar
- Department of Operating Room, School of Paramedical Sciences, Alborz University of Medical Sciences, Karaj, Iran
| |
Collapse
|
3
|
Burch V, McGuire J, Buch E, Sathekge M, M'bouaffou F, Senkubuge F, Fagan J. Feasibility and Acceptability of Web-Based Structured Oral Examinations for Postgraduate Certification: Mixed Methods Preliminary Evaluation. JMIR Form Res 2024; 8:e40868. [PMID: 38064633 PMCID: PMC10919348 DOI: 10.2196/40868] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/19/2022] [Revised: 06/16/2023] [Accepted: 12/04/2023] [Indexed: 03/07/2024] Open
Abstract
BACKGROUND The COVID-19 pandemic disrupted postgraduate certification examinations globally. The Colleges of Medicine of South Africa continued hosting certification examinations through the pandemic. This was achieved by effecting a rapid transition from in-person to web-based certification examinations. OBJECTIVE This formative evaluation explored candidates' acceptability of web-based structured oral examinations (SOEs) hosted via Zoom (Zoom Communications Inc). We also reported the audiovisual quality and technical challenges encountered while using Zoom and candidates' overall experience with these examinations conducted during the early part of the COVID-19 pandemic. Additionally, performance in web-based certification examinations was compared with previous in-person certification examinations. METHODS This mixed methods, single-arm evaluation anonymously gathered candidates' perceptions of web-based SOE acceptability, audiovisual quality, and overall experience with Zoom using a web-based survey. Pass rates of web-based and previous in-person certification examinations were compared using chi-square tests, with a Yates correction. A thematic analysis approach was adopted for qualitative data. RESULTS Between June 2020 and June 2021, 3105 candidates registered for certification examinations, 293 (9.4%) withdrew, 2812 (90.6%) wrote, and 2799 (99.9%) passed, and 1525 (54.2%) were invited to a further web-based SOE. Examination participation was 96.2% (n=1467). During the first web-based examination cycle (2020), 542 (87.1%) of 622 web-based SOE candidates completed the web-based survey. They reported web-based SOEs as fair (374/542, 69%) and adequately testing their clinical reasoning and insight (396/542, 73.1%). Few would have preferred real patient encounters (173/542, 31.9%) or in-person oral examinations (152/542, 28%). Most found Zoom acceptable (434/542, 80%) and fair (396/542, 73.1%) for hosting web-based SOEs. SOEs resulted in financial (434/542, 80%) and time (428/542, 79%) savings for candidates. Many (336/542, 62%) supported the ongoing use of web-based certification examinations. Only 169 technical challenges in using Zoom were reported, which included connectivity-related issues, poor audio quality, and poor image quality. The thematic analysis identified 4 themes of positive and negative experiences related to web-based SOE station design and content, examination station environment, examiner-candidate interactions, and personal benefits for candidates. Our qualitative analysis identified 10 improvements for future web-based SOEs. Candidates achieved high pass rates in web-based certification examinations in 2020 (1583/1732, 91.39%) and 2021 (850/1067, 79.66%). These were significantly higher (2020: N=8635; χ21=667; P<.001; 2021: N=7988; χ21=178; P<.001) than the previous in-person certification examination pass rate of 58.23% (4030/6921; 2017-2019). CONCLUSIONS Web-based SOEs conducted by the Colleges of Medicine of South Africa during the COVID-19 pandemic were well received by candidates, and few technical difficulties were encountered while using Zoom. Better performance was observed in web-based examinations than in previous in-person certification examinations. These early findings support the ongoing use of this assessment method.
Collapse
Affiliation(s)
- Vanessa Burch
- The Colleges of Medicine of South Africa, Cape Town, South Africa
- The University of Cape Town, Cape Town, South Africa
| | | | - Eric Buch
- The Colleges of Medicine of South Africa, Cape Town, South Africa
- University of Pretoria, Pretoria, South Africa
| | - Mike Sathekge
- The Colleges of Medicine of South Africa, Cape Town, South Africa
- University of Pretoria, Pretoria, South Africa
| | | | - Flavia Senkubuge
- The Colleges of Medicine of South Africa, Cape Town, South Africa
- University of Pretoria, Pretoria, South Africa
| | - Johannes Fagan
- The Colleges of Medicine of South Africa, Cape Town, South Africa
- The University of Cape Town, Cape Town, South Africa
| |
Collapse
|
4
|
Sun H, Deiner SG, Harman AE, Isaak RS, Keegan MT. A comparison of the American Board of Anesthesiology's in-person and virtual objective structured clinical examinations. J Clin Anesth 2023; 91:111258. [PMID: 37734196 DOI: 10.1016/j.jclinane.2023.111258] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2023] [Revised: 09/03/2023] [Accepted: 09/06/2023] [Indexed: 09/23/2023]
Abstract
BACKGROUND The American Board of Anesthesiology's Objective Structured Clinical Examination (OSCE), as a component of its initial certification process, had been administered in-person in a dedicated assessment center since its launch in 2018 until March 2020. Due to the COVID-19 pandemic, a virtual format of the exam was piloted in December 2020 and was administered in 2021. This study aimed to compare candidate performance, examiner grading severity, and scenario difficulty between these two formats of the OSCE. METHODS The Many-Facet Rasch Model was utilized to estimate candidate performance, examiner grading severity, and scenario difficulty for the in-person and virtual OSCEs separately. The virtual OSCE was equated to the in-person OSCE by common examiners and common scenarios. Independent-samples t-test was used to compare candidate performance, and partially overlapping samples t-tests were applied to compare examiner grading severity and scenario difficulty between the in-person and virtual OSCEs. RESULTS The in-person (n = 3235) and virtual (n = 2934) first-time candidates were comparable in age, sex, race/ethnicity, and whether U.S. medical school graduates. The virtual scenarios (n = 35, mean [0.21] ± SD [0.38] in logits) were more difficult than the in-person scenarios (n = 93, 0.00 ± 0.69, Welch's partially overlapping samples t-test, p = 0.01); there were no statistically significant differences in examiner severity (n = 390, -0.01 ± 0.82 vs. n = 304, -0.02 ± 0.93, Welch's partially overlapping samples t-test, p = 0.81) or candidate performance (2.19 ± 0.93 vs. 2.18 ± 0.92, Welch's independent samples t-test, p = 0.83) between the in-person and virtual OSCEs. CONCLUSIONS Our retrospective analyses of first-time OSCEs found comparable candidate performance and examiner grading severity between the in-person and virtual formats, despite the virtual scenarios being more difficult than the in-person scenarios. These results provided assurance that the virtual OSCE functioned reasonably well in a high-stakes setting.
Collapse
Affiliation(s)
- Huaping Sun
- The American Board of Anesthesiology, Raleigh, NC, USA.
| | - Stacie G Deiner
- Department of Anesthesiology, Dartmouth Hitchcock Medical Center, Lebanon, NH, USA.
| | - Ann E Harman
- The American Board of Anesthesiology, Raleigh, NC, USA.
| | - Robert S Isaak
- Department of Anesthesiology, The University of North Carolina at Chapel Hill, Chapel Hill, NC, USA.
| | - Mark T Keegan
- Department of Anesthesiology and Perioperative Medicine, Mayo Clinic, Rochester, MN, USA.
| |
Collapse
|
5
|
Chen D, Toutkoushian E, Sun H, Warner DO, Macario A, Deiner SG, Keegan MT. Career decisions, training priorities, and perceived challenges for anesthesiology residents in the United States. J Clin Anesth 2023; 89:111155. [PMID: 37290294 DOI: 10.1016/j.jclinane.2023.111155] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/03/2023] [Revised: 05/12/2023] [Accepted: 05/21/2023] [Indexed: 06/10/2023]
Abstract
STUDY OBJECTIVE This study sought to understand the timing and important factors identified by residents regarding their decision to pursue a career in anesthesiology, training areas deemed important to their future success, perceived greatest challenges facing the profession of anesthesiology, and their post-residency plans. DESIGN The American Board of Anesthesiology administered voluntary, anonymous, repeated cross-sectional surveys to residents who began clinical anesthesia training in the U.S. from 2013 to 2016 and were subsequently followed up yearly until the completion of their residency. The analyses included data from 12 surveys (4 cohorts from clinical anesthesia years 1 to 3), including multiple-choice questions, rankings, Likert scales, and free text responses. Free responses were analyzed using an iterative inductive coding process to determine the main themes. MAIN RESULTS The overall response rate was 36% (6480 responses to 17,793 invitations). Forty-five percent of residents chose anesthesiology during the 3rd year of medical school. "Nature of the clinical practice of anesthesiology" was the most important factor influencing their decision (average ranking of 5.93 out of 8 factors, 1 [least important] to 8 [most important]), followed by "ability to use pharmacology to acutely manipulate physiology" (5.75) and "favorable lifestyle" (5.22). "Practice management" and "political advocacy for anesthesiologists" (average rating 4.46 and 4.42, respectively, on a scale of 1 [very unimportant] to 5 [very important]) were considered the most important non-traditional training areas, followed by "anesthesiologists as leaders of the perioperative surgical home" (4.32), "structure and financing of the healthcare system" (4.27), and "principles of quality improvement" (4.26). Three out of 5 residents desired to pursue a fellowship; pain medicine, pediatric anesthesiology, and cardiac anesthesiology were the most popular choices, each accounting for approximately 20% of prospective fellows. Perceived greatest challenges facing the profession of anesthesiology included workforce competition from non-physician anesthesia providers and lack of advocacy for anesthesiologist values (referenced by 96% of respondents), changes and uncertainty in healthcare systems (30%), and personal challenges such as psychological well-being (3%). CONCLUSIONS Most residents identified anesthesiology as their career choice during medical school. Interest in non-traditional subjects and fellowship training was common. Competition from non-physician providers, healthcare system changes, and compromised psychological well-being were perceived concerns.
Collapse
Affiliation(s)
- Dandan Chen
- The American Board of Anesthesiology, Raleigh, NC, USA.
| | | | - Huaping Sun
- The American Board of Anesthesiology, Raleigh, NC, USA.
| | - David O Warner
- Department of Anesthesiology and Perioperative Medicine, Mayo Clinic, Rochester, MN, USA.
| | - Alex Macario
- Department of Anesthesiology, Perioperative and Pain Medicine, Stanford University, Stanford, CA, USA.
| | - Stacie G Deiner
- Department of Anesthesiology, Dartmouth Hitchcock Medical Center, Lebanon, NH, USA.
| | - Mark T Keegan
- Department of Anesthesiology and Perioperative Medicine, Mayo Clinic, Rochester, MN, USA.
| |
Collapse
|
6
|
Staudenmann D, Waldner N, Lörwald A, Huwendiek S. Medical specialty certification exams studied according to the Ottawa Quality Criteria: a systematic review. BMC MEDICAL EDUCATION 2023; 23:619. [PMID: 37649019 PMCID: PMC10466740 DOI: 10.1186/s12909-023-04600-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/28/2023] [Accepted: 08/18/2023] [Indexed: 09/01/2023]
Abstract
BACKGROUND Medical specialty certification exams are high-stakes summative assessments used to determine which doctors have the necessary skills, knowledge, and attitudes to treat patients independently. Such exams are crucial for patient safety, candidates' career progression and accountability to the public, yet vary significantly among medical specialties and countries. It is therefore of paramount importance that the quality of specialty certification exams is studied in the scientific literature. METHODS In this systematic literature review we used the PICOS framework and searched for papers concerning medical specialty certification exams published in English between 2000 and 2020 in seven databases using a diverse set of search term variations. Papers were screened by two researchers independently and scored regarding their methodological quality and relevance to this review. Finally, they were categorized by country, medical specialty and the following seven Ottawa Criteria of good assessment: validity, reliability, equivalence, feasibility, acceptability, catalytic and educational effect. RESULTS After removal of duplicates, 2852 papers were screened for inclusion, of which 66 met all relevant criteria. Over 43 different exams and more than 28 different specialties from 18 jurisdictions were studied. Around 77% of all eligible papers were based in English-speaking countries, with 55% of publications centered on just the UK and USA. General Practice was the most frequently studied specialty among certification exams with the UK General Practice exam having been particularly broadly analyzed. Papers received an average of 4.2/6 points on the quality score. Eligible studies analyzed 2.1/7 Ottawa Criteria on average, with the most frequently studied criteria being reliability, validity, and acceptability. CONCLUSIONS The present systematic review shows a growing number of studies analyzing medical specialty certification exams over time, encompassing a wider range of medical specialties, countries, and Ottawa Criteria. Due to their reliance on multiple assessment methods and data-points, aspects of programmatic assessment suggest a promising way forward in the development of medical specialty certification exams which fulfill all seven Ottawa Criteria. Further research is needed to confirm these results, particularly analyses of examinations held outside the Anglosphere as well as studies analyzing entire certification exams or comparing multiple examination methods.
Collapse
Affiliation(s)
| | - Noemi Waldner
- University of Bern, Institute for Medical Education, Bern, Switzerland
| | - Andrea Lörwald
- University of Bern, Institute for Medical Education, Bern, Switzerland
| | - Sören Huwendiek
- University of Bern, Institute for Medical Education, Bern, Switzerland
| |
Collapse
|
7
|
Putnam EM, Baetzel AE, Leis A. Paediatric anaesthesiology education: simulation-based 'attending boot camp' for fellows shows feasibility and value in the early years of attendings' careers. BJA OPEN 2022; 4:100115. [PMID: 37588785 PMCID: PMC10430843 DOI: 10.1016/j.bjao.2022.100115] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/18/2022] [Accepted: 11/14/2022] [Indexed: 08/18/2023]
Abstract
Background Established simulation-based 'boot camps' utilise adult learning theory to engage and teach technical and non-technical skills to medical graduates transitioning into residency or fellowship. However, the transition from trainee to the attending role has not been well studied. The primary aim of this study was to design and execute a simulation-based educational day, exposing senior trainees in paediatric anaesthesia to commonly encountered challenges and teaching critical technical skills relevant to their new role. Secondary aims included assessment of its value and relevance in early years of graduated fellows' new careers as attendings. Methods An 'attending boot camp' day comprised the following: two crisis simulations, an otolaryngologist-taught cadaver cricothyroidotomy laboratory, and a difficult conversations workshop. There was a debriefing after each section. Data were collected using end-of-day and early-career e-mail surveys for five consecutive fellow cohorts from 2016 to 2020. Results Forty fellows participated; overall feedback was positive. The end-of-day surveys revealed planned changes in practice for 89% (25/28) of fellows, and 54% (15/28) highlighted communication skills as 'most beneficial'. Early-career follow-up surveys found 96% (23/24) identified increased confidence in skill acquisition because of the day; 79% (19/24) experienced scenarios in real life similar to those simulated. The qualitative analysis revealed four high-value themes: delegation, leadership, clinical skills, and difficult communication. Conclusions The transition from senior trainee to attending physician remains under-researched. A tailored simulation-based 'attending boot camp' was feasible and valued and may be useful in bridging this transition. Participants identified leadership practice, life-saving technical skills, and difficult communication practice as valuable and relevant in their early careers.
Collapse
Affiliation(s)
- Elizabeth M. Putnam
- Department of Anesthesiology, University of Michigan Health Systems, Ann Arbor, MI, USA
- Department of Learning Health Sciences, and University of Michigan Health Systems, Ann Arbor, MI, USA
| | - Anne E. Baetzel
- Department of Anesthesiology, University of Michigan Health Systems, Ann Arbor, MI, USA
| | - Aleda Leis
- Department of Epidemiology, University of Michigan Health Systems, Ann Arbor, MI, USA
| |
Collapse
|
8
|
Kinney CL, Raddatz MM, Robinson LR, Garrison CJ, Sabharwal S. Interrater Reliability in the American Board of Physical Medicine and Rehabilitation Part II Certification Examination: Impact of a New Assessment Design. Am J Phys Med Rehabil 2022; 101:468-472. [PMID: 34347627 DOI: 10.1097/phm.0000000000001859] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
OBJECTIVE The design of medical board certification examinations continues to evolve with advances in testing innovations and psychometric analysis. The potential for subjectivity is inherent in the design of oral board examinations, making improvements in reliability and validity especially important. The purpose of this quality improvement study was to analyze the impact of using two examiners on the overall reliability of the oral certification examination in physical medicine and rehabilitation. DESIGN This was a retrospective quality improvement study of 422 candidates for the American Board of Physical Medicine and Rehabilitation Part II Examination in 2020. Candidates were examined by examiner pairs, each of whom submitted independent scores. Training for all 116 examiners included examination case review, scoring guidelines, and bias mitigation. Examiner performance was analyzed for both internal consistency (intrarater reliability) and agreement with their paired examiner (interrater reliability). RESULTS The reliability of the Part II Examination was high, ranging from 0.93 to 0.94 over three administrations. The analysis also demonstrated high interrater agreement and examiner internal consistency. CONCLUSIONS A high degree of interrater agreement was found using a new, two-examiner format. Comprehensive examiner training is likely the most significant factor for this finding. The two-examiner format improved the overall reliability and validity of the Part II Examination.
Collapse
Affiliation(s)
- Carolyn L Kinney
- From the American Board of Physical Medicine and Rehabilitation, Rochester, Minnesota (CLK, MMR); Department of Physical Medicine and Rehabilitation, Mayo Clinic, Phoenix, Arizona (CLK); Division of Physical Medicine and Rehabilitation, University of Toronto, Toronto, Ontario, Canada (LRR); Ascension Seton Healthcare Family, Dell Medical School, The University of Texas at Austin, Austin, Texas (CJG); and VA Boston Health Care System, Harvard Medical School, Boston, Massachusetts (SS)
| | | | | | | | | |
Collapse
|
9
|
Keegan MT, McLoughlin TM, Patterson AJ, Fiadjoe JE, Pisacano MM, Warner DO, Sun H, Harman AE. A Coronavirus Disease 2019 Pandemic Pivot: Development of the American Board of Anesthesiology's Virtual APPLIED Examination. Anesth Analg 2021; 133:1331-1341. [PMID: 34517394 DOI: 10.1213/ane.0000000000005750] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
In 2020, the coronavirus disease 2019 (COVID-19) pandemic interrupted the administration of the APPLIED Examination, the final part of the American Board of Anesthesiology (ABA) staged examination system for initial certification. In response, the ABA developed, piloted, and implemented an Internet-based "virtual" form of the examination to allow administration of both components of the APPLIED Exam (Standardized Oral Examination and Objective Structured Clinical Examination) when it was impractical and unsafe for candidates and examiners to travel and have in-person interactions. This article describes the development of the ABA virtual APPLIED Examination, including its rationale, examination format, technology infrastructure, candidate communication, and examiner training. Although the logistics are formidable, we report a methodology for successfully introducing a large-scale, high-stakes, 2-element, remote examination that replicates previously validated assessments.
Collapse
Affiliation(s)
- Mark T Keegan
- From the The American Board of Anesthesiology, Raleigh, North Carolina.,Department of Anesthesiology and Perioperative Medicine, Mayo Clinic, Rochester, Minnesota
| | - Thomas M McLoughlin
- From the The American Board of Anesthesiology, Raleigh, North Carolina.,Department of Anesthesiology, Lehigh Valley Health Network, Allentown, Pennsylvania
| | - Andrew J Patterson
- From the The American Board of Anesthesiology, Raleigh, North Carolina.,Department of Anesthesiology, Emory University Hospital, Atlanta, Georgia
| | - John E Fiadjoe
- From the The American Board of Anesthesiology, Raleigh, North Carolina.,Department of Anesthesiology, Boston Children's Hospital, Boston, Massachusetts
| | - Margaret M Pisacano
- From the The American Board of Anesthesiology, Raleigh, North Carolina.,Office of Legal Counsel, University of Kentucky, Albert B. Chandler Hospital, Lexington, Kentucky
| | - David O Warner
- From the The American Board of Anesthesiology, Raleigh, North Carolina.,Department of Anesthesiology and Perioperative Medicine, Mayo Clinic, Rochester, Minnesota
| | - Huaping Sun
- From the The American Board of Anesthesiology, Raleigh, North Carolina
| | - Ann E Harman
- From the The American Board of Anesthesiology, Raleigh, North Carolina
| |
Collapse
|
10
|
Goudra B, Guthal A. US Residents' Perspectives on the Introduction, Conduct, and Value of American Board of Anesthesiology's Objective Structured Clinical Examination-Results of the 1 st Nationwide Questionnaire Survey. Anesth Essays Res 2021; 15:87-100. [PMID: 34667354 PMCID: PMC8462430 DOI: 10.4103/aer.aer_76_21] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/02/2021] [Revised: 07/04/2021] [Accepted: 07/04/2021] [Indexed: 11/26/2022] Open
Abstract
Introduction: Passing the Objective Structured Clinical Examination (OSCE) is currently a requirement for the vast majority (not all) of candidates to gain American Board of Anesthesiology (ABA) initial certification. Many publications from the ABA have attempted to justify its introduction, conduct and value. However, the ABA has never attempted to understand the views of the residents. Methods: A total of 4237 residents at various training levels from 132 programs were surveyed by asking to fill a Google questionnaire prospectively between March 8th, 2021 and April 10th, 2021. Every potential participant was sent an original email followed by 2 reminders. Results: The overall response rate was 17.26% (710 responses to 4112 invitations). On a 5-point Likert scale with 1 as “very inaccurate” and 5 as “very accurate,” the mean accuracy of objective structured clinical examination (OSCE) in assessing communication skills and professionalism was 2.3 and 2.1 respectively. In terms of the usefulness of OSCE training for improving physicians' clinical practice, avoiding lawsuits, teaching effective communication with patients and teaching effective communication with other providers, the means on a 5-point Likert scale with 1 as “Not at all useful” and 5 as “Very useful” were 1.86, 1.69, 1.79, and 1.82 respectively. Residents unanimously thought that factors such as culture, race/ethnicity, religion and language adversely influence the assessment of communication skills. On a 5-point Likert scale with 1 as “not at all affected” and 5 as “very affected,” the corresponding scores were 3.45, 3.19, 3.89, and 3.18 respectively. Interestingly, nationality and political affiliation were also thought to influence this assessment, however, to a lesser extent. In addition, residents believed it is inappropriate to test non-cardiac anesthesiologists for TEE skills (2.39), but felt it was appropriate to test non-regional anesthesiologists in Ultrasound skills (3.29). Lastly, nearly 80% of the residents think that money was the primary motivating factor behind ABA's introduction of the OSCE. Over 96% residents think that OSCE should be stalled, either permanently scrapped (60.8%) or paused (35.8%). Conclusions: Anesthesiology residents in the United States overwhelmingly indicated that the OSCE does not serve any useful purpose and should be immediately halted.
Collapse
Affiliation(s)
- Basavana Goudra
- Department of Anesthesiology and Critical Care Medicine Perelman School of Medicine, Hospital of the University of Pennsylvania, Philadelphia, PA, USA
| | - Arjun Guthal
- Department of Molecular Biology, Princeton University, Princeton, NJ, USA
| |
Collapse
|
11
|
Wang T, Sun H, Zhou Y, Chen D, Harman AE, Isaak RS, Peterson-Layne C, Macario A, Fahy BG, Warner DO. Construct Validation of the American Board of Anesthesiology's APPLIED Examination for Initial Certification. Anesth Analg 2021; 133:226-232. [PMID: 33481404 DOI: 10.1213/ane.0000000000005364] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
BACKGROUND The American Board of Anesthesiology administers the APPLIED Examination as a part of initial certification, which as of 2018 includes 2 components-the Standardized Oral Examination (SOE) and the Objective Structured Clinical Examination (OSCE). The goal of this study is to investigate the measurement construct(s) of the APPLIED Examination to assess whether the SOE and the OSCE measure distinct constructs (ie, factors). METHODS Exploratory item factor analysis of candidates' performance ratings was used to determine the number of constructs, and confirmatory item factor analysis to estimate factor loadings within each construct and correlation(s) between the constructs. RESULTS In exploratory item factor analysis, the log-likelihood ratio test and Akaike information criterion index favored the 3-factor model, with factors reflecting the SOE, OSCE Communication and Professionalism, and OSCE Technical Skills. The Bayesian information criterion index favored the 2-factor model, with factors reflecting the SOE and the OSCE. In confirmatory item factor analysis, both models suggest moderate correlation between the SOE factor and the OSCE factor; the correlation was 0.49 (95% confidence interval [CI], 0.42-0.55) for the 3-factor model and 0.61 (95% CI, 0.54-0.64) for the 2-factor model. The factor loadings were lower for Technical Skills stations of the OSCE (ranging from 0.11 to 0.25) compared with those of the SOE and Communication and Professionalism stations of the OSCE (ranging from 0.36 to 0.50). CONCLUSIONS The analyses provide evidence that the SOE and the OSCE measure distinct constructs, supporting the rationale for administering both components of the APPLIED Examination for initial certification in anesthesiology.
Collapse
Affiliation(s)
- Ting Wang
- From The American Board of Anesthesiology, Raleigh, North Carolina
| | - Huaping Sun
- From The American Board of Anesthesiology, Raleigh, North Carolina
| | - Yan Zhou
- From The American Board of Anesthesiology, Raleigh, North Carolina
| | - Dandan Chen
- From The American Board of Anesthesiology, Raleigh, North Carolina
| | - Ann E Harman
- From The American Board of Anesthesiology, Raleigh, North Carolina
| | - Robert S Isaak
- Department of Anesthesiology, The University of North Carolina at Chapel Hill, Chapel Hill, North Carolina
| | | | - Alex Macario
- Department of Anesthesiology, Perioperative and Pain Medicine, Stanford University, Stanford, California
| | - Brenda G Fahy
- Department of Anesthesiology, University of Florida, Gainesville, Florida
| | - David O Warner
- Department of Anesthesiology and Perioperative Medicine, Mayo Clinic, Rochester, Minnesota
| |
Collapse
|
12
|
Residency program directors' perceptions about the impact of the American Board of Anesthesiology's Objective Structured Clinical Examination. J Clin Anesth 2021; 75:110439. [PMID: 34293669 DOI: 10.1016/j.jclinane.2021.110439] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/12/2021] [Revised: 06/18/2021] [Accepted: 06/26/2021] [Indexed: 11/22/2022]
Abstract
STUDY OBJECTIVE To describe how the introduction of an Objective Structured Clinical Examination (OSCE) by the American Board of Anesthesiology (ABA) to its initial certification impacted anesthesiology residencies in the United States. DESIGN AND SETTING A sequential mixed-methods design with focus groups and online survey among program directors of Accreditation Council for Graduate Medical Education-accredited anesthesiology residencies. PATIENTS No patients were included. INTERVENTION None. MEASUREMENTS A convenience sample of 34 program directors were interviewed to understand their perceptions of the ABA OSCE. Subsequently, an online survey, based on major themes identified from the focus groups, was sent to all 156 program directors. MAIN RESULTS Several themes emerged from the focus group discussions: (1) a mock OSCE was most common for preparing residents for the ABA OSCE; 2) the ABA OSCE led to changes in residency curriculum; 3) the ABA OSCE assessed communication and professionalism skills well, and how well it assessed technical skills was less agreed on. Survey results from 87 program directors (response rate = 56%) were mostly consistent with the themes generated by the focus groups. Eight-one out of 87 programs (93%) specifically prepared their residents for the ABA OSCE. Fifty-two out of 81 program directors (64%) reported the introduction of the ABA OSCE led to curricular changes. Out of 79 program directors, 45 (57%) agreed the ABA OSCE assesses skills essential to anesthesiology practice, and 40 (51%) considered it added value to board certification. CONCLUSIONS The introduction of the OSCE by the ABA for board certification has affected the curriculum of many residencies. Approximately 3 in 5 program directors perceived the ABA OSCE measures skills essential to anesthesiologists' practice. Future studies should assess residency graduates' perspective on the usefulness of both mock OSCE preparation and the ABA OSCE, and whether the ABA OSCE performance predicts future clinical practice.
Collapse
|
13
|
|
14
|
|
15
|
Khan FA, Williams M, Napolitano CA. Resident education during Covid-19, virtual mock OSCE's via zoom: A pilot program. J Clin Anesth 2021; 69:110107. [PMID: 33248355 PMCID: PMC7577665 DOI: 10.1016/j.jclinane.2020.110107] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/02/2020] [Revised: 09/30/2020] [Accepted: 10/10/2020] [Indexed: 11/29/2022]
Affiliation(s)
- Faiza A Khan
- Department of Anesthesiology, Suite 515, University of Arkansas for Medical Sciences, 4301 W Markham Street, Little Rock, AR-, 72205, USA.
| | - Matthew Williams
- Department of Anesthesiology, Suite 515, University of Arkansas for Medical Sciences, 4301 W Markham Street, Little Rock, AR-, 72205, USA.
| | - Charles A Napolitano
- Department of Anesthesiology, Suite 515, University of Arkansas for Medical Sciences, 4301 W Markham Street, Little Rock, AR-, 72205, USA.
| |
Collapse
|
16
|
Sun H, Chen D, Warner DO, Zhou Y, Nemergut EC, Macario A, Keegan MT. Anesthesiology Residents' Experiences and Perspectives of Residency Training. Anesth Analg 2021; 132:1120-1128. [PMID: 33438965 DOI: 10.1213/ane.0000000000005316] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
BACKGROUND Anesthesiology residents' experiences and perspectives about their programs may be helpful in improving training. The goals of this repeated cross-sectional survey study are to determine: (1) the most important factors residents consider in choosing an anesthesiology residency, (2) the aspects of the clinical base year that best prepare residents for anesthesia clinical training, and what could be improved, (3) whether residents are satisfied with their anesthesiology residency and what their primary struggles are, and (4) whether residents believe their residency prepares them for proficiency in the 6 Accreditation Council for Graduate Medical Education (ACGME) Core Competencies and for independent practice. METHODS Anesthesiologists beginning their US residency training from 2013 to 2016 were invited to participate in anonymous, confidential, and voluntary self-administered online surveys. Resident cohort was defined by clinical anesthesia year 1, such that 9 survey administrations were included in this study-3 surveys for the 2013 and 2014 cohorts (clinical anesthesia years 1-3), 2 surveys for the 2015 cohort (clinical anesthesia years 1-2), and 1 survey for the 2016 cohort (clinical anesthesia year 1). RESULTS The overall response rate was 36% (4707 responses to 12,929 invitations). On a 5-point Likert scale with 1 as "very unimportant" and 5 as "very important," quality of clinical experience (4.7-4.8 among the cohorts) and departmental commitment to education (4.3-4.5) were rated as the most important factors in anesthesiologists' choice of residency. Approximately 70% of first- and second-year residents agreed that their clinical base year prepared them well for anesthesiology residency, particularly clinical training experiences in critical care rotations, anesthesiology rotations, and surgery rotations/perioperative procedure management. Overall, residents were satisfied with their choice of anesthesiology specialty (4.4-4.5 on a 5-point scale among cohort-training levels) and their residency programs (4.0-4.1). The residency training experiences mostly met their expectations (3.8-4.0). Senior residents who reported any struggles highlighted academic more than interpersonal or technical difficulties. Senior residents generally agreed that the residency adequately prepared them for independent practice (4.1-4.4). Of the 6 ACGME Core Competencies, residents had the highest confidence in professionalism (4.7-4.9) and interpersonal and communication skills (4.6-4.8). Areas in residency that could be improved include the provision of an appropriate balance between education and service and allowance for sufficient time off to search and interview for a postresidency position. CONCLUSIONS Anesthesiology residents in the United States indicated they most value quality of clinical training experiences and are generally satisfied with their choice of specialty and residency program.
Collapse
Affiliation(s)
- Huaping Sun
- From the American Board of Anesthesiology, Raleigh, North Carolina
| | - Dandan Chen
- From the American Board of Anesthesiology, Raleigh, North Carolina
| | - David O Warner
- Department of Anesthesiology and Perioperative Medicine, Mayo Clinic, Rochester Minnesota
| | - Yan Zhou
- From the American Board of Anesthesiology, Raleigh, North Carolina
| | - Edward C Nemergut
- Department of Anesthesiology, University of Virginia, Charlottesville, Virginia
| | - Alex Macario
- Department of Anesthesiology, Perioperative and Pain Medicine, Stanford University, Stanford, California
| | - Mark T Keegan
- Department of Anesthesiology and Perioperative Medicine, Mayo Clinic, Rochester Minnesota
| |
Collapse
|
17
|
Martinelli SM, Chen F, Isaak RS, Huffmyer JL, Neves SE, Mitchell JD. Educating Anesthesiologists During the Coronavirus Disease 2019 Pandemic and Beyond. Anesth Analg 2021; 132:585-593. [PMID: 33201006 DOI: 10.1213/ane.0000000000005333] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/30/2023]
Abstract
The coronavirus disease 2019 (COVID-19) pandemic has altered approaches to anesthesiology education by shifting educational paradigms. This vision article discusses pre-COVID-19 educational methodologies and best evidence, adaptations required under COVID-19, and evidence for these modifications, and suggests future directions for anesthesiology education. Learning management systems provide structure to online learning. They have been increasingly utilized to improve access to didactic materials asynchronously. Despite some historic reservations, the pandemic has necessitated a rapid uptake across programs. Commercially available systems offer a wide range of peer-reviewed curricular options. The flipped classroom promotes learning foundational knowledge before teaching sessions with a focus on application during structured didactics. There is growing evidence that this approach is preferred by learners and may increase knowledge gain. The flipped classroom works well with learning management systems to disseminate focused preclass work. Care must be taken to keep virtual sessions interactive. Simulation, already used in anesthesiology, has been critical in preparation for the care of COVID-19 patients. Multidisciplinary, in situ simulations allow for rapid dissemination of new team workflows. Physical distancing and reduced availability of providers have required more sessions. Early pandemic decreases in operating volumes have allowed for this; future planning will have to incorporate smaller groups, sanitizing of equipment, and attention to use of personal protective equipment. Effective technical skills training requires instruction to mastery levels, use of deliberate practice, and high-quality feedback. Reduced sizes of skill-training workshops and approaches for feedback that are not in-person will be required. Mock oral and objective structured clinical examination (OSCE) allow for training and assessment of competencies often not addressed otherwise. They provide formative and summative data and objective measurements of Accreditation Council for Graduate Medical Education (ACGME) milestones. They also allow for preparation for the American Board of Anesthesiology (ABA) APPLIED examination. Adaptations to teleconferencing or videoconferencing can allow for continued use. Benefits of teaching in this new era include enhanced availability of asynchronous learning and opportunities to apply universal, expert-driven curricula. Burdens include decreased social interactions and potential need for an increased amount of smaller, live sessions. Acquiring learning management systems and holding more frequent simulation and skills sessions with fewer learners may increase cost. With the increasing dependency on multimedia and technology support for teaching and learning, one important focus of educational research is on the development and evaluation of strategies that reduce extraneous processing and manage essential and generative processing in virtual learning environments. Collaboration to identify and implement best practices has the potential to improve education for all learners.
Collapse
Affiliation(s)
- Susan M Martinelli
- From the Department of Anesthesiology, The University of North Carolina, Chapel Hill, North Carolina
| | - Fei Chen
- From the Department of Anesthesiology, The University of North Carolina, Chapel Hill, North Carolina
| | - Robert S Isaak
- From the Department of Anesthesiology, The University of North Carolina, Chapel Hill, North Carolina
| | - Julie L Huffmyer
- Department of Anesthesiology, University of Virginia, Charlottesville, Virginia
| | - Sara E Neves
- Department of Anesthesia, Critical Care and Pain Medicine, Beth Israel Deaconess Medical Center, Boston, MA
| | - John D Mitchell
- Department of Anesthesia, Critical Care and Pain Medicine, Beth Israel Deaconess Medical Center, Boston, MA
| |
Collapse
|
18
|
Warner DO, Lien CA, Wang T, Zhou Y, Isaak RS, Peterson-Layne C, Harman AE, Macario A, Gaiser RR, Suresh S, Culley DJ, Rathmell JP, Keegan MT, Cole DJ, Fahy BG, Dainer RJ, Sun H. First-Year Results of the American Board of Anesthesiology's Objective Structured Clinical Examination for Initial Certification. Anesth Analg 2020; 131:1412-1418. [PMID: 33079864 DOI: 10.1213/ane.0000000000005086] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Abstract
In 2018, the American Board of Anesthesiology (ABA) became the first US medical specialty certifying board to incorporate an Objective Structured Clinical Examination (OSCE) into its initial certification examination system. Previously, the ABA's staged examination system consisted of 2 written examinations (the BASIC and ADVANCED examinations) and the Standardized Oral Examination (SOE). The OSCE and the existing SOE are now 2 separate components of the APPLIED Examination. This report presents the results of the first-year OSCE administration. A total of 1410 candidates took both the OSCE and the SOE in 2018. Candidate performance approximated a normal distribution for both the OSCE and the SOE, and was not associated with the timing of the examination, including day of the week, morning versus afternoon session, and order of the OSCE and the SOE. Practice-based Learning and Improvement was the most difficult station, while Application of Ultrasonography was the least difficult. The correlation coefficient between SOE and OSCE scores was 0.35 ([95% confidence interval {CI}, 0.30-0.39]; P < .001). Scores for the written ADVANCED Examination were modestly correlated with scores for the SOE (r = 0.29 [95% CI, 0.25-0.34]; P < .001) and the OSCE (r = 0.15 [95% CI, 0.10-0.20]; P < .001). Most of the candidates who failed the SOE passed the OSCE, and most of the candidates who failed the OSCE passed the SOE. Of the 1410 candidates, 77 (5.5%) failed the OSCE, 155 (11.0%) failed the SOE, and 25 (1.8%) failed both. Thus, 207 (14.7%) failed at least 1 component of the APPLIED Examination. Adding an OSCE to a board certification examination system is feasible. Preliminary evidence indicates that the OSCE measures aspects of candidate abilities distinct from those measured by other examinations used for initial board certification.
Collapse
Affiliation(s)
- David O Warner
- From the Department of Anesthesiology and Perioperative Medicine, Mayo Clinic, Rochester, Minnesota
| | - Cynthia A Lien
- Department of Anesthesiology, Medical College of Wisconsin, Milwaukee, Wisconsin
| | - Ting Wang
- The American Board of Anesthesiology, Raleigh, North Carolina
| | - Yan Zhou
- The American Board of Anesthesiology, Raleigh, North Carolina
| | - Robert S Isaak
- Department of Anesthesiology, The University of North Carolina at Chapel Hill, Chapel Hill, North Carolina
| | | | - Ann E Harman
- The American Board of Anesthesiology, Raleigh, North Carolina
| | - Alex Macario
- Department of Anesthesiology, Perioperative and Pain Medicine, Stanford University, Stanford, California
| | - Robert R Gaiser
- Department of Anesthesiology, University of Kentucky, Lexington, Kentucky
| | - Santhanam Suresh
- Department of Pediatric Anesthesiology, Ann & Robert H. Lurie Children's Hospital of Chicago, Northwestern University, Chicago, Illinois
| | - Deborah J Culley
- Department of Anesthesiology, Perioperative and Pain Medicine, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts
| | - James P Rathmell
- Department of Anesthesiology, Perioperative and Pain Medicine, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts
| | - Mark T Keegan
- From the Department of Anesthesiology and Perioperative Medicine, Mayo Clinic, Rochester, Minnesota
| | - Daniel J Cole
- Department of Anesthesiology and Perioperative Medicine, University of California, Los Angeles, Los Angeles, California
| | - Brenda G Fahy
- Department of Anesthesiology, University of Florida, Gainesville, Florida
| | - Rupa J Dainer
- Department of Ambulatory Surgery, Pediatric Specialists of Virginia, Fairfax, Virginia
| | - Huaping Sun
- The American Board of Anesthesiology, Raleigh, North Carolina
| |
Collapse
|
19
|
Dabbagh A, Abtahi D, Aghamohammadi H, Ahmadizadeh SN, Ardehali SH. Relationship Between "Simulated Patient Scenarios and Role-Playing" Method and OSCE Performance in Senior Anesthesiology Residents: A Correlation Assessment Study. Anesth Pain Med 2020; 10:e106640. [PMID: 34150568 PMCID: PMC8207878 DOI: 10.5812/aapm.106640] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2020] [Revised: 08/15/2020] [Accepted: 08/18/2020] [Indexed: 02/05/2023] Open
Abstract
BACKGROUND "Simulated-patient scenarios and role-playing" and OSCE are among the many non-traditional education methods with variable results in different clinical settings. OBJECTIVES This cross-sectional study was performed to assess the correlation between the results of these two methods in senior anesthesiology residents, with a special focus on four of the six ACGME core competencies. METHODS During two years, senior anesthesiology residents were subject to "simulated patient scenario and role-playing" sessions. Two faculty members took the role of the patient and one of the relatives. An objective checklist with 15 items was prepared to be rated by other department faculty members. Meanwhile, an ordered pattern of OSCE was prepared to cover four core competencies that were more related to this academic process (from a total of six core competencies). The mean and standard deviation of the score of each of the 15 items in the checklist were calculated. The correlation between cumulative checklist scoring results and OSCE exam results was assessed. A P value of less than 0.05 was considered significant. RESULTS A total of 40 senior anesthesiology residents, with 344 assessments by faculty members in 40 sessions, were enrolled in the study. The questionnaire's Cronbach's alpha reliability was 0.74. No statistically significant disparity was detected between the results of the two assessment methods, while the results of the two assessments had a significant correlation (two-tailed correlation coefficient = 0.886; P value < 0.001). CONCLUSIONS There was an objective relationship between the results of "simulated patient scenario and role-playing" strategies and the results of OSCE exams using an observer-based rating method. Thus, they could be used as surrogates in the assessment of core clinical competencies of senior anesthesiology residents.
Collapse
Affiliation(s)
- Ali Dabbagh
- Anesthesiology Department, School of Medicine, Shahid Beheshti University of Medical Sciences, Tehran, Iran
- Anesthesiology Research Center, Shahid Beheshti University of Medical Sciences, Tehran, Iran
- Corresponding Author: Anesthesiology Research Center, Shahid Beheshti University of Medical Sciences, Tehran, Iran.
| | - Dariush Abtahi
- Anesthesiology Department, School of Medicine, Shahid Beheshti University of Medical Sciences, Tehran, Iran
| | - Homayoun Aghamohammadi
- Anesthesiology Department, School of Medicine, Shahid Beheshti University of Medical Sciences, Tehran, Iran
| | | | - Seyed Hossein Ardehali
- Anesthesiology Department, School of Medicine, Shahid Beheshti University of Medical Sciences, Tehran, Iran
| | | |
Collapse
|
20
|
Andreae MH, Maman SR, Behnam AJ. An Electronic Medical Record-Derived Individualized Performance Metric to Measure Risk-Adjusted Adherence with Perioperative Prophylactic Bundles for Health Care Disparity Research and Implementation Science. Appl Clin Inform 2020; 11:497-514. [PMID: 32726836 PMCID: PMC7390620 DOI: 10.1055/s-0040-1714692] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2020] [Accepted: 06/01/2020] [Indexed: 12/27/2022] Open
Abstract
BACKGROUND Health care disparity persists despite vigorous countermeasures. Clinician performance is paramount for equitable care processes and outcomes. However, precise and valid individual performance measures remain elusive. OBJECTIVES We sought to develop a generalizable, rigorous, risk-adjusted metric for individual clinician performance (MIP) derived directly from the electronic medical record (EMR) to provide visual, personalized feedback. METHODS We conceptualized MIP as risk responsiveness, i.e., administering an increasing number of interventions contingent on patient risk. We embedded MIP in a hierarchical statistical model, reflecting contemporary nested health care delivery. We tested MIP by investigating the adherence with prophylactic bundles to reduce the risk of postoperative nausea and vomiting (PONV), retrieving PONV risk factors and prophylactic antiemetic interventions from the EMR. We explored the impact of social determinants of health on MIP. RESULTS We extracted data from the EMR on 25,980 elective anesthesia cases performed at Penn State Milton S. Hershey Medical Center between June 3, 2018 and March 31, 2019. Limiting the data by anesthesia Current Procedural Terminology code and to complete cases with PONV risk and antiemetic interventions, we evaluated the performance of 83 anesthesia clinicians on 2,211 anesthesia cases. Our metric demonstrated considerable variance between clinicians in the adherence to risk-adjusted utilization of antiemetic interventions. Risk seemed to drive utilization only in few clinicians. We demonstrated the impact of social determinants of health on MIP, illustrating its utility for health science and disparity research. CONCLUSION The strength of our novel measure of individual clinician performance is its generalizability, as well as its intuitive graphical representation of risk-adjusted individual performance. However, accuracy, precision and validity, stability over time, sensitivity to system perturbations, and acceptance among clinicians remain to be evaluated.
Collapse
Affiliation(s)
- Michael H. Andreae
- Department of Anesthesiology, Penn State Milton S. Hershey Medical Center, Hershey, Pennsylvania, United States
| | - Stephan R. Maman
- Department of Anesthesiology, Penn State Milton S. Hershey Medical Center, Hershey, Pennsylvania, United States
- Penn State College of Medicine, Hershey, Pennsylvania, United States
| | - Abrahm J. Behnam
- Department of Anesthesiology, Penn State Milton S. Hershey Medical Center, Hershey, Pennsylvania, United States
| |
Collapse
|