1
|
Chudnofsky CR, Reisdorff EJ, Joldersma KB, Ruff KC, Goyal DG, Gorgas DL. Early validity and reliability evidence for the American Board of Emergency Medicine Virtual Oral Examination. AEM EDUCATION AND TRAINING 2023; 7:e10850. [PMID: 36994316 PMCID: PMC10041069 DOI: 10.1002/aet2.10850] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/27/2022] [Revised: 01/04/2023] [Accepted: 01/11/2023] [Indexed: 06/19/2023]
Abstract
Background The American Board of Emergency Medicine (ABEM) in-person Oral Certification Examination (OCE) was halted abruptly in 2020 due to the COVID-19 pandemic. The OCE was reconfigured to be administered in a virtual environment starting in December 2020. Objectives The purpose of this investigation was to determine whether there was sufficient validity and reliability evidence to support the continued use of the ABEM virtual Oral Examination (VOE) for certification decisions. Methods This retrospective, descriptive study used multiple data sources to provide validity evidence and reliability data. Validity evidence focused on test content, response processes, internal structure (e.g., internal consistency and item response theory), and the consequences of testing. A multifaceted Rasch reliability coefficient was used to measure reliability. Study data were from two 2019 in-person OCEs and the first four VOE administrations. Results There were 2279 physicians who took the 2019 in-person OCE examination and 2153 physicians who took the VOE during the study period. Among the OCE group, 92.0% agreed or strongly agreed that the cases on the examination were cases that an emergency physician should be expected to see; 91.1% of the VOE group agreed or strongly agreed. A similar pattern of responses given to a question about whether the cases on the examination were cases that they had seen. Additional evidence of validity was obtained by the use of the EM Model, the process for case development, the use of think-aloud protocols, and similar test performance patterns (e.g., pass rates). For reliability, the Rasch reliability coefficients for the OCE and the VOE during the study period were all >0.90. Conclusions There was substantial validity evidence and reliability to support ongoing use of the ABEM VOE to make confident and defensible certification decisions.
Collapse
Affiliation(s)
- Carl R. Chudnofsky
- Keck School of Medicine of the University of Southern CaliforniaLos AngelesCaliforniaUSA
| | | | | | | | | | - Diane L. Gorgas
- Wexner Medical Center at The Ohio State UniversityColumbusOhioUSA
| |
Collapse
|
2
|
Buchanan JA, Moreira M, Taira T, Byyny R, Jarou Z, Taylor TA, Sungar WG, Angerhofer C, Dyer S, White M, Amin D, D. Lall M, Caro D, E. Parsons M, Smith TY. Defining "county": A mixed-methods inquiry of county emergency medicine residency programs. AEM EDUCATION AND TRAINING 2021; 5:S87-S97. [PMID: 34616979 PMCID: PMC8480508 DOI: 10.1002/aet2.10664] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/27/2021] [Revised: 05/11/2021] [Accepted: 06/14/2021] [Indexed: 06/13/2023]
Abstract
INTRODUCTION There is no clear unified definition of "county programs" in emergency medicine (EM). Key residency directories are varied in designation, despite it being one of the most important match factors for applicants. The Council of Residency Directors EM County Program Community of Practice consists of residency program leadership from a unified collective of programs that identify as "county." This paper's framework was spurred from numerous group discussions to better understand unifying themes that define county programs. METHODOLOGY This institutional review board-exempt work provides qualitative descriptive results via a mixed-methods inquiry utilizing survey data and quantitative data from programs that self-designate as county. UNIQUE TREATMENT ANALYSIS AND CRITIQUE Most respondents work, identify, and trained at a county program. The majority defined county programs by commitment to care for the underserved, funding from the city or state, low-resourced, and urban setting. Major qualitative themes included mission, clinical environment, research, training, and applicant recommendations. Comparing the attributes of programs by self-described type of training environment, county programs are typically larger, older, in central metro areas, and more likely to be 4 years in duration and have higher patient volumes when compared to community or university programs. When comparing hospital-level attributes of primary training sites county programs are more likely to be owned and operated by local governments or governmental hospital districts and authorities and see more disproportionate-share hospital patients. IMPLICATIONS FOR EDUCATION AND TRAINING IN EM To be considered a county program we recommend some or most of the following attributes be present: a shared mission to medically underserved and vulnerable patients, an urban location with city or county funding, an ED with high patient volumes, supportive of resident autonomy, and research expertise focusing on underserved populations.
Collapse
Affiliation(s)
- Jennie A. Buchanan
- Denver Health & Hospital Authority Department of Emergency Medicine & University of Colorado Department of Emergency MedicineDenverColoradoUSA
| | - Maria Moreira
- Denver Health & Hospital Authority Department of Emergency Medicine & University of Colorado Department of Emergency MedicineDenverColoradoUSA
| | - Taku Taira
- Department of Emergency MedicineLAC+USC Medical CenterLos AngelesCaliforniaUSA
| | | | - Zachary Jarou
- Section of Emergency MedicineUniversity of Chicago Department of MedicineChicagoIllinoisUSA
| | - Todd Andrew Taylor
- Department of Emergency MedicineEmory University School of MedicineAtlantaGeorgiaUSA
| | - W. Gannon Sungar
- Denver Health & Hospital Authority Department of Emergency Medicine & University of Colorado Department of Emergency MedicineDenverColoradoUSA
| | | | - Sean Dyer
- Department of Emergency MedicineCook County Health and Hospital SystemChicagoIllinoisUSA
| | - Melissa White
- Department of Emergency MedicineEmory University School of MedicineAtlantaGeorgiaUSA
| | - Dhara Amin
- Department of Emergency MedicineCook County Health and Hospital SystemChicagoIllinoisUSA
| | - Michelle D. Lall
- Department of Emergency MedicineEmory University School of MedicineAtlantaGeorgiaUSA
| | - David Caro
- Department of Emergency MedicineUniversity of Florida College of Medicine–JacksonvilleJacksonvilleFloridaUSA
| | - Melissa E. Parsons
- Department of Emergency MedicineUniversity of Florida College of Medicine–JacksonvilleJacksonvilleFloridaUSA
| | - Teresa Y. Smith
- Department of Graduate Medical EducationKings County HospitalSUNY Downstate Health Sciences UniversityBrooklynNew YorkUSA
- Department of Emergency MedicineSUNY Downstate Health Sciences UniversityBrooklynNew YorkUSA
| |
Collapse
|
3
|
Margus C, Brown N, Hertelendy AJ, Safferman MR, Hart A, Ciottone GR. Emergency Physician Twitter Use in the COVID-19 Pandemic as a Potential Predictor of Impending Surge: Retrospective Observational Study. J Med Internet Res 2021; 23:e28615. [PMID: 34081612 PMCID: PMC8281822 DOI: 10.2196/28615] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2021] [Revised: 04/14/2021] [Accepted: 04/23/2021] [Indexed: 01/12/2023] Open
Abstract
Background The early conversations on social media by emergency physicians offer a window into the ongoing response to the COVID-19 pandemic. Objective This retrospective observational study of emergency physician Twitter use details how the health care crisis has influenced emergency physician discourse online and how this discourse may have use as a harbinger of ensuing surge. Methods Followers of the three main emergency physician professional organizations were identified using Twitter’s application programming interface. They and their followers were included in the study if they identified explicitly as US-based emergency physicians. Statuses, or tweets, were obtained between January 4, 2020, when the new disease was first reported, and December 14, 2020, when vaccination first began. Original tweets underwent sentiment analysis using the previously validated Valence Aware Dictionary and Sentiment Reasoner (VADER) tool as well as topic modeling using latent Dirichlet allocation unsupervised machine learning. Sentiment and topic trends were then correlated with daily change in new COVID-19 cases and inpatient bed utilization. Results A total of 3463 emergency physicians produced 334,747 unique English-language tweets during the study period. Out of 3463 participants, 910 (26.3%) stated that they were in training, and 466 of 902 (51.7%) participants who provided their gender identified as men. Overall tweet volume went from a pre-March 2020 mean of 481.9 (SD 72.7) daily tweets to a mean of 1065.5 (SD 257.3) daily tweets thereafter. Parameter and topic number tuning led to 20 tweet topics, with a topic coherence of 0.49. Except for a week in June and 4 days in November, discourse was dominated by the health care system (45,570/334,747, 13.6%). Discussion of pandemic response, epidemiology, and clinical care were jointly found to moderately correlate with COVID-19 hospital bed utilization (Pearson r=0.41), as was the occurrence of “covid,” “coronavirus,” or “pandemic” in tweet texts (r=0.47). Momentum in COVID-19 tweets, as demonstrated by a sustained crossing of 7- and 28-day moving averages, was found to have occurred on an average of 45.0 (SD 12.7) days before peak COVID-19 hospital bed utilization across the country and in the four most contributory states. Conclusions COVID-19 Twitter discussion among emergency physicians correlates with and may precede the rising of hospital burden. This study, therefore, begins to depict the extent to which the ongoing pandemic has affected the field of emergency medicine discourse online and suggests a potential avenue for understanding predictors of surge.
Collapse
Affiliation(s)
- Colton Margus
- Division of Disaster Medicine, Department of Emergency Medicine, Beth Israel Deaconess Medical Center, Boston, MA, United States.,Department of Emergency Medicine, Harvard Medical School, Boston, MA, United States
| | - Natasha Brown
- Division of Disaster Medicine, Department of Emergency Medicine, Beth Israel Deaconess Medical Center, Boston, MA, United States.,Department of Emergency Medicine, Harvard Medical School, Boston, MA, United States
| | - Attila J Hertelendy
- Division of Disaster Medicine, Department of Emergency Medicine, Beth Israel Deaconess Medical Center, Boston, MA, United States.,Department of Information Systems and Business Analytics, College of Business, Florida International University, Miami, FL, United States
| | - Michelle R Safferman
- Department of Emergency Medicine, Icahn School of Medicine at Mount Sinai, New York, NY, United States.,Department of Emergency Medicine, Mount Sinai Morningside-West, New York, NY, United States
| | - Alexander Hart
- Division of Disaster Medicine, Department of Emergency Medicine, Beth Israel Deaconess Medical Center, Boston, MA, United States.,Department of Emergency Medicine, Harvard Medical School, Boston, MA, United States
| | - Gregory R Ciottone
- Division of Disaster Medicine, Department of Emergency Medicine, Beth Israel Deaconess Medical Center, Boston, MA, United States.,Department of Emergency Medicine, Harvard Medical School, Boston, MA, United States
| |
Collapse
|
4
|
Marco CA, Wahl RP, Thomas JD, Johnson RW, Ma OJ, Harvey AL, Reisdorff EJ. Emergency medicine practice environment and impact on concert examination performance. Am J Emerg Med 2018; 37:859-863. [PMID: 30078653 DOI: 10.1016/j.ajem.2018.07.055] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/02/2018] [Revised: 07/02/2018] [Accepted: 07/30/2018] [Indexed: 11/19/2022] Open
Abstract
OBJECTIVE The ABEM ConCert Examination is a summative examination that ABEM-certified physicians are required to pass once in every 10-year cycle to maintain certification. This study was undertaken to identify practice settings of emergency physicians, and to determine if there was a difference in performance on the 2017 ConCert between physicians of differing practice types and settings. METHODS This was a mixed methods cross sectional-study, using a post-examination survey and test performance data. All physicians taking the 2017 ConCert Examination who completed three survey questions pertaining to practice type, practice locations, and teaching were included. These three questions address different aspects of academia: self-identification, an academic setting, and whether the physician teaches. RESULTS Among 2796 test administrations of the 2017 ConCert Examination, 2693 (96.3%) completed the three survey questions about practice environment. The majority (N = 2054; 76.3%) self-identified as primarily being a community physician, 528 (19.6%) as academic, and 111 (4.1%) as other. The average ConCert Examination score for community physicians was 83.5 (95% CI, 83.3-83.8); the academic group was 84.8 (95% CI, 84.3-85.3); and the other group was 82.3 (95% CI, 81.1-83.6). After controlling for initial ability as measured by the Qualifying Examination score, there was no significant difference in performance between academic and community physicians (p = .10). CONCLUSIONS Academic emergency physicians and community emergency physicians scored similarly on the ConCert. Working at a community teaching hospital was associated with higher examination performance. Teaching medical learners, especially non-emergency medicine residents, was also associated with better examination performance.
Collapse
Affiliation(s)
- Catherine A Marco
- Department of Emergency Medicine, Wright State University, Dayton, OH, United States of America.
| | - Robert P Wahl
- Department of Emergency Medicine, Wayne State University, Detroit, MI, United States of America
| | - James D Thomas
- Department of Emergency Medicine, Good Samaritan Hospital, Brockton, MA, United States of America
| | - Ramon W Johnson
- Department of Emergency Medicine, Mission Hospital, Mission Viejo, CA, United States of America
| | - O John Ma
- Department of Emergency Medicine, Oregon Health & Science University, Portland, OR, United States of America
| | - Anne L Harvey
- American Board of Emergency Medicine, East Lansing, MI, United States of America
| | - Earl J Reisdorff
- American Board of Emergency Medicine, East Lansing, MI, United States of America
| |
Collapse
|