1
|
Tat S, Shaukat H, Zaveri P, Kou M, Jarvis L. Developing and Integrating Asynchronous Web-Based Cases for Discussing and Learning Clinical Reasoning: Repeated Cross-sectional Study. JMIR MEDICAL EDUCATION 2022; 8:e38427. [PMID: 36480271 PMCID: PMC9782361 DOI: 10.2196/38427] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/31/2022] [Revised: 09/09/2022] [Accepted: 10/24/2022] [Indexed: 06/17/2023]
Abstract
BACKGROUND Trainees rely on clinical experience to learn clinical reasoning in pediatric emergency medicine (PEM). Outside of clinical experience, graduate medical education provides a handful of explicit activities focused on developing skills in clinical reasoning. OBJECTIVE In this paper, we describe the development, use, and changing perceptions of a web-based asynchronous tool to facilitate clinical reasoning discussion for PEM providers. METHODS We created a case-based web-based discussion tool for PEM clinicians and fellows to post and discuss cases. We examined website analytics for site use and collected user survey data over a 3-year period to assess the use and acceptability of the tool. RESULTS The learning tool had more than 30,000 site visits and 172 case comments for the 55 published cases over 3 years. Self-reported engagement with the learning tool varied inversely with clinical experience in PEM. The tool was relevant to clinical practice and useful for learning PEM for most respondents. The most experienced clinicians were more likely than fellows to report posting commentary, although absolute rate of commentary was low. CONCLUSIONS An asynchronous method of case presentation and web-based commentary may present an acceptable way to supplement clinical experience and traditional education methods for sharing clinical reasoning.
Collapse
Affiliation(s)
- Sonny Tat
- Division of Pediatric Emergency Medicine, Benioff Children's Hospitals, University of California, San Francisco, San Francisco, CA, United States
| | - Haroon Shaukat
- Division of Emergency Medicine, Children's National Health System, Washington, DC, United States
| | - Pavan Zaveri
- Division of Emergency Medicine, Children's National Health System, Washington, DC, United States
| | - Maybelle Kou
- Graduate Medical Education, Inova Fairfax Medical Campus, Fairfax, MD, United States
| | - Lenore Jarvis
- Division of Emergency Medicine, Children's National Health System, Washington, DC, United States
| |
Collapse
|
2
|
Lewis JJ, Balaji L, Grossestreuer AV, Ullman E, Rosen C, Dubosh NM. Correlation of attending and patient assessment of resident communication skills in the emergency department. AEM EDUCATION AND TRAINING 2021; 5:e10629. [PMID: 34485802 PMCID: PMC8391985 DOI: 10.1002/aet2.10629] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/10/2020] [Revised: 04/12/2021] [Accepted: 05/05/2021] [Indexed: 06/13/2023]
Abstract
BACKGROUND Communication and interpersonal skills are one of the Accreditation Council for Graduate Medical Education's six core competencies. Validated methods for assessing these among trainees are lacking. Educators have developed various communication assessment tools from both the supervising attending and the patient perspectives. How these different assessment methods and tools compare with each other remains unknown. The goal of this study was to determine the degree of agreement between attending and patient assessment of resident communication skills. METHODS This was a retrospective study of emergency medicine (EM) residents at an academic medical center. From July 2017 to June 2018, residents were assessed on communication skills during their emergency department shifts by both their supervising attending physicians and their patients. The attendings rated residents' communication skills with patients, colleagues, and nursing/ancillary staff using a 1 to 5 Likert scale. Patients completed the modified Communication Assessment Tool (CAT), a 14-item questionnaire based on a 1 to 5 Likert scale. Mean attending ratings and patient CAT scores were calculated for each resident. Means were divided into tertiles due to nonparametric distribution of scores. Agreement between attending and patient ratings of residents were measured using Cohen's kappa for each attending evaluation question. Scores were weighted to assign adjacent tertiles partial agreement. RESULTS During the study period, 1,097 attending evaluations and 952 patient evaluations were completed for 26 residents. Attending scores and CAT scores of the residents showed slight to fair agreement in the following three domains: patient communication (κ = 0.21), communication with colleagues (κ = 0.21), and communication with nursing/ancillary staff (κ = 0.26). CONCLUSIONS Attending and patient ratings of EM residents' communication skills show slight to fair agreement. The use of different types of raters may be beneficial in fully assessing trainees' communication skills.
Collapse
Affiliation(s)
- Jason J. Lewis
- Department of Emergency MedicineBeth Israel Deaconess Medical Center/Harvard Affiliated Emergency Medicine ResidencyBostonMassachusettsUSA
| | - Lakshman Balaji
- Department of Emergency MedicineBeth Israel Deaconess Medical CenterBostonMassachusettsUSA
| | - Anne V. Grossestreuer
- Department of Emergency MedicineBeth Israel Deaconess Medical CenterBostonMassachusettsUSA
| | - Edward Ullman
- Department of Emergency MedicineBeth Israel Deaconess Medical Center/Harvard Affiliated Emergency Medicine ResidencyBostonMassachusettsUSA
| | - Carlo Rosen
- Department of Emergency MedicineBeth Israel Deaconess Medical Center/Harvard Affiliated Emergency Medicine ResidencyBostonMassachusettsUSA
| | - Nicole M. Dubosh
- Department of Emergency MedicineBeth Israel Deaconess Medical Center/Harvard Affiliated Emergency Medicine ResidencyBostonMassachusettsUSA
| |
Collapse
|
3
|
Gottlieb M, Jordan J, Siegelman JN, Cooney R, Stehman C, Chan TM. Direct Observation Tools in Emergency Medicine: A Systematic Review of the Literature. AEM EDUCATION AND TRAINING 2021; 5:e10519. [PMID: 34041428 PMCID: PMC8138102 DOI: 10.1002/aet2.10519] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/03/2020] [Revised: 07/31/2020] [Accepted: 08/09/2020] [Indexed: 05/07/2023]
Abstract
OBJECTIVES Direct observation is important for assessing the competency of medical learners. Multiple tools have been described in other fields, although the degree of emergency medicine-specific literature is unclear. This review sought to summarize the current literature on direct observation tools in the emergency department (ED) setting. METHODS We searched PubMed, Scopus, CINAHL, the Cochrane Central Register of Clinical Trials, the Cochrane Database of Systematic Reviews, ERIC, PsycINFO, and Google Scholar from 2012 to 2020 for publications on direct observation tools in the ED setting. Data were dual extracted into a predefined worksheet, and quality analysis was performed using the Medical Education Research Study Quality Instrument. RESULTS We identified 38 publications, comprising 2,977 learners. Fifteen different tools were described. The most commonly assessed tools included the Milestones (nine studies), Observed Structured Clinical Exercises (seven studies), the McMaster Modular Assessment Program (six studies), Queen's Simulation Assessment Test (five studies), and the mini-Clinical Evaluation Exercise (four studies). Most of the studies were performed in a single institution, and there were limited validity or reliability assessments reported. CONCLUSIONS The number of publications on direct observation tools for the ED setting has markedly increased. However, there remains a need for stronger internal and external validity data.
Collapse
Affiliation(s)
- Michael Gottlieb
- Department of Emergency MedicineRush University Medical CenterChicagoILUSA
| | - Jaime Jordan
- Department of Emergency MedicineRonald Reagan UCLA Medical CenterLos AngelesCAUSA
| | | | - Robert Cooney
- Department of Emergency MedicineGeisinger Medical CenterDanvillePAUSA
| | | | - Teresa M. Chan
- Department of MedicineDivision of Emergency MedicineMcMaster UniversityHamiltonOntarioCanada
| |
Collapse
|
4
|
Innovating Pediatric Emergency Care and Learning Through Interprofessional Briefing and Workplace-Based Assessment: A Qualitative Study. Pediatr Emerg Care 2020; 36:575-581. [PMID: 32868619 PMCID: PMC7709919 DOI: 10.1097/pec.0000000000002218] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
BACKGROUND Managing pediatric emergencies can be both clinically and educationally challenging with little existing research on how to improve resident involvement. Moreover, nursing input is frequently ignored. We report here on an innovation using interprofessional briefing (iB) and workplace-based assessment (iWBA) to improve the delivery of care, the involvement of residents, and their assessment. METHODS Over a period of 3 months, we implement an innovation using iB and iWBA for residents providing emergency pediatric care. A constructivist thematic analysis approach was used to collect and analyze data from 4 focus groups (N = 18) with nurses (4), supervisors (5), and 2 groups of residents (4 + 5). RESULTS Residents, supervisors, and nurses all felt that iB had positive impacts on learning, teamwork, and patient care. Moreover, when used, iB seemed to play an important role in enhancing the impact of iWBA. Although iB and iWBA seemed to be accepted and participants described important impacts on emergency department culture, conducting of both iB and iWBA could be sometimes challenging as opposed to iB alone mainly because of time constraints. CONCLUSIONS Interprofessional briefing and iWBA are promising approaches for not only resident involvement and learning during pediatric emergencies but also enhancing team function and patient care. Nursing involvement was pivotal in the success of the innovation enhancing both care and resident learning.
Collapse
|
5
|
Greenberg L. Can the Recruitment of Senior Transitioning Clinician Educators Enhance the Number and Quality of Resident Observations? Thinking Outside the Box. TEACHING AND LEARNING IN MEDICINE 2020; 32:569-574. [PMID: 32841577 DOI: 10.1080/10401334.2020.1801442] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Issue: The Accreditation Council for Graduate Medical Education's Next Accreditation System has forever changed the way faculty evaluate residents, fellows, and medical students, mandating direct observation by faculty of trainee performance. Evidence: The literature suggests that institutional culture does not support trainee observation, and faculty perceive that they have limited time to observe trainees in an efficient and effective manner. These factors contribute to an inadequate number of trainee observations, limiting faculty ability to assess trainees' achievement of competency. Hiring more faculty to increase observations has not been feasible or a priority, nor have faculty development programs been universally effective in recruiting faculty to enhance observations. Implications: To alleviate this important problem, the author proposes recruiting senior clinician educators transitioning to retirement. These are faculty who in their full-time careers have established themselves as playing a major role in teaching and might be interested in continuing their relationship with the academic health center. The number of these physicians is increasing and therefore there will be a larger pool seeking an opportunity to continue their commitment to education. Recruitment of senior clinician educators transitioning to retirement could significantly increase the number and quality of resident observations, addressing a previously insoluble problem with a relatively significant return on investment to the academic health center.
Collapse
Affiliation(s)
- Larrie Greenberg
- Children's National Medical Center, The George Washington University School of Medicine and Health Sciences, Potomac, Maryland, USA
| |
Collapse
|
6
|
Ruiz Moral R, García de Leonardo C, Cerro Pérez A, Caballero Martínez F, Monge Martín D. Barriers to teaching communication skills in Spanish medical schools: a qualitative study with academic leaders. BMC MEDICAL EDUCATION 2020; 20:41. [PMID: 32041592 PMCID: PMC7011270 DOI: 10.1186/s12909-020-1944-9] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/26/2019] [Accepted: 01/23/2020] [Indexed: 06/10/2023]
Abstract
BACKGROUND In recent years, Spanish medical schools (MSs) have incorporated training in communication skills (CS), but how this training is being carried out has not yet been evaluated. OBJECTIVE To identify the barriers to the introduction and development of CS teaching in Spanish MSs. METHODS In a previous study, 34 MSs (83% of all MSs in Spain) were invited to participate in a study that explored the factual aspects of teaching CS in these schools. The person responsible for teaching CS at each school was contacted again for this study and asked to respond to a single open-ended question. Two researchers independently conducted a thematic analysis of the responses. RESULTS We received responses from 30 MSs (85.7% of those contacted and 73% of all MSs in Spain). Five main thematic areas were identified, each with different sub-areas: negative attitudes of teachers and academic leaders; organisation, structure and presence of CS training in the curriculum; negative attitudes of students; a lack of trained teachers; and problems linked to teaching methods and necessary educational logistics. CONCLUSIONS The identified barriers and problems indicate that there are areas for improvement in teaching CS in most Spanish MSs. There seems to be a vicious circle based on the dynamic relationship and interdependence of all these problems that should be faced with different strategies and that requires a significant cultural shift as well as decisive institutional support at the local and national levels. The incorporation of CS training into MS curricula represents a major challenge that must be addressed for students to learn CS more effectively and avoid negative attitudes towards learning CS.
Collapse
Affiliation(s)
- Roger Ruiz Moral
- Department of Medical Education, School of Medicine, Faculty of Health Sciences, Universidad Francisco de Vitoria (UFV), Edificio E. Ctra M-515 Pozuelo-Majadahonda, 3028 Madrid, Spain
| | | | | | | | - Diana Monge Martín
- Family and Preventive Medicine, Epidemiology and Statistics, School of Health Sciences (UFV), Madrid, Spain
| |
Collapse
|
7
|
Wang EE, Yin Y, Gurvich I, Kharasch MS, Rice C, Novack J, Babcock C, Ahn J, Bowman SH, Van Mieghem JA. Resident Supervision and Patient Care: A Comparative Time Study in a Community-Academic Versus a Community Emergency Department. AEM EDUCATION AND TRAINING 2019; 3:308-316. [PMID: 31637347 PMCID: PMC6795365 DOI: 10.1002/aet2.10334] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/17/2018] [Revised: 02/22/2019] [Accepted: 03/11/2019] [Indexed: 06/10/2023]
Abstract
OBJECTIVE The objective was to compare attending emergency physician (EP) time spent on direct and indirect patient care activities in emergency departments (EDs) with and without emergency medicine (EM) residents. METHODS We performed an observational, time-motion study on 25 EPs who worked in a community-academic ED and a nonacademic community ED. Two observations of each EP were performed at each site. Average time spent per 240-minute observation on main-category activities are illustrated in percentages. We report descriptive statistics (median and interquartile ranges) for the number of minutes EPs spent per subcategory activity, in total and per patient. We performed a Wilcoxon two-sample test to assess differences between time spent across two EDs. RESULTS The 25 observed EPs executed 34,358 tasks in the two EDs. At the community-academic ED, EPs spent 14.2% of their time supervising EM residents. Supervision activities included data presentation, medical decision making, and treatment. The time spent on supervision was offset by a decrease in time spent by EPs on indirect patient care (specifically communication and electronic health record work) at the community academic ED compared to the nonacademic community ED. There was no statistical difference with respect to direct patient care time expenditure between the two EDs. There was a nonstatistically significant difference in attending patient load between sites. CONCLUSIONS EPs in our study spent 14.2% of their time (8.5 minutes/hour) supervising residents. The time spent supervising residents was largely offset by time savings related to indirect patient care activities rather than compromising direct patient care.
Collapse
Affiliation(s)
- Ernest E. Wang
- Division of Emergency MedicineNorthShore University HealthSystemEvanstonIL
| | - Yue Yin
- Department of OperationsKellogg School of ManagementNorthwestern UniversityEvanstonIL
| | - Itai Gurvich
- Cornell Tech and School of Operations Research and Information EngineeringCornell UniversityNew YorkNY
| | - Morris S. Kharasch
- Division of Emergency MedicineNorthShore University HealthSystemEvanstonIL
| | - Clifford Rice
- Division of Emergency MedicineNorthShore University HealthSystemEvanstonIL
| | - Jared Novack
- Division of Emergency MedicineNorthShore University HealthSystemEvanstonIL
| | - Christine Babcock
- Section of Emergency MedicineUniversity of Chicago Pritzker School of MedicineChicagoIL
| | - James Ahn
- Section of Emergency MedicineUniversity of Chicago Pritzker School of MedicineChicagoIL
| | - Steven H. Bowman
- Department of Emergency MedicineCook County Health and Hospital SystemRush Medical CollegeChicagoIL
| | - Jan A. Van Mieghem
- Department of OperationsKellogg School of ManagementNorthwestern UniversityEvanstonIL
| |
Collapse
|
8
|
Sheng AY. Trials and Tribulations in Implementation of the Emergency Medicine Milestones from the Frontlines. West J Emerg Med 2019; 20:647-650. [PMID: 31316705 PMCID: PMC6625677 DOI: 10.5811/westjem.2019.4.42061] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/20/2018] [Accepted: 04/29/2019] [Indexed: 11/25/2022] Open
Affiliation(s)
- Alexander Y Sheng
- Boston University School of Medicine, Boston Medical Center, Department of Emergency Medicine, Boston, Massachusetts
| |
Collapse
|
9
|
Linsenmeyer M, Wimsatt L, Speicher M, Powers J, Miller S, Katsaros E. Assessment Considerations for Core Entrustable Professional Activities for Entering Residency. J Osteopath Med 2018; 118:243-251. [DOI: 10.7556/jaoa.2018.049] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
Abstract
Abstract
Context
In the process of analyzing entrustable professional activities (EPAs) for use in medical education, ten Cate and others identified challenges, including the need for valid and reliable EPA assessment strategies.
Objective
To provide osteopathic medical schools with a database of assessment tools compiled from the literature to assist them with the development and implementation of robust, evidence-based assessment methods.
Methods
MEDLINE, ERIC, PubMed, and other relevant databases were searched using MeSH keywords for articles outlining robust, evidence-based assessment tools that could be used in designing assessments for EPAs 1 through 6.
Results
A total of 55 publications were included in content analysis and reporting. All but 2 of the assessment articles were conducted in an undergraduate or graduate medical education setting. The majority of the 55 articles related to assessment of competencies affiliated with EPA 2 (16 articles) and EPA 4 (15 articles). Four articles focused on EPA 3.
Conclusion
Osteopathic medical schools can use this database of assessment tools to support the development of EPA-specific assessment plans that match the unique context and needs of their institution.
Collapse
|
10
|
Chang YC, Lee CH, Chen CK, Liao CH, Ng CJ, Chen JC, Chaou CH. Exploring the influence of gender, seniority and specialty on paper and computer-based feedback provision during mini-CEX assessments in a busy emergency department. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2017; 22:57-67. [PMID: 27112960 PMCID: PMC5306427 DOI: 10.1007/s10459-016-9682-9] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/30/2015] [Accepted: 04/18/2016] [Indexed: 05/23/2023]
Abstract
The mini-clinical evaluation exercise (mini-CEX) is a well-established method of assessing trainees' clinical competence in the workplace. In order to improve the quality of clinical learning, factors that influence the provision of feedback are worthy of further investigation. A retrospective data analysis of documented feedback provided by assessors using the mini-CEX in a busy emergency department (ED) was conducted. The assessors comprised emergency physicians (EPs) and trauma surgeons. The trainees were all postgraduate year one (PGY1) residents. The completion rate and word count for each of three feedback components (positive feedback, suggestions for development, and an agreed action plan) were recorded. Other variables included observation time, feedback time, the format used (paper versus computer-based), the seniority of the assessor, the gender of the assessor and the specialty of the assessor. The components of feedback provided by the assessors and the influence of these contextual and demographic factors were also analyzed. During a 26-month study period, 1101 mini-CEX assessments (from 273 PGY1 residents and 67 assessors) were collected. The overall completion rate for the feedback components was 85.3 % (positive feedback), 54.8 % (suggestions for development), and 29.5 % (agreed action plan). In only 22.9 % of the total mini-CEX assessments were all three aspects of feedback completed, and 7.4 % contained no feedback. In the univariate analysis, the mini-CEX format, the seniority of the assessor and the specialty of the assessor were identified as influencing the completion of all three components of feedback. In the multivariate analysis, only the mini-CEX format and the seniority of the assessor were statistically significant. In a subgroup analysis, the feedback-facilitating effect of the computer-based format was uneven across junior and senior EPs. In addition, feedback provision showed a primacy effect: assessors tended to provide only the first or second feedback components in a busy ED setting. In summary, the authors explored the influence of gender, seniority and specialty on paper and computer-based feedback provision during mini-CEX assessments for PGY1 residency training in a busy ED. It was shown that junior assessors were more likely to provide all three aspects of written feedback in the mini-CEX than were senior assessors. The computer-based format facilitated the completion of feedback among EPs.
Collapse
Affiliation(s)
- Yu-Che Chang
- Chang Gung Medical Education Research Center, CGMERC, No. 5, Fusing St., Gueishan Township, 333, Taoyuan city, Taiwan (R.O.C.)
- Department of Emergency Medicine, Chang Gung Memorial Hospital, Linkou, and Chang Gung University College of Medicine, Taoyuan City, Taiwan (R.O.C.)
- Department of Medical Education, Chang Gung Memorial Hospital, Linkou, Taoyuan city, Taiwan (R.O.C.)
| | - Ching-Hsing Lee
- Department of Emergency Medicine, Chang Gung Memorial Hospital, Linkou, and Chang Gung University College of Medicine, Taoyuan City, Taiwan (R.O.C.)
| | - Chien-Kuang Chen
- Department of Emergency Medicine, Chang Gung Memorial Hospital, Linkou, and Chang Gung University College of Medicine, Taoyuan City, Taiwan (R.O.C.)
| | - Chien-Hung Liao
- Department of Traumatology and Emergency Surgery, Chang Gung Memorial Hospital, Linkou, and Chang Gung University College of Medicine, Taoyuan city, Taiwan (R.O.C.)
| | - Chip-Jin Ng
- Department of Emergency Medicine, Chang Gung Memorial Hospital, Linkou, and Chang Gung University College of Medicine, Taoyuan City, Taiwan (R.O.C.)
| | - Jih-Chang Chen
- Department of Emergency Medicine, Chang Gung Memorial Hospital, Linkou, and Chang Gung University College of Medicine, Taoyuan City, Taiwan (R.O.C.)
| | - Chung-Hsien Chaou
- Chang Gung Medical Education Research Center, CGMERC, No. 5, Fusing St., Gueishan Township, 333, Taoyuan city, Taiwan (R.O.C.).
- Department of Emergency Medicine, Chang Gung Memorial Hospital, Linkou, and Chang Gung University College of Medicine, Taoyuan City, Taiwan (R.O.C.).
- Department of Medical Education, Chang Gung Memorial Hospital, Linkou, Taoyuan city, Taiwan (R.O.C.).
| |
Collapse
|
11
|
Peabody MR, O'Neill TR, Peterson LE. Examining the Functioning and Reliability of the Family Medicine Milestones. J Grad Med Educ 2017; 9:46-53. [PMID: 28261393 PMCID: PMC5319627 DOI: 10.4300/jgme-d-16-00172.1] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND The Family Medicine (FM) Milestones are a framework designed to assess development of residents in key dimensions of physician competency. Residency programs use the milestones in semiannual reviews of resident performance from entry toward graduation. OBJECTIVE To examine the functioning and reliability of the FM Milestones and to determine whether they measure the amount of a latent trait (eg, knowledge or ability) possessed by a resident or simply indicate where a resident falls along the training sequence. METHODS This study utilized the Rasch Partial Credit model to examine academic year 2014-2015 ratings for 10 563 residents from 476 residency programs (postgraduate year [PGY] 1 = 3639; PGY-2 = 3562; PGY-3 = 3351; PGY-4 = 11). RESULTS Reliability was exceptionally high at 0.99. Mean scores were 3.2 (SD = 1.3) for PGY-1; 5.0 (SD = 1.3) for PGY-2; 6.7 (SD = 1.2) for PGY-3; and 7.4 (SD = 1.0) for PGY-4. Keyform analysis showed a rating on 1 item was likely to be similar for all other items. CONCLUSIONS Our findings suggest that FM Milestones seem to largely function as intended. Lack of spread in item difficulty and lack of variation in category probabilities show that FM Milestones do not measure the amount of a latent trait possessed by a resident, but rather describe where a resident falls along the training sequence. High reliability indicates residents are being rated in a stable manner as they progress through residency, and individual residents deviating from this rating structure warrant consideration by program leaders.
Collapse
Affiliation(s)
- Michael R. Peabody
- Corresponding author: Michael R. Peabody, PhD, American Board of Family Medicine, 1648 McGrathiana Parkway, Suite 550, Lexington, KY 40511, 859.269.5626, ext 1226, fax 859.335.7501,
| | | | | |
Collapse
|
12
|
Smith J, Jacobs E, Li Z, Vogelman B, Zhao Y, Feldstein D. Successful Implementation of a Direct Observation Program in an Ambulatory Block Rotation. J Grad Med Educ 2017; 9:113-117. [PMID: 28261405 PMCID: PMC5319609 DOI: 10.4300/jgme-d-16-00167.1] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Direct observation of clinical skills is a cornerstone of competency-based education and training. Ensuring direct observation in a consistent fashion has been a significant challenge for residency programs. OBJECTIVE The purpose of this study was to evaluate the effects of a novel evaluation system, designed to achieve ongoing direct observation of residents, examine changes in resident observation practices, and understand faculty attitudes toward direct observation and the evaluation system. METHODS Internal medicine residents on an ambulatory block rotation participated in a new evaluation system, which replaced a single end-of-rotation summative evaluation with 9 formative evaluations based on direct observation. Faculty received training in direct observation and use of the forms, and residents were given responsibility to collect 9 observations per rotation. Faculty members contacted residents at the beginning and middle of the rotation to ensure completion of the observations. Residents and faculty also completed postrotation surveys to gauge the impact of the new system. RESULTS A total of 507 patient encounters were directly observed, and 52 of 57 (91%) residents completed all 9 observations. Residents reported considerably more direct observation than prior to the intervention, and most reported changes to their clinical skills based on faculty feedback. Faculty reported improvements in their attitudes, increased their use of direct observation, and preferred the new system to the old one. CONCLUSIONS A novel evaluation system replacing summative evaluations with multiple formative evaluations based on direct observation was successful in achieving high rates of observations, and improving faculty attitudes toward direct observation.
Collapse
Affiliation(s)
- Jeremy Smith
- Corresponding author: Jeremy Smith, MD, University of Wisconsin School of Medicine and Public Health, General Internal Medicine Administrative Office, 2828 Marshall Court, Suite 100, Madison, WI 53705, 608.263.3010,
| | | | | | | | | | | |
Collapse
|
13
|
Gee DW, Phitayakorn R, Khatri A, Butler K, Mullen JT, Petrusa ER. A Pilot Study to Gauge Effectiveness of Standardized Patient Scenarios in Assessing General Surgery Milestones. JOURNAL OF SURGICAL EDUCATION 2016; 73:e1-e8. [PMID: 27886969 DOI: 10.1016/j.jsurg.2016.08.012] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/13/2016] [Revised: 08/18/2016] [Accepted: 08/22/2016] [Indexed: 06/06/2023]
Abstract
PURPOSE Some General Surgery Milestones can be difficult to assess in traditional clinical settings and especially difficult to assess in junior residents. The purpose of this pilot study was to METHODS: A total of 9 categorical interns participated in a comprehensive, 4-module, SP scenario designed to evaluate and manage right upper quadrant pain. SP checklist scores (SP%) were converted to Milestone-equivalent scores for direct comparison (SP-C). Milestone scores were analyzed from 3 different sources: SP, faculty (FAC), and CCC. Interns completed course evaluations at the end of each session. Spearman's rho was used to determine correlations. Wilcoxon signed rank tests were used to test for differences between scores from different sources. RESULTS Individual intern Milestone scores from the 3 sources (SP-C, FAC, and CCC) did not correlate. All 7 mean Milestone scores from SPs were significantly higher than from FAC and CCC. FAC and CCC scores were statistically equivalent except for Systems-Based Practice 1 (SBP1) and Patient Care 3 (PC3) where CCC scores were significantly higher than FAC. Mean SP% scores for PC1 were significantly lower than for PROF1, MK1, MK2, and ICS1 (p < 0.05). Interns felt the modules were moderately to very useful. CONCLUSIONS Developing an SP scenario for Milestones evaluation is feasible. SPs, faculty observers, and CCC each use different data to provide a unique source of Milestone assessment. SP scenarios may be ideally suited to assess specific resident strengths and weaknesses and provide individualized feedback, thus augmenting traditional evaluations. Additional SP scenarios, assessing a broader range of skills and Milestones, are advisable for more reliable estimates of resident performance.
Collapse
Affiliation(s)
- Denise W Gee
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts.
| | - Roy Phitayakorn
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Avni Khatri
- Laboratory of Computer Science, Massachusetts General Hospital, Boston, Massachusetts
| | - Kathryn Butler
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - John T Mullen
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Emil R Petrusa
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| |
Collapse
|
14
|
Klamen DL, Williams RG, Roberts N, Cianciolo AT. Competencies, milestones, and EPAs - Are those who ignore the past condemned to repeat it? MEDICAL TEACHER 2016; 38:904-910. [PMID: 26805785 DOI: 10.3109/0142159x.2015.1132831] [Citation(s) in RCA: 25] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
BACKGROUND The idea of competency-based education sounds great on paper. Who wouldn't argue for a standardized set of performance-based assessments to assure competency in graduating students and residents? Even so, conceptual concerns have already been raised about this new system and there is yet no evidence to refute their veracity. AIMS We argue that practical concerns deserve equal consideration, and present evidence strongly suggesting these concerns should be taken seriously. METHOD Specifically, we share two historical examples that illustrate what happened in two disparate contexts (K-12 education and the Department of Defense [DOD]) when competency (or outcomes-based) assessment frameworks were implemented. We then examine how observation and assessment of clinical performance stands currently in medical schools and residencies, since these methodologies will be challenged to a greater degree by expansive lists of competencies and milestones. RESULTS/CONCLUSIONS We conclude with suggestions as to a way forward, because clearly the assessment of competency and the ability to guarantee that graduates are ready for medical careers is of utmost importance. Hopefully the headlong rush to competencies, milestones, and core entrustable professional activities can be tempered before even more time, effort, frustration and resources are invested in an endeavor which history suggests will collapse under its own weight.
Collapse
|
15
|
Williams RG, Dunnington GL, Mellinger JD, Klamen DL. Placing constraints on the use of the ACGME milestones: a commentary on the limitations of global performance ratings. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2015; 90:404-407. [PMID: 25295965 DOI: 10.1097/acm.0000000000000507] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Abstract
As part of the outcomes-based accreditation process, the Accreditation Council for Graduate Medical Education (ACGME) now requires that medical specialties formulate and use educational milestones to assess residents' performance. These milestones are specialty-specific achievements that residents are expected to demonstrate at established intervals in their training. In this Commentary, the authors argue that the pressure to efficiently use program directors' and faculty members' time, particularly in the increasingly clinical-revenue-dependent model of the academic medical center, will lead program directors to meet these new accreditation expectations solely by adding items that assess these competencies to global end-of-rotation rating forms. This approach will increase the workload of faculty but will not provide new and useful information about residents' competence. These same concerns could apply if assessment committees attempt to measure these new performance dimensions without using direct observation to evaluate residents' performance. In these circumstances, the milestones movement will fall short of its intention and potential. In this Commentary, the authors outline and provide evidence from the literature for their concerns. They discuss the role that human judges play in measuring performance, the measurement characteristics of global performance ratings, and the problems associated with simply adding items to existing global rating forms.
Collapse
Affiliation(s)
- Reed G Williams
- Dr. Williams is adjunct professor of surgery, Indiana University School of Medicine, Indianapolis, Indiana, and J. Roland Folse, MD, Professor of Surgical Education Research and Development Emeritus, Southern Illinois University School of Medicine, Springfield, Illinois. He served as a member of the General Surgery Milestones Development Committee. Dr. Dunnington is chairman, Department of Surgery, and Jay L. Grosfeld Professor of Surgery, Indiana University School of Medicine, Indianapolis, Indiana. He served as a member of the committee that developed the original ACGME competencies. Dr. Mellinger is J. Roland Folse, MD, Chair and professor, Division of General Surgery, and program director, General Surgery Residency Program, Southern Illinois University School of Medicine, Springfield, Illinois. Dr. Klamen is associate dean for education and curriculum and chair, Department of Medical Education, Southern Illinois University School of Medicine, Springfield, Illinois
| | | | | | | |
Collapse
|
16
|
Duong DK, Oyama LC, Smith JL, Narang AT, Spector J. Medical student perceptions on the instruction of the emergency medicine oral case presentation. J Emerg Med 2014; 48:337-43. [PMID: 25453857 DOI: 10.1016/j.jemermed.2014.09.035] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/07/2014] [Revised: 08/04/2014] [Accepted: 09/30/2014] [Indexed: 11/27/2022]
Abstract
BACKGROUND The emergency medicine oral case presentation (EM OCP) is the clinician's communication tool to justify whether urgent intervention is required, to argue for ruling out emergent disease states, and to propose safe disposition plans in the context of triaging patients for medical care and prioritization of resources. The EM OCP provides the representation of the practice of emergency medicine, yet we do not know the current level of effectiveness of its instruction. OBJECTIVES We aimed to document medical student perceptions and expectations of the instruction of the EM OCP. METHODS We surveyed medical students from five institutions after their emergency medicine clerkship on their instruction of the EM OCP. Analysis included univariate descriptive statistics and chi-squared analyses for interactions. RESULTS One hundred fifty-five medical students (82%) completed the survey. Most medical students reported the EM OCP to be unique compared to that of other disciplines (86%), integral to their clerkship evaluation (77%), and felt that additional teaching was required beyond their current medical school instruction (78%). A minority report being specifically taught the EM OCP (37%), that their instruction was consistent (29%), or that expectations of the EM OCP were clear (21%). Respondents felt that brief instruction during their orientation (65%) and reading with a portable summary card (45%) would improve their EM OCP skills, whereas other modalities would be less helpful. CONCLUSION This study identifies a need for additional specific and consistent teaching of the EM OCP to medical students and their preference on how to receive this instruction.
Collapse
Affiliation(s)
- David K Duong
- Department of Emergency Medicine, University of California, San Francisco, California
| | - Leslie C Oyama
- Department of Emergency Medicine, University of California, San Diego, California
| | - Jessica L Smith
- Department of Emergency Medicine, Alpert Medical School of Brown University, Providence, Rhode Island
| | - Aneesh T Narang
- Department of Emergency Medicine, Boston University School of Medicine, Boston, Massachusetts
| | - Jordan Spector
- Department of Emergency Medicine, Albert Einstein Medical Center, Philadelphia, Pennsylvania
| |
Collapse
|
17
|
Chang TP, Pham PK, Sobolewski B, Doughty CB, Jamal N, Kwan KY, Little K, Brenkert TE, Mathison DJ. Pediatric emergency medicine asynchronous e-learning: a multicenter randomized controlled Solomon four-group study. Acad Emerg Med 2014; 21:912-9. [PMID: 25154469 DOI: 10.1111/acem.12434] [Citation(s) in RCA: 37] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/05/2014] [Revised: 03/21/2014] [Accepted: 03/30/2014] [Indexed: 11/29/2022]
Abstract
OBJECTIVES Asynchronous e-learning allows for targeted teaching, particularly advantageous when bedside and didactic education is insufficient. An asynchronous e-learning curriculum has not been studied across multiple centers in the context of a clinical rotation. We hypothesize that an asynchronous e-learning curriculum during the pediatric emergency medicine (EM) rotation improves medical knowledge among residents and students across multiple participating centers. METHODS Trainees on pediatric EM rotations at four large pediatric centers from 2012 to 2013 were randomized in a Solomon four-group design. The experimental arms received an asynchronous e-learning curriculum consisting of nine Web-based, interactive, peer-reviewed Flash/HTML5 modules. Postrotation testing and in-training examination (ITE) scores quantified improvements in knowledge. A 2 × 2 analysis of covariance (ANCOVA) tested interaction and main effects, and Pearson's correlation tested associations between module usage, scores, and ITE scores. RESULTS A total of 256 of 458 participants completed all study elements; 104 had access to asynchronous e-learning modules, and 152 were controls who used the current education standards. No pretest sensitization was found (p = 0.75). Use of asynchronous e-learning modules was associated with an improvement in posttest scores (p < 0.001), from a mean score of 18.45 (95% confidence interval [CI] = 17.92 to 18.98) to 21.30 (95% CI = 20.69 to 21.91), a large effect (partial η(2) = 0.19). Posttest scores correlated with ITE scores (r(2) = 0.14, p < 0.001) among pediatric residents. CONCLUSIONS Asynchronous e-learning is an effective educational tool to improve knowledge in a clinical rotation. Web-based asynchronous e-learning is a promising modality to standardize education among multiple institutions with common curricula, particularly in clinical rotations where scheduling difficulties, seasonality, and variable experiences limit in-hospital learning.
Collapse
Affiliation(s)
- Todd P. Chang
- Division of Emergency Medicine and Transport; Children's Hospital Los Angeles; Los Angeles CA
- University of Southern California Keck School of Medicine; Los Angeles CA
| | - Phung K. Pham
- Division of Emergency Medicine and Transport; Children's Hospital Los Angeles; Los Angeles CA
| | - Brad Sobolewski
- Division of Emergency Medicine at Cincinnati Children's Hospital and Medical Center; Cincinnati OH
- University of Cincinnati; Cincinnati OH
| | - Cara B. Doughty
- Division of Emergency Medicine at Texas Children' Hospital; Houston TX
- Baylor College of Medicine; Houston TX
| | - Nazreen Jamal
- Division of Emergency Medicine and Trauma Center at Children's National Medical Center and George Washington University; Washington DC
| | - Karen Y. Kwan
- Division of Emergency Medicine and Transport; Children's Hospital Los Angeles; Los Angeles CA
- University of Southern California Keck School of Medicine; Los Angeles CA
| | - Kim Little
- Division of Emergency Medicine at Texas Children' Hospital; Houston TX
- Baylor College of Medicine; Houston TX
| | - Timothy E. Brenkert
- Division of Emergency Medicine at Cincinnati Children's Hospital and Medical Center; Cincinnati OH
- University of Cincinnati; Cincinnati OH
| | - David J. Mathison
- Division of Emergency Medicine and Trauma Center at Children's National Medical Center and George Washington University; Washington DC
| |
Collapse
|
18
|
Time-motion analysis of emergency radiologists and emergency physicians at an urban academic medical center. Emerg Radiol 2013; 20:409-16. [DOI: 10.1007/s10140-013-1129-5] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/01/2013] [Accepted: 04/16/2013] [Indexed: 10/26/2022]
|
19
|
Hamden K, Jeanmonod D, Gualtieri D, Jeanmonod R. Comparison of resident and mid-level provider productivity in a high-acuity emergency department setting. Emerg Med J 2013; 31:216-9. [DOI: 10.1136/emermed-2012-201904] [Citation(s) in RCA: 22] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022]
|
20
|
Celenza A, Rogers IR. Comparison of visual analogue and Likert scales in evaluation of an emergency department bedside teaching programme. Emerg Med Australas 2012; 23:68-75. [PMID: 21284816 DOI: 10.1111/j.1742-6723.2010.01352.x] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
The present study compares visual analogue scale (VAS) to Likert-type scale (LTS) instruments in evaluating perceptions of an ED bedside clinical teaching programme. A prospective study was conducted in the ED of an urban, adult tertiary hospital. Prospective pairing occurred of a teaching consultant and registrar who were relatively quarantined from normal clinical duties. Registrars received 3 months of the teaching intervention, and 3 months without the intervention in a cross-over fashion. Evaluation questionnaires were completed using both the LTS and 100 mm horizontal VAS for each question. Correlation between VAS and LTS gave a measure of validity, and test-retest stability and internal consistency gave measures of reliability. Registrar perceptions of the teaching programme were positive, but no differences were found between the pre- and post-intervention groups. The test-retest reliabilities (intraclass correlation coefficient) for the questionnaires were 0.51 and 0.54 for the VAS, and 0.58 and 0.58 for the LTS. Cronbach's alpha varied between 0.79 and 0.91 for the VAS, and 0.79 and 0.81 for the LTS. Correlations between the two methods varied from 0.35 to 0.94 for each question. A linear regression equation describing the relationship approximated VAS = 19.5 × LTS-9 with overall r= 0.89. An ED bedside teaching programme is perceived to be a beneficial educational intervention. The VAS is a reliable and valid alternative to the LTS for educational evaluation and might provide advantages in educational measurement. Further research into the significance of extreme values and educationally important changes in scores is required.
Collapse
Affiliation(s)
- Antonio Celenza
- Emergency Department, Sir Charles Gairdner Hospital, Nedlands, Western Australia 6009, Australia.
| | | |
Collapse
|
21
|
Madan R, Conn D, Dubo E, Voore P, Wiesenfeld L. The enablers and barriers to the use of direct observation of trainee clinical skills by supervising faculty in a psychiatry residency program. CANADIAN JOURNAL OF PSYCHIATRY. REVUE CANADIENNE DE PSYCHIATRIE 2012; 57:269-72. [PMID: 22480593 DOI: 10.1177/070674371205700411] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
OBJECTIVE Studies have reported that medical trainees do not get sufficient direct observation. Our study aimed to determine the frequency of direct observation and the enablers and barriers to direct observation in the department of psychiatry at a large Canadian university. METHOD Focus groups and interviews explored the role and use of direct observation, followed by a survey both of faculty and of residents. RESULTS Direct observation was used in various contexts in the residents' last rotation. Missed opportunities are identified. Enablers include financial compensation, guidelines, and a discussion at the beginning of each clinical rotation. Barriers are identified at the resident, faculty, and administrative levels. CONCLUSIONS Direct observation is used in many contexts in psychiatric training. While there are barriers which limit its use, our data indicate numerous potential enablers and missed opportunities for more observation.
Collapse
Affiliation(s)
- Robert Madan
- Centre for Mental Health, Baycrest, Toronto, Ontario.
| | | | | | | | | |
Collapse
|
22
|
Wang EE, Dyne PL, Du H. Systems-based practice: Summary of the 2010 Council of Emergency Medicine Residency Directors Academic Assembly Consensus Workgroup--teaching and evaluating the difficult-to-teach competencies. Acad Emerg Med 2011; 18 Suppl 2:S110-20. [PMID: 21999553 DOI: 10.1111/j.1553-2712.2011.01160.x] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
OBJECTIVES The development of robust Accreditation Council for Graduate Medical Education (ACGME) systems-based practice (SBP) training and validated evaluation tools has been generally challenging for emergency medicine (EM) residency programs. The purpose of this paper is to report the results of a consensus workgroup session of the 2010 Council of Emergency Medicine Residency Directors (CORD) Academic Assembly with the following objectives: 1) to discuss current and preferred local and regional methods for teaching and assessing SBP and 2) to develop consensus within the CORD community using the modified Delphi method with respect to EM-specific SBP domains and link these domains to specific SBP educational and evaluative methods. METHODS Consensus was developed using a modified Delphi method. Previously described taxonomy generation methodology was used to create a SBP taxonomy of EM domain-specific knowledge, skills, and attitudes (KSA). The steps in the process consisted of: 1) an 11-question preconference survey, 2) a vetting process conducted at the 2010 CORD Academic Assembly, and 3) the development and ranking of domain-specific SBP educational activities and evaluation criteria for the specialty of EM. RESULTS Rank-order lists were created for preferred SBP education and evaluation methods. Expert modeling, informal small group discussion, and formal small group activities were considered to be the optimal methods to teach SBP. Kruskal-Wallis testing revealed that these top three items were rated significantly higher than self-directed learning projects and lectures (p = 0.0317). Post hoc test via permutation testing revealed that the difference was significant between expert modeling and formal small group activity (adjusted p = 0.028), indicating that expert modeling was rated significantly higher than formal small group activity. Direct observation methods were the preferred methods for evaluation. Multiple barriers to training and evaluation were elucidated. We developed a consensus taxonomy of domains that were felt to be most essential and reflective of the practice of EM: multitasking, disposition, and patient safety. Learning formats linked to the domains were created and specific examples of local best practices collected. Domain-specific anchors of observable actions for the three domains were created. CONCLUSIONS This consensus process resulted in the development of a taxonomy of EM-specific domains for teaching and observable tasks for evaluating SBP. The concept of SBP is interlinked with the other general competencies and difficult to separate. Rather than develop specific SBP evaluation tools to measure the competency directly, SBP competency evaluation should be considered one element of a coordinated effort to teach and evaluate the six ACGME general competencies.
Collapse
Affiliation(s)
- Ernest E Wang
- Department of Emergency Medicine, NorthShore University HealthSystem Research Institute (HD), NorthShore University HealthSystem, Evanston, IL, USA.
| | | | | |
Collapse
|
23
|
Ryan JG, Barlas D, Sharma M. Direct observation evaluations by emergency medicine faculty do not provide data that enhance resident assessment when compared to summative quarterly evaluations. Acad Emerg Med 2010; 17 Suppl 2:S72-7. [PMID: 21199088 DOI: 10.1111/j.1553-2712.2010.00878.x] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
OBJECTIVES The purpose of this study was to compare quarterly global evaluations with direct observation evaluations to determine if direct observation evaluations provide unique data compared to those obtained from quarterly global evaluations. METHODS This observational, cohort study was performed at a 3-year emergency medicine (EM) residency program with 10 residents per year. Faculty used an online Web-based evaluation system to complete quarterly global evaluations and patient-specific direct observation evaluations. Two scores were collected for each resident within each quarterly evaluation period: 1) the quarterly evaluation score was the mean score across all faculty who performed a quarterly evaluation and, 2) the direct observation score was the mean score across all faculty who performed a direct observation evaluation. Pearson correlation coefficients were performed across these two groups of evaluations. RESULTS Over the 4-year period of the study 296 complete data sets were available for the analysis. When the quarterly evaluation score was correlated with the direct observation score for each resident at the same evaluation period, we found a very high correlation for each of the eight evaluation questions (r = 0.95-0.96, p < 0.0001). When these evaluations were stratified based on the number of direct observation evaluations that were performed during the evaluation period of interest, the correlation between the quarterly evaluation and the direct observation scores increased as the number of direct observations in the evaluation period increased. The evaluation scores from the faculty who had performed both direct observation and quarterly evaluation methods during the same resident evaluation period were highly correlated even with small numbers of evaluators. CONCLUSIONS Direct observations are highly correlated with quarterly evaluations when there are greater than three direct observation evaluations completed; however, this correlation drops significantly when the number of direct observations is lower. Direct observation evaluations provide similar data when compared with data obtained from quarterly global evaluations.
Collapse
Affiliation(s)
- James G Ryan
- Department of Emergency Medicine, New York Hospital Queens, Flushing, NY, USA.
| | | | | |
Collapse
|
24
|
Memon MA, Brigden D, Subramanya MS, Memon B. Assessing the surgeon's technical skills: analysis of the available tools. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2010; 85:869-880. [PMID: 20520044 DOI: 10.1097/acm.0b013e3181d74bad] [Citation(s) in RCA: 26] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/29/2023]
Abstract
The concept of assessing competency in surgical practice is not new and has taken on an added urgency in view of the recent high-profile inquiries into "botched cases" involving surgeons of various levels in different parts of the world. Until very recently, surgeons in the United Kingdom and other parts of the world, although required to undergo formal and compulsory examinations to test their factual knowledge and decision making, were not required to demonstrate technical ability. Therefore, there existed (and still exist) no objective assessment criteria to test trainees' surgical skill, especially during the exit examination, which, if passed, provides unrestricted license to surgeons to practice their specialties. However, with the introduction of a new curriculum by various surgical societies and a demand from the lay community for better standards, new assessment tools are emerging that focus on technical competency and that could objectively and reliably measure surgical skills. Furthermore, training authorities and hospitals are keen to embrace these changes for satisfactory accreditation and reaccreditation processes and to assure the public of the safety of the public and private health care systems. In the United Kingdom, two new surgical tools (Surgical Direct Observation of Procedural Skill, and Procedure Based Assessments) have been simultaneously introduced to assess surgical trainees. The authors describe these two assessment methods, provide an overview of other assessment tools currently or previously used to assess surgical skills, critically analyze the two new assessment tools, and reflect on the merit of simultaneously introducing them.
Collapse
|
25
|
Chong A, Weiland TJ, Mackinlay C, Jelinek GA. The capacity of Australian ED to absorb the projected increase in intern numbers. Emerg Med Australas 2010; 22:100-7. [DOI: 10.1111/j.1742-6723.2010.01268.x] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
26
|
Quinn A, Brunett P. Service versus education: finding the right balance: a consensus statement from the Council Of Emergency Medicine Residency Directors 2009 Academic Assembly "Question 19" working group. Acad Emerg Med 2009; 16 Suppl 2:S15-8. [PMID: 20053203 DOI: 10.1111/j.1553-2712.2009.00599.x] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
Many emergency medicine (EM) residency programs have recently received citations for their residents' responses to Question 19 of the Accreditation Council on Graduate Medical Education annual survey, which asks residents to rate their program's emphasis on clinical education over service obligations. To the best of our knowledge, no prior investigations or consensus statements exist that specifically address the appropriate balance between educational activity and clinical service in EM residency training. The objective of this project was to create a consensus statement based on the shared insights of academic faculty and educators in EM, with specific recommendations to improve the integration of education with clinical service in EM residency training programs. More than 80 EM program directors (PDs), associate and assistant PDs, and other academic EM faculty attending an annual conference of EM educators met to address this issue in a discussion session and working group. Participants examined the current literature on resident service and education and shared with the conference at large their collective insight and experience and possible solutions to this challenge. A consensus statement of specific recommendations and effective educational techniques aimed at balancing service and education requirements was created, based on the contributions of a diverse group of academic emergency physicians. Recommendations included identifying the teachable moment in all clinical service; promoting resident understanding of program goals and expectations from the beginning; educating residents about the ACGME resident survey; and engaging hospitals, institutional graduate medical education departments, and residents in finding solutions.
Collapse
Affiliation(s)
- Antonia Quinn
- Department of Emergency Medicine, Kings County Hospital Center/SUNY Downstate Medical Center, Brooklyn NY, USA.
| | | |
Collapse
|
27
|
Gillespie C, Paik S, Ark T, Zabar S, Kalet A. Residents' perceptions of their own professionalism and the professionalism of their learning environment. J Grad Med Educ 2009; 1:208-15. [PMID: 21975980 PMCID: PMC2931244 DOI: 10.4300/jgme-d-09-00018.1] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND The competency of professionalism encompasses a range of behaviors in multiple domains. Residency programs are struggling to integrate and effectively assess professionalism. We report results from a survey assessing residents' perceptions of their professional competence and the professionalism of their learning environment. METHODS A survey was developed to assess specific behaviors reflecting professionalism based on the conceptualizations of key accrediting bodies. Residents rated their ability to perform the behaviors and reported the frequency with which they observed their fellow residents failing to perform the behaviors. Eighty-five senior residents in emergency medicine, internal medicine, pediatrics, psychiatry, and surgery specialties completed the survey (response rate = 77%). Differences among domains (and among items within domains) were assessed. Correlations between perceived professionalism and the professionalism of the learning environment were described. RESULTS Cronbach alpha for professionalism competence was .93 and for professionalism in the learning environment it was .86. Residents reported feeling most competent in being accountable (mean score = 51.4%; F = 10.3, p<.001) and in demonstrating respect. Some residents reported having trouble being sensitive to patients (n = 5 to 23). Disrespectful behaviors were the most frequently witnessed professionalism lapse in the learning environment (mean = 41.1%; F = 8.1, p<.001). While serious lapses in professionalism were not witnessed with great frequency in the learning environment, instances of over-representing qualifications were reported. Problems in accountability in the learning environment were negatively associated with residents' perceived competence. CONCLUSIONS Residents reported being able to perform professionally most of the time, especially in terms of accountability and respect. However, disrespect was a feature of the learning environment for many residents and several serious lapses were witnessed by a small number of residents. Accountability in the learning environment may be an important indicator of or influence on residents' professionalism.
Collapse
Affiliation(s)
- Colleen Gillespie
- Corresponding author: Colleen Gillespie, PhD, New York University School of Medicine, VA New York Harbor Health System, 423 East 23rd Street, 15 Floor North (15028AN), New York, NY 10010, 212.263.4247,
| | | | | | | | | |
Collapse
|
28
|
Falvo T, McKniff S, Smolin G, Vega D, Amsterdam JT. The business of emergency medicine: a nonclinical curriculum proposal for emergency medicine residency programs. Acad Emerg Med 2009; 16:900-7. [PMID: 19689483 DOI: 10.1111/j.1553-2712.2009.00506.x] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
Over the course of their postgraduate medical education, physicians are expected not only to acquire an extensive knowledge of clinical medicine and sound procedural skills, but also to develop competence in their other professional roles as communicator, collaborator, mediator, manager, teacher, and patient advocate. Although the need for physicians to develop stronger service delivery skills is well recognized, residency programs may underemphasize formal training in nonclinical proficiencies. As a result, graduates can begin their professional careers with an incomplete understanding of the operation of health care systems and how to utilize system resources in the manner best suited to their patients' needs. This article proposes the content, educational strategy, and needs assessment for an academic program entitled The Business of Emergency Medicine (BOEM). Developed as an adjunct to the (predominantly) clinical content of traditional emergency medicine (EM) training programs, BOEM is designed to enhance the existing academic curricula with additional learning opportunities by which EM residents can acquire a fundamental understanding of the nonclinical skills of their specialty.
Collapse
Affiliation(s)
- Thomas Falvo
- Health Services Design Section, Department of Emergency Medicine, York Hospital, WellSpan Health System, York, PA, USA.
| | | | | | | | | |
Collapse
|
29
|
Lurie SJ, Mooney CJ, Lyness JM. Measurement of the general competencies of the accreditation council for graduate medical education: a systematic review. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2009; 84:301-9. [PMID: 19240434 DOI: 10.1097/acm.0b013e3181971f08] [Citation(s) in RCA: 152] [Impact Index Per Article: 10.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
PURPOSE To evaluate published evidence that the Accreditation Council for Graduate Medical Education's six general competencies can each be measured in a valid and reliable way. METHOD In March 2008, the authors conducted searches of Medline and ERIC using combinations of search terms "ACGME," "Accreditation Council for Graduate Medical Education," "core competencies," "general competencies," and the specific competencies "systems-based practice" (SBP) and "practice based learning and improvement (PBLI)." Included were all publications presenting new qualitative or quantitative data about specific assessment modalities related to the general competencies since 1999; opinion pieces, review articles, and reports of consensus conferences were excluded. The search yielded 127 articles, of which 56 met inclusion criteria. Articles were subdivided into four categories: (1) quantitative/psychometric evaluations, (2) preliminary studies, (3) studies of SBP and PBLI, and (4) surveys. RESULTS Quantitative/psychometric studies of evaluation tools failed to develop measures reflecting the six competencies in a reliable or valid way. Few preliminary studies led to published quantitative data regarding reliability or validity. Only two published surveys met quality criteria. Studies of SBP and PBLI generally operationalized these competencies as properties of systems, not of individual trainees. CONCLUSIONS The peer-reviewed literature provides no evidence that current measurement tools can assess the competencies independently of one another. Because further efforts are unlikely to be successful, the authors recommend using the competencies to guide and coordinate specific evaluation efforts, rather than attempting to develop instruments to measure the competencies directly.
Collapse
Affiliation(s)
- Stephen J Lurie
- Office of Curriculum and Assessment, University of Rochester School of Medicine and Dentistry, Rochester, New York 14642, USA.
| | | | | |
Collapse
|
30
|
|
31
|
Chisholm CD, Weaver CS, Whenmouth LF, Giles B, Brizendine EJ. A Comparison of Observed Versus Documented Physician Assessment and Treatment of Pain: The Physician Record Does Not Reflect the Reality. Ann Emerg Med 2008; 52:383-9. [DOI: 10.1016/j.annemergmed.2008.01.004] [Citation(s) in RCA: 26] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/07/2007] [Revised: 12/21/2007] [Accepted: 01/07/2008] [Indexed: 11/15/2022]
|
32
|
Use of an automated electronic case log to assess fellowship training: tracking the pediatric emergency medicine experience. Pediatr Emerg Care 2008; 24:75-82. [PMID: 18277842 DOI: 10.1097/pec.0b013e318163db3c] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/20/2022]
Abstract
The Accreditation Council for Graduate Medical Education has mandated the assessment of medical training across 6 core competencies. The patient care competency is at the core of medical training. With the introduction of patient tracking systems used in emergency departments, patient-physician encounters can be systematically studied. The combination of tracking data with other clinical information systems can be used to create an electronic case log to quantify the experience of fellows, thereby offering a summative measure of the patient care competency. We used an automated case log to assess clinical exposure in our pediatric emergency medicine fellowship.
Collapse
|
33
|
Baker RC, Klein M, Samaan Z, Brinkman W. Exam room presentations and teaching in outpatient pediatrics: effects on visit duration and parent, attending physician, and resident perceptions. ACTA ACUST UNITED AC 2007; 7:354-9. [PMID: 17870643 DOI: 10.1016/j.ambp.2007.05.006] [Citation(s) in RCA: 16] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2006] [Revised: 05/02/2007] [Accepted: 05/16/2007] [Indexed: 11/20/2022]
Abstract
OBJECTIVE To examine the effects of exam room presentations and teaching (ERPT) in a busy outpatient pediatric setting on visit duration and on parent, preceptor, and resident perceptions. METHODS This 8-week, 2-method crossover study compared first-year pediatric resident patient presentations and attending physician teaching and discussion in the exam room (ERPT) with conference area presentation and teaching (CAPT). Outcome measures included visit duration, parent satisfaction, and resident/attending physician perceptions. Differences were analyzed using chi2 (parent surveys), t tests (visit duration), and signed rank tests (Attending Physician and Resident Surveys). RESULTS Three hundred forty patient encounters were studied (151 ERPT vs 189 CAPT) that involved 15 first-year pediatric residents and 15 attending physicians. Visit durations were equivalent. Parent satisfaction was high in both methods. Attending physicians favored ERPT for adding opportunities to evaluate resident competencies, provide informed feedback, and role model. Attending physicians felt that ERPT decreased resident comfort level when discussing sensitive topics. Residents were less comfortable with ERPT for discussing sensitive topics and felt somewhat embarrassed when they did not know the answer to attending physicians' questions. Residents reported that ERPT presentations permitted attending physicians to demonstrate more physical exam skills and to observe interactions, enabling more informed feedback. CONCLUSIONS ERPT and CAPT require similar time and result in high parent satisfaction. Although residents are a little less comfortable with ERPT, attending physicians are better able to observe, evaluate, and give feedback on resident skills and to role model and teach physical diagnosis.
Collapse
Affiliation(s)
- Raymond C Baker
- Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati Children's Hospital Medical Center, Cincinnati, Ohio 45229, USA.
| | | | | | | |
Collapse
|
34
|
Kelly SP, Shapiro N, Woodruff M, Corrigan K, Sanchez LD, Wolfe RE. The effects of clinical workload on teaching in the emergency department. Acad Emerg Med 2007; 14:526-31. [PMID: 17483400 DOI: 10.1197/j.aem.2007.01.024] [Citation(s) in RCA: 27] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
BACKGROUND Academic emergency physicians have expressed concern that increased clinical workload and overcrowding adversely affect clinical teaching. OBJECTIVES To evaluate the influence of clinical workload and attending physicians' teaching characteristics on clinical teaching in the emergency department (ED). METHODS This was a prospective observational study using learner satisfaction assessment tools to evaluate bedside teaching. On days when a research assistant was available, all ED residents and attending physicians were queried. A total of 335 resident surveys were administered over nine months (89% response). Clinical workload was measured by perception and patient volume. Teaching quality and characteristics were rated on ten-point scales. A linear mixed-effects model was used to obtain adjusted impact estimates of clinical workload and teaching attributes on teaching scores while controlling for individual attending physicians' teaching ability and residents' grading tendencies. RESULTS No clinical workload parameter had a significant effect on teaching scores: residents' workload perception (beta estimate, 0.024; p = 0.55), attending physicians' workload perception (beta estimate, -0.05; p = 0.28), patient volume in patients per hour (beta estimate, -0.010; p = 0.36), and shift type (beta estimate, -0.19; p = 0.28). The individual attending physician effect was significant (p < 0.001) and adjusted in each case. In another model, the attending physicians' learning environment established (beta estimate, 0.12; p = 0.005), clinical teaching skills (beta estimate, 0.36; p < 0.001), willingness to teach (beta estimate, 0.25; p < 0.001), and interpersonal skills (beta estimate, 0.19; p < 0.001) affected teaching scores, but the attending physicians' availability to teach had no significant effect (beta estimate, 0.007; p = 0.35). CONCLUSIONS Clinical workload and attending physicians' availability had little effect on teaching scores. Attending physicians' clinical teaching skills, willingness to teach, interpersonal skills, and learning environment established were the important determinants of overall scores. Skilled instructors received higher scores, regardless of how busy they were.
Collapse
Affiliation(s)
- Sean P Kelly
- Department of Emergency Medicine, Beth Israel Deaconess Medical Center and Harvard Medical School, Boston, MA, USA.
| | | | | | | | | | | |
Collapse
|
35
|
Stahmer SA, Ellison SR, Jubanyik KK, Felten S, Doty C, Binder L, Jouriles NJ. Integrating the core competencies: proceedings from the 2005 Academic Assembly consortium. Acad Emerg Med 2007; 14:80-94. [PMID: 17079791 DOI: 10.1197/j.aem.2006.06.050] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
The Accreditation Council for Graduate Medical Education mandated the integration of the core competencies into residency training in 2001. To this end, educators in emergency medicine (EM) have been proactive in their approach, using collaborative efforts to develop methods that teach and assess the competencies. The first steps toward a collaborative approach occurred during the proceedings of the Council of Emergency Medicine Residency Directors (CORD-EM) academic assembly in 2002. Three years later, the competencies were revisited by working groups of EM program directors and educators at the 2005 Academic Assembly. This report provides a summary discussion of the status of integration of the competencies into EM training programs in 2005.
Collapse
Affiliation(s)
- Sarah A Stahmer
- Cooper University Hospital/Robert Wood Johnson-University of Medicine and Dentistry, New Jersey, Camden, NJ, USA.
| | | | | | | | | | | | | |
Collapse
|
36
|
Nagler J, Harper MB, Bachur RG. An automated electronic case log: using electronic information systems to assess training in emergency medicine. Acad Emerg Med 2006; 13:733-9. [PMID: 16723724 DOI: 10.1197/j.aem.2006.02.010] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
As part of the Outcome Project of the Accreditation Council for Graduate Medical Education, training programs are required to evaluate trainees across six general competencies. Assessment of the patient-care competency by direct observation can be supplemented with a quantification of overall experience through the use of case logs. However, manual entry of information into such registries frequently is incomplete. The authors report on the development of an automated electronic case log as a novel tool for evaluating the experience of individual trainees or an entire training program. Specific examples of use of the case log are provided. The authors use a pediatric emergency medicine fellowship as a paradigm to demonstrate the potential utility across all emergency medicine training programs. In addition, the authors discuss how additional information technologies might be incorporated to further these evaluative efforts in the future.
Collapse
Affiliation(s)
- Joshua Nagler
- Division of Emergency Medicine, Department of Medicine, Children's Hospital, Boston, MA 02115, USA.
| | | | | |
Collapse
|
37
|
Shayne P, Gallahue F, Rinnert S, Anderson CL, Hern G, Katz E. Reliability of a core competency checklist assessment in the emergency department: the Standardized Direct Observation Assessment Tool. Acad Emerg Med 2006; 13:727-32. [PMID: 16636361 DOI: 10.1197/j.aem.2006.01.030] [Citation(s) in RCA: 35] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
OBJECTIVES A Council of Emergency Medicine Residency Directors task force developed the Standardized Direct Observation Assessment Tool (SDOT), a 26-item checklist assessment tool to evaluate Accreditation Council for Graduate Medical Education resident core competencies by direct observation. Each of the checklist items is assigned to one or more of five core competencies. The objective of this study was to test the interrater measurement properties of the SDOT instrument. METHODS Two videos of simulated patient-resident-attending physician encounters were produced. Academic emergency medicine faculty members not involved in the development of the form viewed the two encounters and completed the SDOT for each. Faculty demographic data were collected. Data were collected from 82 faculty members at 16 emergency medicine residency programs. The checklist items were used to generate a composite score for each core competency of patient care, medical knowledge, interpersonal and communication skills, professionalism, and systems-based practice. RESULTS Univariate analysis demonstrated a high degree of agreement between evaluators in evaluating residents for both videos. Multivariate analysis found no differences in rating by faculty when examined by experience, academic title, site, or previous use of the SDOT. CONCLUSIONS Faculty from 16 emergency medicine residency programs had a high interrater agreement when using the SDOT to evaluate resident core competency performance. This study did not test the validity of the tool. This data analysis is mainly descriptive, and scripted video scenarios may not approximate direct observation in the emergency department.
Collapse
Affiliation(s)
- Philip Shayne
- Department of Emergency Medicine, Emory University, Atlanta, GA 30305, USA.
| | | | | | | | | | | |
Collapse
|
38
|
Affiliation(s)
- Clare Atzema
- Royal College Emergency Medicine Residency Training Program, University of Toronto, Ontario, Canada.
| | | | | |
Collapse
|