1
|
Cevik AA, Cakal ED, Kwan J, Chu S, Mtombeni S, Anantharaman V, Jouriles N, Peng DTK, Singer A, Cameron P, Ducharme J, Wai A, Manthey DE, Hobgood C, Mulligan T, Menendez E, Jakubaszko J. IFEM model curriculum: emergency medicine learning outcomes for undergraduate medical education. Int J Emerg Med 2024; 17:98. [PMID: 39103797 DOI: 10.1186/s12245-024-00671-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2024] [Accepted: 07/04/2024] [Indexed: 08/07/2024] Open
Abstract
BACKGROUND The International Federation for Emergency Medicine (IFEM) published its model curriculum for medical student education in emergency medicine in 2009. Because of the evolving principles of emergency medicine and medical education, driven by societal, professional, and educational developments, there was a need for an update on IFEM recommendations. The main objective of the update process was creating Intended Learning Outcomes (ILOs) and providing tier-based recommendations. METHOD A consensus methodology combining nominal group and modified Delphi methods was used. The nominal group had 15 members representing eight countries in six regions. The process began with a review of the 2009 curriculum by IFEM Core Curriculum and Education Committee (CCEC) members, followed by a three-phase update process involving survey creation [The final survey document included 55 items in 4 sections, namely, participant & context information (16 items), intended learning outcomes (6 items), principles unique to emergency medicine (20 items), and content unique to emergency medicine (13 items)], participant selection from IFEM member countries and survey implementation, and data analysis to create the recommendations. RESULTS Out of 112 invitees (CCEC members and IFEM member country nominees), 57 (50.9%) participants from 27 countries participated. Eighteen (31.6%) participants were from LMICs, while 39 (68.4%) were from HICs. Forty-four (77.2%) participants have been involved with medical students' emergency medicine training for more than five years in their careers, and 56 (98.2%) have been involved with medical students' training in the last five years. Thirty-five (61.4%) participants have completed a form of training in medical education. The exercise resulted in the formulation of tiered ILO recommendations. Tier 1 ILOs are recommended for all medical schools, Tier 2 ILOs are recommended for medical schools based on perceived local healthcare system needs and/or adequate resources, and Tier 3 ILOs should be considered for medical schools based on perceived local healthcare system needs and/or adequate resources. CONCLUSION The updated IFEM ILO recommendations are designed to be applicable across diverse educational and healthcare settings. These recommendations aim to provide a clear framework for medical schools to prepare graduates with essential emergency care capabilities immediately after completing medical school. The successful distribution and implementation of these recommendations hinge on support from faculty and administrators, ensuring that future healthcare professionals are well-prepared for emergency medical care.
Collapse
Affiliation(s)
- Arif Alper Cevik
- Emergency Medicine Section, Department of Internal Medicine, College of Medicine and Health Sciences, United Arab Emirates University, Al Ain, United Arab Emirates.
- Department of Emergency Medicine, Tawam Hospital, Al Ain, UAE.
| | - Elif Dilek Cakal
- Emergency Department, Leicester Royal Infirmary, University Hospitals of Leicester NHS Trust, Leicester, UK
| | - James Kwan
- Department of Emergency Medicine, Tan Tock Seng Hospital, Singapore, Singapore
- Yong Loo Lin School of Medicine, National University of Singapore, Singapore, Singapore
| | - Simon Chu
- University of Adelaide, Lyell McEwin Hospital, Elizabeth Vale, SA, Australia
| | - Sithembile Mtombeni
- Department of Emergency Medicine, University of Namibia, Northern Campus, Oshakati, Namibia
| | | | - Nicholas Jouriles
- Department of Emergency Medicine, Northeast Ohio Medical University, Rootstown, Ohio, USA
| | | | - Andrew Singer
- Australian Government Department of Health and Aged Care, Canberra, ACT, Australia
- Australian National University Medical School, Acton, ACT, Australia
| | - Peter Cameron
- Department of Epidemiology and Preventive Medicine, School of Public Health and Preventive Medicine, Monash University, Clayton, Australia
- The Alfred Hospital, Emergency and Trauma Centre, Melbourne, Australia
| | | | - Abraham Wai
- Department of Emergency Medicine, School of Clinical Medicine, The University of Hong Kong, Hong Kong, Hong Kong
| | - David Edwin Manthey
- Department of Emergency Medicine, Wake Forest School of Medicine, Winston Salem, NC, USA
| | - Cherri Hobgood
- Department of Emergency Medicine, University of North Carolina School of Medicine, Chapel Hill, NC, USA
| | - Terrence Mulligan
- Department of Emergency Medicine, University of Maryland School of Medicine, Baltimore, MD, USA
| | - Edgardo Menendez
- Department of Emergency Medicine, Churruca Hospital UBA, Buenos Aires, Argentina
| | - Juliusz Jakubaszko
- Department of Emergency Medicine, Wroclaw University of Medicine, Wroclaw, Poland
| |
Collapse
|
2
|
Lin HJ, Wu JH, Lin WH, Nien KW, Wang HT, Tsai PJ, Chen CY. Using ACGME milestones as a formative assessment for the internal medicine clerkship: a consecutive two-year outcome and follow-up after graduation. BMC MEDICAL EDUCATION 2024; 24:238. [PMID: 38443912 PMCID: PMC10916194 DOI: 10.1186/s12909-024-05108-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/04/2023] [Accepted: 01/29/2024] [Indexed: 03/07/2024]
Abstract
BACKGROUND This study evaluated the utility of using Accreditation Council for Graduate Medical Education (ACGME) Milestones as a formative assessment tool for the fifth- and sixth-grade medical students' performance in their internal medicine (IM) clerkship and the same students' performance in their post-graduate year (PGY) IM training. METHODS Retrospective data were collected from 65 medical students completing the two-year IM clerkship in the academic years 2019 and 2020 and 26 of the above students completing their PGY-1 training at the same university hospital in the academic year 2021. Data included the assessment results of 7 of the ACGME IM Milestones, information on admitted patients assigned to the students, and surveys of the students' satisfaction. RESULTS The analysis included 390 assessment results during the IM clerkship and 78 assessment results during the PGY-1 training. Clinical teachers commonly rated level 3 to medical students in the IM clerkship, with PC-2 subcompetency receiving the lowest rating among seven subcompetencies. The levels of most subcompetencies showed stationary in the two-year IM clerkship. Significant improvement was observed in all subcompetencies during the PGY-1 training. The medical students in the second-year IM clerkship expressed higher satisfaction with implementing Milestones than in their first-year IM clerkship and perceived Milestones assessments' usefulness as learning feedback. CONCLUSIONS Using ACGME Milestones as a formative assessment tool in the IM clerkship yielded promising outcomes. Longitudinal follow-up of subcompetencies facilitated tracking students' development and providing constructive feedback.
Collapse
Affiliation(s)
- Hsiao-Ju Lin
- Department of Internal Medicine, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan
- Education Center, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan
| | - Jhong-Han Wu
- Department of Internal Medicine, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan
| | - Wei-Hung Lin
- Department of Internal Medicine, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan
- Education Center, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan
| | - Kai-Wen Nien
- Education Center, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan
| | - Huei-Ting Wang
- Education Center, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan
| | - Pei-Jen Tsai
- Education Center, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan
| | - Chiung-Yu Chen
- Department of Internal Medicine, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan.
- Education Center, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan, Taiwan.
| |
Collapse
|
3
|
Love J, Zeidan A, Khatri U, Samuels-Kalow M, Mills A, Hsu C. WOMen profEssioNal developmenT oUtcome Metrics in Academic Emergency Medicine: Results from the WOMENTUM Modified Delphi Study. West J Emerg Med 2022; 23:660-671. [PMID: 36205680 PMCID: PMC9541981 DOI: 10.5811/westjem.2022.6.56608] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/02/2022] [Accepted: 06/30/2022] [Indexed: 11/25/2022] Open
Abstract
Introduction To address persistent gender inequities in academic medicine, women professional development groups (PDG) have been developed to support the advancement of women in medicine. While these programs have shown promising outcomes, long-term evaluative metrics do not currently exist. The objective of this study was to establish metrics to assess women’s PDGs. Methods This was a modified Delphi study that included an expert panel of current and past emergency department (ED) chairs and Academy for Women in Academic Emergency Medicine (AWAEM) presidents. The panel completed three iterative surveys to develop and rank metrics to assess women PDGs. Metrics established by the expert panel were also distributed for member-checking to women EM faculty. Results The expert panel ranked 11 metrics with high to moderate consensus ranking with three metrics receiving greater than 90% consensus: gender equity strategy and plan; recruitment; and compensation. Members ranked 12 metrics with high consensus with three metrics receiving greater than 90% consensus: gender equity strategy and plan; compensation; and gender equity in promotion rates among faculty. Participants emphasized that departments should be responsible for leading gender equity efforts with PDGs providing a supportive role. Conclusion In this study, we identified metrics that can be used to assess academic EDs’ gender equity initiatives and the advisory efforts of a departmental women’s PDG. These metrics can be tailored to individual departmental/institutional needs, as well as to a PDG’s mission. Importantly, PDGs can use metrics to develop and assess programming, acknowledging that many metrics are the responsibility of the department rather than the PDG.
Collapse
Affiliation(s)
- Jennifer Love
- Icahn School of Medicine at Mount Sinai, Department of Emergency Medicine, New York, New York
| | - Amy Zeidan
- Emory University School of Medicine, Department of Emergency Medicine, Atlanta, Georgia
| | - Utsha Khatri
- Icahn School of Medicine at Mount Sinai, Department of Emergency Medicine, New York, New York; Icahn School of Medicine at Mount Sinai, Department of Population Health Science and Policy, New York, New York
| | - Margaret Samuels-Kalow
- Harvard Medical School, Massachusetts General Hospital, Department of Emergency Medicine, Boston, Massachusetts
| | - Angela Mills
- Columbia University College of Physicians and Surgeons, Department of Emergency Medicine, New York, New York
| | - Cindy Hsu
- University of Michigan Medical School, Department of Emergency Medicine, Ann Arbor, Michigan
| |
Collapse
|
4
|
Peng CR, Schertzer KA, Caretta-Weyer HA, Sebok-Syer SS, Lu W, Tansomboon C, Gisondi MA. Assessment of Entrustable Professional Activities Using a Web-Based Simulation Platform During Transition to Emergency Medicine Residency: Mixed Methods Pilot Study. JMIR MEDICAL EDUCATION 2021; 7:e32356. [PMID: 34787582 PMCID: PMC8663509 DOI: 10.2196/32356] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/23/2021] [Revised: 09/30/2021] [Accepted: 10/03/2021] [Indexed: 06/13/2023]
Abstract
BACKGROUND The 13 core entrustable professional activities (EPAs) are key competency-based learning outcomes in the transition from undergraduate to graduate medical education in the United States. Five of these EPAs (EPA2: prioritizing differentials, EPA3: recommending and interpreting tests, EPA4: entering orders and prescriptions, EPA5: documenting clinical encounters, and EPA10: recognizing urgent and emergent conditions) are uniquely suited for web-based assessment. OBJECTIVE In this pilot study, we created cases on a web-based simulation platform for the diagnostic assessment of these EPAs and examined the feasibility and acceptability of the platform. METHODS Four simulation cases underwent 3 rounds of consensus panels and pilot testing. Incoming emergency medicine interns (N=15) completed all cases. A maximum of 4 "look for" statements, which encompassed specific EPAs, were generated for each participant: (1) performing harmful or missing actions, (2) narrowing differential or wrong final diagnosis, (3) errors in documentation, and (4) lack of recognition and stabilization of urgent diagnoses. Finally, we interviewed a sample of interns (n=5) and residency leadership (n=5) and analyzed the responses using thematic analysis. RESULTS All participants had at least one missing critical action, and 40% (6/15) of the participants performed at least one harmful action across all 4 cases. The final diagnosis was not included in the differential diagnosis in more than half of the assessments (8/15, 54%). Other errors included selecting incorrect documentation passages (6/15, 40%) and indiscriminately applying oxygen (9/15, 60%). The interview themes included psychological safety of the interface, ability to assess learning, and fidelity of cases. The most valuable feature cited was the ability to place orders in a realistic electronic medical record interface. CONCLUSIONS This study demonstrates the feasibility and acceptability of a web-based platform for diagnostic assessment of specific EPAs. The approach rapidly identifies potential areas of concern for incoming interns using an asynchronous format, provides feedback in a manner appreciated by residency leadership, and informs individualized learning plans.
Collapse
Affiliation(s)
- Cynthia R Peng
- Department of Emergency Medicine, Stanford University School of Medicine, Palo Alto, CA, United States
| | - Kimberly A Schertzer
- Department of Emergency Medicine, Stanford University School of Medicine, Palo Alto, CA, United States
| | - Holly A Caretta-Weyer
- Department of Emergency Medicine, Stanford University School of Medicine, Palo Alto, CA, United States
| | - Stefanie S Sebok-Syer
- Department of Emergency Medicine, Stanford University School of Medicine, Palo Alto, CA, United States
| | - William Lu
- Cornell University College of Engineering, Ithaca, NY, United States
| | | | | |
Collapse
|
5
|
Wolff M, Deiorio NM, Miller Juve A, Richardson J, Gazelle G, Moore M, Santen SA, Hammoud MM. Beyond advising and mentoring: Competencies for coaching in medical education. MEDICAL TEACHER 2021; 43:1210-1213. [PMID: 34314291 DOI: 10.1080/0142159x.2021.1947479] [Citation(s) in RCA: 16] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
BACKGROUND Coaching supports academic goals, professional development and wellbeing in medical education. Scant literature exists on training and assessing coaches and evaluating coaching programs. To begin filling this gap, we created a set of coach competencies for medical education using a modified Delphi approach. METHODS An expert team assembled, comprised of seven experts in the field of coaching. A modified Delphi approach was utilized to develop competencies. RESULTS Fifteen competencies in five domains resulted: coaching process and structure, relational skills, coaching skills, coaching theories and models, and coach development. CONCLUSION These competencies delineate essential features of a coach in medical education. Next steps include creating faculty development and assessment tools for coaching.
Collapse
Affiliation(s)
- Meg Wolff
- Departments of Emergency Medicine and Pediatrics, University of Michigan Medical School, Ann Arbor, MI, USA
| | - Nicole M Deiorio
- Department of Emergency Medicine, Virginia Commonwealth University, Richmond, VA, USA
| | - Amy Miller Juve
- Department of Anesthesiology and Perioperative Medicine, Oregon Health and Science University, Portland, OR, USA
| | - Judee Richardson
- Medical Education Strategy Unit, American Medical Association, Chicago, IL, USA
| | - Gail Gazelle
- Department of Medicine, Brigham and Women's Hospital, Boston, MA, USA
| | - Margaret Moore
- Institute of Coaching, McLean Hospital, Harvard Medical School Affiliate, Belmont, TN, USA
| | - Sally A Santen
- Department of Emergency Medicine, Virginia Commonwealth University, Richmond, VA, USA
| | - Maya M Hammoud
- Department of Obstetrics and Gynecology, University of Michigan Medical School, Ann Arbor, MI, USA
| |
Collapse
|
6
|
Morrison Ponce DP, Wolff M. Defining a Focused Pediatric Emergency Medicine Curriculum for Emergency Medicine Residents: A Case Study at Michigan Medicine. AEM EDUCATION AND TRAINING 2021; 5:70-74. [PMID: 33521493 PMCID: PMC7821070 DOI: 10.1002/aet2.10455] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/12/2019] [Revised: 03/04/2020] [Accepted: 04/09/2020] [Indexed: 06/12/2023]
Abstract
OBJECTIVES Emergency medicine (EM) is dedicated to the treatment of urgent and emergent illness requiring physicians to evaluate, treat, and diagnose patients of all ages. EM residency provides the foundation of knowledge enabling trainees to care for any patient. However, specific pediatric curriculum guidance from governing bodies is limited. The literature includes two potential curricula that are cumbersome to implement. Our primary objective was to identify the components of this curricula that were specific to pediatric emergency medicine (PEM). Secondary objectives were to provide a methods framework and to compare the results with the American Board of Emergency Medicine Model of Clinical Practice (EM Model). METHODS With the modified Delphi technique, iterative rounds of expert panels sought to reach consensus on PEM-specific topics. We utilized the published curricula as the foundation and focused this list using a group of local experts. Predetermined consensus was defined as 80% agreement. RESULTS The literature-derived list of 190 topics was reviewed by the expert panel. Experts identified 92 PEM-specific topics, and the remaining 98 topics were deemed adequately covered by general EM curricula. All topics reached consensus after three rounds. The final list was sorted in accordance with the EM Model categories. Redundant topics were consolidated resulting in 68 PEM topics. Of these 68 topics, we identified 20 topics (five of which are critical) that were incompletely covered by the EM Model. CONCLUSIONS Emergency medicine residency programs should focus their PEM curriculum by deliberately assessing their coverage of key PEM topics. The methods of this study can be replicated to yield locally applicable results in other EM programs. Additionally, the next iteration of the EM Model of Clinical Practice should inform their PEM topics from the available curricula in the literature.
Collapse
Affiliation(s)
- Daphne P. Morrison Ponce
- Department of Emergency MedicineDivision of Pediatric Emergency MedicineUniversity of MichiganAnn ArborMI
| | - Margaret Wolff
- Department of Emergency MedicineDivision of Pediatric Emergency MedicineUniversity of MichiganAnn ArborMI
| |
Collapse
|
7
|
St-Onge C, Vachon Lachiver É, Langevin S, Boileau E, Bernier F, Thomas A. Lessons from the implementation of developmental progress assessment: A scoping review. MEDICAL EDUCATION 2020; 54:878-887. [PMID: 32083743 DOI: 10.1111/medu.14136] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/28/2019] [Revised: 01/21/2020] [Accepted: 02/06/2020] [Indexed: 06/10/2023]
Abstract
OBJECTIVES Educators and researchers recently implemented developmental progress assessment (DPA) in the context of competency-based education. To reap its anticipated benefits, much still remains to be understood about its implementation. In this study, we aimed to determine the nature and extent of the current evidence on DPA, in an effort to broaden our understanding of the major goals and intended outcomes of DPA as well as the lessons learned from how it has been executed in, or applied across, educational contexts. METHODS We conducted a scoping study based on the methodology of Arksey and O'Malley. Our search strategy yielded 2494 articles. These articles were screened for inclusion and exclusion (90% agreement), and numerical and qualitative data were extracted from 56 articles based on a pre-defined set of charting categories. The thematic analysis of the qualitative data was completed with iterative consultations and discussions until consensus was achieved for the interpretation of the results. RESULTS Tools used to document DPA include scales, milestones and portfolios. Performances were observed in clinical or standardised contexts. We identified seven major themes in our qualitative thematic analysis: (a) underlying aims of DPA; (b) sources of information; (c) barriers; (d) contextual factors that can act as barriers or facilitators to the implementation of DPA; (e) facilitators; (f) observed outcomes, and (g) documented validity evidences. CONCLUSIONS Developmental progress assessment seems to fill a need in the training of future competent health professionals. However, moving forward with a widespread implementation of DPA, factors such as lack of access to user-friendly technology and time to observe performance may render its operationalisation burdensome in the context of competency-based medical education.
Collapse
Affiliation(s)
- Christina St-Onge
- Department of Medicine, Faculty of Medicine and Health Sciences, University of Sherbrooke, Sherbrooke, Québec, Canada
| | - Élise Vachon Lachiver
- Faculty of Medicine and Health Sciences, University of Sherbrooke, Sherbrooke, Québec, Canada
| | - Serge Langevin
- Department of Medicine, Faculty of Medicine and Health Sciences, University of Sherbrooke, Sherbrooke, Québec, Canada
| | - Elisabeth Boileau
- Department of Family and Emergency Medicine, Faculty of Medicine and Health Sciences, University of Sherbrooke, Sherbrooke, Québec, Canada
| | - Frédéric Bernier
- Department of Medicine, Faculty of Medicine and Health Sciences, University of Sherbrooke, Sherbrooke, Québec, Canada
- Faculty of Medicine and Health Sciences, University of Sherbrooke, Sherbrooke, Québec, Canada
- Research Center - Sherbrooke University Hospital Center (CHUS), Integrated Health and Social Service Centers (CISSS) and Integrated University Health and Social Service Centres (CIUSSS), Sherbrooke, Québec, Canada
| | - Aliki Thomas
- School of Physical and Occupational Therapy, McGill University, Montreal, Québec, Canada
| |
Collapse
|
8
|
Tay KT, Ng S, Hee JM, Chia EWY, Vythilingam D, Ong YT, Chiam M, Chin AMC, Fong W, Wijaya L, Toh YP, Mason S, Krishna LKR. Assessing Professionalism in Medicine - A Scoping Review of Assessment Tools from 1990 to 2018. JOURNAL OF MEDICAL EDUCATION AND CURRICULAR DEVELOPMENT 2020; 7:2382120520955159. [PMID: 33150208 PMCID: PMC7580192 DOI: 10.1177/2382120520955159] [Citation(s) in RCA: 17] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/17/2020] [Accepted: 08/11/2020] [Indexed: 05/17/2023]
Abstract
BACKGROUND Medical professionalism enhances doctor-patient relationships and advances patient-centric care. However, despite its pivotal role, the concept of medical professionalism remains diversely understood, taught and thus poorly assessed with Singapore lacking a linguistically sensitive, context specific and culturally appropriate assessment tool. A scoping review of assessments of professionalism in medicine was thus carried out to better guide its understanding. METHODS Arksey and O'Malley's (2005) approach to scoping reviews was used to identify appropriate publications featured in four databases published between 1 January 1990 and 31 December 2018. Seven members of the research team employed thematic analysis to evaluate the selected articles. RESULTS 3799 abstracts were identified, 138 full-text articles reviewed and 74 studies included. The two themes identified were the context-specific nature of assessments and competency-based stages in medical professionalism. CONCLUSIONS Prevailing assessments of professionalism in medicine must contend with differences in setting, context and levels of professional development as these explicate variances found in existing assessment criteria and approaches. However, acknowledging the significance of context-specific competency-based stages in medical professionalism will allow the forwarding of guiding principles to aid the design of a culturally-sensitive and practical approach to assessing professionalism.
Collapse
Affiliation(s)
- Kuang Teck Tay
- Yong Loo Lin School of Medicine, National University of Singapore, Singapore
| | - Shea Ng
- Yong Loo Lin School of Medicine, National University of Singapore, Singapore
| | - Jia Min Hee
- National University Hospital, National University Health System, Singapore
| | | | - Divya Vythilingam
- School of Medicine, International Medical University Malaysia, Kuala Lumpur, Malaysia
| | - Yun Ting Ong
- Yong Loo Lin School of Medicine, National University of Singapore, Singapore
| | - Min Chiam
- Division of Cancer Education, National Cancer Centre Singapore, Singapore
| | - Annelissa Mien Chew Chin
- Medical Library, National University of Singapore Libraries, National University of Singapore, Singapore
| | - Warren Fong
- Yong Loo Lin School of Medicine, National University of Singapore, Singapore
- Department of Rheumatology and Immunology, Singapore General Hospital, Singapore
- Duke-NUS Graduate Medical School, Singapore
| | - Limin Wijaya
- Yong Loo Lin School of Medicine, National University of Singapore, Singapore
- Department of Infectious Diseases, Singapore General Hospital, Singapore
- Duke-NUS Graduate Medical School, Singapore
| | - Ying Pin Toh
- Department of Family Medicine, National University Health System, Singapore
| | - Stephen Mason
- Palliative Care Institute Liverpool, Academic Palliative & End of Life Care Centre, University of Liverpool, Liverpool, UK
| | - Lalit Kumar Radha Krishna
- Yong Loo Lin School of Medicine, National University of Singapore, Singapore
- Division of Cancer Education, National Cancer Centre Singapore, Singapore
- Palliative Care Institute Liverpool, Academic Palliative & End of Life Care Centre, University of Liverpool, Liverpool, UK
- Division of Supportive and Palliative Care, National Cancer Centre Singapore, Singapore
- Centre for Biomedical Ethics, National University of Singapore, Singapore
- Duke-NUS Graduate Medical School, Singapore
- PalC, The Palliative Care Centre for Excellence in Research and Education
- Lalit Kumar Radha Krishna, Palliative Care Institute Liverpool, Academic Palliative & End of Life Care Centre, Cancer Research Centre, University of Liverpool, 200 London Road, Liverpool, L3 9TA, UK.
| |
Collapse
|
9
|
Olaf MF. Pupil Prose Appraisal: Four Practical Solutions to Medical Student Documentation and Feedback in the Emergency Department. AEM EDUCATION AND TRAINING 2019; 3:403-407. [PMID: 31637360 PMCID: PMC6795385 DOI: 10.1002/aet2.10384] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/24/2019] [Revised: 07/22/2019] [Accepted: 07/31/2019] [Indexed: 06/10/2023]
Abstract
Documentation is part of a critical foundation of skills in the undergraduate medical education curriculum. New compliance rules from the Centers for Medicare and Medicaid Services will impact student documentation practices. Common barriers to student documentation include limited access to the electronic medical record, variable clerkship documentation expectations, variable advice regarding utilizing the electronic medical record, and limited time for feedback delivery. Potential solutions to these barriers are suggested to foster documentation skill development. Recommendations are also given to mitigate compliance and legal risk.
Collapse
Affiliation(s)
- Mark F. Olaf
- Geisinger Commonwealth School of MedicineGeisinger HealthDanvillePA
| |
Collapse
|
10
|
Smith C, Likourezos A, Schiller J. Focused Teaching Improves Medical Student Professionalism and Data Gathering Skills in the Emergency Department. Cureus 2019; 11:e5765. [PMID: 31723524 PMCID: PMC6825500 DOI: 10.7759/cureus.5765] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2019] [Accepted: 09/25/2019] [Indexed: 11/06/2022] Open
Abstract
INTRODUCTION Leaders in medical education have developed milestones and core competencies in an attempt to ensure that relational skills, such as communication and professionalism, are emphasized in addition to the usual skills of medical knowledge, data gathering, and emergency stabilization during students' emergency medicine (EM) medical education. Providers facile in each of these areas have better patient outcomes, patient experiences, and decreased incidence of malpractice cases. The authors attempted to demonstrate that by deliberate teaching of these skills during an EM medical student clerkship, students could significantly improve their clinical performance. METHODS This prospective, randomized, single-blinded cohort study was performed at an academic, tertiary, urban ED to investigate the effects of a one-on-one preceptor shift on the clinical performance of fourth-year medical students. Students were randomized into two groups and assessed by pre- and post-intervention objective structured clinical encounters (OSCEs) with standardized patients (SPs) at weeks one and three. A crossover design was employed so that students in the control group participated in a preceptor shift after their second OSCE. Measurements were based on a five-point Likert scale assessment linked to early EM milestones as defined by the Accreditation Council on Graduate Medical Education (ACGME). Results: The mean improvement in total overall score was significantly greater in the intervention group: 4.31 versus 2.57 (Cohen's d = 0.57, p = 0.029). When each milestone was assessed individually, students in the intervention group improved significantly in data gathering (Cohen's d = 0.47, p = 0.048) and professionalism (Cohen's d = 0.66, p = 0.011). There was a nonstatistically significant improvement for the intervention compared to control group in emergency management and communication skills. There was no improvement for either group in medical knowledge. CONCLUSION A one-on-one preceptor shift can result in a statistically significant improvement in data gathering and professionalism skills as measured by OSCEs.
Collapse
Affiliation(s)
- Colleen Smith
- Assistant Professor Emergency Medicine, Medical Education and Simulation, Elmhurst Hospital, Icahn School of Medicine at Mount Sinai, New York, USA
| | | | - Joshua Schiller
- Emergency Medicine, Maimonides Medical Center, Brooklyn, USA
| |
Collapse
|
11
|
Fostering leadership through the changing practice of the emergency nurse practitioner specialty. J Am Assoc Nurse Pract 2018; 30:475-477. [PMID: 30211819 DOI: 10.1097/jxx.0000000000000132] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|
12
|
Cevik AA, Cakal ED, Abu-Zidan FM. Emergency medicine clerkship curriculum in a high-income developing country: methods for development and application. Int J Emerg Med 2018; 11:31. [PMID: 29882065 PMCID: PMC5991107 DOI: 10.1186/s12245-018-0190-y] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2018] [Accepted: 05/24/2018] [Indexed: 12/20/2022] Open
Abstract
Background The published recommendations for international emergency medicine curricula cover the content, but exclude teaching and learning methods, assessment, and evaluation. We aim to provide an overview on available emergency medicine clerkship curricula and report the development and application experience of our own curriculum. Methods Our curriculum is an outcome-based education, enriched by e-learning and various up-to-date pedagogic principles. Results Teaching and learning methods, assessment, and evaluation are described. The theory behind our practice in the light of recent literature is discussed aiming to help other colleagues from developing countries to have a clear map for developing and tailoring their own curricula depending on their needs. The details of our emergency medicine clerkship will serve as an example for developing and developed countries having immature undergraduate emergency medicine clerkship curricula. However, these recommendations will differ in various settings depending on available resources. Conclusions The main concept of curriculum development is to create a curriculum having learning outcomes and content relevant to the local context, and then align the teaching and learning activities, assessments, and evaluations to be in harmony. This may assure favorable educational outcome even in resource limited settings. Electronic supplementary material The online version of this article (10.1186/s12245-018-0190-y) contains supplementary material, which is available to authorized users.
Collapse
Affiliation(s)
- Arif Alper Cevik
- Department of Internal Medicine, Emergency Medicine Clerkship, College of Medicine and Health Sciences, United Arab Emirates University, Al Ain, 17666, United Arab Emirates. .,Department of Emergency Medicine, Tawam-John Hopkins Hospital, Al Ain, United Arab Emirates.
| | - Elif Dilek Cakal
- Department of Emergency Medicine, Mersin State Hospital, Mersin, Turkey
| | - Fikri M Abu-Zidan
- Department of Surgery, College of Medicine and Health Sciences, United Arab Emirates University, Al Ain, United Arab Emirates
| |
Collapse
|
13
|
Jung J, Franzen D, Lawson L, Manthey D, Tews M, Dubosh N, Fisher J, Haughey M, House JB, Trainor A, Wald DA, Hiller K. The National Clinical Assessment Tool for Medical Students in the Emergency Department (NCAT-EM). West J Emerg Med 2017; 19:66-74. [PMID: 29383058 PMCID: PMC5785203 DOI: 10.5811/westjem.2017.10.34834] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2017] [Revised: 10/06/2017] [Accepted: 10/18/2017] [Indexed: 11/11/2022] Open
Abstract
Introduction Clinical assessment of medical students in emergency medicine (EM) clerkships is a highly variable process that presents unique challenges and opportunities. Currently, clerkship directors use institution-specific tools with unproven validity and reliability that may or may not address competencies valued most highly in the EM setting. Standardization of assessment practices and development of a common, valid, specialty-specific tool would benefit EM educators and students. Methods A two-day national consensus conference was held in March 2016 in the Clerkship Directors in Emergency Medicine (CDEM) track at the Council of Residency Directors in Emergency Medicine (CORD) Academic Assembly in Nashville, TN. The goal of this conference was to standardize assessment practices and to create a national clinical assessment tool for use in EM clerkships across the country. Conference leaders synthesized the literature, articulated major themes and questions pertinent to clinical assessment of students in EM, clarified the issues, and outlined the consensus-building process prior to consensus-building activities. Results The first day of the conference was dedicated to developing consensus on these key themes in clinical assessment. The second day of the conference was dedicated to discussing and voting on proposed domains to be included in the national clinical assessment tool. A modified Delphi process was initiated after the conference to reconcile questions and items that did not reach an a priori level of consensus. Conclusion The final tool, the National Clinical Assessment Tool for Medical Students in Emergency Medicine (NCAT-EM) is presented here.
Collapse
Affiliation(s)
- Julianna Jung
- Johns Hopkins University, Department of Emergency Medicine, Baltimore, Maryland
| | - Douglas Franzen
- University of Washington, Department of Emergency Medicine, Seattle, Washington
| | - Luan Lawson
- East Carolina University, Department of Emergency Medicine, Greenville, North Carolina
| | - David Manthey
- Wake Forest University, Department of Emergency Medicine, Winston-Salem, North Carolina
| | - Matthew Tews
- Medical College of Georgia, Department of Emergency Medicine, Augusta, Georgia
| | - Nicole Dubosh
- Beth Israel Deaconess Medical Center/Harvard Medical School, Boston, Massachusetts
| | - Jonathan Fisher
- University of Arizona, Phoenix, Department of Emergency Medicine, Phoenix, Arizona
| | - Marianne Haughey
- City University of New York, Department of Emergency Medicine, New York, New York
| | - Joseph B House
- University of Michigan, Department of Emergency Medicine, Ann Arbor, Michigan
| | - Arleigh Trainor
- University of South Dakota, Department of Emergency Medicine, Vermillion, South Dakota
| | - David A Wald
- Temple University, Department of Emergency Medicine, Philadelphia, Pennsylvania
| | - Katherine Hiller
- University of Arizona, Department of Emergency Medicine, Tucson, Arizona
| |
Collapse
|
14
|
Almeland SK, Lindford A, Berg JO, Hansson E. A core undergraduate curriculum in plastic surgery – a Delphi consensus study in Scandinavia. J Plast Surg Hand Surg 2017; 52:97-105. [DOI: 10.1080/2000656x.2017.1343190] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Affiliation(s)
- Stian K. Almeland
- Department of Plastic and Reconstructive Surgery, Haukeland University Hospital, Bergen, Norway
- Department of Clinical Medicine, University of Bergen, Bergen, Norway
| | - Andrew Lindford
- Department of Plastic and Reconstructive Surgery, Helsinki University Hospital, Helsinki, Finland
| | - Jais Oliver Berg
- Department of Plastic and Reconstructive Surgery, Herlev-Gentofte Hospital, University of Copenhagen, Copenhagen, Denmark
| | - Emma Hansson
- Department of Plastic and Reconstructive Surgery, Sahlgrenska University Hospital, Gothenburg, Sweden
- Department of Clinical Sciences Malmö, Lund University, Lund, Sweden
| |
Collapse
|
15
|
Bruce AN, Kumar A, Malekzadeh S. Procedural Skills of the Entrustable Professional Activities: Are Graduating US Medical Students Prepared to Perform Procedures in Residency? JOURNAL OF SURGICAL EDUCATION 2017; 74:589-595. [PMID: 28126380 DOI: 10.1016/j.jsurg.2017.01.002] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/18/2016] [Revised: 11/17/2016] [Accepted: 01/01/2017] [Indexed: 06/06/2023]
Abstract
PURPOSE Competency-based medical education has been successfully instituted in graduate medical education through the development of Milestones. Consequently, the Association of American Medical Colleges implemented the core entrustable professional activities initiative to complement this framework in undergraduate medical education. We sought to determine its efficacy by examining the experiences and confidence of recent medical school graduates with general procedural skills (entrustable professional activities 12). METHOD We administered an electronic survey to the MedStar Georgetown University Hospital intern class assessing their experiences with learning and evaluation as well as their confidence with procedural skills training during medical school. Simple linear regression was used to compare respondent confidence and the presence of formal evaluation in medical school. RESULTS We received 28 complete responses, resulting in a 33% response rate, whereas most respondents indicated that basic cardiopulmonary resuscitation, bag/mask ventilation, and universal precautions were important to and evaluated by their medical school, this emphasis was not present for venipuncture, intravenous catheter placement, and arterial puncture. Mean summed scores of confidence for each skill indicated a statistically significant effect between confidence and evaluation of universal precaution skills. CONCLUSIONS More advanced procedural skills are not considered as important for graduating medical students and are less likely to be taught and formally evaluated before graduation. Formal evaluation of some procedural skills is associated with increased confidence of the learner.
Collapse
Affiliation(s)
- Adrienne N Bruce
- Department of Student Research, Georgetown University School of Medicine, Washington, DC.
| | - Anagha Kumar
- Department of Biostatistics, MedStar Health Research Institute, Hyattsville, Maryland
| | - Sonya Malekzadeh
- Department of Otolaryngology-Head and Neck Surgery, Georgetown University School of Medicine, Washington, DC
| |
Collapse
|
16
|
Hall MM, Dubosh NM, Ullman E. Distribution of Honors Grades Across Fourth-year Emergency Medicine Clerkships. AEM EDUCATION AND TRAINING 2017; 1:81-86. [PMID: 30051015 PMCID: PMC6001727 DOI: 10.1002/aet2.10018] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/07/2016] [Revised: 12/31/2016] [Accepted: 01/04/2017] [Indexed: 06/08/2023]
Abstract
BACKGROUND Medical student grades during emergency medicine (EM) rotations are a key factor in resident selection. The variability in grading among EM clerkships is not well understood. OBJECTIVE The objective was to describe the current grade distribution of fourth-year EM clerkships. METHODS This was an observational study at an EM residency program. We identified grade distributions by reviewing the standard letter of evaluation from individuals applying to our residency program for the 2016 match. Descriptive statistics of proportions, standard deviations (SDs), and p-values were calculated. RESULTS A total of 1,075 applications from 236 individual clerkships were reviewed. Thirty-four programs did not give an honors grade during the previous year. Four of these programs distributed a highest grade of "high pass" and 30 gave only "pass" and/or "fail." Of the remaining 202 programs, the percentage of grades that were given as honors ranged from 1% to 87% with a mean (±SD) of 25% (±17.2%). Of the 202 programs that granted honors grades, 63 (31.2%) sites gave between 1 and 14.9% honors grades, 69 (34.2%) gave 15% to 29.9% honors grades, 27 (13.4%) gave 30% to 44.9% honors grades, and 24 (11.9%) programs granted honors to greater than 45% of their students. Medical schools required an EM rotation at 82 (40.6%) sites. Among these programs, honors grades were given to 24% (±16.7%) of students with a range of 4% to 85% while programs that did not require clerkships gave a mean (±SD) of 26% (±17.5%) with a range of 1% to 87% and a p-value of 0.54. CONCLUSIONS Honor grade distribution varies markedly across U.S. fourth-year EM clerkship sites. Requiring EM clerkships does not affect honor percentages. A minority of sites only give pass/fail grades. Program directors should consider this marked variation in grades when reviewing EM residency applications.
Collapse
Affiliation(s)
- Matthew M. Hall
- Department of Emergency MedicineBeth Israel Deaconess Medical CenterBostonMA
| | - Nicole M. Dubosh
- Department of Emergency MedicineBeth Israel Deaconess Medical CenterBostonMA
| | - Edward Ullman
- Department of Emergency MedicineBeth Israel Deaconess Medical CenterBostonMA
| |
Collapse
|
17
|
Dubosh NM, Fisher J, Lewis J, Ullman EA. Faculty Evaluations Correlate Poorly with Medical Student Examination Performance in a Fourth-Year Emergency Medicine Clerkship. J Emerg Med 2017; 52:850-855. [PMID: 28341085 DOI: 10.1016/j.jemermed.2016.09.018] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/06/2016] [Accepted: 09/12/2016] [Indexed: 11/29/2022]
Abstract
BACKGROUND Clerkship directors routinely evaluate medical students using multiple modalities, including faculty assessment of clinical performance and written examinations. Both forms of evaluation often play a prominent role in final clerkship grade. The degree to which these modalities correlate in an emergency medicine (EM) clerkship is unclear. OBJECTIVE We sought to correlate faculty clinical evaluations with medical student performance on a written, standardized EM examination of medical knowledge. METHODS This is a retrospective study of fourth-year medical students in a 4-week EM elective at one academic medical center. EM faculty performed end of shift evaluations of students via a blinded online system using a 5-point Likert scale for 8 domains: data acquisition, data interpretation, medical knowledge base, professionalism, patient care and communication, initiative/reliability/dependability, procedural skills, and overall evaluation. All students completed the National EM M4 Examination in EM. Means, medians, and standard deviations for end of shift evaluation scores were calculated, and correlations with examination scores were assessed using a Spearman's rank correlation coefficient. RESULTS Thirty-nine medical students with 224 discrete faculty evaluations were included. The median number of evaluations completed per student was 6. The mean score (±SD) on the examination was 78.6% ± 6.1%. The examination score correlated poorly with faculty evaluations across all 8 domains (ρ 0.074-0.316). CONCLUSION Faculty evaluations of medical students across multiple domains of competency correlate poorly with written examination performance during an EM clerkship. Educators need to consider the limitations of examination score in assessing students' ability to provide quality patient clinical care.
Collapse
Affiliation(s)
- Nicole M Dubosh
- Department of Emergency Medicine, Harvard Medical School, Boston, Massachusetts; Department of Emergency Medicine, Beth Israel Deaconess Medical Center, Boston, Massachusetts
| | - Jonathan Fisher
- Department of Emergency Medicine, Maricopa Medical Center, Phoenix, Arizona
| | - Jason Lewis
- Department of Emergency Medicine, Harvard Medical School, Boston, Massachusetts; Department of Emergency Medicine, Beth Israel Deaconess Medical Center, Boston, Massachusetts
| | - Edward A Ullman
- Department of Emergency Medicine, Harvard Medical School, Boston, Massachusetts; Department of Emergency Medicine, Beth Israel Deaconess Medical Center, Boston, Massachusetts
| |
Collapse
|
18
|
Hiller KM, Franzen D, Lawson L, Manthey D, Fisher J, Haughey M, Tews M, Dubosh N, House J, Trainor A, Wald D, Jung J. Clinical Assessment of Medical Students in the Emergency Department, a National Consensus Conference. West J Emerg Med 2016; 18:82-83. [PMID: 28116013 PMCID: PMC5226769 DOI: 10.5811/westjem.2016.11.32686] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2016] [Accepted: 11/01/2016] [Indexed: 11/13/2022] Open
Affiliation(s)
- Katherine M Hiller
- University of Arizona, Department of Emergency Medicine, Tucson, Arizona
| | - Douglas Franzen
- University of Washington, Department of Medicine, Division of Emergency Medicine, Seattle, Washington
| | - Luan Lawson
- East Carolina University, Department of Emergency Medicine, Greenville, North Carolina
| | - David Manthey
- Wake Forest University, Department of Emergency Medicine, Winston-Salem, North Carolina
| | - Jonathan Fisher
- University of Arizona, Department of Emergency Medicine, Tucson, Arizona
| | - Marianne Haughey
- St. Barnabas Medical Center, Department of Emergency Medicine, Bronx, New York
| | - Matthew Tews
- Medical College of Wisconsin, Department of Emergency Medicine, Milwaukee, Wisconsin
| | - Nicole Dubosh
- Harvard University, Department of Emergency Medicine, Cambridge, Massachusetts
| | - Joseph House
- University of Michigan, Department of Emergency Medicine, Ann Arbor, Michigan
| | - Arleigh Trainor
- University of South Dakota, Department of Emergency Medicine, Vermillion, South Dakota
| | - David Wald
- Lewis Katz School of Medicine, Philadelphia, Pennsylvania
| | - Julianna Jung
- Johns Hopkins University, Department of Emergency Medicine, Baltimore, Maryland
| |
Collapse
|
19
|
Lawson L, Jung J, Franzen D, Hiller K. Clinical Assessment of Medical Students in Emergency Medicine Clerkships: A Survey of Current Practice. J Emerg Med 2016; 51:705-711. [PMID: 27614539 DOI: 10.1016/j.jemermed.2016.06.045] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2016] [Revised: 06/15/2016] [Accepted: 06/29/2016] [Indexed: 11/17/2022]
Abstract
BACKGROUND Assessment practices in emergency medicine (EM) clerkships have not been previously described. Clinical assessment frequently relies on global ratings of clinical performance, or "shift cards," although these tools have not been standardized or studied. OBJECTIVE We sought to characterize assessment practices in EM clerkships, with particular attention to shift cards. METHODS A survey regarding assessment practices was administered to a national sample of EM clerkship directors (CDs). Descriptive statistics were compiled and regression analyses were performed. RESULTS One hundred seventy-two CDs were contacted, and 100 (58%) agreed to participate. The most heavily weighted assessment methods in final grades were shift cards (66%) and written examinations (21-26%), but there was considerable variability in grading algorithms. EM faculty (100%) and senior residents (69%) were most commonly responsible for assessment, and assessors were often preassigned (71%). Forty-four percent of CDs reported immediate completion of shift cards, 27% within 1 to 2 days, and 20% within a week. Only 40% reported return rates >75%. Thirty percent of CDs do not permit students to review individual evaluations, and 54% of the remainder deidentify evaluations before student review. Eighty-six percent had never performed psychometric analysis on their assessment tools. Sixty-five percent of CDs were satisfied with their shift cards, but 90% supported the development of a national tool. CONCLUSION There is substantial variability in assessment practices between EM clerkships, raising concern regarding the comparability of grades between institutions. CDs rely on shift cards in grading despite the lack of evidence of validity and inconsistent process variables. Standardization of assessment practices may improve the assessment of EM students.
Collapse
Affiliation(s)
- Luan Lawson
- Department of Emergency Medicine, East Carolina University Brody School of Medicine, Greenville, North Carolina
| | - Julianna Jung
- Department of Emergency Medicine, Johns Hopkins University School of Medicine, Baltimore, Maryland
| | - Douglas Franzen
- Division of Emergency Medicine, University of Washington School of Medicine, Seattle, Washington
| | - Katherine Hiller
- Department of Emergency Medicine, University of Arizona College of Medicine, Tucson, Arizona
| |
Collapse
|
20
|
Milestones for medical students completing a clinical genetics elective. Genet Med 2016; 19:236-239. [PMID: 27584909 DOI: 10.1038/gim.2016.89] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2016] [Accepted: 05/17/2016] [Indexed: 11/08/2022] Open
Abstract
PURPOSE We are not aware of any competency-based evaluation method that is specifically designed for a genetics elective for medical students. Here, we aimed to create a milestone template to improve evaluation and to use the feedback from the template to improve the elective. METHODS Through an iterative process using feedback from eight medical students and eight attendings, we crafted a milestone template for the medical student genetics rotation. A "scavenger hunt" of activities was developed to address several gaps discovered through this process. RESULTS All participants felt that the milestone template was complete for the student level and that it improved evaluation. In response to faculty feedback, we modified the evaluation process such that several evaluators rated students in only selected domains. Scavenger hunt activities were designed to address five domains that the students reported to be inadequately covered. CONCLUSION Developing a milestone template has taken us a step closer to meaningful assessment of students completing the genetics elective and simultaneously allowed us to strengthen the elective. Meaningful elective experiences in genetics that provide individual feedback within a learner-centered assessment of progress and flexible out-of-classroom activities may contribute to lifelong learning and interest in genetics and genomics.Genet Med 19 2, 236-239.
Collapse
|
21
|
Quinn SM, Worrilow CC, Jayant DA, Bailey B, Eustice E, Kohlhepp J, Rogers R, Kane BG. Using Milestones as Evaluation Metrics During an Emergency Medicine Clerkship. J Emerg Med 2016; 51:426-431. [PMID: 27473442 DOI: 10.1016/j.jemermed.2016.06.014] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2016] [Revised: 05/18/2016] [Accepted: 06/04/2016] [Indexed: 11/19/2022]
Abstract
BACKGROUND The Accreditation Council for Graduate Medical Education's (ACGME) Milestones presumes graduating medical students will enter residency proficient at Milestone level 1 for 23 skills. The Next Accreditation System now includes Milestones for each postgraduate specialty, and it is unlikely that schools will document every emergency medicine (EM) applicant's EM-specific skills in their performance evaluation. OBJECTIVES The goals of this research were to determine if assessment of the Milestones was feasible during a medical student clerkship and examine the proportion of medical students performing at Milestone level 1. METHODS This study was conducted at a center with Liaison Committee on Medical Education-approved medical training and a 4-year EM residency. Using traditional clerkship, we studied the feasibility of an ACGME EM Milestones-based clerkship assessment. Data led to redesign of the clerkship and its evaluation process, including all level 1 anchor(s) to add "occasionally" (>60%), "usually" (>80%), and "always" (100%) on a Likert scale to on-shift assessment forms. RESULTS During the feasibility phase (2013-14), 75 students rotated though the clerkship; 55 evaluations were issued and 50 contained the Milestone summary. Eight deficiencies were noted in Milestone 12 and three in Milestone 14. After changes, 49 students rotated under the new evaluation rubric. Of 575 completed on-shift evaluations, 16 Milestone deficiencies were noted. Of 41 institutional evaluations issued, only one student had deficiencies noted, all of which pertained to patient care. All evaluations in this second cohort contained each student's Milestone proficiency. CONCLUSIONS Assessment of the Milestones is feasible. Communication of ACGME EM Milestone proficiency may identify students who require early observation or remediation. The majority of students meet the anchors for the Milestones, suggesting that clerkship assessment with the ACGME EM Milestones does not adequately differentiate students.
Collapse
Affiliation(s)
- Shawn M Quinn
- Department of Emergency Medicine, Lehigh Valley Health Network/University of South Florida Morsani College of Medicine, Allentown, Pennsylvania
| | - Charles C Worrilow
- Department of Emergency Medicine, Lehigh Valley Health Network/University of South Florida Morsani College of Medicine, Allentown, Pennsylvania
| | - Deepak A Jayant
- Department of Emergency Medicine, Lehigh Valley Health Network/University of South Florida Morsani College of Medicine, Allentown, Pennsylvania
| | - Blake Bailey
- Department of Emergency Medicine, Lehigh Valley Health Network/University of South Florida Morsani College of Medicine, Allentown, Pennsylvania
| | - Eric Eustice
- Department of Emergency Medicine, Lehigh Valley Health Network/University of South Florida Morsani College of Medicine, Allentown, Pennsylvania
| | - Jared Kohlhepp
- Department of Emergency Medicine, Lehigh Valley Health Network/University of South Florida Morsani College of Medicine, Allentown, Pennsylvania
| | - Ryan Rogers
- Department of Emergency Medicine, Lehigh Valley Health Network/University of South Florida Morsani College of Medicine, Allentown, Pennsylvania
| | - Bryan G Kane
- Department of Emergency Medicine, Lehigh Valley Health Network/University of South Florida Morsani College of Medicine, Allentown, Pennsylvania
| |
Collapse
|
22
|
Tews MC, Treat RW, Nanes M. Increasing Completion Rate of an M4 Emergency Medicine Student End-of-Shift Evaluation Using a Mobile Electronic Platform and Real-Time Completion. West J Emerg Med 2016; 17:478-83. [PMID: 27429704 PMCID: PMC4944810 DOI: 10.5811/westjem.2016.5.29384] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2015] [Revised: 04/30/2016] [Accepted: 05/17/2016] [Indexed: 12/03/2022] Open
Abstract
Introduction Medical students on an emergency medicine rotation are traditionally evaluated at the end of each shift with paper-based forms, and data are often missing due to forms not being turned in or completed. Because students’ grades depend on these evaluations, change was needed to increase form rate of return. We analyzed a new electronic evaluation form and modified completion process to determine if it would increase the completion rate without altering how faculty scored student performance. Methods During fall 2013, 29 faculty completed paper N=339 evaluations consisting of seven competencies for 33 students. In fall 2014, an electronic evaluation form with the same competencies was designed using an electronic platform and completed N=319 times by 27 faculty using 25 students’ electronic devices. Feedback checkboxes were added to facilitate collection of common comments. Data was analyzed with IBM® SPSS® 21.0 using multi-factor analysis of variance with the students’ global rating (GR) as an outcome. Inter-item reliability was determined with Cronbach alpha. Results There was a significantly higher completion rate (p=0.001) of 98% electronic vs. 69% paper forms, lower (p=0.001) missed GR rate (1% electronic. vs 12% paper), and higher mean scores (p=0.001) for the GR with the electronic (7.0±1.1) vs. paper (6.8±1.2) form. Feedback checkboxes were completed on every form. The inter-item reliability for electronic and paper forms was each alpha=0.95. Conclusion The use of a new electronic form and modified completion process for evaluating students at the end of shift demonstrated a higher faculty completion rate, a lower missed data rate, a higher global rating and consistent collection of common feedback. The use of the electronic form and the process for obtaining the information made our end-of-shift evaluation process for students more reliable and provided more accurate, up-to-date information for student feedback and when determining student grades.
Collapse
Affiliation(s)
- Matthew C Tews
- Medical College of Wisconsin, Department of Emergency Medicine, Milwaukee, Wisconsin
| | - Robert W Treat
- Medical College of Wisconsin, Department of Emergency Medicine, Milwaukee, Wisconsin
| | - Maxwell Nanes
- ProHealth Waukesha Memorial Hospital, Emergency Medicine Associates of Waukesha, LLC, Waukesha, Wisconsin
| |
Collapse
|
23
|
Chiu DT, Solano JJ, Ullman E, Pope J, Tibbles C, Horng S, Nathanson LA, Fisher J, Rosen CL. The Integration of Electronic Medical Student Evaluations Into an Emergency Department Tracking System is Associated With Increased Quality and Quantity of Evaluations. J Emerg Med 2016; 51:432-439. [PMID: 27372377 DOI: 10.1016/j.jemermed.2016.05.008] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/23/2015] [Revised: 11/25/2015] [Accepted: 05/05/2016] [Indexed: 11/26/2022]
Abstract
BACKGROUND Medical student evaluations are essential for determining clerkship grades. Electronic evaluations have various advantages compared to paper evaluations, such as increased ease of collection, asynchronous reporting, and decreased likelihood of becoming lost. OBJECTIVES To determine whether electronic medical student evaluations (EMSEs) provide more evaluations and content when compared to paper shift card evaluations. METHODS This before and after cohort study was conducted over a 2.5-year period at an academic hospital affiliated with a medical school and emergency medicine residency program. EMSEs replaced the paper shift evaluations that had previously been used halfway through the study period. A random sample of the free text comments on both paper and EMSEs were blindly judged by medical student clerkship directors for their helpfulness and usefulness. Logistic regression was used to test for any relationship between quality and quantity of words. RESULTS A total of 135 paper evaluations for 30 students and then 570 EMSEs for 62 students were collected. An average of 4.8 (standard deviation [SD] 3.2) evaluations were completed per student using the paper version compared to 9.0 (SD 3.8) evaluations completed per student electronically (p < 0.001). There was an average of 8.8 (SD 8.5) words of free text evaluation on paper evaluations when compared to 22.5 (SD 28.4) words for EMSEs (p < 0.001). A statistically significant (p < 0.02) association between quality of an evaluation and the word count existed. CONCLUSIONS EMSEs that were integrated into the emergency department tracking system significantly increased the number of evaluations completed compared to paper evaluations. In addition, the EMSEs captured more "helpful/useful" information about the individual students as evidenced by the longer free text entries per evaluation.
Collapse
Affiliation(s)
- David T Chiu
- Department of Emergency Medicine, Beth Israel Deaconess Medical Center, Boston, Massachusetts; Department of Emergency Medicine, Harvard Medical School, Boston, Massachusetts
| | - Joshua J Solano
- Department of Emergency Medicine, Beth Israel Deaconess Medical Center, Boston, Massachusetts; Department of Emergency Medicine, Harvard Medical School, Boston, Massachusetts
| | - Edward Ullman
- Department of Emergency Medicine, Beth Israel Deaconess Medical Center, Boston, Massachusetts; Department of Emergency Medicine, Harvard Medical School, Boston, Massachusetts
| | - Jennifer Pope
- Department of Emergency Medicine, Beth Israel Deaconess Medical Center, Boston, Massachusetts; Department of Emergency Medicine, Harvard Medical School, Boston, Massachusetts
| | - Carrie Tibbles
- Department of Emergency Medicine, Beth Israel Deaconess Medical Center, Boston, Massachusetts; Department of Emergency Medicine, Harvard Medical School, Boston, Massachusetts
| | - Steven Horng
- Department of Emergency Medicine, Beth Israel Deaconess Medical Center, Boston, Massachusetts; Department of Emergency Medicine, Harvard Medical School, Boston, Massachusetts
| | - Larry A Nathanson
- Department of Emergency Medicine, Beth Israel Deaconess Medical Center, Boston, Massachusetts; Department of Emergency Medicine, Harvard Medical School, Boston, Massachusetts
| | - Jonathan Fisher
- Department of Emergency Medicine, Beth Israel Deaconess Medical Center, Boston, Massachusetts; Department of Emergency Medicine, Harvard Medical School, Boston, Massachusetts
| | - Carlo L Rosen
- Department of Emergency Medicine, Beth Israel Deaconess Medical Center, Boston, Massachusetts; Department of Emergency Medicine, Harvard Medical School, Boston, Massachusetts
| |
Collapse
|
24
|
Dinh VA, Lakoff D, Hess J, Bahner DP, Hoppmann R, Blaivas M, Pellerito JS, Abuhamad A, Khandelwal S. Medical Student Core Clinical Ultrasound Milestones: A Consensus Among Directors in the United States. JOURNAL OF ULTRASOUND IN MEDICINE : OFFICIAL JOURNAL OF THE AMERICAN INSTITUTE OF ULTRASOUND IN MEDICINE 2016; 35:421-434. [PMID: 26782162 DOI: 10.7863/ultra.15.07080] [Citation(s) in RCA: 63] [Impact Index Per Article: 7.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/30/2015] [Accepted: 10/20/2015] [Indexed: 06/05/2023]
Abstract
OBJECTIVES Many medical schools are implementing point-of-care ultrasound in their curricula to help augment teaching of the physical examination, anatomy, and ultimately clinical management. However, point-of-care ultrasound milestones for medical students remain unknown. The purpose of this study was to formulate a consensus on core medical student clinical point-of-care ultrasound milestones across allopathic and osteopathic medical schools in the United States. Directors who are leading the integration of ultrasound in medical education (USMED) at their respective institutions were surveyed. METHODS An initial list of 205 potential clinical ultrasound milestones was developed through a literature review. An expert panel consisting of 34 USMED directors across the United States was used to produce consensus on clinical ultrasound milestones through 2 rounds of a modified Delphi technique, an established anonymous process to obtain consensus through multiple rounds of quantitative questionnaires. RESULTS There was a 100% response rate from the 34 USMED directors in both rounds 1 and 2 of the modified Delphi protocol. After the first round, 2 milestones were revised to improve clarity, and 9 were added on the basis of comments from the USMED directors, resulting in 214 milestones forwarded to round 2. After the second round, only 90 milestones were found to have a high level of agreement and were included in the final medical student core clinical ultrasound milestones. CONCLUSIONS This study established 90 core clinical milestones that all graduating medical students should obtain before graduation, based on consensus from 34 USMED directors. These core milestones can serve as a guide for curriculum deans who are initiating ultrasound curricula at their institutions. The exact method of implementation and competency assessment needs further investigation.
Collapse
Affiliation(s)
- Vi Am Dinh
- Loma Linda University School of Medicine, Loma Linda, California USA (V.A.D.); Icahn School of Medicine at Mount Sinai, New York, New York USA (D.L.); University of Wisconsin School of Medicine and Public Health, Madison, Wisconsin USA (J.H.); The Ohio State University College of Medicine, Columbus, Ohio USA (D.P.B., S.K.); University of South Carolina School of Medicine, Columbia, South Carolina USA (R.H., M.B.); Hofstra University North Shore-LIJ School of Medicine, Hempstead, New York USA (J.S.P.); and Eastern Virginia Medical School, Norfolk, Virginia USA (A.A.).
| | - Daniel Lakoff
- Loma Linda University School of Medicine, Loma Linda, California USA (V.A.D.); Icahn School of Medicine at Mount Sinai, New York, New York USA (D.L.); University of Wisconsin School of Medicine and Public Health, Madison, Wisconsin USA (J.H.); The Ohio State University College of Medicine, Columbus, Ohio USA (D.P.B., S.K.); University of South Carolina School of Medicine, Columbia, South Carolina USA (R.H., M.B.); Hofstra University North Shore-LIJ School of Medicine, Hempstead, New York USA (J.S.P.); and Eastern Virginia Medical School, Norfolk, Virginia USA (A.A.)
| | - Jamie Hess
- Loma Linda University School of Medicine, Loma Linda, California USA (V.A.D.); Icahn School of Medicine at Mount Sinai, New York, New York USA (D.L.); University of Wisconsin School of Medicine and Public Health, Madison, Wisconsin USA (J.H.); The Ohio State University College of Medicine, Columbus, Ohio USA (D.P.B., S.K.); University of South Carolina School of Medicine, Columbia, South Carolina USA (R.H., M.B.); Hofstra University North Shore-LIJ School of Medicine, Hempstead, New York USA (J.S.P.); and Eastern Virginia Medical School, Norfolk, Virginia USA (A.A.)
| | - David P Bahner
- Loma Linda University School of Medicine, Loma Linda, California USA (V.A.D.); Icahn School of Medicine at Mount Sinai, New York, New York USA (D.L.); University of Wisconsin School of Medicine and Public Health, Madison, Wisconsin USA (J.H.); The Ohio State University College of Medicine, Columbus, Ohio USA (D.P.B., S.K.); University of South Carolina School of Medicine, Columbia, South Carolina USA (R.H., M.B.); Hofstra University North Shore-LIJ School of Medicine, Hempstead, New York USA (J.S.P.); and Eastern Virginia Medical School, Norfolk, Virginia USA (A.A.)
| | - Richard Hoppmann
- Loma Linda University School of Medicine, Loma Linda, California USA (V.A.D.); Icahn School of Medicine at Mount Sinai, New York, New York USA (D.L.); University of Wisconsin School of Medicine and Public Health, Madison, Wisconsin USA (J.H.); The Ohio State University College of Medicine, Columbus, Ohio USA (D.P.B., S.K.); University of South Carolina School of Medicine, Columbia, South Carolina USA (R.H., M.B.); Hofstra University North Shore-LIJ School of Medicine, Hempstead, New York USA (J.S.P.); and Eastern Virginia Medical School, Norfolk, Virginia USA (A.A.)
| | - Michael Blaivas
- Loma Linda University School of Medicine, Loma Linda, California USA (V.A.D.); Icahn School of Medicine at Mount Sinai, New York, New York USA (D.L.); University of Wisconsin School of Medicine and Public Health, Madison, Wisconsin USA (J.H.); The Ohio State University College of Medicine, Columbus, Ohio USA (D.P.B., S.K.); University of South Carolina School of Medicine, Columbia, South Carolina USA (R.H., M.B.); Hofstra University North Shore-LIJ School of Medicine, Hempstead, New York USA (J.S.P.); and Eastern Virginia Medical School, Norfolk, Virginia USA (A.A.)
| | - John S Pellerito
- Loma Linda University School of Medicine, Loma Linda, California USA (V.A.D.); Icahn School of Medicine at Mount Sinai, New York, New York USA (D.L.); University of Wisconsin School of Medicine and Public Health, Madison, Wisconsin USA (J.H.); The Ohio State University College of Medicine, Columbus, Ohio USA (D.P.B., S.K.); University of South Carolina School of Medicine, Columbia, South Carolina USA (R.H., M.B.); Hofstra University North Shore-LIJ School of Medicine, Hempstead, New York USA (J.S.P.); and Eastern Virginia Medical School, Norfolk, Virginia USA (A.A.)
| | - Alfred Abuhamad
- Loma Linda University School of Medicine, Loma Linda, California USA (V.A.D.); Icahn School of Medicine at Mount Sinai, New York, New York USA (D.L.); University of Wisconsin School of Medicine and Public Health, Madison, Wisconsin USA (J.H.); The Ohio State University College of Medicine, Columbus, Ohio USA (D.P.B., S.K.); University of South Carolina School of Medicine, Columbia, South Carolina USA (R.H., M.B.); Hofstra University North Shore-LIJ School of Medicine, Hempstead, New York USA (J.S.P.); and Eastern Virginia Medical School, Norfolk, Virginia USA (A.A.)
| | - Sorabh Khandelwal
- Loma Linda University School of Medicine, Loma Linda, California USA (V.A.D.); Icahn School of Medicine at Mount Sinai, New York, New York USA (D.L.); University of Wisconsin School of Medicine and Public Health, Madison, Wisconsin USA (J.H.); The Ohio State University College of Medicine, Columbus, Ohio USA (D.P.B., S.K.); University of South Carolina School of Medicine, Columbia, South Carolina USA (R.H., M.B.); Hofstra University North Shore-LIJ School of Medicine, Hempstead, New York USA (J.S.P.); and Eastern Virginia Medical School, Norfolk, Virginia USA (A.A.)
| |
Collapse
|
25
|
Conner BJ, Behar-Horenstein LS, Su Y. Comparison of Two Clinical Teaching Models for Veterinary Emergency and Critical Care Instruction. JOURNAL OF VETERINARY MEDICAL EDUCATION 2016; 43:58-63. [PMID: 26751912 DOI: 10.3138/jvme.0415-069r1] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/05/2023]
Abstract
Standards to oversee the implementation and assessment of clinical teaching of emergency and critical care for veterinary students do not exist. The purpose of this study was to assess differences in the learning environment between two veterinary emergency and critical care clinical rotations (one required, one elective) with respect to caseload, technical/procedural opportunities, direct faculty contact time, client communication opportunities, and students' perception of practice readiness. The authors designed a 22-item survey to assess differences in the learning environment between the two rotations. It was sent electronically to 35 third- and fourth-year veterinary medicine students. Bivariate analysis, including the Wilcoxon signed-rank test and the t-test, were used to compare differences between pre-test and post-test scores among students. Twenty-six students' responses were included from the required rotation and nine from the elective rotation. Findings showed that students preferred the elective community emergency department setting to the required academic setting and that there were statistically significantly more positive experiences related to the variables of interest. Students saw significantly more cases at the community emergency department setting. Findings from this study offer guidance to assess students' emergency department rotations, suggest how teaching interactions can be modified for optimal learning experiences, and ensure that students receive maximal opportunities to treat patients that are representative of what they would encounter in practice.
Collapse
|
26
|
Bradley KE, Andolsek KM. A pilot study of orthopaedic resident self-assessment using a milestones' survey just prior to milestones implementation. INTERNATIONAL JOURNAL OF MEDICAL EDUCATION 2016; 7:11-8. [PMID: 26752012 PMCID: PMC4715902 DOI: 10.5116/ijme.5682.6dfd] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/26/2015] [Accepted: 12/29/2015] [Indexed: 05/17/2023]
Abstract
OBJECTIVE To pilot test if Orthopaedic Surgery residents could self-assess their performance using newly created milestones, as defined by the Accreditation Council on Graduate Medical Education. METHODS In June 2012, an email was sent to Program Directors and administrative coordinators of the 154 accredited Orthopaedic Surgery Programs, asking them to send their residents a link to an online survey. The survey was adapted from the Orthopaedic Surgery Milestone Project. Completed surveys were aggregated in an anonymous, confidential database. SAS 9.3 was used to perform the analyses. RESULTS Responses from 71 residents were analyzed. First and second year residents indicated through self-assessment that they had substantially achieved Level 1 and Level 2 milestones. Third year residents reported they had substantially achieved 30/41, and fourth year residents, all Level 3 milestones. Fifth year, graduating residents, reported they had substantially achieved 17 Level 4 milestones, and were extremely close on another 15. No milestone was rated at Level 5, the maximum possible. Earlier in training, Patient Care and Medical Knowledge milestones were rated lower than the milestones reflecting the other four competencies of Practice Based Learning and Improvement, Systems Based Practice, Professionalism, and Interpersonal Communication. The gap was closed by the fourth year. CONCLUSIONS Residents were able to successfully self-assess using the 41 Orthopaedic Surgery milestones. Respondents' rate improved proficiency over time. Graduating residents report they have substantially, or close to substantially, achieved all Level 4 milestones. Milestone self-assessment may be a useful tool as one component of a program's overall performance assessment strategy.
Collapse
Affiliation(s)
- Kendall E. Bradley
- Orthopaedic Surgery Residency Training Program, Duke University Hospital, Box 3000 DUMC, Durham, NC 27710, USA
| | - Kathryn M. Andolsek
- Community and Family Medicine, Duke University School of Medicine, Box 3648 DUMC, 201 Trent Drive, Durham, NC 27710, USA
| |
Collapse
|
27
|
Perry M, Hopson L, House JB, Fischer JP, Dooley-Hash S, Hauff S, Wolff MS, Sozener C, Nypaver M, Moll J, Losman ED, Carney M, Santen SA. Model for Developing Educational Research Productivity: The Medical Education Research Group. West J Emerg Med 2015; 16:947-51. [PMID: 26594297 PMCID: PMC4651601 DOI: 10.5811/westjem.2015.9.27306] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2015] [Revised: 09/09/2015] [Accepted: 09/26/2015] [Indexed: 11/11/2022] Open
Abstract
Introduction Education research and scholarship are essential for promotion of faculty as well as dissemination of new educational practices. Educational faculty frequently spend the majority of their time on administrative and educational commitments and as a result educators often fall behind on scholarship and research. The objective of this educational advance is to promote scholarly productivity as a template for others to follow. Methods We formed the Medical Education Research Group (MERG) of education leaders from our emergency medicine residency, fellowship, and clerkship programs, as well as residents with a focus on education. First, we incorporated scholarship into the required activities of our education missions by evaluating the impact of programmatic changes and then submitting the curricula or process as peer-reviewed work. Second, we worked as a team, sharing projects that led to improved motivation, accountability, and work completion. Third, our monthly meetings served as brainstorming sessions for new projects, research skill building, and tracking work completion. Lastly, we incorporated a work-study graduate student to assist with basic but time-consuming tasks of completing manuscripts. Results The MERG group has been highly productive, achieving the following scholarship over a three-year period: 102 abstract presentations, 46 journal article publications, 13 MedEd Portal publications, 35 national didactic presentations and five faculty promotions to the next academic level. Conclusion An intentional focus on scholarship has led to a collaborative group of educators successfully improving their scholarship through team productivity, which ultimately leads to faculty promotions and dissemination of innovations in education.
Collapse
Affiliation(s)
- Marcia Perry
- University of Michigan, Department of Emergency Medicine, Ann Arbor, Michigan
| | - Laura Hopson
- University of Michigan, Department of Emergency Medicine, Ann Arbor, Michigan
| | - Joseph B House
- University of Michigan, Department of Emergency Medicine, Ann Arbor, Michigan
| | - Jonathan P Fischer
- University of Michigan, Department of Public Health, Ann Arbor, Michigan
| | - Suzanne Dooley-Hash
- University of Michigan, Department of Emergency Medicine, Ann Arbor, Michigan
| | - Samantha Hauff
- University of Michigan, Department of Emergency Medicine, Ann Arbor, Michigan
| | - Margaret S Wolff
- University of Michigan, Department of Emergency Medicine, Children's Emergency Services, Ann Arbor, Michigan
| | - Cemal Sozener
- University of Michigan, Department of Emergency Medicine, Ann Arbor, Michigan
| | - Michele Nypaver
- University of Michigan, Department of Emergency Medicine, Children's Emergency Services, Ann Arbor, Michigan
| | - Joel Moll
- University of Michigan, Department of Emergency Medicine, Ann Arbor, Michigan
| | - Eve D Losman
- University of Michigan, Department of Emergency Medicine, Ann Arbor, Michigan
| | - Michele Carney
- University of Michigan, Department of Emergency Medicine, Children's Emergency Services, Ann Arbor, Michigan
| | - Sally A Santen
- University of Michigan, Department of Emergency Medicine, Ann Arbor, Michigan
| |
Collapse
|
28
|
Ford J, Pambrun C. Exit competencies in pathology and laboratory medicine for graduating medical students: the Canadian approach. Hum Pathol 2015; 46:637-42. [DOI: 10.1016/j.humpath.2015.01.016] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/13/2014] [Revised: 01/19/2015] [Accepted: 01/30/2015] [Indexed: 11/28/2022]
|