1
|
Kaushik JS, Ramachandran P, Kukreja S, Gupta P, Singh T. Delivering Electives the Clerkship Way: Consolidating the Student Doctor Method of Training. Indian Pediatr 2022. [DOI: 10.1007/s13312-022-2600-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|
2
|
Medical student perceptions of assessment systems, subjectivity, and variability on introductory dermatology clerkships. Int J Womens Dermatol 2021; 7:323-330. [PMID: 34222591 PMCID: PMC8243165 DOI: 10.1016/j.ijwd.2021.01.003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/06/2020] [Revised: 12/29/2020] [Accepted: 01/04/2021] [Indexed: 11/25/2022] Open
Abstract
Background Elective introductory clerkships in dermatology serve a critical function in providing formative experiences to medical students interested in the field. Although dermatology clerkships play a pivotal role in students’ career choices and residency preparation, the assessment systems used to evaluate students on these clerkships are widely different and likely affect student experiences. Objective This study aimed to explore the relationship between dermatology clerkship assessment systems and student experiences through interviews with students about their clerkship reflections and perceptions of assessment. Methods The authors contacted clerkship directors via the Association of Professors of Dermatology mailing list and invited them to provide a description of the assessment system at their institution. The authors, via contacted clerkship directors, then invited students who had completed an introductory dermatology clerkship in between 2018 and 2019 to provide a description of the assessment system at their institution and to participate in a qualitative interview about their experiences with assessment systems. The authors then iteratively synthesized interview transcripts using phenomenological analysis, in which a templated approach was used to achieve comprehensive thematic categorization. Results Prior to clerkship onset, students expressed a limited understanding of their clinical role and the assessment system. During the clerkship, students endorsed variable expectations across preceptors, limited feedback experiences, and pressures to perform for evaluators. After their clerkship, students continued to perceive assessment systems as nontransparent, subjective, and preordained. Conclusion Medical students perceived assessment systems on introductory dermatology clerkships to be unclear and arbitrary. Encouragingly, students also viewed these challenges in assessment as malleable, identifying several opportunities for educational reform in dermatology clerkships.
Collapse
|
3
|
Tobias A, Sobehart R, Doshi AA, Suffoletto B. Implementation of a Web-Based Tool With Text Message Prompts to Improve End-of-Shift Assessments for Emergency Medicine Residents. J Grad Med Educ 2020; 12:753-758. [PMID: 33391600 PMCID: PMC7771609 DOI: 10.4300/jgme-d-20-00204.1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/15/2020] [Revised: 06/05/2020] [Accepted: 09/10/2020] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND End-of-shift assessments (ESA) can provide representative data on medical trainee performance but do not occur routinely and are not documented systematically. OBJECTIVE To evaluate the implementation of a web-based tool with text message prompts to assist mobile ESA (mESA) in an emergency medicine (EM) residency program. METHODS mESA used timed text messages to prompt faculty/trainees to expect in-person qualitative ESA in a milestone content area and for the faculty to record descriptive performance data through a web-based platform. We assessed implementation between January 2018 and November 2019 using the RE-AIM framework (reach, effectiveness, adoption, implementation, and maintenance). RESULTS Reach: 96 faculty and 79 trainees participated in the mESA program. Effectiveness: From surveys, approximately 72% of faculty and 58% of trainees reported increases in providing and receiving ESA feedback after program implementation. From ESA submissions, trainees reported receiving in-person feedback on 90% of shifts. Residency leadership confirmed perceived utility of the mESA program. Adoption: mESA prompts were sent on 7792 unique shifts across 4 EDs, all days of week, and different times of day. Faculty electronically submitted ESA feedback on 45% of shifts. Implementation quality: No technological errors occurred. Maintenance: Completion of in-person ESA feedback and electronic submission of feedback by faculty was stable over time. CONCLUSIONS We found mixed evidence in support of using a web-based tool with text message prompts for mESA for EM trainees.
Collapse
Affiliation(s)
- Adam Tobias
- Associate Professor of Emergency Medicine, University of Pittsburgh School of Medicine
| | - Robert Sobehart
- Medical Director, Department of Emergency Medicine, Allegheny Health Network
| | - Ankur A Doshi
- Associate Professor of Emergency Medicine, University of Pittsburgh School of Medicine
| | - Brian Suffoletto
- Associate Professor, Department of Emergency Medicine, Stanford University School of Medicine
| |
Collapse
|
4
|
Abstract
Supervision of resident physicians is a high-risk area of emergency medicine, and what constitutes appropriate supervision is a complex question. In this article, policies and procedures for appropriate supervision of resident physicians and the implications for billing are reviewed. Recommendations on supervision of resident physicians in the emergency department are detailed, with attention paid to addressing challenges in balancing patient safety with resident autonomy and education during the course of patient care and graduate medical education.
Collapse
Affiliation(s)
- Alexander Y Sheng
- Department of Emergency Medicine, Boston Medical Center, Boston University School of Medicine, 800 Harrison Avenue, BCD Building, 1st Floor, Boston, MA 02118, USA.
| | - Avery Clark
- Department of Emergency Medicine, Boston Medical Center, Boston University School of Medicine, 800 Harrison Avenue, BCD Building, 1st Floor, Boston, MA 02118, USA
| | - Cristopher Amanti
- Department of Emergency Medicine, Boston Medical Center, Boston University School of Medicine, 800 Harrison Avenue, BCD Building, 1st Floor, Boston, MA 02118, USA
| |
Collapse
|
5
|
Chaou CH, Chang YC, Yu SR, Tseng HM, Hsiao CT, Wu KH, Monrouxe LV, Ling RNY. Clinical learning in the context of uncertainty: a multi-center survey of emergency department residents' and attending physicians' perceptions of clinical feedback. BMC MEDICAL EDUCATION 2019; 19:174. [PMID: 31142306 PMCID: PMC6542138 DOI: 10.1186/s12909-019-1597-8] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 05/14/2018] [Accepted: 05/07/2019] [Indexed: 06/09/2023]
Abstract
BACKGROUND Feedback is an essential part of clinical teaching and learning, yet it is often perceived as unsatisfactory in busy clinical settings. Clinical teachers need to balance the competing demands of clinical duty and feedback provision. The influence of the clinical environment and the mutual relationship between feedback giving and seeking has been inadequately investigated. This study therefore aimed to quantify the adequacy, perceptions, and influential factors of feedback provision during resident training in emergency departments (EDs). METHODS A multicenter online questionnaire study was undertaken. The respondents comprised ED residents and clinical teachers from four teaching hospitals in Taiwan. The questionnaire was developed via an expert panel, and a pilot study ensured validity. Ninety clinical teachers and 54 residents participated. RESULTS The respondents reported that the majority of feedback, which usually lasted 1-5 min, was initiated by the clinical teachers. Feedback satisfaction was significantly lower for the clinical teachers than for the residents (clinical teachers M = 13.8, SD = 1.83; residents M = 15.3, SD = 2.14; p < 0.0001), and positive feedback was provided infrequently in clinical settings (31.1%). Both groups of participants admitted hesitating between providing/seeking feedback and completing clinical work. Being busy, the teachers' clinical abilities, the learners' attitudes, and the relationship between both parties were reported as the most influential factors in feedback provision. CONCLUSION ED clinical feedback provision is often short, circumstantial, and initiated by clinical teachers. Providing or seeking feedback appears to be an important part of clinical learning in the context of uncertainty. The importance of the relationship between the feedback seeker and the provider highlights the interactive, reciprocal nature of clinical feedback provision.
Collapse
Affiliation(s)
- Chung-Hsien Chaou
- Chang-Gung Medical Education Research Centre, Chang Gung Memorial Hospital, Taoyuan, Taiwan.
- Department of Emergency Medicine, Chang Gung Memorial Hospital, Linkou and Chang Gung University College of Medicine, Taoyuan, Taiwan.
| | - Yu-Che Chang
- Chang-Gung Medical Education Research Centre, Chang Gung Memorial Hospital, Taoyuan, Taiwan
- Department of Emergency Medicine, Chang Gung Memorial Hospital, Linkou and Chang Gung University College of Medicine, Taoyuan, Taiwan
| | - Shiuan-Ruey Yu
- Chang-Gung Medical Education Research Centre, Chang Gung Memorial Hospital, Taoyuan, Taiwan
| | - Hsu-Min Tseng
- Chang-Gung Medical Education Research Centre, Chang Gung Memorial Hospital, Taoyuan, Taiwan
- Department of Health Care Management, Chang Gung University, Taoyuan, Taiwan
| | - Cheng-Ting Hsiao
- Chang-Gung Medical Education Research Centre, Chang Gung Memorial Hospital, Taoyuan, Taiwan
- Department of Emergency Medicine, Chang Gung Memorial Hospital, Chiayi and Chang Gung University College of Medicine, Taoyuan, Taiwan
| | - Kuan-Han Wu
- Department of Emergency Medicine, Chang Gung Memorial Hospital, Kaohsiung and Chang Gung University College of Medicine, Taoyuan, Taiwan
| | - Lynn Valerie Monrouxe
- Chang-Gung Medical Education Research Centre, Chang Gung Memorial Hospital, Taoyuan, Taiwan
| | - Roy Ngerng Yi Ling
- Chang-Gung Medical Education Research Centre, Chang Gung Memorial Hospital, Taoyuan, Taiwan
| |
Collapse
|
6
|
Hoonpongsimanont W, Feldman M, Bove N, Sahota PK, Velarde I, Anderson CL, Wiechmann W. Improving feedback by using first-person video during the emergency medicine clerkship. ADVANCES IN MEDICAL EDUCATION AND PRACTICE 2018; 9:559-565. [PMID: 30127651 PMCID: PMC6091253 DOI: 10.2147/amep.s169511] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/23/2023]
Abstract
PURPOSE Providing feedback to students in the emergency department during their emergency medicine clerkship can be challenging due to time constraints, the logistics of direct observation, and limitations of privacy. The authors aimed to evaluate the effectiveness of first-person video, captured via Google Glass™, to enhance feedback quality in medical student education. MATERIAL AND METHODS As a clerkship requirement, students asked patients and attending physicians to wear the Google Glass™ device to record patient encounters and patient presentations, respectively. Afterwards, students reviewed the recordings with faculty, who provided formative and summative feedback, during a private, one-on-one session. We introduced the intervention to 45, fourth-year medical students who completed their mandatory emergency medicine clerkships at a United States medical school during the 2015-2016 academic year. RESULTS Students assessed their performances before and after the review sessions using standardized medical school evaluation forms. We compared students' self-assessment scores to faculty assessment scores in 14 categories using descriptive statistics and symmetric tests. The overall mean scores, for each of the 14 categories, ranged between 3 and 4 (out of 5) for the self-assessment forms. When evaluating the propensity of self-assessment scores toward the faculty assessment scores, we found no significant changes in all 14 categories. Although not statistically significant, one fifth of students changed perspectives of their clinical skills (history taking, performing physical exams, presenting cases, and developing differential diagnoses and plans) toward faculty assessments after reviewing the video recordings. CONCLUSION First-person video recording still initiated the feedback process, allocated specific time and space for feedback, and possibly substituted for the direct observation procedure. Additional studies, with different outcomes and larger sample sizes, are needed to understand the effectiveness of first-person video in improving feedback quality.
Collapse
Affiliation(s)
- Wirachin Hoonpongsimanont
- Department of Emergency Medicine, University of California, Irvine School of Medicine, Irvine, CA, USA,
| | - Maja Feldman
- Department of Emergency Medicine, University of California, Irvine School of Medicine, Irvine, CA, USA,
| | - Nicholas Bove
- Department of Emergency Medicine, University of California, Irvine School of Medicine, Irvine, CA, USA,
| | - Preet Kaur Sahota
- Department of Emergency Medicine, University of California, Irvine School of Medicine, Irvine, CA, USA,
| | - Irene Velarde
- Department of Emergency Medicine, University of California, Irvine School of Medicine, Irvine, CA, USA,
- Touro University College of Osteopathic Medicine, Vallejo, CA, USA
| | - Craig L Anderson
- Department of Emergency Medicine, University of California, Irvine School of Medicine, Irvine, CA, USA,
| | - Warren Wiechmann
- Department of Emergency Medicine, University of California, Irvine School of Medicine, Irvine, CA, USA,
| |
Collapse
|
7
|
Cevik AA, Shaban S, El Zubeir M, Abu-Zidan FM. The role of emergency medicine clerkship e-Portfolio to monitor the learning experience of students in different settings: a prospective cohort study. Int J Emerg Med 2018; 11:24. [PMID: 29651758 PMCID: PMC5897274 DOI: 10.1186/s12245-018-0184-9] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/06/2018] [Accepted: 04/04/2018] [Indexed: 11/23/2022] Open
Abstract
Background Although emergency departments provide acute care learning opportunities for medical students, student exposure to recommended curriculum presentations and procedures are limited. In this perspective, clinical environments providing learning opportunities for students should be monitored as part of an ongoing quality improvement process. This study aims to analyze student exposures and their involvement levels in two different hospitals (Tawam and Al Ain) so as to improve the teaching and learning activities. Methods This is a prospective study on all 76 final year medical students’ electronic logbooks (e-Portfolio) of the academic year 2016/2017. Results Students recorded 5087 chief complaints and 3721 procedures. The average patient and procedure exposure in a shift per student in Al Ain Hospital compared with Tawam Hospital were 7.2 vs 6.4 and 5.8 vs 4.3, respectively. The highest full involvement with presentations was seen in the pediatric unit (67.1%, P < 0.0001). Urgent care shifts demonstrated the highest area of “full involvement” with procedures for our students (73.2%, P < 0.0001). Students’ highest involvement with presentations and procedures were found during the night shifts (P < 0.0001, 66.5 and 75.1%, respectively). Conclusions The electronic portfolio has proven to be a very useful tool in defining the learning activities of final year medical students during their emergency medicine clerkship and in comparing activities in two different clinical settings. Data collected and analyzed using this e-Portfolio has the potential to help medical educators and curriculum designers improve emergency medicine teaching and learning activities.
Collapse
Affiliation(s)
- Arif Alper Cevik
- Department of Internal Medicine, Emergency Medicine Clerkship, College of Medicine and Health Sciences, United Arab Emirates University, Al Ain, 17666, United Arab Emirates. .,Department of Emergency Medicine, Tawam-John Hopkins Hospital, Al Ain, UAE.
| | - Sami Shaban
- Department of Medical Education, College of Medicine and Health Sciences, United Arab Emirates University, Al Ain, United Arab Emirates
| | - Margret El Zubeir
- Department of Medical Education, College of Medicine and Health Sciences, United Arab Emirates University, Al Ain, United Arab Emirates
| | - Fikri M Abu-Zidan
- Department of Surgery, College of Medicine and Health Sciences, United Arab Emirates University, Al Ain, United Arab Emirates
| |
Collapse
|
8
|
Moreau KA, Eady K, Tang K, Jabbour M, Frank JR, Campbell M, Hamstra SJ. The development of the PARENTS: a tool for parents to assess residents' non-technical skills in pediatric emergency departments. BMC MEDICAL EDUCATION 2017; 17:210. [PMID: 29137674 PMCID: PMC5686846 DOI: 10.1186/s12909-017-1042-9] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/31/2017] [Accepted: 11/02/2017] [Indexed: 05/29/2023]
Abstract
BACKGROUND Parents can assess residents' non-technical skills (NTS) in pediatric emergency departments (EDs). There are no assessment tools, with validity evidence, for parental use in pediatric EDs. The purpose of this study was to develop the Parents' Assessment of Residents Enacting Non-Technical Skills (PARENTS) educational assessment tool and collect three sources of validity evidence (i.e., content, response process, internal structure) for it. METHODS We established content evidence for the PARENTS through interviews with physician-educators and residents, focus groups with parents, a literature review, and a modified nominal group technique with experts. We collected response process evidence through cognitive interviews with parents. To examine the internal structure evidence, we administered the PARENTS and performed exploratory factor analysis. RESULTS Initially, a 20-item PARENTS was developed. Cognitive interviews led to the removal of one closed-ended item, the addition of resident photographs, and wording/formatting changes. Thirty-seven residents and 434 parents participated in the administration of the resulting 19-item PARENTS. Following factor analysis, a one-factor model prevailed. CONCLUSIONS The study presents initial validity evidence for the PARENTS. It also highlights strategies for potentially: (a) involving parents in the assessment of residents, (b) improving the assessment of NTS in pediatric EDs, and
Collapse
Affiliation(s)
- Katherine A. Moreau
- Faculty of Education, University of Ottawa, 145 Jean-Jacques-Lussier Private, Ottawa, ON K1N 6N5 Canada
- Children’s Hospital of Eastern Ontario Research Institute, University of Ottawa, 401 Smyth Road, Ottawa, ON K1H 8L1 Canada
| | - Kaylee Eady
- Children’s Hospital of Eastern Ontario Research Institute, University of Ottawa, 401 Smyth Road, Ottawa, ON K1H 8L1 Canada
- School of Rehabilitation Sciences, Faculty of Health Sciences, University of Ottawa, 451 Smyth Road, Ottawa, ON K1H 8M5 Canada
| | - Kenneth Tang
- Children’s Hospital of Eastern Ontario Research Institute, University of Ottawa, 401 Smyth Road, Ottawa, ON K1H 8L1 Canada
| | - Mona Jabbour
- Department of Emergency Medicine, Faculty of Medicine, University of Ottawa, 1053 Carling Avenue, Ottawa, ON K1Y 4E9 Canada
- Department of Pediatrics, Faculty of Medicine, University of Ottawa, 401 Smyth Road, Ottawa, ON K1H 8L1 Canada
- Children’s Hospital of Eastern Ontario, University of Ottawa, 401 Smyth Road, Ottawa, ON K1H 8L1 Canada
| | - Jason R. Frank
- Department of Emergency Medicine, Faculty of Medicine, University of Ottawa, 1053 Carling Avenue, Ottawa, ON K1Y 4E9 Canada
- Royal College of Physicians and Surgeons of Canada, 774 Echo Drive, Ottawa, ON K1S 5N8 Canada
| | - Meaghan Campbell
- Children’s Hospital of Eastern Ontario, University of Ottawa, 401 Smyth Road, Ottawa, ON K1H 8L1 Canada
| | - Stanley J. Hamstra
- Faculty of Education, University of Ottawa, 145 Jean-Jacques-Lussier Private, Ottawa, ON K1N 6N5 Canada
- Accreditation Council for Graduate Medical Education, 401 North Michigan Avenue, Chicago, IL 60611 USA
| |
Collapse
|
9
|
Kornegay JG, Kraut A, Manthey D, Omron R, Caretta‐Weyer H, Kuhn G, Martin S, Yarris LM. Feedback in Medical Education: A Critical Appraisal. AEM EDUCATION AND TRAINING 2017; 1:98-109. [PMID: 30051017 PMCID: PMC6001508 DOI: 10.1002/aet2.10024] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/29/2016] [Revised: 01/10/2017] [Accepted: 01/12/2017] [Indexed: 05/24/2023]
Abstract
OBJECTIVE The objective was to review and critically appraise the medical education literature pertaining to feedback and highlight influential papers that inform our current understanding of the role of feedback in medical education. METHODS A search of the English language literature in querying Education Resources Information Center (ERIC), PsychINFO, PubMed, and Scopus identified 327 feedback-related papers using either quantitative (hypothesis-testing or observational investigations of educational interventions), qualitative methods (exploring important phenomena in emergency medicine [EM] education), or review methods.Two reviewers independently screened each category of publications using previously established exclusion criteria. Six reviewers then independently scored the remaining 54 publications using a qualitative, quantitative, or review paper scoring system. Each scoring system consisted of nine criteria and used parallel scoring metrics that have been previously used in critical appraisals of education research. RESULTS Fifty-four feedback papers (25 quantitative studies, 24 qualitative studies, five review papers) met the a priori criteria for inclusion and were reviewed. Eight quantitative studies, nine qualitative studies, and three review papers were ranked highly by the reviewers and are summarized in this article. CONCLUSIONS This inaugural Council of Emergency Medicine Residency Directors Academy critical appraisal highlights 20 feedback in medical education papers that describe the current state of the feedback literature. A summary of current factors that influence feedback effectiveness is discussed, along with practical implications for EM educators and the next steps for research.
Collapse
Affiliation(s)
- Joshua G. Kornegay
- Department of Emergency MedicineOregon Health & Science UniversityPortlandOR
| | - Aaron Kraut
- BerbeeWalsh Department of Emergency MedicineUniversity of Wisconsin School of Medicine and Public HealthMadisonWI
| | - David Manthey
- Department of Emergency MedicineWake Forest University Baptist HealthWinston‐SalemNC
| | - Rodney Omron
- Department of Emergency MedicineJohns Hopkins School of MedicineBaltimoreMD
| | - Holly Caretta‐Weyer
- Department of Emergency MedicineOregon Health & Science UniversityPortlandOR
| | - Gloria Kuhn
- Department of Emergency MedicineWayne State UniversityDetroitMI
| | - Sandra Martin
- Department of Emergency MedicineWayne State UniversityDetroitMI
| | - Lalena M. Yarris
- Department of Emergency MedicineOregon Health & Science UniversityPortlandOR
| |
Collapse
|
10
|
Bernard AW, Ceccolini G, Feinn R, Rockfeld J, Rosenberg I, Thomas L, Cassese T. Medical students review of formative OSCE scores, checklists, and videos improves with student-faculty debriefing meetings. MEDICAL EDUCATION ONLINE 2017; 22:1324718. [PMID: 28521646 PMCID: PMC5508650 DOI: 10.1080/10872981.2017.1324718] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/22/2016] [Accepted: 04/19/2017] [Indexed: 06/01/2023]
Abstract
BACKGROUND Performance feedback is considered essential to clinical skills development. Formative objective structured clinical exams (F-OSCEs) often include immediate feedback by standardized patients. Students can also be provided access to performance metrics including scores, checklists, and video recordings after the F-OSCE to supplement this feedback. How often students choose to review this data and how review impacts future performance has not been documented. OBJECTIVE We suspect student review of F-OSCE performance data is variable. We hypothesize that students who review this data have better performance on subsequent F-OSCEs compared to those who do not. We also suspect that frequency of data review can be improved with faculty involvement in the form of student-faculty debriefing meetings. DESIGN Simulation recording software tracks and time stamps student review of performance data. We investigated a cohort of first- and second-year medical students from the 2015-16 academic year. Basic descriptive statistics were used to characterize frequency of data review and a linear mixed-model analysis was used to determine relationships between data review and future F-OSCE performance. RESULTS Students reviewed scores (64%), checklists (42%), and videos (28%) in decreasing frequency. Frequency of review of all metric and modalities improved when student-faculty debriefing meetings were conducted (p<.001). Among 92 first-year students, checklist review was associated with an improved performance on subsequent F-OSCEs (p = 0.038) by 1.07 percentage points on a scale of 0-100. Among 86 second year students, no review modality was associated with improved performance on subsequent F-OSCEs. CONCLUSION Medical students review F-OSCE checklists and video recordings less than 50% of the time when not prompted. Student-faculty debriefing meetings increased student data reviews. First-year student's review of checklists on F-OSCEs was associated with increases in performance on subsequent F-OSCEs, however this outcome was not observed among second-year students.
Collapse
Affiliation(s)
- Aaron W. Bernard
- Frank H. Netter MD School of Medicine at Quinnipiac, Hamden, CT, USA
| | | | - Richard Feinn
- Frank H. Netter MD School of Medicine at Quinnipiac, Hamden, CT, USA
| | - Jennifer Rockfeld
- Frank H. Netter MD School of Medicine at Quinnipiac, Hamden, CT, USA
| | - Ilene Rosenberg
- Frank H. Netter MD School of Medicine at Quinnipiac, Hamden, CT, USA
| | - Listy Thomas
- Frank H. Netter MD School of Medicine at Quinnipiac, Hamden, CT, USA
| | - Todd Cassese
- Frank H. Netter MD School of Medicine at Quinnipiac, Hamden, CT, USA
| |
Collapse
|
11
|
Quinn SM, Worrilow CC, Jayant DA, Bailey B, Eustice E, Kohlhepp J, Rogers R, Kane BG. Using Milestones as Evaluation Metrics During an Emergency Medicine Clerkship. J Emerg Med 2016; 51:426-431. [PMID: 27473442 DOI: 10.1016/j.jemermed.2016.06.014] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/26/2016] [Revised: 05/18/2016] [Accepted: 06/04/2016] [Indexed: 11/19/2022]
Abstract
BACKGROUND The Accreditation Council for Graduate Medical Education's (ACGME) Milestones presumes graduating medical students will enter residency proficient at Milestone level 1 for 23 skills. The Next Accreditation System now includes Milestones for each postgraduate specialty, and it is unlikely that schools will document every emergency medicine (EM) applicant's EM-specific skills in their performance evaluation. OBJECTIVES The goals of this research were to determine if assessment of the Milestones was feasible during a medical student clerkship and examine the proportion of medical students performing at Milestone level 1. METHODS This study was conducted at a center with Liaison Committee on Medical Education-approved medical training and a 4-year EM residency. Using traditional clerkship, we studied the feasibility of an ACGME EM Milestones-based clerkship assessment. Data led to redesign of the clerkship and its evaluation process, including all level 1 anchor(s) to add "occasionally" (>60%), "usually" (>80%), and "always" (100%) on a Likert scale to on-shift assessment forms. RESULTS During the feasibility phase (2013-14), 75 students rotated though the clerkship; 55 evaluations were issued and 50 contained the Milestone summary. Eight deficiencies were noted in Milestone 12 and three in Milestone 14. After changes, 49 students rotated under the new evaluation rubric. Of 575 completed on-shift evaluations, 16 Milestone deficiencies were noted. Of 41 institutional evaluations issued, only one student had deficiencies noted, all of which pertained to patient care. All evaluations in this second cohort contained each student's Milestone proficiency. CONCLUSIONS Assessment of the Milestones is feasible. Communication of ACGME EM Milestone proficiency may identify students who require early observation or remediation. The majority of students meet the anchors for the Milestones, suggesting that clerkship assessment with the ACGME EM Milestones does not adequately differentiate students.
Collapse
Affiliation(s)
- Shawn M Quinn
- Department of Emergency Medicine, Lehigh Valley Health Network/University of South Florida Morsani College of Medicine, Allentown, Pennsylvania
| | - Charles C Worrilow
- Department of Emergency Medicine, Lehigh Valley Health Network/University of South Florida Morsani College of Medicine, Allentown, Pennsylvania
| | - Deepak A Jayant
- Department of Emergency Medicine, Lehigh Valley Health Network/University of South Florida Morsani College of Medicine, Allentown, Pennsylvania
| | - Blake Bailey
- Department of Emergency Medicine, Lehigh Valley Health Network/University of South Florida Morsani College of Medicine, Allentown, Pennsylvania
| | - Eric Eustice
- Department of Emergency Medicine, Lehigh Valley Health Network/University of South Florida Morsani College of Medicine, Allentown, Pennsylvania
| | - Jared Kohlhepp
- Department of Emergency Medicine, Lehigh Valley Health Network/University of South Florida Morsani College of Medicine, Allentown, Pennsylvania
| | - Ryan Rogers
- Department of Emergency Medicine, Lehigh Valley Health Network/University of South Florida Morsani College of Medicine, Allentown, Pennsylvania
| | - Bryan G Kane
- Department of Emergency Medicine, Lehigh Valley Health Network/University of South Florida Morsani College of Medicine, Allentown, Pennsylvania
| |
Collapse
|
12
|
Tews MC, Treat RW, Nanes M. Increasing Completion Rate of an M4 Emergency Medicine Student End-of-Shift Evaluation Using a Mobile Electronic Platform and Real-Time Completion. West J Emerg Med 2016; 17:478-83. [PMID: 27429704 PMCID: PMC4944810 DOI: 10.5811/westjem.2016.5.29384] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/26/2015] [Revised: 04/30/2016] [Accepted: 05/17/2016] [Indexed: 12/03/2022] Open
Abstract
Introduction Medical students on an emergency medicine rotation are traditionally evaluated at the end of each shift with paper-based forms, and data are often missing due to forms not being turned in or completed. Because students’ grades depend on these evaluations, change was needed to increase form rate of return. We analyzed a new electronic evaluation form and modified completion process to determine if it would increase the completion rate without altering how faculty scored student performance. Methods During fall 2013, 29 faculty completed paper N=339 evaluations consisting of seven competencies for 33 students. In fall 2014, an electronic evaluation form with the same competencies was designed using an electronic platform and completed N=319 times by 27 faculty using 25 students’ electronic devices. Feedback checkboxes were added to facilitate collection of common comments. Data was analyzed with IBM® SPSS® 21.0 using multi-factor analysis of variance with the students’ global rating (GR) as an outcome. Inter-item reliability was determined with Cronbach alpha. Results There was a significantly higher completion rate (p=0.001) of 98% electronic vs. 69% paper forms, lower (p=0.001) missed GR rate (1% electronic. vs 12% paper), and higher mean scores (p=0.001) for the GR with the electronic (7.0±1.1) vs. paper (6.8±1.2) form. Feedback checkboxes were completed on every form. The inter-item reliability for electronic and paper forms was each alpha=0.95. Conclusion The use of a new electronic form and modified completion process for evaluating students at the end of shift demonstrated a higher faculty completion rate, a lower missed data rate, a higher global rating and consistent collection of common feedback. The use of the electronic form and the process for obtaining the information made our end-of-shift evaluation process for students more reliable and provided more accurate, up-to-date information for student feedback and when determining student grades.
Collapse
Affiliation(s)
- Matthew C Tews
- Medical College of Wisconsin, Department of Emergency Medicine, Milwaukee, Wisconsin
| | - Robert W Treat
- Medical College of Wisconsin, Department of Emergency Medicine, Milwaukee, Wisconsin
| | - Maxwell Nanes
- ProHealth Waukesha Memorial Hospital, Emergency Medicine Associates of Waukesha, LLC, Waukesha, Wisconsin
| |
Collapse
|
13
|
Tews MC, Ditz Wyte CM, Coltman M, Hiller K, Jung J, Oyama LC, Jubanyik K, Khandelwal S, Goldenberg W, Wald DA, Zun LS, Zinzuwadia S, Pandit K, An C, Ander DS. Implementing a third-year emergency medicine medical student curriculum. J Emerg Med 2015; 48:732-743.e8. [PMID: 25825161 DOI: 10.1016/j.jemermed.2014.12.063] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/13/2014] [Revised: 11/05/2014] [Accepted: 12/22/2014] [Indexed: 11/19/2022]
Abstract
BACKGROUND Emergency medicine (EM) is commonly introduced in the fourth year of medical school because of a perceived need to have more experienced students in the complex and dynamic environment of the emergency department. However, there is no evidence supporting the optimal time or duration for an EM rotation, and a number of institutions offer third-year rotations. OBJECTIVE A recently published syllabus provides areas of knowledge, skills, and attitudes that third-year EM rotation directors can use to develop curricula. This article expands on that syllabus by providing a comprehensive curricular guide for the third-year medical student rotation with a focus on implementation. DISCUSSION Included are consensus-derived learning objectives, discussion of educational methods, considerations for implementation, and information on feedback and evaluation as proposed by the Clerkship Directors in Emergency Medicine Third-Year Curriculum Work Group. External validation results, derived from a survey of third-year rotation directors, are provided in the form of a content validity index for each content area. CONCLUSIONS This consensus-derived curricular guide can be used by faculty who are developing or revising a third-year EM medical student rotation and provide guidance for implementing this curriculum at their institution.
Collapse
Affiliation(s)
- Matthew C Tews
- Department of Emergency Medicine, Medical College of Wisconsin, Milwaukee, Wisconsin
| | - Collette Marie Ditz Wyte
- Department of Emergency Medicine, Oakland University, William Beaumont School of Medicine, Royal Oak, Michigan
| | - Marion Coltman
- Department of Emergency Medicine, Oakland University, William Beaumont School of Medicine, Royal Oak, Michigan
| | - Kathy Hiller
- Department of Emergency Medicine, University of Arizona Health Network, Tucson, Arizona
| | - Julianna Jung
- Department of Emergency Medicine, Johns Hopkins University School of Medicine, Baltimore, Maryland
| | - Leslie C Oyama
- UCSD Emergency Medicine, University of California, San Diego, San Diego, California
| | - Karen Jubanyik
- Department of Emergency Medicine, Yale-New Haven Hospital, New Haven, Connecticut
| | - Sorabh Khandelwal
- Department of Emergency Medicine, The Ohio State University Medical Center, Columbus, Ohio
| | - William Goldenberg
- Department of Emergency Medicine, Naval Medical Center, San Diego, California
| | - David A Wald
- Department of Emergency Medicine, Temple University School of Medicine, Philadelphia, Pennsylvania
| | - Leslie S Zun
- Department of Emergency Medicine, Mount Sinai Hospital, Chicago Medical School, Chicago, Illinois
| | - Shreni Zinzuwadia
- Department of Emergency Medicine, New Jersey Medical School-University Hospital, Newark, New Jersey
| | - Kiran Pandit
- Department of Emergency Medicine, Columbia University, New York, New York
| | - Charlene An
- Department of Emergency Medicine, SUNY Downstate Medical Center, Brooklyn, New York
| | - Douglas S Ander
- Department of Emergency Medicine, Emory University School of Medicine, Atlanta, Georgia
| |
Collapse
|
14
|
Alquraini MM, Baig L, Magzoub M, Omair A. Evaluation of off-service rotations at National Guard Health Affairs: Results from a perception survey of off-service residents. J Family Community Med 2013; 20:123-9. [PMID: 23983565 PMCID: PMC3748647 DOI: 10.4103/2230-8229.114773] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
Context: “Off-service” clinical rotations are part of the necessary requirements for many residency training programs. Because these rotations are off-service, little attention is given to their structure and quality of training. This often leads to suboptimal educational experience for the residents on these rotations. Aims: The aim of this study was to assess medical residents’ perceptions, opinions, and levels of satisfaction with their “off-service” rotations at a major residency training site in Saudi Arabia. It was also to evaluate the reliability and validity of a questionnaire used for quality assurance in these rotations. Improved reliability and validity of this questionnaire may help to improve the educational experience of residents in their “off-service” rotations. Materials and Methods: A close-ended questionnaire was developed, Pilot tested and distributed to 110 off-service residents in training programs of different specializations at King Fahad Naitonal Guard Hospital and King Abdulziz Medical City, Riyadh, Saudi Arabia, between September 2011 and December 2011. Results: A total of 80 out of 110 residents completed and returned the questionnaire. Only 33% of these residents had a clear set of goals and educational learning objectives before the beginning of their off-service rotations to direct their training. Surgical specializations had low satisfaction mean scores of 57.2 (11.9) compared to emergency medicine, which had 70.7 (16.2), P value (0.03). The reliability of the questionnaire was Cronbach's alpha 0.57. The factor analysis yielded a 4-factor solution (educational environment, educational balance, educational goals and objectives, and learning ability); thus, accounting for 51% variance in the data. Conclusion: Our data suggest that there were significant weaknesses in the curriculum for off-service clinical rotations in KAMC and that residents were not completely satisfied with their training.
Collapse
Affiliation(s)
- Mustafa M Alquraini
- Department of Emergency Medicine, King Abdul-Aziz Medical City for National Guard Health Affairs, Riyadh, Kingdom of Saudi Arabia
| | | | | | | |
Collapse
|
15
|
Dekker H, Schönrock-Adema J, Snoek JW, van der Molen T, Cohen-Schotanus J. Which characteristics of written feedback are perceived as stimulating students' reflective competence: an exploratory study. BMC MEDICAL EDUCATION 2013; 13:94. [PMID: 23829790 PMCID: PMC3750500 DOI: 10.1186/1472-6920-13-94] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/29/2012] [Accepted: 06/26/2013] [Indexed: 05/14/2023]
Abstract
BACKGROUND Teacher feedback on student reflective writing is recommended to improve learners' reflective competence. To be able to improve teacher feedback on reflective writing, it is essential to gain insight into which characteristics of written feedback stimulate students' reflection processes. Therefore, we investigated (1) which characteristics can be distinguished in written feedback comments on reflective writing and (2) which of these characteristics are perceived to stimulate students' reflection processes. METHODS We investigated written feedback comments from forty-three teachers on their students' reflective essays. In Study 1, twenty-three medical educators grouped the comments into distinct categories. We used Multiple Correspondence Analysis to determine dimensions in the set of comments. In Study 2, another group of twenty-one medical educators individually judged whether the comments stimulated reflection by rating them on a five-point scale. We used t-tests to investigate whether comments classified as stimulating and not stimulating reflection differed in their scores on the dimensions. RESULTS Our results showed that characteristics of written feedback comments can be described in three dimensions: format of the feedback (phrased as statement versus question), focus of the feedback (related to the levels of students' reflections) and tone of the feedback (positive versus negative). Furthermore, comments phrased as a question and in a positive tone were judged as stimulating reflection more than comments at the opposite side of those dimensions (t = (14.5) = 6.48; p = < .001 and t = (15) = -1.80; p < .10 respectively). The effect sizes were large for format of the feedback comment (r = .86) and medium for tone of the feedback comment (r = .42). CONCLUSIONS This study suggests that written feedback comments on students' reflective essays should be formulated as a question, positive in tone and tailored to the individual student's reflective level in order to stimulate students to reflect on a slightly higher level. Further research is needed to examine whether incorporating these characteristics into teacher training helps to improve the quality of written feedback comments on reflective writing.
Collapse
Affiliation(s)
- Hanke Dekker
- Institute for Medical Education, University of Groningen and University Medical Center Groningen, A. Deusinglaan 1, FC40, 9713 AV, Groningen, the Netherlands
| | - Johanna Schönrock-Adema
- Center for Research and Innovation of Medical Education, University of Groningen and the University Medical Center Groningen, Groningen, the Netherlands
| | - Jos W Snoek
- Institute for Medical Education, University of Groningen and University Medical Center Groningen, A. Deusinglaan 1, FC40, 9713 AV, Groningen, the Netherlands
| | - Thys van der Molen
- Department of Primary Care, University of Groningen and University Medical Center Groningen, Groningen, the Netherlands
| | - Janke Cohen-Schotanus
- Center for Research and Innovation of Medical Education, University of Groningen and the University Medical Center Groningen, Groningen, the Netherlands
| |
Collapse
|