1
|
McCarrick CA, Moynihan A, Khan MF, Lennon F, Stokes M, Donnelly S, Heneghan H, Cahill RA. Impact of Simulation Training on Core Skill Competency of Undergraduate Medical Students. JOURNAL OF SURGICAL EDUCATION 2024; 81:1222-1228. [PMID: 38981819 DOI: 10.1016/j.jsurg.2024.06.006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/01/2024] [Revised: 06/04/2024] [Accepted: 06/10/2024] [Indexed: 07/11/2024]
Abstract
INTRODUCTION Simulation based medical training (SBMT) is gaining traction for undergraduate learning and development. We designed, implemented, and independently assessed the impact of an SBMT programme on competency in surgical history taking and clinical examination for senior clinical students. METHODS With institutional ethical approval and initial pilot study of student volunteers that ensured format appropriateness, we implemented an SBMT programme weekly for ten weeks during the core surgery module of our Medicine degree programme. Groups of 5 students collaboratively undertook an observed focused history and physical examination while simultaneously directing care on a simulated surgical patient (actor) with acute abdominal pain. This was conducted in a nonclinical, standardised, tutor-supervised environment and followed by a group debriefing led by both the simulated patient and tutor discussing student interaction and competency. All students undertook Southampton Medical Assessment Tool (SMAT) on a surgical inpatient prior to (baseline) and within 2 weeks after SBMT. Students without simulation training functioned as a control group and randomized cluster sampling was utilised for group selection. Second assessments were by independent surgical academics blinded to student group. Feedback was collected via anonymous questionnaire from those who undertook SBMT. RESULTS One hundred students took part, fifty of whom undertook SBMT. Global mean SMAT scores were similar between the control and intervention group at baseline (p > 0.05). Scores on the second assessment were significantly higher (p = 0.0006) for those who had undertaken SBMT vs. controls; 94% of students taking SBMT reported benefit via questionnaire with 85% stating increased confidence in history-taking and 78% reporting improved abdominal examination. CONCLUSIONS Undergraduate simulation training at scale is feasible and positively impacts undergraduate student core task competency.
Collapse
Affiliation(s)
- Cathleen A McCarrick
- Department of Surgery, School of Medicine, University College Dublin; Department of Surgery, Mater Misericordiae University Hospital
| | - Alice Moynihan
- Department of Surgery, School of Medicine, University College Dublin; Department of Surgery, Mater Misericordiae University Hospital
| | - Mohammad Faraz Khan
- Department of Surgery, School of Medicine, University College Dublin; Department of Surgery, Mater Misericordiae University Hospital
| | - Finbar Lennon
- Department of Surgery, School of Medicine, University College Dublin
| | - Maurice Stokes
- Department of Surgery, School of Medicine, University College Dublin; Department of Surgery, Mater Misericordiae University Hospital
| | - Suzanne Donnelly
- Department of Surgery, School of Medicine, University College Dublin; Department of Surgery, Mater Misericordiae University Hospital
| | - Helen Heneghan
- Department of Surgery, School of Medicine, University College Dublin
| | - Ronan A Cahill
- Department of Surgery, School of Medicine, University College Dublin; Department of Surgery, Mater Misericordiae University Hospital.
| |
Collapse
|
2
|
Raikhel AV, Starks H, Berger G, Redinger J. Through the Looking Glass: Comparing Hospitalists' and Internal Medicine Residents' Perceptions of Feedback. Cureus 2024; 16:e63459. [PMID: 39077307 PMCID: PMC11285250 DOI: 10.7759/cureus.63459] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 06/28/2024] [Indexed: 07/31/2024] Open
Abstract
INTRODUCTION Feedback is critical for resident growth and is most effective when the relationship between residents and attendings is collaborative, with shared expectations for the purpose, timing, and manner of communication for feedback. Within internal medicine, there is limited work exploring the resident and hospitalist perspectives on whether key elements are included in feedback sessions. METHODS We surveyed internal medicine residents and supervising hospitalists at a large urban training program about their perspectives on four components of effective feedback: specificity,timeliness, respectful communication, and actionability. RESULTS We received surveys from 130/184 internal medicine residents and 74/129 hospitalists (71% and 57% response rate, respectively). Residents and hospitalists differed in their perspectives about specificity and timeliness: 54% (70/129) of residents reported they did not receive specific feedback while 90% (65/72) of hospitalists reported they delivered specific feedback (p<0.01), and 33% (43/129) of residents compared with 82% (59/72) of hospitalists perceived feedback as timely (p<0.01). Internal medicine residents and hospitalists reported concordant rates of feedback sessions consisting of a two-way conversation (84%, 109/129; 89%, 64/72, respectively, p=0.82) and that communication was delivered in a respectful manner (95%, 122/129; 97%, 70/72, respectively, p=0.57). CONCLUSIONS We observed discordance between internal medicine residents and supervising hospitalist perspectives on the inclusion of two critical components of feedback: specificity and timing. The hospitalist cohort reported delivering more components of effective feedback than the resident cohort reported receiving. The etiology of this discordance is likely multifactorial and requires further investigation.
Collapse
Affiliation(s)
- Andrew V Raikhel
- Department of Hospital Medicine, VA (Veteran's Affairs) Puget Sound Healthcare System, Seattle Division, Seattle, USA
- Department of General Internal Medicine, University of Washington, Seattle, USA
| | - Helene Starks
- Department of Bioethics and Humanities, University of Washington, Seattle, USA
| | - Gabrielle Berger
- Department of General Internal Medicine, University of Washington, Seattle, USA
| | - Jeffrey Redinger
- Department of Medicine, University of Washington School of Medicine, Seattle, USA
- Department of Hospital Medicine, VA (Veteran's Affairs) Puget Sound Healthcare System, Seattle Division, Seattle, USA
| |
Collapse
|
3
|
Stefanidis D, Cook D, Kalantar-Motamedi SM, Muret-Wagstaff S, Calhoun AW, Lauridsen KG, Paige JT, Lockey A, Donoghue A, Hall AK, Patocka C, Palaganas J, Gross IT, Kessler D, Vermylen J, Lin Y, Aebersold M, Chang TP, Duff J, Kolbe M, Rutherford-Hemming T, Decker S, Collings A, Toseef Ansari M. Society for Simulation in Healthcare Guidelines for Simulation Training. Simul Healthc 2024; 19:S4-S22. [PMID: 38240614 DOI: 10.1097/sih.0000000000000776] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/23/2024]
Abstract
BACKGROUND Simulation has become a staple in the training of healthcare professionals with accumulating evidence on its effectiveness. However, guidelines for optimal methods of simulation training do not currently exist. METHODS Systematic reviews of the literature on 16 identified key questions were conducted and expert panel consensus recommendations determined using the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) methodology. OBJECTIVE These evidence-based guidelines from the Society for Simulation in Healthcare intend to support healthcare professionals in decisions on the most effective methods for simulation training in healthcare. RESULTS Twenty recommendations on 16 questions were determined using GRADE. Four expert recommendations were also provided. CONCLUSIONS The first evidence-based guidelines for simulation training are provided to guide instructors and learners on the most effective use of simulation in healthcare.
Collapse
Affiliation(s)
- Dimitrios Stefanidis
- From the Department of Surgery (D.S., S.-M.K.-M.), Indiana University School of Medicine, Indianapolis, IN; Department of Internal Medicine (D.C.), Mayo Clinic, Rochester, MN; Department of Surgery (S.M.-W.), Emory University, Atlanta, GA; Department of Pediatrics (A.W.C.), University of Louisville School of Medicine and Norton Children's Medical Group, Louisville, KY; Department of Medicine (K.G.L.), Randers Regional Hospital, Randers, Denmark; Research Center for Emergency Medicine (K.G.L.), Aarhus University, Aarhus, Denmark; Department of Surgery (J.T.P.), LSU Health New Orleans School of Medicine, New Orleans, LA; Emergency Department (A.L.), Calderdale and Huddersfield NHS Trust, Halifax; School of Human and Health Sciences (A.L.), University of Huddersfield, Huddersfield, UK; Critical Care Medicine and Pediatrics (A.D.), University of Pennsylvania Perelman School of Medicine, Philadelphia, PA; Department of Emergency Medicine (A.K.H.), University of Ottawa, Ottawa, Ontario, Canada; Department of Emergency Medicine (C.P.), Cumming School of Medicine University of Calgary, Calgary, AB, Canada; Department of Health Professions Education (J.P.), School of Healthcare Leadership, MGH Institute of Health Professions, Boston, MA; Department of Pediatrics (I.T.G.), Section of Emergency Medicine, Yale University, New Haven, CT; Department of Emergency Medicine (D.K.), Columbia University Vagelos College of Physicians and Surgeons, New York, NY,; Department of Medicine and Medical Education (J.V.), Feinberg School of Medicine, Northwestern University, Chicago, IL; KidSIM Simulation Research Program (Y.L.), Alberta Children's Hospital, Calgary, Canada; University of Michigan School of Nursing (M.A.), Ann Arbor, MI; Las Madrinas Simulation Center, Children's Hospital (T.C.), University South California, Los Angeles, CA; Department of Pediatrics (J.D.), University of Alberta, Edmonton, Alberta, Canada; Simulation Center (M.K.), University Hospital Zurich, ETH Zurich, Switzerland; Department of Nursing (T.R.-H.), University of North Carolina, Chapel Hill, NC; Department of Nursing (S.D.), Texas Tech University Health Sciences Center, Lubbock, TX; Department of Surgery (A.C.), University of Louisville, Louisville, KY; and Independent Methodologist (M.T.A.), Ottawa, Ontario, Canada
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
Collapse
|
4
|
Patocka C, Pandya A, Brennan E, Lacroix L, Anderson I, Ganshorn H, Hall AK. The Impact of Just-in-Time Simulation Training for Healthcare Professionals on Learning and Performance Outcomes: A Systematic Review. Simul Healthc 2024; 19:S32-S40. [PMID: 38240616 DOI: 10.1097/sih.0000000000000764] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/23/2024]
Abstract
ABSTRACT Although just-in-time training (JIT) is increasingly used in simulation-based health professions education, its impact on learning, performance, and patient outcomes remains uncertain. The aim of this study was to determine whether JIT simulation training leads to improved learning and performance outcomes. We included randomized or nonrandomized interventional studies assessing the impact of JIT simulation training (training conducted in temporal or spatial proximity to performance) on learning outcomes among health professionals (trainees or practitioners). Of 4077 citations screened, 28 studies were eligible for inclusion. Just-in-time training simulation training has been evaluated for a variety of medical, resuscitation, and surgical procedures. Most JIT simulation training occurred immediately before procedures and lasted between 5 and 30 minutes. Despite the very low certainty of evidence, this systematic review suggests JIT simulation training can improve learning and performance outcomes, in particular time to complete skills. There remains limited data on better patient outcomes and collateral educational effects.
Collapse
Affiliation(s)
- Catherine Patocka
- From the Department of Emergency Medicine (C.P., A.P.), University of Calgary Cumming School of Medicine, Calgary, Canada; Department of Emergency Medicine (E.B.), Queen's University, Kingston, Canada ; Department of Emergency Medicine (L.L., A.K.H.), University of Ottawa, Ottawa, Canada; Department of Pediatric Emergency Medicine (I.A.), Rainbow Babies and Children's Hospital, Case Western Reserve University, Cleveland, OH; Royal College of Physicians and Surgeons of Canada (A.K.H.), Ottawa, Canada ; Libraries and Cultural Resources (H.G.), University of Calgary, Calgary, Canada
| | | | | | | | | | | | | |
Collapse
|
5
|
Mok SF, Tan TMD, Seow CJ. Modified endocrinology script concordance test: evaluating the reliability and construct validity for assessing clinical reasoning. Singapore Med J 2023:384045. [PMID: 37675672 DOI: 10.4103/singaporemedj.smj-2021-230] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/08/2023]
Affiliation(s)
- Shao Feng Mok
- Department of Medicine, National University Hospital, Singapore
| | | | - Cherng Jye Seow
- Department of Endocrinology, Tan Tock Seng Hospital, Singapore
| |
Collapse
|
6
|
Bamber H. Evaluation of the Workplace-Based Assessment Anaesthesia-Clinical Evaluation Exercise (A-CEX) and Its Role in the Royal College of Anaesthetists 2021 Curriculum. Cureus 2023; 15:e37402. [PMID: 37181999 PMCID: PMC10171902 DOI: 10.7759/cureus.37402] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/09/2023] [Indexed: 05/16/2023] Open
Abstract
The workplace-based assessment (WPBA) Anaesthesia-Clinical Evaluation Exercise (A-CEX) is used in anaesthetic training in the Royal College of Anaesthetists 2021 curriculum. WBPAs are part of a multimodal approach to assess competencies, but can be limited by their granularity. They are an essential component of assessment and are used in both a formative and summative capacity. The A-CEX is a form of WBPA which evaluates knowledge, behaviours and skill of anaesthetists in training across a variety of 'real world' situations. An entrustment scale is assigned to the evaluation which has implications for future practice and ongoing supervision requirements. Despite being a key component in the curriculum the A-CEX has drawbacks. Its qualitative nature results in variation in feedback provided amongst assessors, which may have ongoing implications for clinical practice. Furthermore, the completion of an A-CEX can be viewed as a 'tick box' exercise and does not guarantee that learning has taken place. Currently no direct evidence exists as to the benefit of the A-CEX in anaesthetic training, but extrapolated data from other studies may show validity. However, the assessment remains a key part of the 2021 curriculum, Future areas for consideration include education for those assessing trainees via A-CEX, altering the matrix of assessment to a less granular approach and a longitudinal study as to the utility of A-CEX in anaesthetics training.
Collapse
|
7
|
Alomar AZ. Perception and Satisfaction of Undergraduate Medical Students of the Mini Clinical Evaluation Exercise Implementation in Orthopedic Outpatient Setting. ADVANCES IN MEDICAL EDUCATION AND PRACTICE 2022; 13:1159-1170. [PMID: 36176422 PMCID: PMC9514777 DOI: 10.2147/amep.s375693] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 05/21/2022] [Accepted: 09/19/2022] [Indexed: 06/16/2023]
Abstract
PURPOSE The Mini Clinical Evaluation Exercise (mini-CEX) is a brief and direct observational assessment of trainee-patient interactions that helps to assess several clinical domains. There is limited evidence of mini-CEX implementation in orthopedics and undergraduate perceptions toward such an approach. This study investigated the perception of mini-CEX among undergraduate medical students through a questionnaire-based survey in an orthopedic outpatient setting. PATIENTS AND METHODS Undergraduate medical students completing their orthopedic clinical posting were invited to participate in an anonymous, self-administered questionnaire written in English to evaluate their perceptions toward mini-CEX implementation in the orthopedic outpatient setting for the 2016-2017 academic session. The questionnaire comprised 28 closed-ended questions with a five-point Likert rating-scale, and five open-ended questions. The survey responses were analyzed for reliability, validity, and quantitative and qualitative analyses. RESULTS A total of 350 students completed the questionnaire; the questionnaire was proven to be valid and reliable. The closed-ended questions were designed to assess the knowledge of the mini-CEX as an assessment tool. The participants demonstrated a satisfactory understanding of the mini-CEX methodology, purpose, clarity, comprehensiveness, and as a self-assessment tool for undergraduate medical students. Instructor support for the implementation of mini-CEX appeared inadequate and was rated with non-confidence among most students. Most participants appreciated better clinical skills, which was reflected through improvements in clinical exam preparation, the Objective Structured Clinical Examination, and clinical judgment. CONCLUSION Undergraduate medical students perceived the mini-CEX as an effective tool for clinical teaching in an outpatient orthopedic setting. However, most students indicated suboptimal instructor involvement in the teaching and assessment process; this raises concerns regarding inadequate direct observation and limited feedback for student performance. Additional measures are needed to ensure high quality clinical encounters, teacher training, integration with other assessment tools, and standardized coverage mini-CEX implementation in orthopedics.
Collapse
Affiliation(s)
- Abdulaziz Z Alomar
- Department of Orthopaedic Surgery, College of Medicine, King Saud University, Riyadh, Kingdom of Saudi Arabia
| |
Collapse
|
8
|
Jefferies K. Factors that may improve paediatric workplace-based assessments: an exploratory study. Arch Dis Child 2022; 107:941-946. [PMID: 35768176 DOI: 10.1136/archdischild-2022-323937] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/15/2022] [Accepted: 05/30/2022] [Indexed: 11/04/2022]
Abstract
OBJECTIVES To establish if paediatric trainees are satisfied with the current workplace-based assessment (WBA) process. To identify factors that contribute both positively and negatively to the educational experience during WBAs. To find out if trainees and their supervisors experience any challenges conducting WBAs. To establish potential ways to improve future assessments. DESIGN Qualitative semistructured interviews. SETTING Participants included fifteen trainees (ST1-8) in general paediatric and subspecialty posts and four consultants or equivalent across five hospital sites in the Thames Valley Deanery. All participants had regular exposure to WBAs. INTERVENTIONS Interviews were undertaken between June 2020 and January 2021 via video link. Data collection and analysis were conducted iteratively using constant comparison until theoretical sufficiency was achieved. MAIN OUTCOME MEASURE Using Constructivist Grounded Theory, a theoretical framework, grounded in the data, was developed that depicted the core elements that should be present to optimise WBAs. RESULTS A number of key components were reported to affect the educational value of WBAs. A positive departmental culture towards education and training is essential. Chosen cases should be challenging, and direct observation or in-depth discussion, depending on the assessment type, is fundamental. Timely constructive feedback and immediate completion of the assessment form are also imperative. CONCLUSION Some trainees experienced WBAs where these key components aligned, but many did not, and this negatively affected their learning. Three main challenges or future targets for further improvements include increasing time, improving training and optimising technology.
Collapse
Affiliation(s)
- Kimberley Jefferies
- Centre for Medical Education, School of Medicine, University of Dundee, Dundee, UK
| |
Collapse
|
9
|
Liang Y, Noble LM. Chinese doctors' views on workplace-based assessment: trainee and supervisor perspectives of the mini-CEX. MEDICAL EDUCATION ONLINE 2021; 26:1869393. [PMID: 33380291 PMCID: PMC7782920 DOI: 10.1080/10872981.2020.1869393] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/03/2020] [Revised: 12/15/2020] [Accepted: 12/22/2020] [Indexed: 05/28/2023]
Abstract
Purpose: This study investigated whether the mini-clinical evaluation exercise (mini-CEX) has been successfully integrated into the Chinese context, following its introduction as part of the national general training programme. Materials and methods: Online questionnaires (N = 91) and interviews (N = 22) were conducted with Year 1 trainee doctors and clinical supervisors at a cancer hospital in China to explore users' experiences, attitudes and opinions of the mini-CEX. Results" Trainees were more likely than supervisors to report understanding the purpose of the mini-CEX and agree that it encouraged reflection and helped improve overall performance. Both trainees and supervisors felt that it provided a framework for learning, that it was useful in identifying underperformance, and that it informed learning progression. Groups were equally positive about the commitment of their counterpart in the process and valued the focus on detailed feedback. It was perceived as cultivating the learner-teacher relationship. Overall, both groups felt they 'bought in' to using the mini-CEX. However, concerns were raised about subjectivity of ratings and lack of benchmarking with expected standards of care. Conclusions: Chinese trainees and supervisors generally perceived the mini-CEX as an acceptable and valuable medical training tool, although both groups suggested enhancements to improve its efficacy.
Collapse
Affiliation(s)
- Yuying Liang
- Department of Medical Education, Affiliated Cancer Hospital and Institute of Guangzhou Medical University, Guangzhou, China
- UCL Medical School, University College London, London, UK
| | | |
Collapse
|
10
|
Bashir K, Arshad W, Azad AM, Alfalahi S, Kodumayil A, Elmoheen A. Acceptability and Feasibility of Mini Clinical Evaluation Exercise (Mini-CEX) in the Busy Emergency Department. Open Access Emerg Med 2021; 13:481-486. [PMID: 34803409 PMCID: PMC8594889 DOI: 10.2147/oaem.s321161] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/10/2021] [Accepted: 08/27/2021] [Indexed: 11/23/2022] Open
Abstract
Background Mini Clinical Evaluation Exercise (Mini-CEX) has been adapted to different specialties in clinical practice but with very little evidence documented about its use for residency training in the emergency department (ED). This study aims to assess its acceptability and feasibility as a formative tool in the busy emergency department. Materials and Methods Both the faculty members and the emergency medicine residents were sent a validated questionnaire using Google forms, and the results were analyzed using simple statistical tools. Results Forty-nine residents and 58 faculty participated in the survey. The study was carried out over a period of 4 months. The resident’s completion rate was 96% (49 out of 51), while faculty completion rate was 96% (58 out of 60). The time for Mini-CEX completion ranged from 10 to 20 minutes. Most of the residents were satisfied with Mini-CEX as an assessment tool. Twelve residents expressed their concern regarding available time during busy clinical shifts. Most of the faculty agreed with the benefits of using Mini-CEX as a formative assessment tool. Several of them commented that they need “protected time” and “more training” to use this tool to provide maximum benefit to the residents. Conclusion Despite busy nature of ED, Mini-CEX has been identified as an acceptable learning tool for residents in emergency medicine. Based on the faculty’s feedback and comments, several faculty development workshops were conducted to improve faculty skills in carrying assessments by using Mini-CEX, and protected time is provided to some faculty members to carry out these formative assessments for the benefit of the residents.
Collapse
|
11
|
Martinsen SSS, Espeland T, Berg EAR, Samstad E, Lillebo B, Slørdahl TS. Examining the educational impact of the mini-CEX: a randomised controlled study. BMC MEDICAL EDUCATION 2021; 21:228. [PMID: 33882913 PMCID: PMC8061047 DOI: 10.1186/s12909-021-02670-3] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/06/2020] [Accepted: 04/14/2021] [Indexed: 06/12/2023]
Abstract
BACKGROUND The purpose of this study is to evaluate the mini-Clinical Evaluation Exercise (mini-CEX) as a formative assessment tool among undergraduate medical students, in terms of student perceptions, effects on direct observation and feedback, and educational impact. METHODS Cluster randomised study of 38 fifth-year medical students during a 16-week clinical placement. Hospitals were randomised to provide a minimum of 8 mini-CEXs per student (intervention arm) or continue with ad-hoc feedback (control arm). After finishing their clinical placement, students completed an Objective Structured Clinical Examination (OSCE), a written test and a survey. RESULTS All participants in the intervention group completed the pre-planned number of assessments, and 60% found them to be useful during their clinical placement. Overall, there were no statistically significant differences between groups in reported quantity or quality of direct observation and feedback. Observed mean scores were marginally higher on the OSCE and written test in the intervention group, but not statistically significant. CONCLUSIONS There is considerable potential in assessing medical students during clinical placements and routine practice, but the educational impact of formative assessments remains mostly unknown. This study contributes with a robust study design, and may serve as a basis for future research.
Collapse
Affiliation(s)
| | - Torvald Espeland
- Department of Circulation and Medical Imaging, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
- Clinic of Cardiology, St. Olavs Hospital, Trondheim University Hospital, Trondheim, Norway
| | - Erik Andreas Rye Berg
- Department of Circulation and Medical Imaging, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
- Clinic of Thoracic and Occupational Medicine, St. Olavs Hospital, Trondheim University Hospital, Trondheim, Norway
| | - Eivind Samstad
- Department of Clinical and Molecular Medicine, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
- Clinic of Medicine and Rehabilitation, Ålesund Hospital, Møre og Romsdal Hospital Trust, Ålesund, Norway
| | - Børge Lillebo
- Department of Circulation and Medical Imaging, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
- Clinic of Medicine and Rehabilitation, Levanger Hospital, Nord-Trøndelag Hospital Trust, Levanger, Norway
| | - Tobias S Slørdahl
- Department of Clinical and Molecular Medicine, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
- Department of Haematology, St. Olavs Hospital, Trondheim University Hospital, Trondheim, Norway
| |
Collapse
|
12
|
Bock A, Peters F, Elvers D, Wittenborn J, Kniha K, Gerressen M, Hölzle F, Modabber A. Introduction of mini-clinical evaluation exercise in teaching dental radiology-A pilot study. EUROPEAN JOURNAL OF DENTAL EDUCATION : OFFICIAL JOURNAL OF THE ASSOCIATION FOR DENTAL EDUCATION IN EUROPE 2020; 24:695-705. [PMID: 32558047 DOI: 10.1111/eje.12558] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/13/2019] [Revised: 05/22/2020] [Accepted: 06/02/2020] [Indexed: 06/11/2023]
Abstract
INTRODUCTION Workplace-based assessments are methods that can be applied for assessing competence and performance. One of these methods is the mini-clinical evaluation exercise (mini-CEX). This study was conducted to determine the role of mini-CEX in students' performance assessment on panoramic X-ray reporting at dental radiology course. MATERIALS AND METHODS A workshop as training for the assessors and the participants was conducted before the primary test. All participants (n = 36) were randomly allocated into six groups. Each group had three seminars in which every student reported a panoramic X-ray. Students were directly observed and rated by an assessor on a modified mini-CEX rating form. Then, a self-assessment of the students and a systematic feedback session were performed. Finally, the students and the assessors were evaluated for the acceptability and satisfaction with this tool. RESULTS The mean duration of the assessment and the feedback decreased significantly from the first seminar to the third seminar (P < .0001). Comparison of the results of the mini-CEX of all three assessments showed that students displayed a significantly better performance in evaluating the upper jaw and the soft tissue (P < .05). There was no significant improvement for the other aspects of the rating form. Overall, both students and assessors reported a high level of satisfaction in using the mini-CEX rating form. CONCLUSION Due to the objectivity and transparency of the assessment, the mini-CEX helped to improve the performance on reporting panoramic X-rays. Besides that, the structured feedback had major impact on the improvement. Overall, the assessors and the participants reported a high level of satisfaction using the rating form. Therefore, the mini-CEX may be an effective method for performing workplace-based assessments to evaluate students' performance on reporting panoramic X-rays.
Collapse
Affiliation(s)
- Anna Bock
- Department of Oral, Maxillofacial Surgery, University Hospital of Aachen University, Aachen, Germany
| | - Florian Peters
- Department of Oral, Maxillofacial Surgery, University Hospital of Aachen University, Aachen, Germany
| | - Dirk Elvers
- Department of Oral, Maxillofacial Surgery, University Hospital of Aachen University, Aachen, Germany
| | - Julian Wittenborn
- Department of Oral, Maxillofacial Surgery, University Hospital of Aachen University, Aachen, Germany
| | - Kristian Kniha
- Department of Oral, Maxillofacial Surgery, University Hospital of Aachen University, Aachen, Germany
| | - Marcus Gerressen
- Department of Oral, Maxillofacial and Plastic Facial Surgery, Heinrich Braun Hospital, Zwickau, Germany
| | - Frank Hölzle
- Department of Oral, Maxillofacial Surgery, University Hospital of Aachen University, Aachen, Germany
| | - Ali Modabber
- Department of Oral, Maxillofacial Surgery, University Hospital of Aachen University, Aachen, Germany
| |
Collapse
|
13
|
Quality of dictated feedback associated with SIMPL operative assessments of pediatric surgical trainees. Am J Surg 2020; 221:303-308. [PMID: 33051067 DOI: 10.1016/j.amjsurg.2020.10.014] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/14/2020] [Revised: 08/23/2020] [Accepted: 10/04/2020] [Indexed: 11/22/2022]
Abstract
BACKGROUND SIMPL is a workplace-based operative performance assessment tool which allows for dictated feedback (DF). To better understand the value of DF, we sought to characterize the type and quality of DF generated during SIMPL evaluations. METHODS Thematic analysis of DF from SIMPL assessments between June 2017 and December 2018 at a single pediatric surgery fellowship program was performed. Comments were categorized as specific, encouraging or corrective. Categories were combined to determine DF quality as effective, mediocre or ineffective. RESULTS Of 781 SIMPL assessments (21 faculty, 5 trainees), 451 (57%) had DF. Most comments were encouraging (93%) and specific (65%). Only 21% were corrective, 17% had entrustment features, and 8% had an explicit learning plan. Feedback quality was deemed mediocre (45%), ineffective (33%) and effective (21%). CONCLUSION SIMPL dictated feedback was mostly encouraging and specific. To improve quality, feedback should incorporate learning plans as well as corrective and entrustment features.
Collapse
|
14
|
Wu Y, Gong M, Zhang D, Zhang C. Educational impact of the mini-Clinical Evaluation Exercise in resident standardization training: a comparative study between resident and professional degree postgraduate trainees. J Int Med Res 2020; 48:300060520920052. [PMID: 32459121 PMCID: PMC7278105 DOI: 10.1177/0300060520920052] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Objective We aimed to explore differences in the educational impact of the
mini-Clinical Evaluation Exercise (mini-CEX) on resident (RE) and
professional degree postgraduate (PDPG) trainees, as well as influencing
factors, to provide suggestions for hospital managers, trainers, and
trainees. Methods We performed a retrospective analysis of all scores among first-year resident
standardization training trainees registered during 2017 to 2019 at Xinqiao
Hospital of Army Medical University, to identify differences in mini-CEX
outcomes between REs and PDPGs. Results We collected data of 154 registered trainees for retrospective analysis,
including 57 PDPG trainees and 97 RE trainees. The mean (standard deviation)
overall performance score of PDPGs was 84.18 (4.25), which was higher than
that of REs (81.48 (3.35)). In terms of domain analysis, PDPG trainees
performed significantly better than REs in history taking, physical
examination, clinical diagnosis/treatment regimen, and the knowledge
examination; communication skills/humanistic care were comparable between
the groups. Conclusions PDPGs performed better than REs in overall competency, history taking,
physical examination, clinical diagnosis/treatment regimen, and the
knowledge examination. A better knowledge base, supervisor-dominated
one-to-one teaching mode, higher self-esteem and learning goals, and more
sophisticated responses to feedback were potential contributors to a
superior educational impact of the mini-CEX.
Collapse
Affiliation(s)
- Yali Wu
- Department of Medical Service, Xinqiao Hospital, Army Medical University, Chongqing, China
| | - Mingfu Gong
- Department of Radiology, Xinqiao Hospital, Army Medical University, Chongqing, China
| | - Dong Zhang
- Department of Radiology, Xinqiao Hospital, Army Medical University, Chongqing, China
| | - Chun Zhang
- Department of Medical Service, Xinqiao Hospital, Army Medical University, Chongqing, China
| |
Collapse
|
15
|
Simulated patient-based teaching of medical students improves pre-anaesthetic assessment. Eur J Anaesthesiol 2020; 37:387-393. [DOI: 10.1097/eja.0000000000001139] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
16
|
Buckley C, Natesan S, Breslin A, Gottlieb M. Finessing Feedback: Recommendations for Effective Feedback in the Emergency Department. Ann Emerg Med 2020; 75:445-451. [DOI: 10.1016/j.annemergmed.2019.05.016] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2018] [Indexed: 01/11/2023]
|
17
|
Mortaz Hejri S, Jalili M, Masoomi R, Shirazi M, Nedjat S, Norcini J. The utility of mini-Clinical Evaluation Exercise in undergraduate and postgraduate medical education: A BEME review: BEME Guide No. 59. MEDICAL TEACHER 2020; 42:125-142. [PMID: 31524016 DOI: 10.1080/0142159x.2019.1652732] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Background: This BEME review aims at exploring, analyzing, and synthesizing the evidence considering the utility of the mini-CEX for assessing undergraduate and postgraduate medical trainees, specifically as it relates to reliability, validity, educational impact, acceptability, and cost.Methods: This registered BEME review applied a systematic search strategy in seven databases to identify studies on validity, reliability, educational impact, acceptability, or cost of the mini-CEX. Data extraction and quality assessment were carried out by two authors. Discrepancies were resolved by a third reviewer. Descriptive synthesis was mainly used to address the review questions. A meta-analysis was performed for Cronbach's alpha.Results: Fifty-eight papers were included. Only two studies evaluated all five utility criteria. Forty-seven (81%) of the included studies met seven or more of the quality criteria. Cronbach's alpha ranged from 0.58 to 0.97 (weighted mean = 0.90). Reported G coefficients, Standard error of measurement, and confidence interval were diverse and varied based on the number of encounters and the nested or crossed design of the study. The calculated number of encounters needed for a desirable G coefficient also varied greatly. Content coverage was reported satisfactory in several studies. Mini-CEX discriminated between various levels of competency. Factor analyses revealed a single dimension. The six competencies showed high levels of correlation with statistical significance with the overall competence. Moderate to high correlations between mini-CEX scores and other clinical exams were reported. The mini-CEX improved students' performance in other examinations. By providing a framework for structured observation and feedback, the mini-CEX exerts a favorable educational impact. Included studies revealed that feedback was provided in most encounters but its quality was questionable. The completion rates were generally above 50%. Feasibility and high satisfaction were reported.Conclusion: The mini-CEX has reasonable validity, reliability, and educational impact. Acceptability and feasibility should be interpreted given the required number of encounters.
Collapse
Affiliation(s)
- Sara Mortaz Hejri
- Department of Medical Education, School of Medicine, Tehran University of Medical Sciences, Tehran, Iran
| | - Mohammad Jalili
- Department of Medical Education, School of Medicine, Tehran University of Medical Sciences, Tehran, Iran
- Department of Emergency Medicine, School of Medicine, Tehran University of Medical Sciences, Tehran, Iran
| | - Rasoul Masoomi
- Department of Medical Education, School of Medicine, Tehran University of Medical Sciences, Tehran, Iran
| | - Mandana Shirazi
- Department of Medical Education, School of Medicine, Tehran University of Medical Sciences, Tehran, Iran
- Department of Clinical Science and Education at SOS Hospital, Karolina Institute, Stockholm, Sweden
| | - Saharnaz Nedjat
- Department of Epidemiology and Biostatistics, School of Public Health, Tehran University of Medical Sciences, Tehran, Iran
| | - John Norcini
- Foundation for Advancement of International Medical Education and Research (FAIMER), Philadelphia, PA, USA
| |
Collapse
|
18
|
Joshi MK, Singh T, Badyal DK. Acceptability and feasibility of mini-clinical evaluation exercise as a formative assessment tool for workplace-based assessment for surgical postgraduate students. J Postgrad Med 2019; 63:100-105. [PMID: 28272063 PMCID: PMC5414419 DOI: 10.4103/0022-3859.201411] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022] Open
Abstract
Background: Despite an increasing emphasis on workplace-based assessment (WPBA) during medical training, the existing assessment system largely relies on summative assessment while formative assessment is less valued. Various tools have been described for WPBA, mini-clinical evaluation exercise (mini-CEX) being one of them. Mini-CEX is well accepted in Western countries, however, reports of its use in India are scarce. We conducted this study to assess acceptability and feasibility of mini-CEX as a formative assessment tool for WPBA of surgical postgraduate students in an Indian setting. Methods: Faculty members and 2nd year surgical residents were sensitized toward mini-CEX and requisite numbers of exercises were conducted. The difficulties during conduction of these exercises were identified, recorded, and appropriate measures were taken to address them. At the conclusion, the opinion of residents and faculty members regarding their experience with mini-CEX was taken using a questionnaire. The results were analyzed using simple statistical tools. Results: Nine faculty members out of 11 approached participated in the study (81.8%). All 16 2nd year postgraduate surgical residents participated (100%). Sixty mini-CEX were conducted over 7 months. Each resident underwent 3–5 encounters. The mean time taken by the assessor for observation was 12.3 min (8–30 min) while the mean feedback time was 4.2 min (3–10 min). The faculty reported good overall satisfaction with mini-CEX and found it acceptable as a formative assessment tool. Three faculty members (33.3%) reported mini-CEX as more time-consuming while 2 (22.2%) found it difficult to carry the exercises often. All residents accepted mini-CEX and most of them reported good to high satisfaction with the exercises conducted. Conclusions: Mini-CEX is well accepted by residents and faculty as a formative assessment tool. It is feasible to utilize mini-CEX for WPBA of postgraduate students of surgery.
Collapse
Affiliation(s)
- M K Joshi
- Department of Surgical Disciplines, All India Institute of Medical Sciences, New Delhi, India
| | - T Singh
- Department of Paediatrics, CMCL-FAIMER Regional Institute, Medical Council of India Nodal Centre for Faculty Development, Christian Medical College and Hospital, Ludhiana, Punjab, India
| | - D K Badyal
- Department of Pharmacology, CMCL-FAIMER Regional Institute, Medical Council of India Nodal Centre for Faculty Development, Christian Medical College and Hospital, Ludhiana, Punjab, India
| |
Collapse
|
19
|
Lörwald AC, Lahner FM, Mooser B, Perrig M, Widmer MK, Greif R, Huwendiek S. Influences on the implementation of Mini-CEX and DOPS for postgraduate medical trainees' learning: A grounded theory study. MEDICAL TEACHER 2019; 41:448-456. [PMID: 30369283 DOI: 10.1080/0142159x.2018.1497784] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
Introduction: In order for Mini-Clinical Evaluation Exercise (Mini-CEX) and Direct Observation of Procedural Skills (DOPS) to actually have a positive effect on trainees' learning, the way in which the tools are implemented is of key importance. However, there are many factors influencing their implementation. In this study, we aim to develop a comprehensive model of such factors. Methods: Using a constructivist grounded theory approach, we performed eight focus groups. Participants were postgraduate trainees and supervisors from three different specialties; all were experienced with Mini-CEX and/or DOPS. Data were analyzed for recurring themes, underlying concepts and their interactions using constant comparison. Results: We developed a model demonstrating how the implementation of Mini-CEX and DOPS for trainees' learning is influenced by 13 factors relating to four categories: organizational culture (e.g. value of teaching and feedback), work structure (e.g. time for Mini-CEX and DOPS, faculty development), instruments (e.g. content of assessment), and users (e.g. relationship between trainees and supervisors), and their interaction. Conclusions: We developed a complex model of influencing factors relating to four categories. Consideration of this model might support successful implementation and trainees' learning with Mini-CEX and DOPS.
Collapse
Affiliation(s)
- Andrea C Lörwald
- a Department of Assessment and Evaluation , Institute of Medical Education , Bern , Switzerland
| | - Felicitas-Maria Lahner
- a Department of Assessment and Evaluation , Institute of Medical Education , Bern , Switzerland
| | - Bettina Mooser
- a Department of Assessment and Evaluation , Institute of Medical Education , Bern , Switzerland
| | - Martin Perrig
- b Department of General Internal Medicine , Bern University Hospital University of Bern , Bern , Switzerland
| | - Matthias K Widmer
- c Department of Cardiovascular Surgery , Bern University Hospital University of Bern , Bern , Switzerland
| | - Robert Greif
- d Department of Anaesthesiology and Pain Therapy , Bern University Hospital University of Bern , Bern , Switzerland
| | - Sören Huwendiek
- a Department of Assessment and Evaluation , Institute of Medical Education , Bern , Switzerland
| |
Collapse
|
20
|
LeBeau L, Morgan C, Heath D, Pazdernik VK. Assessing Competency in Family Medicine Residents Using the Osteopathic Manipulative Medicine Mini-Clinical Evaluation Exercise. J Osteopath Med 2019; 119:2721358. [PMID: 30644522 DOI: 10.7556/jaoa.2019.013] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
Abstract
CONTEXT The Mini-Clinical Evaluation Exercise (Mini-CEX) is one example of a direct observation tool used for workplace-based skills assessment. The Mini-CEX has been validated as a useful formative evaluation tool in graduate medical education. However, no Mini-CEX has been reported in the literature that specifically assesses the osteopathic manipulative medicine (OMM) skills of family medicine residents. Therefore, the authors created and studied an OMM Mini-CEX to fill this skills assessment gap. OBJECTIVE To determine whether the OMM Mini-CEX is perceived as an effective evaluation tool for assessing the OMM core competencies of family medicine residents. METHODS Faculty and residents of The Wright Center for Graduate Medical Education National Family Medicine Residency program participated in the study. Each resident was evaluated at least once using the OMM Mini-CEX. Surveys were used to assess faculty and resident perceptions of the usefulness and effectiveness of the OMM Mini-CEX for assessing OMM competencies. RESULTS Eighty-one responses were received during 2 survey cycles within a 7-month period. The internal consistency of the survey responses had a high reliability (Cronbach α=0.93). Considering respondents who agreed that they had a clear understanding of the general purpose of a Mini-CEX, the perceived effectiveness score for the OMM Mini-CEX was higher among those who agreed that a Mini-CEX was a useful part of training than among those who disagreed or were unsure of its usefulness (median score, 4.0 vs 3.4, respectively; P=.047). CONCLUSIONS The results suggest the OMM Mini-CEX can be a useful direct observation evaluation tool to assess OMM core competencies in family medicine residents. Additional research is needed to determine its perceived effectiveness in other clinical specialties and in undergraduate medical education.
Collapse
|
21
|
Green LMC, Friend AJ, Bardgett RJM, Darling JC. Including children and young people in assessments: a practical guide. Arch Dis Child Educ Pract Ed 2018; 103:267-273. [PMID: 29150423 DOI: 10.1136/archdischild-2017-313368] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/12/2017] [Revised: 10/02/2017] [Accepted: 10/11/2017] [Indexed: 11/03/2022]
Abstract
The ability to interact with children and young people (CYP), appropriately examine and competently interpret signs is an essential skill for many medical practitioners and allied healthcare professionals; yet, how do we ensure competence in our students and trainees? One method is to include CYP in both formative and summative assessments; this provides an invaluable opportunity for examiners not only to evaluate the clinical interaction but also to gain an understanding of the CYP experience and what characteristics they value in a 'good children's doctor'. This paper explores the benefits and challenges of involving CYP in assessments and provides practical advice for course organisers, assessors and students when encountering CYP in assessment.
Collapse
Affiliation(s)
- Lydia M C Green
- Division of Women's and Children's Health, Leeds Institute of Medical Education, School of Medicine, University of Leeds, Leeds, UK
| | - Amanda J Friend
- Division of Women's and Children's Health, Leeds Institute of Medical Education, School of Medicine, University of Leeds, Leeds, UK
| | - Rebecca J M Bardgett
- Children's and Adolescent Services, Bradford Teaching Hospitals NHS Foundation Trust, Bradford, UK
| | - Jonathan C Darling
- Division of Women's and Children's Health, Leeds Institute of Medical Education, School of Medicine, University of Leeds, Leeds, UK
| |
Collapse
|
22
|
Lörwald AC, Lahner FM, Nouns ZM, Berendonk C, Norcini J, Greif R, Huwendiek S. The educational impact of Mini-Clinical Evaluation Exercise (Mini-CEX) and Direct Observation of Procedural Skills (DOPS) and its association with implementation: A systematic review and meta-analysis. PLoS One 2018; 13:e0198009. [PMID: 29864130 PMCID: PMC5986126 DOI: 10.1371/journal.pone.0198009] [Citation(s) in RCA: 49] [Impact Index Per Article: 8.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/28/2017] [Accepted: 05/11/2018] [Indexed: 11/19/2022] Open
Abstract
Introduction Mini Clinical Evaluation Exercise (Mini-CEX) and Direct Observation of Procedural Skills (DOPS) are used as formative assessments worldwide. Since an up-to-date comprehensive synthesis of the educational impact of Mini-CEX and DOPS is lacking, we performed a systematic review. Moreover, as the educational impact might be influenced by characteristics of the setting in which Mini-CEX and DOPS take place or their implementation status, we additionally investigated these potential influences. Methods We searched Scopus, Web of Science, and Ovid, including All Ovid Journals, Embase, ERIC, Ovid MEDLINE(R), and PsycINFO, for original research articles investigating the educational impact of Mini-CEX and DOPS on undergraduate and postgraduate trainees from all health professions, published in English or German from 1995 to 2016. Educational impact was operationalized and classified using Barr’s adaptation of Kirkpatrick’s four-level model. Where applicable, outcomes were pooled in meta-analyses, separately for Mini-CEX and DOPS. To examine potential influences, we used Fisher’s exact test for count data. Results We identified 26 articles demonstrating heterogeneous effects of Mini-CEX and DOPS on learners’ reactions (Kirkpatrick Level 1) and positive effects of Mini-CEX and DOPS on trainees’ performance (Kirkpatrick Level 2b; Mini-CEX: standardized mean difference (SMD) = 0.26, p = 0.014; DOPS: SMD = 3.33, p<0.001). No studies were found on higher Kirkpatrick levels. Regarding potential influences, we found two implementation characteristics, “quality” and “participant responsiveness”, to be associated with the educational impact. Conclusions Despite the limited evidence, the meta-analyses demonstrated positive effects of Mini-CEX and DOPS on trainee performance. Additionally, we revealed implementation characteristics to be associated with the educational impact. Hence, we assume that considering implementation characteristics could increase the educational impact of Mini-CEX and DOPS.
Collapse
Affiliation(s)
- Andrea C. Lörwald
- Department of Assessment and Evaluation, Institute of Medical Education, University of Bern, Bern, Switzerland
- * E-mail:
| | - Felicitas-Maria Lahner
- Department of Assessment and Evaluation, Institute of Medical Education, University of Bern, Bern, Switzerland
| | - Zineb M. Nouns
- Department of Assessment and Evaluation, Institute of Medical Education, University of Bern, Bern, Switzerland
| | - Christoph Berendonk
- Department of Assessment and Evaluation, Institute of Medical Education, University of Bern, Bern, Switzerland
| | - John Norcini
- FAIMER, Philadelphia, Pennsylvania, United States of America
| | - Robert Greif
- Department of Anaesthesiology and Pain Therapy, Bern University Hospital, University of Bern, Bern, Switzerland
| | - Sören Huwendiek
- Department of Assessment and Evaluation, Institute of Medical Education, University of Bern, Bern, Switzerland
| |
Collapse
|
23
|
Lörwald AC, Lahner FM, Greif R, Berendonk C, Norcini J, Huwendiek S. Factors influencing the educational impact of Mini-CEX and DOPS: A qualitative synthesis. MEDICAL TEACHER 2018; 40:414-420. [PMID: 29188739 DOI: 10.1080/0142159x.2017.1408901] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/22/2023]
Abstract
INTRODUCTION The educational impact of Mini-CEX and DOPS varies greatly and can be influenced by several factors. However, there is no comprehensive analysis and synthesis of the described influencing factors. METHODS To fill this gap, we chose a two-step approach. First, we performed a systematic literature review and selected articles describing influencing factors on the educational impact of Mini-CEX and DOPS. Second, we performed a qualitative synthesis of these factors. RESULTS Twelve articles were included, which revealed a model consisting of four themes and nine subthemes as influencing factors. The theme context comprises "time for Mini-CEX/DOPS" and "usability of the tools", and influences the users. The theme users comprises "supervisors' knowledge about how to use Mini-CEX/DOPS", "supervisors' attitude to Mini-CEX/DOPS", "trainees' knowledge about Mini-CEX/DOPS", and "trainees' perception of Mini-CEX/DOPS". These influence the implementation of Mini-CEX and DOPS, including "observation" and "feedback". The theme implementation directly influences the theme outcome, which, in addition to the educational impact, encompasses "trainees' appraisal of feedback". CONCLUSIONS Our model of influencing factors might help to further improve the use of Mini-CEX and DOPS and serve as basis for future research.
Collapse
Affiliation(s)
- Andrea C Lörwald
- a Department of Assessment and Evaluation , Institute of Medical Education, University of Bern , Bern , Switzerland
| | - Felicitas-Maria Lahner
- a Department of Assessment and Evaluation , Institute of Medical Education, University of Bern , Bern , Switzerland
| | - Robert Greif
- b Department of Anaesthesiology and Pain Medicine , Bern University Hospital , Bern , Switzerland
| | - Christoph Berendonk
- a Department of Assessment and Evaluation , Institute of Medical Education, University of Bern , Bern , Switzerland
| | - John Norcini
- c Foundation for Avancement of International Medical Education and Research (FAIMER) , Philadelphia , PA , USA
| | - Sören Huwendiek
- a Department of Assessment and Evaluation , Institute of Medical Education, University of Bern , Bern , Switzerland
| |
Collapse
|
24
|
Alkureishi MA, Lee WW, Lyons M, Wroblewski K, Farnan JM, Arora VM. Electronic-clinical evaluation exercise (e-CEX): A new patient-centered EHR use tool. PATIENT EDUCATION AND COUNSELING 2018; 101:481-489. [PMID: 29042145 DOI: 10.1016/j.pec.2017.10.005] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/29/2017] [Revised: 10/04/2017] [Accepted: 10/07/2017] [Indexed: 05/26/2023]
Abstract
INTRODUCTION Electronic Health Record (EHR) use can enhance or weaken patient-provider communication. Despite EHR adoption, no validated tool exists to assess EHR communication skills. We aimed to develop and validate such a tool. METHODS Electronic-Clinical Evaluation Exercise (e-CEX) is a 10-item-tool based on systematic literature review and pilot-testing. Second-year (MS2s) students participated in an EHR-use lecture and structured Clinical Examination (OSCE). Untrained third-year students (MS3s) participated in the same OSCE. OSCEs were scored with e-CEX compared to a standardized patient (SP) tool. Internal consistency, discriminant validity, and concurrent validity were analyzed. RESULTS Three investigators used e-CEX to rate 70 videos (20 MS2, 50 MS3). Reliability testing indicated high internal consistency (Cronbach's alpha=0.89). MS2s scored significantly higher than untrained MS3s on e-CEX [e-CEX 55(10.7) vs. 44.9 (12.7), P=0.003], providing evidence of discriminant validity. e-CEX and SP score correlation was high (Pearson correlation=0.74, P<0.001), providing concurrent validity evidence. Item reduction suggested a three-item tool had similar explanatory power (R-squared=0.85 vs 0.86). CONCLUSION e-CEX is a reliable, valid tool to assess medical student patient-centered EHR communication skills. PRACTICE IMPLICATIONS While validation is needed with other healthcare providers, e-CEX may help improve provider behaviors and enhance patients' overall experience of EHR use in their care.
Collapse
Affiliation(s)
| | - Wei Wei Lee
- Department of Medicine, University of Chicago, Chicago, United States
| | - Maureen Lyons
- Division of General Internal Medicine, Saint Louis University School of Medicine, St. Louis, United States
| | - Kristen Wroblewski
- Department of Public Health Sciences, University of Chicago, Chicago, United States
| | - Jeanne M Farnan
- Department of Medicine, University of Chicago, Chicago, United States
| | - Vineet M Arora
- Department of Medicine, University of Chicago, Chicago, United States
| |
Collapse
|
25
|
Kogan JR, Hatala R, Hauer KE, Holmboe E. Guidelines: The do's, don'ts and don't knows of direct observation of clinical skills in medical education. PERSPECTIVES ON MEDICAL EDUCATION 2017; 6:286-305. [PMID: 28956293 PMCID: PMC5630537 DOI: 10.1007/s40037-017-0376-7] [Citation(s) in RCA: 86] [Impact Index Per Article: 12.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/12/2023]
Abstract
INTRODUCTION Direct observation of clinical skills is a key assessment strategy in competency-based medical education. The guidelines presented in this paper synthesize the literature on direct observation of clinical skills. The goal is to provide a practical list of Do's, Don'ts and Don't Knows about direct observation for supervisors who teach learners in the clinical setting and for educational leaders who are responsible for clinical training programs. METHODS We built consensus through an iterative approach in which each author, based on their medical education and research knowledge and expertise, independently developed a list of Do's, Don'ts, and Don't Knows about direct observation of clinical skills. Lists were compiled, discussed and revised. We then sought and compiled evidence to support each guideline and determine the strength of each guideline. RESULTS A final set of 33 Do's, Don'ts and Don't Knows is presented along with a summary of evidence for each guideline. Guidelines focus on two groups: individual supervisors and the educational leaders responsible for clinical training programs. Guidelines address recommendations for how to focus direct observation, select an assessment tool, promote high quality assessments, conduct rater training, and create a learning culture conducive to direct observation. CONCLUSIONS High frequency, high quality direct observation of clinical skills can be challenging. These guidelines offer important evidence-based Do's and Don'ts that can help improve the frequency and quality of direct observation. Improving direct observation requires focus not just on individual supervisors and their learners, but also on the organizations and cultures in which they work and train. Additional research to address the Don't Knows can help educators realize the full potential of direct observation in competency-based education.
Collapse
Affiliation(s)
- Jennifer R Kogan
- Perelman School of Medicine at the University of Pennsylvania, Philadelphia, PA, USA.
| | - Rose Hatala
- University of British Columbia, Vancouver, British Columbia, Canada
| | - Karen E Hauer
- University of California San Francisco, San Francisco, CA, USA
| | - Eric Holmboe
- Accreditation Council of Graduate Medical Education, Chicago, IL, USA
| |
Collapse
|
26
|
Mortaz Hejri S, Jalili M, Shirazi M, Masoomi R, Nedjat S, Norcini J. The utility of mini-Clinical Evaluation Exercise (mini-CEX) in undergraduate and postgraduate medical education: protocol for a systematic review. Syst Rev 2017; 6:146. [PMID: 28720128 PMCID: PMC5516345 DOI: 10.1186/s13643-017-0539-y] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/06/2017] [Accepted: 07/03/2017] [Indexed: 11/12/2022] Open
Abstract
BACKGROUND One of the most frequently used assessment tools that measure the trainees' performance in workplace is the mini-Clinical Evaluation Exercise (mini-CEX), in which an expert observes and rates the actual performance of trainees. Several primary studies have evaluated the effectiveness of mini-CEX by assessing its educational and psychometric properties. The objective of this BEME review is to explore, analyze, and synthesize the evidence considering the utility of the mini-CEX for assessing undergraduate and postgraduate medical trainees. METHODS Studies reporting on mini-CEX performed in undergraduate and postgraduate medical education and providing some empirical data for mini-CEX in relation to one or more of the validity, reliability, educational impact, acceptability, and cost of mini-CEX will be included in the review. No restrictions on study design or publication date or language will be handled. To ensure comprehensiveness of our search, we will use different approaches and methods. In addition to electronic search in bibliographic databases, we will conduct forward and backward searching. We will also contact leading authors in the field of mini-CEX and will search for the gray literature. Data extractions will be done independently by two coders based on a form. If there is any discordance, a third author will resolve it. The quality assessment will be also done independently by two team members, based on critical appraisal checklists. In attempting to answer our original research questions, we will use meta-analysis or meta-synthesis. DISCUSSION The findings of this study can be transferred to the medical education stakeholders such as administrators of medical schools, residency program directors, and faculty members. We also hope that publication of this review will encourage stakeholders who have already adopted the mini-CEX to evaluate and report its different characteristics. Lastly, we expect that we can identify gap of knowledge in this field and suggest areas for future research.
Collapse
Affiliation(s)
- Sara Mortaz Hejri
- Department of Medical Education, Health Professions Education Research Center, Tehran University of Medical Sciences, Tehran, Iran
| | - Mohammad Jalili
- Department of Medical Education, Department of Emergency Medicine, Tehran University of Medical Sciences, 57, Hojjatdoust St, Keshavarz Blvd, Tehran, Iran.
| | - Mandana Shirazi
- Education Development Center, Department of Medical Education, Tehran University of Medical Sciences, Tehran, Iran.,Department of Clinical Science and Education at SOS Hospital, Karolina Institute, Stockholm, Sweden
| | - Rasoul Masoomi
- Department of Medical Education, Tehran University of Medical Sciences, Tehran, Iran
| | - Saharnaz Nedjat
- Department of Epidemiology and Biostatistics, School of Public Health, Tehran University of Medical Sciences, Tehran, Iran
| | - John Norcini
- Foundation for Advancement of International Medical Education and Research (FAIMER), Philadelphia, PA, USA
| |
Collapse
|
27
|
Walsh E, Foley T, Sinnott C, Boyle S, Smithson WH. Developing and piloting a resource for training assessors in use of the Mini-CEX (mini clinical evaluation exercise). EDUCATION FOR PRIMARY CARE 2017; 28:243-245. [PMID: 28110625 DOI: 10.1080/14739879.2017.1280694] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
Affiliation(s)
- Elaine Walsh
- a Department of General Practice , University College Cork , Cork , Ireland
| | - Tony Foley
- a Department of General Practice , University College Cork , Cork , Ireland
| | - Carol Sinnott
- a Department of General Practice , University College Cork , Cork , Ireland
| | - Siobhan Boyle
- a Department of General Practice , University College Cork , Cork , Ireland
| | - W Henry Smithson
- a Department of General Practice , University College Cork , Cork , Ireland
| |
Collapse
|
28
|
Weller JM, Castanelli DJ, Chen Y, Jolly B. Making robust assessments of specialist trainees' workplace performance. Br J Anaesth 2017; 118:207-214. [PMID: 28100524 DOI: 10.1093/bja/aew412] [Citation(s) in RCA: 48] [Impact Index Per Article: 6.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 11/15/2016] [Indexed: 11/12/2022] Open
Abstract
BACKGROUND Workplace-based assessments should provide a reliable measure of trainee performance, but have met with mixed success. We proposed that using an entrustability scale, where supervisors scored trainees on the level of supervision required for the case would improve the utility of compulsory mini-clinical evaluation exercise (CEX) assessments in a large anaesthesia training program. METHODS We analysed mini-CEX scores from all Australian and New Zealand College of Anaesthetists trainees submitted to an online database over a 12-month period. Supervisors' scores were adjusted for the expected supervision requirement for the case for trainees at different stages of training. We used generalisability theory to determine score reliability. RESULTS 7808 assessments were available for analysis. Supervision requirements decreased significantly (P < 0.05) with increased duration and level of training, supporting validity. We found moderate reliability (G > 0.7) with a feasible number of assessments. Adjusting scores against the expected supervision requirement considerably improved reliability, with G > 0.8 achieved with only nine assessments. Three per cent of trainees generated average mini-CEX scores below the expected standard. CONCLUSIONS Using an entrustment scoring system, where supervisors score trainees on the level of supervision required, mini-CEX scores demonstrated moderate reliability within a feasible number of assessments, and evidence of validity. When scores were adjusted against an expected standard, underperforming trainees could be identified, and reliability much improved. Taken together with other evidence on trainee ability, the mini-CEX is of sufficient reliability for inclusion in high stakes decisions on trainee progression towards independent specialist practice.
Collapse
Affiliation(s)
- J M Weller
- Centre for Medical and Health Sciences Education, School of Medicine, University of Auckland, New Zealand .,Department of Anaesthesia, Auckland City Hospital, New Zealand
| | - D J Castanelli
- Department of Anaesthesia and Perioperative Medicine, Monash Health, Victoria, Australia.,Department of Anaesthesia and Perioperative Medicine, Monash University, Clayton, Victoria, Australia
| | - Y Chen
- Centre for Medical and Health Sciences Education, School of Medicine, University of Auckland, New Zealand
| | - B Jolly
- Medical Education Unit, School of Medicine and Public Health, Faculty of Health and Medicine, University of Newcastle, New South Wales, Australia
| |
Collapse
|
29
|
Lewis TL, Sagmeister ML, Miller GW, Boissaud-Cooke MA, Abrahams PH. Anatomy, radiology, and practical procedure education for foundation doctors in England: A National Observational Study. Clin Anat 2016; 29:982-990. [DOI: 10.1002/ca.22783] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/28/2016] [Revised: 08/25/2016] [Accepted: 08/26/2016] [Indexed: 01/08/2023]
Affiliation(s)
- Thomas L. Lewis
- St George's Hospital; Blackshaw Road London SW17 0QT United Kingdom
| | | | - George W. Miller
- King's College London School of Medicine; Strand London WC2R 2LS United Kingdom
| | | | - Peter H. Abrahams
- Warwick Medical School, University of Warwick; Gibbet Hill Road Coventry CV4 7AL United Kingdom
| |
Collapse
|
30
|
Watling C, LaDonna KA, Lingard L, Voyer S, Hatala R. 'Sometimes the work just needs to be done': socio-cultural influences on direct observation in medical training. MEDICAL EDUCATION 2016; 50:1054-64. [PMID: 27628722 DOI: 10.1111/medu.13062] [Citation(s) in RCA: 79] [Impact Index Per Article: 9.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/07/2015] [Revised: 12/08/2015] [Accepted: 02/26/2016] [Indexed: 05/14/2023]
Abstract
CONTEXT Direct observation promises to strengthen both coaching and assessment, and calls for its increased use in medical training abound. Despite its apparent potential, the uptake of direct observation in medical training remains surprisingly limited outside the formal assessment setting. The limited uptake of observation raises questions about cultural barriers to its use. In this study, we explore the influence of professional culture on the use of direct observation within medical training. METHODS Using a constructivist grounded theory approach, we interviewed 22 residents or fellows (10 male, 12 female) about their experiences of being observed during training. Participants represented a range of specialties and training levels. Data collection and analysis were conducted iteratively. Themes were identified using constant comparative analysis. RESULTS Observation was used selectively; specialties tended to observe the clinical acts that they valued most. Despite these differences, we found two cultural values that consistently challenged the ready implementation of direct observation across specialties: (i) autonomy in learning and (ii) efficiency in health care provision. Furthermore, we found that direct observation was a primarily learner-driven activity, which left learners caught in the middle, wanting observation but also wanting to appear independent and efficient. CONCLUSIONS The cultural values of autonomy in learning and practice and efficiency in health care provision challenge the integration of direct observation into clinical training. Medical learners are often expected to ask for observation, but such requests are socially and culturally fraught, and likely to constrain the wider uptake of direct observation.
Collapse
Affiliation(s)
- Christopher Watling
- Department of Clinical Neurological Sciences, Schulich School of Medicine and Dentistry, Western University, London, Ontario, Canada.
| | - Kori A LaDonna
- Centre for Education Research & Innovation, Schulich School of Medicine and Dentistry, Western University, London, Ontario, Canada
| | - Lorelei Lingard
- Department of Medicine and Centre for Education Research & Innovation, Schulich School of Medicine and Dentistry, Western University, London, Ontario, Canada
| | - Stephane Voyer
- Department of Medicine, University of British Columbia, Vancouver, British Columbia, Canada
| | - Rose Hatala
- Department of Medicine, University of British Columbia, Vancouver, British Columbia, Canada
| |
Collapse
|
31
|
Gaunt A, Patel A, Royle J, Fallis S, Almond M, Mylvaganam S, Rusius V, Markham DH, Pawlikowska T. What do surgeons and trainees think of WBAs and how do they use them? ACTA ACUST UNITED AC 2016. [DOI: 10.1308/rcsbull.2016.408] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
Abstract
Differences in how workplace-based assessments are viewed by trainees and those who train them.
Collapse
Affiliation(s)
- A Gaunt
- Heart of England NHS Foundation Trust
| | - A Patel
- University Hospitals of Coventry and Warwickshire NHS Trust
| | - J Royle
- Shrewsbury and Telford Hospital NHS Trust
| | | | - M Almond
- Heart of England NHS Foundation Trust
| | | | - V Rusius
- Dudley Group NHS Foundation Trust
| | | | | |
Collapse
|
32
|
Maweni RM, Foley RW, Lupi M, Shier D, Ronan O'Connell P, Vig S. Surgical learning activities for house officers: do they improve the surgical experience? Ir J Med Sci 2016; 185:913-919. [PMID: 27585806 DOI: 10.1007/s11845-016-1495-6] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/19/2016] [Accepted: 08/22/2016] [Indexed: 11/27/2022]
Abstract
OBJECTIVE To ascertain whether house officers (HOs) attain a more satisfactory surgical rotation experience when they perform basic surgical learning activities. We also sought to establish how many and which learning activities HOs achieve and the effect on their surgical experience. METHODOLOGY A questionnaire listing 20 learning activities and questions regarding satisfaction with an overall experience was disseminated to HOs in the UK and Ireland who had completed ≥3 months of surgical rotations. Satisfaction with surgical experience was dichotomised in order to perform logistic regression using R Studio software v0.98. RESULTS The survey was completed by 150 respondents, with 26 % completing at least 10 basic surgical learning activities during their surgical rotation. On multivariate analysis, the completion of these learning activities was significantly associated with a satisfactory rotation experience (p < 0.001). Furthermore, the use of a checklist of surgical activities provided to HOs was associated with a significant increase in the performance of learning activities (p = 0.003). CONCLUSION Surgical HOs who were informed about potential basic surgical learning activities that can be performed during their rotations performed significantly more of these activities. And these activities were associated with a significantly greater satisfaction with surgical rotations. Therefore, we recommend facilitating HOs completion of these activities as this will ensure that basic surgical competencies are achieved and that HOs will be more satisfied with their surgical experience.
Collapse
Affiliation(s)
- R M Maweni
- Croydon University Hospital, London, UK.
| | - R W Foley
- UCD School of Medicine and Medical Science, University College Dublin, Dublin, Ireland
| | - M Lupi
- Croydon University Hospital, London, UK
| | - D Shier
- Kingston Hospital, London, UK
| | - P Ronan O'Connell
- UCD School of Medicine and Medical Science, University College Dublin, Dublin, Ireland
- St Vincent's University Hospital, Dublin, Ireland
| | - S Vig
- Croydon University Hospital, London, UK
| |
Collapse
|
33
|
Massie J, Ali JM. Workplace-based assessment: a review of user perceptions and strategies to address the identified shortcomings. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2016; 21:455-73. [PMID: 26003590 DOI: 10.1007/s10459-015-9614-0] [Citation(s) in RCA: 77] [Impact Index Per Article: 9.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/24/2014] [Accepted: 05/13/2015] [Indexed: 05/12/2023]
Abstract
Workplace based assessments (WBAs) are now commonplace in postgraduate medical training. User acceptability and engagement is essential to the success of any medical education innovation. To this end, possessing an insight into trainee and trainer perceptions towards WBAs will help identify the major problems, permitting strategies to be introduced to improve WBA implementation. A review of literature was performed to identify studies examining trainee and trainer perceptions towards WBAs. Studies were excluded if non-English or sampling a non-medical/dental population. The identified literature was synthesised for the purpose of this critical narrative review. It is clear that there is widespread negativity towards WBAs in the workplace. This has negatively impacted on the effectiveness of WBA tools as learning aids. This negativity exists in trainees but also to an extent in their trainers. Insight gained from the literature reveals three dominant problems with WBA implementation: poor understanding as to the purpose of WBAs; insufficient time available for undertaking these assessments; and inadequate training of trainers. Approaches to addressing these three problems with WBA implementation are discussed. It is likely that a variety of solutions will be required. The prevalence of negativity towards WBAs is substantial in both trainees and trainers, eroding the effectiveness of learning that is consequent upon them. The educational community must now listen to the concerns being raised by the users and consider the range of strategies being proposed to improve the experiences of trainees, and their trainers.
Collapse
Affiliation(s)
- Jonathan Massie
- School of Clinical Medicine, University of Cambridge, Cambridge, UK
| | - Jason M Ali
- Department of Surgery, University of Cambridge, BOX 202, Addenbrookes Hospital, Cambridge, CB2 0QQ, UK.
| |
Collapse
|
34
|
Foley T, Walsh E, Sweeney C, James M, Maher B, O'Flynn S. Training the assessors: A Mini-CEX workshop for GPs who assess undergraduate medical students. EDUCATION FOR PRIMARY CARE 2016; 26:446-7. [PMID: 26808954 DOI: 10.1080/14739879.2015.1101860] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Affiliation(s)
- Tony Foley
- a General Practitioner & Lecturer, Department of General Practice , University College Cork
| | - Elaine Walsh
- a General Practitioner & Lecturer, Department of General Practice , University College Cork
| | - Catherine Sweeney
- b Lecturer and Lead for Faculty Development, Medical Education Unit, University College Cork
| | - Mark James
- c Lecturer in Ophthalmology, Medical Education Unit, University College Cork
| | - Bridget Maher
- d Senior Lecturer, Medical Education Unit, University College Cork
| | - Siún O'Flynn
- e Head of Medical Education Unit, University College Cork
| |
Collapse
|
35
|
Lefroy J, Watling C, Teunissen PW, Brand P. Guidelines: the do's, don'ts and don't knows of feedback for clinical education. PERSPECTIVES ON MEDICAL EDUCATION 2015; 4:284-299. [PMID: 26621488 PMCID: PMC4673072 DOI: 10.1007/s40037-015-0231-7] [Citation(s) in RCA: 184] [Impact Index Per Article: 20.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/09/2023]
Abstract
INTRODUCTION The guidelines offered in this paper aim to amalgamate the literature on formative feedback into practical Do's, Don'ts and Don't Knows for individual clinical supervisors and for the institutions that support clinical learning. METHODS The authors built consensus by an iterative process. Do's and Don'ts were proposed based on authors' individual teaching experience and awareness of the literature, and the amalgamated set of guidelines were then refined by all authors and the evidence was summarized for each guideline. Don't Knows were identified as being important questions to this international group of educators which if answered would change practice. The criteria for inclusion of evidence for these guidelines were not those of a systematic review, so indicators of strength of these recommendations were developed which combine the evidence with the authors' consensus. RESULTS A set of 32 Do and Don't guidelines with the important Don't Knows was compiled along with a summary of the evidence for each. These are divided into guidelines for the individual clinical supervisor giving feedback to their trainee (recommendations about both the process and the content of feedback) and guidelines for the learning culture (what elements of learning culture support the exchange of meaningful feedback, and what elements constrain it?) CONCLUSION Feedback is not easy to get right, but it is essential to learning in medicine, and there is a wealth of evidence supporting the Do's and warning against the Don'ts. Further research into the critical Don't Knows of feedback is required. A new definition is offered: Helpful feedback is a supportive conversation that clarifies the trainee's awareness of their developing competencies, enhances their self-efficacy for making progress, challenges them to set objectives for improvement, and facilitates their development of strategies to enable that improvement to occur.
Collapse
Affiliation(s)
- Janet Lefroy
- Keele University School of Medicine, Clinical Education Centre RSUH, ST4 6QG, Staffordshire, UK.
| | - Chris Watling
- Schulich School of Medicine and Dentistry, Western University, Ontario, Canada
| | - Pim W Teunissen
- Maastricht University and VU University Medical Center, Amsterdam, The Netherlands
| | - Paul Brand
- Isala Klinieken, Zwolle, The Netherlands
| |
Collapse
|
36
|
Rogausch A, Beyeler C, Montagne S, Jucker-Kupper P, Berendonk C, Huwendiek S, Gemperli A, Himmel W. The influence of students' prior clinical skills and context characteristics on mini-CEX scores in clerkships--a multilevel analysis. BMC MEDICAL EDUCATION 2015; 15:208. [PMID: 26608836 PMCID: PMC4658793 DOI: 10.1186/s12909-015-0490-3] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/31/2015] [Accepted: 11/19/2015] [Indexed: 05/23/2023]
Abstract
BACKGROUND In contrast to objective structured clinical examinations (OSCEs), mini-clinical evaluation exercises (mini-CEXs) take place at the clinical workplace. As both mini-CEXs and OSCEs assess clinical skills, but within different contexts, this study aims at analyzing to which degree students' mini-CEX scores can be predicted by their recent OSCE scores and/or context characteristics. METHODS Medical students participated in an end of Year 3 OSCE and in 11 mini-CEXs during 5 different clerkships of Year 4. The students' mean scores of 9 clinical skills OSCE stations and mean 'overall' and 'domain' mini-CEX scores, averaged over all mini-CEXs of each student were computed. Linear regression analyses including random effects were used to predict mini-CEX scores by OSCE performance and characteristics of clinics, trainers, students and assessments. RESULTS A total of 512 trainers in 45 clinics provided 1783 mini-CEX ratings for 165 students; OSCE results were available for 144 students (87%). Most influential for the prediction of 'overall' mini-CEX scores was the trainers' clinical position with a regression coefficient of 0.55 (95%-CI: 0.26-0.84; p < .001) for residents compared to heads of department. Highly complex tasks and assessments taking place in large clinics significantly enhanced 'overall' mini-CEX scores, too. In contrast, high OSCE performance did not significantly increase 'overall' mini-CEX scores. CONCLUSION In our study, Mini-CEX scores depended rather on context characteristics than on students' clinical skills as demonstrated in an OSCE. Ways are discussed which focus on either to enhance the scores' validity or to use narrative comments only.
Collapse
Affiliation(s)
- Anja Rogausch
- Department of Assessment and Evaluation, Institute of Medical Education, University of Bern, Bern, Switzerland.
- Clinic Sonnenhalde, Riehen, Switzerland.
| | - Christine Beyeler
- Department of Assessment and Evaluation, Institute of Medical Education, University of Bern, Bern, Switzerland.
| | - Stephanie Montagne
- Department of Assessment and Evaluation, Institute of Medical Education, University of Bern, Bern, Switzerland.
| | - Patrick Jucker-Kupper
- Department of Assessment and Evaluation, Institute of Medical Education, University of Bern, Bern, Switzerland.
| | - Christoph Berendonk
- Department of Assessment and Evaluation, Institute of Medical Education, University of Bern, Bern, Switzerland.
| | - Sören Huwendiek
- Department of Assessment and Evaluation, Institute of Medical Education, University of Bern, Bern, Switzerland.
| | - Armin Gemperli
- Department of Health Sciences and Health Policy, University of Lucerne, Lucerne, Switzerland.
- Swiss Paraplegic Research Nottwil, Nottwil, Switzerland.
| | - Wolfgang Himmel
- Department of General Practice, University Medical Center, Göttingen, Germany.
| |
Collapse
|
37
|
Nesbitt C, Phillips AW, Searle R, Stansby G. Student Views on the Use of 2 Styles of Video-Enhanced Feedback Compared to Standard Lecture Feedback During Clinical Skills Training. JOURNAL OF SURGICAL EDUCATION 2015; 72:969-973. [PMID: 26143520 DOI: 10.1016/j.jsurg.2015.04.017] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/24/2015] [Revised: 04/09/2015] [Accepted: 04/18/2015] [Indexed: 06/04/2023]
Abstract
BACKGROUND Feedback plays an important role in the learning process. However, often this may be delivered in an unstructured fashion that can detract from its potential benefit. Further, students may have different preferences in how feedback should be delivered, which may be influenced by which method they feel will lead to the most effective learning. The aim of this study was to evaluate student views on 3 different modes of feedback particularly in relation to the benefit each conferred. METHODS Undergraduate medical students participating in a surgical suturing study were asked to give feedback using a semi-structured questionnaire. Discrete questions using a Likert scale and open responses were solicited. Students received either standard lecture feedback (SLF), individualized video feedback (IVF), or enhanced unsupervised video feedback (UVF). RESULTS Students had a strong preference for IVF over UVF or SLF. These responses correlated with their perception of how much each type of feedback improved their performance. However, there was no statistical difference in suturing skill improvement between IVF and UVF, which were both significantly better than SLF. CONCLUSION Students have a strong preference for IVF. This relates to a perception that this will lead to the greatest level of skill improvement. However, an equal effect in improvement can be achieved by using less resource-demanding UVF.
Collapse
Affiliation(s)
- Craig Nesbitt
- Colorectal Surgery, Sunderland Royal Hospital, Sunderland, United Kingdom.
| | - Alex W Phillips
- Colorectal Surgery, Sunderland Royal Hospital, Sunderland, United Kingdom
| | - Roger Searle
- Colorectal Surgery, Sunderland Royal Hospital, Sunderland, United Kingdom
| | - Gerard Stansby
- Colorectal Surgery, Sunderland Royal Hospital, Sunderland, United Kingdom
| |
Collapse
|
38
|
Kirby J, Baird D, Burton K, Taylor E. Practitioner research and formative assessment. CLINICAL TEACHER 2015; 13:28-32. [PMID: 26177776 DOI: 10.1111/tct.12346] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
BACKGROUND Several workplace-based formative assessment (WFA) tools exist; however, text-based feedback is minimal but valuable because residents have difficulty extracting meaning from numerical scores or rankings. Our programme lacked a formal WFA programme, so we aimed to develop and assess a primarily text-based tool, named formative assessment of skills in training (FAST), using action research. METHODS Action research (AR) methods, including iterative plan-act-observe-reflect cycles that included the FAST tool and our clinical context. Thirteen residents and 11 faculty members performed 133 assessments during three study cycles, and responded to post-use surveys. RESULTS Overall, 83 per cent of participants indicated that FAST should be added to the resident curriculum. Time was perceived to be a barrier; however, time studies found that it did not prolong resident wait or patient visits. Changes were made to increase space for comments on the FAST form, and subsequently comment specificity and length increased. DISCUSSION The FAST facilitates WFA in our programme and allows for specific written feedback with a copy retained by the resident. On average, it adds about 3 minutes to a clinic visit. The AR method facilitated WFA use, stakeholder buy-in, and FAST tool improvements.
Collapse
Affiliation(s)
- Joslyn Kirby
- Department of Dermatology, Penn State Hershey Medical Center, Hershey, Pennsylvania, USA
| | - David Baird
- Department of Dermatology, Penn State Hershey Medical Center, Hershey, Pennsylvania, USA
| | - Kaleen Burton
- Penn State College of Medicine, Hershey, Pennsylvania, USA
| | - Edward Taylor
- Department of Behavioral Sciences and Education, Penn State Harrisburg, Middletown, Pennsylvania, USA
| |
Collapse
|
39
|
Heeneman S, Oudkerk Pool A, Schuwirth LWT, van der Vleuten CPM, Driessen EW. The impact of programmatic assessment on student learning: theory versus practice. MEDICAL EDUCATION 2015; 49:487-98. [PMID: 25924124 DOI: 10.1111/medu.12645] [Citation(s) in RCA: 117] [Impact Index Per Article: 13.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/15/2014] [Revised: 09/15/2014] [Accepted: 10/21/2014] [Indexed: 05/07/2023]
Abstract
CONTEXT It is widely acknowledged that assessment can affect student learning. In recent years, attention has been called to 'programmatic assessment', which is intended to optimise both learning functions and decision functions at the programme level of assessment, rather than according to individual methods of assessment. Although the concept is attractive, little research into its intended effects on students and their learning has been conducted. OBJECTIVES This study investigated the elements of programmatic assessment that students perceived as supporting or inhibiting learning, and the factors that influenced the active construction of their learning. METHODS The study was conducted in a graduate-entry medical school that implemented programmatic assessment. Thus, all assessment information, feedback and reflective activities were combined into a comprehensive, holistic programme of assessment. We used a qualitative approach and interviewed students (n = 17) in the pre-clinical phase of the programme about their perceptions of programmatic assessment and learning approaches. Data were scrutinised using theory-based thematic analysis. RESULTS Elements from the comprehensive programme of assessment, such as feedback, portfolios, assessments and assignments, were found to have both supporting and inhibiting effects on learning. These supporting and inhibiting elements influenced students' construction of learning. Findings showed that: (i) students perceived formative assessment as summative; (ii) programmatic assessment was an important trigger for learning, and (iii) the portfolio's reflective activities were appreciated for their generation of knowledge, the lessons drawn from feedback, and the opportunities for follow-up. Some students, however, were less appreciative of reflective activities. For these students, the elements perceived as inhibiting seemed to dominate the learning response. CONCLUSIONS The active participation of learners in their own learning is possible when learning is supported by programmatic assessment. Certain features of the comprehensive programme of assessment were found to influence student learning, and this influence can either support or inhibit students' learning responses.
Collapse
Affiliation(s)
- Sylvia Heeneman
- Department of Pathology, Faculty of Health, Medicine and Life Sciences, Maastricht University, Maastricht, The Netherlands; Faculty of Health, Medicine and Life Sciences, School of Health Professions Education, Maastricht University, Maastricht, The Netherlands
| | | | | | | | | |
Collapse
|
40
|
Christen HJ, Kordonouri O, Lange K, Berendonk C. Pilotstudie zum interprofessionellen Feedback in der pädiatrischen Weiterbildung. Monatsschr Kinderheilkd 2015. [DOI: 10.1007/s00112-015-3324-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
|
41
|
Gingerich A, Kogan J, Yeates P, Govaerts M, Holmboe E. Seeing the 'black box' differently: assessor cognition from three research perspectives. MEDICAL EDUCATION 2014; 48:1055-68. [PMID: 25307633 DOI: 10.1111/medu.12546] [Citation(s) in RCA: 165] [Impact Index Per Article: 16.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/10/2014] [Revised: 06/04/2014] [Accepted: 07/11/2014] [Indexed: 05/09/2023]
Abstract
CONTEXT Performance assessments, such as workplace-based assessments (WBAs), represent a crucial component of assessment strategy in medical education. Persistent concerns about rater variability in performance assessments have resulted in a new field of study focusing on the cognitive processes used by raters, or more inclusively, by assessors. METHODS An international group of researchers met regularly to share and critique key findings in assessor cognition research. Through iterative discussions, they identified the prevailing approaches to assessor cognition research and noted that each of them were based on nearly disparate theoretical frameworks and literatures. This paper aims to provide a conceptual review of the different perspectives used by researchers in this field using the specific example of WBA. RESULTS Three distinct, but not mutually exclusive, perspectives on the origins and possible solutions to variability in assessment judgements emerged from the discussions within the group of researchers: (i) the assessor as trainable: assessors vary because they do not apply assessment criteria correctly, use varied frames of reference and make unjustified inferences; (ii) the assessor as fallible: variations arise as a result of fundamental limitations in human cognition that mean assessors are readily and haphazardly influenced by their immediate context, and (iii) the assessor as meaningfully idiosyncratic: experts are capable of making sense of highly complex and nuanced scenarios through inference and contextual sensitivity, which suggests assessor differences may represent legitimate experience-based interpretations. CONCLUSIONS Although each of the perspectives discussed in this paper advances our understanding of assessor cognition and its impact on WBA, every perspective has its limitations. Following a discussion of areas of concordance and discordance across the perspectives, we propose a coexistent view in which researchers and practitioners utilise aspects of all three perspectives with the goal of advancing assessment quality and ultimately improving patient care.
Collapse
Affiliation(s)
- Andrea Gingerich
- Northern Medical Program, University of Northern British Columbia, Prince George, British Columbia, Canada
| | | | | | | | | |
Collapse
|
42
|
Rees CE, Cleland JA, Dennis A, Kelly N, Mattick K, Monrouxe LV. Supervised learning events in the foundation programme: a UK-wide narrative interview study. BMJ Open 2014; 4:e005980. [PMID: 25324323 PMCID: PMC4202004 DOI: 10.1136/bmjopen-2014-005980] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/29/2022] Open
Abstract
OBJECTIVES To explore Foundation trainees' and trainers' understandings and experiences of supervised learning events (SLEs), compared with workplace-based assessments (WPBAs), and their suggestions for developing SLEs. DESIGN A narrative interview study based on 55 individual and 19 group interviews. SETTING UK-wide study across three sites in England, Scotland and Wales. PARTICIPANTS Using maximum-variation sampling, 70 Foundation trainees and 40 trainers were recruited, shared their understandings and experiences of SLEs/WPBAs and made recommendations for future practice. METHODS Data were analysed using thematic and discourse analysis and narrative analysis of one exemplar personal incident narrative. RESULTS While participants volunteered understandings of SLEs as learning and assessment, they typically volunteered understandings of WPBAs as assessment. Trainers seemed more likely to describe SLEs as assessment and a 'safety net' to protect patients than trainees. We identified 333 personal incident narratives in our data (221 SLEs; 72 WPBAs). There was perceived variability in the conduct of SLEs/WPBAs in terms of their initiation, tools used, feedback and finalisation. Numerous factors at individual, interpersonal, cultural and technological levels were thought to facilitate/hinder learning. SLE narratives were more likely to be evaluated positively than WPBA narratives overall and by trainees specifically. Participants made sense of their experiences, emotions, identities and relationships through their narratives. They provided numerous suggestions for improving SLEs at individual, interpersonal, cultural and technological levels. CONCLUSIONS Our findings provide tentative support for the shift to formative learning with the introduction of SLEs, albeit raising concerns around trainees' and trainers' understandings about SLEs. We identify five key educational recommendations from our study. Additional research is now needed to explore further the complexities around SLEs within workplace learning.
Collapse
Affiliation(s)
- Charlotte E Rees
- Centre for Medical Education, Medical Education Institute, School of Medicine, University of Dundee, Dundee, UK
| | - Jennifer A Cleland
- Division of Medical and Dental Education, University of Aberdeen, Aberdeen, UK
| | - Ashley Dennis
- Centre for Medical Education, Medical Education Institute, School of Medicine, University of Dundee, Dundee, UK
| | - Narcie Kelly
- University of Exeter Medical School, University of Exeter, Exeter, UK
| | - Karen Mattick
- University of Exeter Medical School, University of Exeter, Exeter, UK
| | - Lynn V Monrouxe
- Office of Research and Scholarship, Institute of Medical Education, Cardiff University, Cardiff, UK
| |
Collapse
|
43
|
Oliphant R, Drummond R, Jackson A, Ross J, Blackhall V, Ridley-Fink E, Parcell S, Renwick A. The use of mini-CEX in UK foundation training six years following its introduction: lessons still to be learned and the benefit of formal teaching regarding its utility. MEDICAL TEACHER 2014; 36:916. [PMID: 24787134 DOI: 10.3109/0142159x.2014.909922] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/03/2023]
Affiliation(s)
- Raymond Oliphant
- Department of General Surgery, Royal Alexandra Hospital , Corsebar Road, Paisley PA2 9PN, Scotland UK
| | | | | | | | | | | | | | | |
Collapse
|
44
|
Fishpool SJC, Stew B, Roberts C. Otolaryngology WBAs in the Wales Deanery: the first six years. ACTA ACUST UNITED AC 2014. [DOI: 10.1308/rcsbull.2014.96.5.164] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
Abstract
‘The provision of excellent care for the surgical patient, delivered safely, is at the heart of the curriculum.’ 1 Workplace-based assessments (WBAs) are an integral part of the assessment component of the UK’s intercollegiate surgical curriculum. The curriculum is web-based and accessed through www.iscp.ac.uk .
Collapse
Affiliation(s)
- SJC Fishpool
- Skull Base Fellow, Deparment of Otolaryngology and Head and Neck Surgery, University of Wales, Cardiff
| | - B Stew
- Otolaryngology ST6, Department of Otolaryngology and Head and Neck Surgery, Royal Gwent Hospital, Newport
| | - C Roberts
- Otolaryngology Consultant,Department of Otolaryngology and Head and Neck Surgery, Princess of Wales Hospital, Bridgend
| |
Collapse
|