1
|
Rai P, Goel A, Bhat SK, Singh A, Srivastava R, Singh S. Assessing Residents in the Department of Surgery at a Tertiary Care Centre Using Mini-Clinical Evaluation Exercise (Mini-CEX). Cureus 2024; 16:e58011. [PMID: 38606026 PMCID: PMC11007447 DOI: 10.7759/cureus.58011] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/06/2024] [Indexed: 04/13/2024] Open
Abstract
OBJECTIVE This study aimed to introduce, sensitize, and train our postgraduate students and faculty of the department of general surgery with the use of mini-Clinical Evaluation Exercise (mini‑CEX) and to assess the perception of students and faculty towards it. MATERIAL AND METHODS A cross‑sectional observational study was conducted over a period of four months. Ten surgery residents in the department were asked to volunteer to participate and five professors conducted the session. Five sessions of mini‑CEX (nine points) were conducted for each resident in different settings of the out‑patient department (OPD) and in‑patient department (IPD). A total of five skills were tested. Feedback from faculty and residents regarding the perception of mini‑CEX was also taken. RESULTS A statistically significant difference in mean scores of all domains was observed comparing the first and last assessment (p<0.05). Hundred percent of the residents scored superior category (7-9) in the final assessment in all domains, whereas the maximum was in a satisfactory scoring grade in 1st assessment. The time taken for the assessment significantly reduced from 1st assessment to the last assessment in OPD and IPD settings (p=0.001). The mini-CEX assessment tool got 100% feedback from faculty in terms of skill improvement, method, attitude of residents, and ability to identify gaps in knowledge. However, one assessor thought that "time given for assessment" was inadequate and more effort was required than the usual traditional assessment methods. The most identified problem faced by residents was that the "time given during assessment" was less (50%); however, overall residents also found it valid, effective, and helpful in identifying knowledge gaps and improving clinical and communication skills. CONCLUSION Mini‑CEX improves the learning environment in residency and also leads to improvement in medical interviewing skills, physical examination skills, humanistic qualities/professionalism, and counseling skills. So, it can be used for residency training in clinical departments.
Collapse
Affiliation(s)
- Priyanka Rai
- General Surgery, Dr. Ram Manohar Lohia Institute of Medical Sciences, Lucknow, IND
| | - Apul Goel
- Urology, King George's Medical University, Lucknow, IND
| | - Sanjay K Bhat
- Surgery, Dr. Ram Manohar Lohia Institute of Medical Sciences, Lucknow, IND
| | - Amarjot Singh
- Surgery, Dr. Ram Manohar Lohia Institute of Medical Sciences, Lucknow, IND
| | - Rohit Srivastava
- General Surgery, Dr. Ram Manohar Lohia Institute of Medical Sciences, Lucknow, IND
| | - Sunil Singh
- General Surgery, Dr. Ram Manohar Lohia Institute of Medical Sciences, Lucknow, IND
| |
Collapse
|
2
|
Gupta SK, Srivastava T. Assessment in Undergraduate Competency-Based Medical Education: A Systematic Review. Cureus 2024; 16:e58073. [PMID: 38738047 PMCID: PMC11088485 DOI: 10.7759/cureus.58073] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/11/2024] [Indexed: 05/14/2024] Open
Abstract
BACKGROUND Studies that have methodically compiled the body of research on the competency-based medical education (CBME) assessment procedure and pinpointed knowledge gaps about the structure of the assessment process are few. Thus, the goals of the study were to create a model assessment framework for competency-based medical education that would be applicable in the Indian setting as well as to thoroughly examine the competency-based medical education assessment framework. METHODS PubMed, MEDLINE (Ovid), EMBASE (Ovid), Scopus, Web of Science, and Google Scholar were the databases that were searched. The search parameters were restricted to English language publications about competency-based education and assessment methods, which were published between January 2006 and December 2020. A descriptive overview of the included research (in tabular form) served as the foundation for the data synthesis. RESULTS Databases provided 732 records; out of which 36 fulfilled the inclusion and exclusion criteria. Thirty-six studies comprised a mix of randomized controlled trials, focus group interviews, and questionnaire studies, including cross-sectional studies, qualitative studies (03), mixed-method studies, etc. The papers were published in 10 different journals. The greatest number was published in BMC Medical Education (18). The average quality score for included studies was 62.53% (range: 35.71-83.33%). Most authors are from the UK (07), followed by the USA (05). The included studies were grouped into seven categories based on their dominant focus: moving away from a behavioristic approach to a constructive approach of assessment (01 studies), formative assessment (FA) and feedback (10 studies), the hurdles in the implementation of feedback (04 studies), utilization of computer or online based formative test with automated feedback (05 studies), video feedback (02 studies), e-learning platforms for formative assessment (04 studies), studies related to workplace-based assessment (WBA)/mini-clinical evaluation exercise (mini-CEX)/direct observation of procedural skills (DOPS) (10 studies). CONCLUSIONS Various constructivist techniques, such as concept maps, portfolios, and rubrics, can be used for assessments. Self-regulated learning, peer feedback, online formative assessment, an online computer-based formative test with automated feedback, the use of a computerized web-based objective structured clinical examination (OSCE) evaluation system, and the use of narrative feedback instead of numerical scores in mini-CEX are all ways to increase student involvement in the design and implementation of the formative assessment.
Collapse
Affiliation(s)
- Sandeep K Gupta
- Pharmacology, Heritage Institute of Medical Sciences, Varanasi, IND
| | | |
Collapse
|
3
|
Guttormsen S, Gogollari A, Huynh-Do U, Schaufelberger M, Huwendiek S, Kunz A, Lahner FM. Developing an Instrument to Evaluate Undergraduate Healthcare Students' Professionalism. PRAXIS 2022; 111:863-870. [PMID: 36415987 DOI: 10.1024/1661-8157/a003934] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/16/2023]
Abstract
Professionalism is a multidimensional quality acquired over time. Undergraduate years lay a foundation for the development of professionalism. Tools monitoring the students' professional development are needed. Our tool development followed three phases: 1) identifying meaningful criteria for professionalism adapted to the education level, 2) developing an evaluation instrument in a process maximising construct validity, 3) testing the evaluation instrument in an interprofessional study. The evaluation instrument proved to be applicable in the field and it meets validity standards. Some differences between professions were found and discussed. Professionality starts to develop during the education, and early monitoring is important to support students' optimal development. The evaluation instrument supports both self- and expert evaluation of healthcare students' professional development.
Collapse
Affiliation(s)
- Sissi Guttormsen
- Institute for Medical Education, Medical Faculty University of Bern, Bern, Switzerland
| | - Artemisa Gogollari
- Institute for Medical Education, Medical Faculty University of Bern, Bern, Switzerland
| | - Uyen Huynh-Do
- Division of Nephrology and Hypertension, University Hospital Bern Inselspital, Bern, Switzerland
| | | | - Sören Huwendiek
- Institute for Medical Education, Medical Faculty University of Bern, Bern, Switzerland
| | - Alexandra Kunz
- Institute for Medical Education, Medical Faculty University of Bern, Bern, Switzerland
- Amt für Justizvollzug of Canton Bern, Bern, Switzerland
| | | |
Collapse
|
4
|
Teichgräber U, Ingwersen M, Bürckenmeyer F, Malouhi A, Arndt C, Herzog A, Franiel T, Mentzel HJ, Aschenbach R. Structured work-based learning in undergraduate clinical radiology immersion experience. BMC MEDICAL EDUCATION 2021; 21:167. [PMID: 33731088 PMCID: PMC7972199 DOI: 10.1186/s12909-021-02592-0] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/10/2020] [Accepted: 03/01/2021] [Indexed: 06/10/2023]
Abstract
BACKGROUND Practical courses in undergraduate medical training often lack a didactic concept. Active participation and learning success largely depend on chance. This study was initiated to evaluate a novel concept of structured work-based learning (WBL) in the course of students' half-day radiology immersion experience (IE). METHODS This prospective, single-centre cohort study included 228 third-year students of the 2019 summer semester who underwent the obligatory radiology IE at a university hospital. The course was based on a novel structured WBL concept that applied established didactic concepts including blended learning, the FAIR principles of feedback, activity, individualization, and relevance, and Peyton's four-step approach. Outcomes of equal weight were student and supervisor satisfaction with the clinical radiology IE assessed by paper-based- and online survey, respectively. Secondary outcome was achievement of intended learning outcomes assessed by means of mini clinical evaluation exercises and personal interviews. RESULTS Satisfaction with structured WBL was high in 99.0% of students. Students' expectations were exceeded, and they felt taken seriously at the professional level. Dissatisfaction was reasoned with quality of learning videos (0.6%), little support by supervisors (0.5%), or inadequate feedback (0.6%). Supervising resident physicians rated achievement of intended learning outcomes regarding cognitive and psychomotor competences as excellent for all students. Personal interviews revealed achievement of affective competence in some students. Twelve of 16 (75.0%) supervising physicians were satisfied with focussing on intended learning outcomes and student preparation for IE. Two of 15 (13.3%) supervisors were unsatisfied with time spent, and 4 of 16 (25%) with the approach of assessment. CONCLUSIONS This study demonstrated that both students and supervisors were satisfied with the novel concept of structured WBL within the scope of clinical radiology IE. Achievement of intended learning outcomes was promising.
Collapse
Affiliation(s)
- Ulf Teichgräber
- Department of Diagnostic and Interventional Radiology, Friedrich-Schiller-University Jena, Jena University Hospital, Am Klinikum 1, 07747, Jena, Germany.
| | - Maja Ingwersen
- Department of Diagnostic and Interventional Radiology, Friedrich-Schiller-University Jena, Jena University Hospital, Am Klinikum 1, 07747, Jena, Germany
| | - Florian Bürckenmeyer
- Department of Diagnostic and Interventional Radiology, Friedrich-Schiller-University Jena, Jena University Hospital, Am Klinikum 1, 07747, Jena, Germany
| | - Amer Malouhi
- Department of Diagnostic and Interventional Radiology, Friedrich-Schiller-University Jena, Jena University Hospital, Am Klinikum 1, 07747, Jena, Germany
| | - Clemens Arndt
- Department of Diagnostic and Interventional Radiology, Friedrich-Schiller-University Jena, Jena University Hospital, Am Klinikum 1, 07747, Jena, Germany
| | - Aimée Herzog
- Department of Diagnostic and Interventional Radiology, Friedrich-Schiller-University Jena, Jena University Hospital, Am Klinikum 1, 07747, Jena, Germany
| | - Tobias Franiel
- Department of Diagnostic and Interventional Radiology, Friedrich-Schiller-University Jena, Jena University Hospital, Am Klinikum 1, 07747, Jena, Germany
| | - Hans-Joachim Mentzel
- Department of Diagnostic and Interventional Radiology, Friedrich-Schiller-University Jena, Jena University Hospital, Am Klinikum 1, 07747, Jena, Germany
| | - René Aschenbach
- Department of Diagnostic and Interventional Radiology, Friedrich-Schiller-University Jena, Jena University Hospital, Am Klinikum 1, 07747, Jena, Germany
| |
Collapse
|
5
|
Ko JJ, Ballard MS, Shenkier T, Simon J, Roze des Ordons A, Fyles G, Lefresne S, Hawley P, Chen C, McKenzie M, Ghement I, Sanders JJ, Bernacki R, Jones S. Serious Illness Conversation-Evaluation Exercise: A Novel Assessment Tool for Residents Leading Serious Illness Conversations. Palliat Med Rep 2020; 1:280-290. [PMID: 34223487 PMCID: PMC8241377 DOI: 10.1089/pmr.2020.0086] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/13/2020] [Indexed: 11/28/2022] Open
Abstract
Background/Objectives: The serious illness conversation (SIC) is an evidence-based framework for conversations with patients about a serious illness diagnosis. The objective of our study was to develop and validate a novel tool, the SIC-evaluation exercise (SIC-Ex), to facilitate assessment of resident-led conversations with oncology patients. Design: We developed the SIC-Ex based on SIC and on the Royal College of Canada Medical Oncology milestones. Seven resident trainees and 10 evaluators were recruited. Each trainee conducted an SIC with a patient, which was videotaped. The evaluators watched the videos and evaluated each trainee by using the novel SIC-Ex and the reference Calgary-Cambridge guide (CCG) at months zero and three. We used Kane's validity framework to assess validity. Results: Intra-class correlation using average SIC-Ex scores showed a moderate level of inter-evaluator agreement (range 0.523–0.822). Most evaluators rated a particular resident similar to the group average, except for one to two evaluator outliers in each domain. Test–retest reliability showed a moderate level of consistency among SIC-Ex scores at months zero and three. Global rating at zero and three months showed fair to good/very good inter-evaluator correlation. Pearson correlation coefficients comparing total SIC-Ex and CCG scores were high for most evaluators. Self-scores by trainees did not correlate well with scores by evaluators. Conclusions: SIC-Ex is the first assessment tool that provides evidence for incorporating the SIG guide framework for evaluation of resident competence. SIC-Ex is conceptually related to, but more specific than, CCG in evaluating serious illness conversation skills.
Collapse
Affiliation(s)
- Jenny J Ko
- Department of Medical Oncology, University of British Columbia, BC Cancer-Abbotsford, Abbotsford, British Columbia, Canada
| | - Mark S Ballard
- Department of Internal Medicine, Chilliwack General Hospital, Chilliwack, British Columbia, Canada
| | - Tamara Shenkier
- Department of Medical Oncology, BC Cancer-Vancouver, Vancouver, British Columbia, Canada
| | - Jessica Simon
- Department of Oncology, University of Calgary, Calgary, Alberta, Canada
| | | | - Gillian Fyles
- BC Centre for Palliative Care, Vancouver, British Columbia, Canada
| | - Shilo Lefresne
- Department of Radiation Oncology, BC Cancer-Vancouver, Vancouver, British Columbia, Canada
| | - Philippa Hawley
- Pain and Symptom Management/Palliative Care Program, BC Cancer-Vancouver, Vancouver, British Columbia, Canada
| | - Charlie Chen
- Department of Oncology, University of Calgary, Calgary, Alberta, Canada
| | - Michael McKenzie
- Department of Radiation Oncology, BC Cancer-Vancouver, Vancouver, British Columbia, Canada
| | | | - Justin J Sanders
- Ariadne Labs, Dana-Farber Cancer Institute, Boston, Massachusetts, USA
| | - Rachelle Bernacki
- Ariadne Labs, Dana-Farber Cancer Institute, Boston, Massachusetts, USA
| | - Scott Jones
- Vancouver Coastal Health, Vancouver, British Columbia, Canada
| |
Collapse
|
6
|
Graddy R, Reynolds SS, Wright SM. Longitudinal resident coaching in the outpatient setting: A novel intervention to improve ambulatory consultation skills. PERSPECTIVES ON MEDICAL EDUCATION 2020; 9:186-190. [PMID: 32232781 PMCID: PMC7283426 DOI: 10.1007/s40037-020-00573-5] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/17/2023]
Abstract
BACKGROUND Direct observation with feedback to learners should be a mainstay in resident education, yet it is infrequently done and its impact on consultation skills has rarely been assessed. APPROACH This project presents the framework and implementation of a longitudinal low-frequency, high-intensity direct observation and coaching intervention, and elaborates on insights learned. Internal medicine interns at one residency training program were randomized to an ambulatory coaching intervention or usual precepting. Over one year, coached interns had three complete primary care visits directly observed by a faculty clinician-coach who provided feedback informed by a behavior checklist. Immediately after each of the coached patient encounters, interns completed a structured self-assessment and coaches led a 30-minute feedback session informed by intern self-reflection and checklist items. Interns with usual precepting had two mini-CEX observations over the course of the year without other formal direct observation in the ambulatory setting. EVALUATION As part of the post-intervention assessment, senior faculty members blinded to intervention and control group assignments evaluated videotaped encounters. Coached interns completed an average of 21/23 behaviors from the checklist, while interns from the control group completed 18 (p < 0.05). The median overall grade for coached interns was B+, compared to B-/C+ for controls (p < 0.05). REFLECTION Coaching interns longitudinally using a behavior checklist is feasible and associated with improved consultation performance. Direct observation of complete clinical encounters followed by systematic coaching is educationally valuable, but time and resource intensive.
Collapse
Affiliation(s)
- Ryan Graddy
- Division of Addiction Medicine, Department of Medicine, Johns Hopkins Bayview Medical Center, Johns Hopkins University School of Medicine, Baltimore, MD, USA.
| | - Stasia S Reynolds
- Division of General Internal Medicine, Department of Medicine, Johns Hopkins Bayview Medical Center, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| | - Scott M Wright
- Division of General Internal Medicine, Department of Medicine, Johns Hopkins Bayview Medical Center, Johns Hopkins University School of Medicine, Baltimore, MD, USA
| |
Collapse
|
7
|
Simulated patient-based teaching of medical students improves pre-anaesthetic assessment. Eur J Anaesthesiol 2020; 37:387-393. [DOI: 10.1097/eja.0000000000001139] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
8
|
Mortaz Hejri S, Jalili M, Masoomi R, Shirazi M, Nedjat S, Norcini J. The utility of mini-Clinical Evaluation Exercise in undergraduate and postgraduate medical education: A BEME review: BEME Guide No. 59. MEDICAL TEACHER 2020; 42:125-142. [PMID: 31524016 DOI: 10.1080/0142159x.2019.1652732] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
Background: This BEME review aims at exploring, analyzing, and synthesizing the evidence considering the utility of the mini-CEX for assessing undergraduate and postgraduate medical trainees, specifically as it relates to reliability, validity, educational impact, acceptability, and cost.Methods: This registered BEME review applied a systematic search strategy in seven databases to identify studies on validity, reliability, educational impact, acceptability, or cost of the mini-CEX. Data extraction and quality assessment were carried out by two authors. Discrepancies were resolved by a third reviewer. Descriptive synthesis was mainly used to address the review questions. A meta-analysis was performed for Cronbach's alpha.Results: Fifty-eight papers were included. Only two studies evaluated all five utility criteria. Forty-seven (81%) of the included studies met seven or more of the quality criteria. Cronbach's alpha ranged from 0.58 to 0.97 (weighted mean = 0.90). Reported G coefficients, Standard error of measurement, and confidence interval were diverse and varied based on the number of encounters and the nested or crossed design of the study. The calculated number of encounters needed for a desirable G coefficient also varied greatly. Content coverage was reported satisfactory in several studies. Mini-CEX discriminated between various levels of competency. Factor analyses revealed a single dimension. The six competencies showed high levels of correlation with statistical significance with the overall competence. Moderate to high correlations between mini-CEX scores and other clinical exams were reported. The mini-CEX improved students' performance in other examinations. By providing a framework for structured observation and feedback, the mini-CEX exerts a favorable educational impact. Included studies revealed that feedback was provided in most encounters but its quality was questionable. The completion rates were generally above 50%. Feasibility and high satisfaction were reported.Conclusion: The mini-CEX has reasonable validity, reliability, and educational impact. Acceptability and feasibility should be interpreted given the required number of encounters.
Collapse
Affiliation(s)
- Sara Mortaz Hejri
- Department of Medical Education, School of Medicine, Tehran University of Medical Sciences, Tehran, Iran
| | - Mohammad Jalili
- Department of Medical Education, School of Medicine, Tehran University of Medical Sciences, Tehran, Iran
- Department of Emergency Medicine, School of Medicine, Tehran University of Medical Sciences, Tehran, Iran
| | - Rasoul Masoomi
- Department of Medical Education, School of Medicine, Tehran University of Medical Sciences, Tehran, Iran
| | - Mandana Shirazi
- Department of Medical Education, School of Medicine, Tehran University of Medical Sciences, Tehran, Iran
- Department of Clinical Science and Education at SOS Hospital, Karolina Institute, Stockholm, Sweden
| | - Saharnaz Nedjat
- Department of Epidemiology and Biostatistics, School of Public Health, Tehran University of Medical Sciences, Tehran, Iran
| | - John Norcini
- Foundation for Advancement of International Medical Education and Research (FAIMER), Philadelphia, PA, USA
| |
Collapse
|