1
|
Cardella L, Lang V, Cross W, Mooney C. Applying a Competency-Based Medical Education Framework to Development of Residents' Feedback Skills. ACADEMIC PSYCHIATRY : THE JOURNAL OF THE AMERICAN ASSOCIATION OF DIRECTORS OF PSYCHIATRIC RESIDENCY TRAINING AND THE ASSOCIATION FOR ACADEMIC PSYCHIATRY 2024; 48:329-333. [PMID: 38740718 DOI: 10.1007/s40596-024-01973-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/17/2023] [Accepted: 04/17/2024] [Indexed: 05/16/2024]
Abstract
OBJECTIVE Feedback is a critically important tool in medical education. This pilot program applies and evaluates a competency-based approach to develop residents' skills in providing feedback to medical students. METHODS In 2018-2019, a competency-based resident feedback skills program incorporating videorecording of skills, multi-source feedback using assessment tools with validity evidence, and sequential deliberate practice was piloted in a single-center, prospective study at the University of Rochester. Study participants included eight second-year psychiatry residents and 23 third-year clerkship students. After an introduction to foundational feedback concepts in didactic sessions, residents were videorecorded providing feedback to medical students. Recordings were reviewed with a faculty member for feedback. Skills were assessed by students who had received resident feedback, residents, and faculty utilizing a tool with validity evidence. Observations were repeated a total of three times. RESULTS Mean feedback scores increased from 2.70 at the first feedback observation, to 2.77 at the second feedback observation, to 2.89 at the third feedback observation (maximum 3.00 points). The differences between the first and third sessions (0.19) and second and third sessions (0.12) were statistically significant (p values were < .001 and .007, with SE of 0.4 and 0.4, respectively). CONCLUSIONS The observed competency-based feedback skills training program for residents using sequential, multi-source review and feedback was feasible and effective. Direct observation is a key component of high-quality feedback, and videorecording is an efficient methodology for observations, enabling both direct observation by the assessor and opportunity for enhanced self-assessment by residents viewing themselves in the feedback encounter.
Collapse
Affiliation(s)
- Laura Cardella
- University of Rochester School of Medicine and Dentistry, Rochester, NY, USA.
| | - Valerie Lang
- University of Rochester School of Medicine and Dentistry, Rochester, NY, USA
| | - Wendi Cross
- University of Rochester School of Medicine and Dentistry, Rochester, NY, USA
| | - Christopher Mooney
- University of Rochester School of Medicine and Dentistry, Rochester, NY, USA
| |
Collapse
|
2
|
Kawaguchi S, Myers J, Li M, Kurahashi AM, Sirianni G, Siemens I. Entrustable Professional Activities in Palliative Medicine: A Faculty and Learner Development Activity. J Palliat Med 2024. [PMID: 39007218 DOI: 10.1089/jpm.2023.0682] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 07/16/2024] Open
Abstract
Background: Faculty development (FD) is critical to the implementation of competency-based medical education (CBME) and yet evidence to guide the design of FD activities is limited. Our aim with this study was to describe and evaluate an FD activity as part of CBME implementation. Methods: Palliative medicine faculty were introduced to entrustable professional activities (EPAs) and gained experience estimating a learner's level of readiness for entrustment by directly observing a simulated encounter. The variation that was found among assessments was discussed in facilitated debrief sessions. Attitudes and confidence levels were measured 1 week and 6 months following debriefs. Results: Participants were able to use the EPA framework when estimating the learner's readiness level for entrustment. Significant improvements in attitudes and level of confidence for several knowledge, skill, and behavior domains were maintained over time. Conclusions: Simulated direct observation and facilitated debriefs contributed to preparing both faculty and learners for CBME and EPA implementation.
Collapse
Affiliation(s)
- Sarah Kawaguchi
- Division of Palliative Care, Division of Palliative Care, Department of Family and Community Medicine, Sinai Health System, University of Toronto, Toronto, Canada
| | - Jeff Myers
- Division of Palliative Care, Division of Palliative Care, Department of Family and Community Medicine, Sinai Health System, University of Toronto, Toronto, Canada
| | - Melissa Li
- Division of Palliative Care, Division of Palliative Care, Department of Family and Community Medicine, Toronto Western Hospital, University Health Network, University of Toronto, Toronto, Canada
| | - Allison M Kurahashi
- The Temmy Latner Centre for Palliative Care, Sinai Health System, Toronto, Canada
| | - Giovanna Sirianni
- Division of Palliative Care, Division of Palliative Care, Department of Family and Community Medicine, Sunnybrook Health Sciences Centre, University of Toronto, Toronto, Canada
| | - Isaac Siemens
- Division of Palliative Care, Division of Palliative Care, Department of Family and Community Medicine, Sinai Health System, University of Toronto, Toronto, Canada
| |
Collapse
|
3
|
Favier R, Proot J, Matiasovic M, Roos A, Knaake F, van der Lee A, den Toom M, Paes G, van Oostrom H, Verstappen F, Beukers M, van den Herik T, Bergknut N. Towards a flexible and personalised development of veterinarians and veterinary nurses working in a companion animal referral care setting. Vet Med Sci 2024; 10:e1518. [PMID: 38952266 PMCID: PMC11217593 DOI: 10.1002/vms3.1518] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2024] [Revised: 05/15/2024] [Accepted: 06/10/2024] [Indexed: 07/03/2024] Open
Abstract
In the Netherlands, the demand for veterinarians and veterinary nurses (VNs) working within referral care is rapidly growing and currently exceeds the amount of available board-certified specialists. Simultaneously, a transparent structure to guide training and development and to assess quality of non-specialist veterinarians and VNs working in a referral setting is lacking. In response, we developed learning pathways guided by an entrustable professional activity (EPA) framework and programmatic assessment to support personalised development and competence of veterinarians and VNs working in referral settings. Between 4 and 35 EPAs varying per discipline (n = 11) were developed. To date, 20 trainees across five disciplines have been entrusted. Trainees from these learning pathways have proceeded to acquire new EPAs in addition to their already entrusted set of EPAs or progressed to specialist training during (n = 3) or after successfully completing (n = 1) the learning pathway. Due to their outcome-based approach, the learning pathways support flexible ways of development.
Collapse
Affiliation(s)
| | - Joachim Proot
- Evidensia Dierenziekenhuis BarendrechtBarendrechtThe Netherlands
| | | | - Arno Roos
- Evidensia Dierenziekenhuis NieuwegeinNieuwegeinThe Netherlands
| | - Frans Knaake
- Evidensia Dierenziekenhuis Den HaagDen HaagThe Netherlands
| | | | | | - Geert Paes
- IVC Evidensia the NetherlandsVleutenThe Netherlands
| | - Hugo van Oostrom
- Evidensia Dierenziekenhuis BarendrechtBarendrechtThe Netherlands
- Evidensia Dierenziekenhuis ArnhemArnhemThe Netherlands
| | | | - Martijn Beukers
- Evidensia Dierenziekenhuis BarendrechtBarendrechtThe Netherlands
- Evidensia Dierenziekenhuis Hart van BrabantWaalwijkThe Netherlands
| | | | - Niklas Bergknut
- Evidensia Dierenziekenhuis Hart van BrabantWaalwijkThe Netherlands
| |
Collapse
|
4
|
Ekman N, Fors A, Moons P, Boström E, Taft C. Are the content and usability of a new direct observation tool adequate for assessing competency in delivering person-centred care: a think-aloud study with patients and healthcare professionals in Sweden. BMJ Open 2024; 14:e085198. [PMID: 38950999 DOI: 10.1136/bmjopen-2024-085198] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 07/03/2024] Open
Abstract
OBJECTIVE To evaluate the content and usability of a new direct observation tool for assessing competency in delivering person-centred care based on the Gothenburg Centre for Person-Centred Care (gPCC) framework. DESIGN This is a qualitative study using think-aloud techniques and retrospective probing interviews and analyzed using deductive content analysis. SETTING Sessions were conducted remotely via Zoom with participants in their homes or offices. PARTICIPANTS 11 participants with lengthy experience of receiving, delivering and/or implementing gPCC were recruited using purposeful sampling and selected to represent a broad variety of stakeholders and potential end-users. RESULTS Participants generally considered the content of the four main domains of the tool, that is, person-centred care activities, clinician manner, clinician skills and person-centred care goals, to be comprehensive and relevant for assessing person-centred care in general and gPCC in particular. Some participants pointed to the need to expand person-centred care activities to better reflect the emphasis on eliciting patient resources/capabilities and psychosocial needs in the gPCC framework. Think-aloud analyses revealed some usability issues primarily regarding difficulties or uncertainties in understanding several words and in using the rating scale. Probing interviews indicated that these problems could be mitigated by improving written instructions regarding response options and by replacing some words. Participants generally were satisfied with the layout and structure of the tool, but some suggested enlarging font size and text spacing to improve readability. CONCLUSION The tool appears to satisfactorily cover major person-centred care activities outlined in the gPCC framework. The inclusion of content concerning clinician manner and skills was seen as a relevant embellishment of the framework and as contributing to a more comprehensive assessment of clinician performance in the delivery of person-centred care. A revised version addressing observed content and usability issues will be tested for inter-rater and intra-rater reliability and for feasibility of use in healthcare education and quality improvement efforts.
Collapse
Affiliation(s)
- Nina Ekman
- Institute of Health and Care Sciences, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
- Gothenburg Centre for Person-Centred Care (GPCC), University of Gothenburg, Gothenburg, Sweden
- Department of Public Health and Primary Care, Faculty of Medicine, KU Leuven, Leuven, Belgium
| | - Andreas Fors
- Institute of Health and Care Sciences, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
- Gothenburg Centre for Person-Centred Care (GPCC), University of Gothenburg, Gothenburg, Sweden
- Region Västra Götaland, Research, Education, Development and Innovation, Primary Health Care, Gothenburg, Sweden
| | - Philip Moons
- Institute of Health and Care Sciences, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
- Gothenburg Centre for Person-Centred Care (GPCC), University of Gothenburg, Gothenburg, Sweden
- Department of Public Health and Primary Care, Faculty of Medicine, KU Leuven, Leuven, Belgium
- Department of Pediatrics and Child Health, University of Cape Town, Cape Town, South Africa
| | - Eva Boström
- Department of Nursing, University of Umeå, Umeå, Sweden
| | - Charles Taft
- Institute of Health and Care Sciences, Sahlgrenska Academy, University of Gothenburg, Gothenburg, Sweden
- Gothenburg Centre for Person-Centred Care (GPCC), University of Gothenburg, Gothenburg, Sweden
| |
Collapse
|
5
|
Griffith M, Zvonar I, Garrett A, Bayaa N. Making goals count: A theory-informed approach to on-shift learning goals. AEM EDUCATION AND TRAINING 2024; 8:e10993. [PMID: 38882241 PMCID: PMC11178521 DOI: 10.1002/aet2.10993] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/21/2024] [Revised: 04/23/2024] [Accepted: 04/25/2024] [Indexed: 06/18/2024]
Abstract
Supervisors often ask emergency medicine trainees for their learning goals at the start of a clinical shift, though they may do so without considering the reasons for this practice. Recognizing the underlying rationale for voicing on-shift learning goals and proactively considering solutions for some of the associated challenges can help learners and supervisors employ this practice to its full potential. Goal articulation is rooted in educational principles such as self-regulated learning, targeted performance feedback, and collaborative relationships between learner and supervisor. Despite the potential for on-shift learning goals to augment learning, there are numerous barriers that make it challenging for learners and supervisors alike to create or follow up on meaningful goals. Learner-related challenges include uncertainty about how to develop goals within an unpredictable clinical environment and creating goals too narrow or broad in scope. Supervisor-related challenges include difficulties integrating direct observation into the clinical workflow and a desire to avoid negative feedback. The learning environment also presents inherent challenges, such as lack of longitudinal supervisor-learner relationships, time constraints, space limitations, and incentives for learners to conceal their knowledge gaps. The authors discuss these challenges to effective on-shift learning goals and propose solutions that target the learner's approach, the supervisor's approach, and the learning environment itself.
Collapse
Affiliation(s)
- Max Griffith
- Department of Emergency Medicine University of Washington Seattle Washington USA
| | - Ivan Zvonar
- Department of Emergency Medicine University of Washington Seattle Washington USA
| | - Alexander Garrett
- Department of Emergency Medicine University of Washington Seattle Washington USA
| | - Naeem Bayaa
- Department of Emergency Medicine University of Washington Seattle Washington USA
| |
Collapse
|
6
|
Richardson D, Landreville JM, Trier J, Cheung WJ, Bhanji F, Hall AK, Frank JR, Oswald A. Coaching in Competence by Design: A New Model of Coaching in the Moment and Coaching Over Time to Support Large Scale Implementation. PERSPECTIVES ON MEDICAL EDUCATION 2024; 13:33-43. [PMID: 38343553 PMCID: PMC10854464 DOI: 10.5334/pme.959] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/08/2023] [Accepted: 11/23/2023] [Indexed: 02/15/2024]
Abstract
Coaching is an increasingly popular means to provide individualized, learner-centered, developmental guidance to trainees in competency based medical education (CBME) curricula. Aligned with CBME's core components, coaching can assist in leveraging the full potential of this educational approach. With its focus on growth and improvement, coaching helps trainees develop clinical acumen and self-regulated learning skills. Developing a shared mental model for coaching in the medical education context is crucial to facilitate integration and subsequent evaluation of success. This paper describes the Royal College of Physicians and Surgeons of Canada's coaching model, one that is theory based, evidence informed, principle driven and iteratively and developed by a multidisciplinary team. The coaching model was specifically designed, fit for purpose to the postgraduate medical education (PGME) context and implemented as part of Competence by Design (CBD), a new competency based PGME program. This coaching model differentiates two coaching roles, which reflect different contexts in which postgraduate trainees learn and develop skills. Both roles are supported by the RX-OCR process: developing Relationship/Rapport, setting eXpectations, Observing, a Coaching conversation, and Recording/Reflecting. The CBD Coaching Model and its associated RX-OCR faculty development tool support the implementation of coaching in CBME. Coaching in the moment and coaching over time offer important mechanisms by which CBD brings value to trainees. For sustained change to occur and for learners and coaches to experience the model's intended benefits, ongoing professional development efforts are needed. Early post implementation reflections and lessons learned are provided.
Collapse
Affiliation(s)
- Denyse Richardson
- Department of Physical Medicine and Rehabilitation, Queen’s University, Kingston, ON, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
| | | | - Jessica Trier
- Department of Physical Medicine and Rehabilitation, Queen’s University, Kingston, ON, Canada
| | - Warren J. Cheung
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Emergency Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Farhan Bhanji
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Education, Faculty of Medicine and Health Sciences, McGill University, Montreal, QC, Canada
| | - Andrew K. Hall
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Department of Emergency Medicine, University of Ottawa, Ottawa, ON, Canada
| | - Jason R. Frank
- University of Ottawa Faculty of Medicine, Ottawa, ON, Canada
| | - Anna Oswald
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
- Division of Rheumatology, Department of Medicine, Faculty of Medicine and Dentistry, University of Alberta, Edmonton, ON, Canada
- Competency Based Medical Education, University of Alberta, Edmonton, AB, Canada
| |
Collapse
|
7
|
Mitchell EC, Ott M, Ross D, Grant A. Development of a Tool to Assess Surgical Resident Competence On-Call: The Western University Call Assessment Tool (WUCAT). JOURNAL OF SURGICAL EDUCATION 2024; 81:106-114. [PMID: 38008642 DOI: 10.1016/j.jsurg.2023.10.001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/10/2023] [Revised: 09/13/2023] [Accepted: 10/02/2023] [Indexed: 11/28/2023]
Abstract
BACKGROUND A central tenet of competency-based medical education is the formative assessment of trainees. There are currently no assessments designed to examine resident competence on-call, despite the on-call period being a significant component of residency, characterized by less direct supervision compared to daytime. The purpose of this study was to design a formative on-call assessment tool and collect valid evidence on its application. METHODS Nominal group technique was used to identify critical elements of surgical resident competence on-call to inform tool development. The tool was piloted over six months in the Division of Plastic & Reconstructive Surgery at our institution. Quantitative and qualitative evidence was collected to examine tool validity. RESULTS A ten-item tool was developed based on the consensus group results. Sixty-three assessments were completed by seven staff members on ten residents during the pilot. The tool had a reliability coefficient of 0.67 based on a generalizability study and internal item consistency was 0.92. Scores were significantly associated with years of training. We found the tool improved the quantity and structure of feedback given and that the tool was considered feasible and acceptable by both residents and staff members. CONCLUSIONS The Western University Call Assessment Tool (WUCAT) has multiple sources of evidence supporting its use in assessing resident competence on-call.
Collapse
Affiliation(s)
- Eric C Mitchell
- Department of Surgery, Western University, London, Ontario, Canada
| | - Michael Ott
- Department of Surgery, Western University, London, Ontario, Canada
| | - Douglas Ross
- Department of Surgery, Western University, London, Ontario, Canada
| | - Aaron Grant
- Department of Surgery, Western University, London, Ontario, Canada.
| |
Collapse
|
8
|
Marty AP, Linsenmeyer M, George B, Young JQ, Breckwoldt J, Ten Cate O. Mobile technologies to support workplace-based assessment for entrustment decisions: Guidelines for programs and educators: AMEE Guide No. 154. MEDICAL TEACHER 2023; 45:1203-1213. [PMID: 36706225 DOI: 10.1080/0142159x.2023.2168527] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/18/2023]
Abstract
With the rise of competency-based medical education and workplace-based assessment (WBA) since the turn of the century, much has been written about methods of assessment. Direct observation and other sources of information have become standard in many clinical programs. Entrustable professional activities (EPAs) have also become a central focus of assessment in the clinical workplace. Paper and pencil (one of the earliest mobile technologies!) to document observations have become almost obsolete with the advent of digital technology. Typically, clinical supervisors are asked to document assessment ratings using forms on computers. However, accessing these forms can be cumbersome and is not easily integrated into existing clinical workflows. With a call for more frequent documentation, this practice is hardly sustainable, and mobile technology is quickly becoming indispensable. Documentation of learner performance at the point of care merges WBA with patient care and WBA increasingly uses smartphone applications for this purpose.This AMEE Guide was developed to support institutions and programs who wish to use mobile technology to implement EPA-based assessment and, more generally, any type of workplace-based assessment. It covers backgrounds of WBA, EPAs and entrustment decision-making, provides guidance for choosing or developing mobile technology, discusses challenges and describes best practices.
Collapse
Affiliation(s)
| | - Machelle Linsenmeyer
- West Virginia School of Osteopathic Medicine, Lewisburg, WV, United States of America
| | - Brian George
- Surgery and Learning Health Sciences, University of Michigan, Ann Arbor, Michigan, United States of America
| | - John Q Young
- Department of Psychiatry, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell &, Zucker Hillside Hospital, NY, United States of America
| | - Jan Breckwoldt
- Institute of Anesthesia at the University Hospital Zurich, Switzerland
| | - Olle Ten Cate
- Utrecht Center for Research and Development of Health Professions Education at UMC Utrecht, The Netherlands
| |
Collapse
|
9
|
Hauer KE, Chang A, van Schaik SM, Lucey C, Cowell T, Teherani A. "It's All About the Trust And Building A Foundation:" Evaluation of a Longitudinal Medical Student Coaching Program. TEACHING AND LEARNING IN MEDICINE 2023; 35:550-564. [PMID: 35996842 DOI: 10.1080/10401334.2022.2111570] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/07/2021] [Accepted: 08/01/2022] [Indexed: 06/15/2023]
Abstract
Coaching is increasingly implemented in medical education to support learners' growth, learning, and wellbeing. Data demonstrating the impact of longitudinal coaching programs are needed. We developed and evaluated a comprehensive longitudinal medical student coaching program designed to achieve three aims for students: fostering personal and professional development, advancing physician skills with a growth mindset, and promoting student wellbeing and belonging within an inclusive learning community. We also sought to advance coaches' development as faculty through satisfying education roles with structured training. Students meet with coaches weekly for the first 17 months of medical school for patient care and health systems skills learning, and at least twice yearly throughout the remainder of medical school for individual progress and planning meetings and small-group discussions about professional identity. Using the developmental evaluation framework, we iteratively evaluated the program over the first five years of implementation with multiple quantitative and qualitative measures of students' and coaches' experiences related to the three aims. The University of California, San Francisco, School of Medicine, developed a longitudinal coaching program in 2016 for medical students alongside reform of the four-year curriculum. The coaching program addressed unmet student needs for a longitudinal, non-evaluative relationship with a coach to support their development, shape their approach to learning, and promote belonging and community. In surveys and focus groups, students reported high satisfaction with coaching in measures of the three program aims. They appreciated coaches' availability and guidance for the range of academic, personal, career, and other questions they had throughout medical school. Students endorsed the value of a longitudinal relationship and coaches' ability to meet their changing needs over time. Students rated coaches' teaching of foundational clinical skills highly. Students observed coaches learning some clinical skills with them - skills outside a coach's daily practice. Students also raised some concerns about variability among coaches. Attention to wellbeing and belonging to a learning community were program highlights for students. Coaches benefited from relationships with students and other coaches and welcomed the professional development to equip them to support all student needs. Students perceive that a comprehensive medical student coaching program can achieve aims to promote their development and provide support. Within a non-evaluative longitudinal coach relationship, students build skills in driving their own learning and improvement. Coaches experience a satisfying yet challenging role. Ongoing faculty development within a coach community and funding for the role seem essential for coaches to fulfill their responsibilities.
Collapse
Affiliation(s)
- Karen E Hauer
- Department of Medicine, University of California, San Francisco, San Francisco, California, USA
| | - Anna Chang
- Department of Medicine, University of California, San Francisco, San Francisco, California, USA
| | - Sandrijn M van Schaik
- Department of Medicine, University of California, San Francisco, San Francisco, California, USA
| | - Catherine Lucey
- Department of Medicine, University of California, San Francisco, San Francisco, California, USA
| | - Tami Cowell
- Department of Medicine, University of California, San Francisco, San Francisco, California, USA
| | - Arianne Teherani
- Department of Medicine, University of California, San Francisco, San Francisco, California, USA
| |
Collapse
|
10
|
McCarthy N, Neville K, Pope A, Barry L, Livingstone V. Effectiveness of a proficiency-based progression e-learning approach to training in communication in the context of clinically deteriorating patients: a multi-arm randomised controlled trial. BMJ Open 2023; 13:e072488. [PMID: 37536965 PMCID: PMC10401258 DOI: 10.1136/bmjopen-2023-072488] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/03/2023] [Accepted: 06/08/2023] [Indexed: 08/05/2023] Open
Abstract
OBJECTIVE To determine the effectiveness of proficiency-based progression (PBP) e-learning in training in communication concerning clinically deteriorating patients. DESIGN Single-centre multi-arm randomised double-blind controlled trial with three parallel arms. RANDOMISATION, SETTING AND PARTICIPANTS A computer-generated program randomised and allocated 120 final year medical students in an Irish University into three trial groups. INTERVENTION Each group completed the standard Identification, Situation, Background, Assessment, Recommendation communication e-learning; group 1 Heath Service Executive course group (HSE) performed this alone; group 2 (PBP) performed additional e-learning using PBP scenarios with expert-determined proficiency benchmarks composed of weighted marking schemes of steps, errors and critical errors cut-offs; group 3 (S) (self-directed, no PBP) performed additional e-learning with identical scenarios to (PBP) without PBP. MAIN OUTCOME MEASURES Primary analysis was based on 114 students, comparing ability to reach expert-determined predefined proficiency benchmark in standardised low-fidelity simulation assessment, before and after completion of each group's e-learning requirements. Performance was recorded and scored by two independent blinded assessors. RESULTS Post-intervention, proficiency in each group in the low-fidelity simulation environment improved with statistically significant difference in proficiency between groups (p<0.001). Proficiency was highest in (PBP) (81.1%, 30/37). Post hoc pairwise comparisons revealed statistically significant differences between (PBP) and self-directed (S) (p<0.001) and (HSE) (p<0.001). No statistically significant difference existed between (S) and (HSE) (p=0.479). Changes in proficiency from pre-intervention to post-intervention were significantly different between the three groups (p=0.001). Post-intervention, an extra 67.6% (25/37) in (PBP) achieved proficiency in the low-fidelity simulation. Post hoc pairwise comparisons revealed statistically significant differences between (PBP) and both (S) (p=0.020) and (HSE) (p<0.001). No statistically significant difference was found between (S) and (HSE) (p=0.156). CONCLUSIONS PBP e-learning is a more effective way to train in communication concerning clinically deteriorating patients than standard e-learning or e-learning without PBP. TRIAL REGISTRATION NUMBER NCT02937597.
Collapse
Affiliation(s)
- Nora McCarthy
- Medical Education Unit, School of Medicine, University College Cork, Cork, Ireland
| | - Karen Neville
- Department of Business Information Systems, Cork University Business School, University College Cork, Cork, Ireland
| | - Andrew Pope
- Department of Business Information Systems, Cork University Business School, University College Cork, Cork, Ireland
| | - Lee Barry
- ESA-BIC, Tyndall Institute, University College Cork National University of Ireland, Cork, Ireland
| | - Vicki Livingstone
- INFANT Centre, University College Cork National University of Ireland, Cork, Ireland
| |
Collapse
|
11
|
Miller KA, Nagler J, Wolff M, Schumacher DJ, Pusic MV. It Takes a Village: Optimal Graduate Medical Education Requires a Deliberately Developmental Organization. PERSPECTIVES ON MEDICAL EDUCATION 2023; 12:282-293. [PMID: 37520509 PMCID: PMC10377742 DOI: 10.5334/pme.936] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/14/2023] [Accepted: 07/06/2023] [Indexed: 08/01/2023]
Abstract
Coaching is proposed as a means of improving the learning culture of medicine. By fostering trusting teacher-learner relationships, learners are encouraged to embrace feedback and make the most of failure. This paper posits that a cultural shift is necessary to fully harness the potential of coaching in graduate medical education. We introduce the deliberately developmental organization framework, a conceptual model focusing on three core dimensions: developmental communities, developmental aspirations, and developmental practices. These dimensions broaden the scope of coaching interactions. Implementing this organizational change within graduate medical education might be challenging, yet we argue that embracing deliberately developmental principles can embed coaching into everyday interactions and foster a culture in which discussing failure to maximize learning becomes acceptable. By applying the dimensions of developmental communities, aspirations, and practices, we present a six-principle roadmap towards transforming graduate medical education training programs into deliberately developmental organizations.
Collapse
Affiliation(s)
- Kelsey A. Miller
- Pediatrics and Emergency Medicine, Harvard Medical School, Boston, MA, USA
| | - Joshua Nagler
- Pediatrics and Emergency Medicine, Harvard Medical School, Boston, MA, USA
| | - Margaret Wolff
- Emergency Medicine and Pediatrics, University of Michigan Medical School, Ann Arbor, MI, USA
| | - Daniel J. Schumacher
- Cincinnati Children’s Hospital Medical Center and the University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Martin V. Pusic
- Pediatrics and Emergency Medicine, Harvard Medical School, Boston, MA, USA
| |
Collapse
|
12
|
Berger S, Stalmeijer RE, Marty AP, Berendonk C. Exploring the Impact of Entrustable Professional Activities on Feedback Culture: A Qualitative Study of Anesthesiology Residents and Attendings. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:836-843. [PMID: 36812061 DOI: 10.1097/acm.0000000000005188] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/18/2023]
Abstract
PURPOSE Entrustable professional activities (EPAs) were introduced as a potential way to optimize workplace-based assessments. Yet, recent studies suggest that EPAs have not yet overcome all of the challenges to implementing meaningful feedback. The aim of this study was to explore the extent to which the introduction of EPAs via mobile app impacts feedback culture as experienced by anesthesiology residents and attending physicians. METHOD Using a constructivist grounded theory approach, the authors interviewed a purposive and theoretical sample of residents (n = 11) and attendings (n = 11) at the Institute of Anaesthesiology, University Hospital of Zurich, where EPAs had recently been implemented. Interviews took place between February and December 2021. Data collection and analysis were conducted iteratively. The authors used open, axial, and selective coding to gain knowledge and understanding on the interplay of EPAs and feedback culture. RESULTS Participants reflected on a number of changes in their day-to-day experience of feedback culture with the implementation of EPAs. Three main mechanisms were instrumental in this process: lowering the feedback threshold, change in feedback focus, and gamification. Participants felt a lower threshold to feedback seeking and giving and that the frequency of feedback conversations increased and tended to be more focused on a specific topic and shorter, while feedback content tended to focus more on technical skills and more attention was given to average performances. Residents indicated that the app-based approach fostered a game-like motivation to "climb levels," while attendings did not perceive a game-like experience. CONCLUSIONS EPAs may offer a solution to problems of infrequent occurrence of feedback and invite attention to average performances and technical competencies, but may come at the expense of feedback on nontechnical skills. This study suggests that feedback culture and feedback instruments have a mutually interacting influence on each other.
Collapse
Affiliation(s)
- Sabine Berger
- S. Berger is a third-year medical resident, Internal Medicine Training Program, St. Claraspital, Basel, Switzerland
| | - Renee E Stalmeijer
- R.E. Stalmeijer is associate professor, Department of Educational Development and Research, Faculty of Health, Medicine and Life Sciences, Maastricht University, Maastricht, The Netherlands
| | - Adrian P Marty
- A.P. Marty is currently senior attending physician and team lead for education, Institute of Anaesthesiology, Intensive Care and Pain Medicine, Orthopedic University Hospital Balgrist, Zurich, Switzerland. At the time of writing, he was attending physician, Institute of Anaesthesiology, University of Zurich, University Hospital of Zurich, Zurich, Switzerland
| | - Christoph Berendonk
- C. Berendonk is senior lecturer in medical education, Institute for Medical Education, University of Bern, Bern, Switzerland
| |
Collapse
|
13
|
Paterson QS, Alrimawi H, Sample S, Bouwsema M, Anjum O, Vincent M, Cheung WJ, Hall A, Woods R, Martin LJ, Chan T. Examining enablers and barriers to entrustable professional activity acquisition using the theoretical domains framework: A qualitative framework analysis study. AEM EDUCATION AND TRAINING 2023; 7:e10849. [PMID: 36994315 PMCID: PMC10041073 DOI: 10.1002/aet2.10849] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/04/2022] [Revised: 01/20/2023] [Accepted: 01/29/2023] [Indexed: 06/19/2023]
Abstract
Background Without a clear understanding of the factors contributing to the effective acquisition of high-quality entrustable professional activity (EPA) assessments, trainees, supervising faculty, and training programs may lack appropriate strategies for successful EPA implementation and utilization. The purpose of this study was to identify barriers and facilitators to acquiring high-quality EPA assessments in Canadian emergency medicine (EM) training programs. Methods We conducted a qualitative framework analysis study utilizing the Theoretical Domains Framework (TDF). Semistructured interviews of EM resident and faculty participants underwent audio recording, deidentification, and line-by-line coding by two authors, being coded to extract themes and subthemes across the domains of the TDF. Results From 14 interviews (eight faculty and six residents) we identified, within the 14 TDF domains, major themes and subthemes for barriers and facilitators to EPA acquisition for both faculty and residents. The two most cited domains (and their frequencies) among residents and faculty were environmental context and resources (56) and behavioral regulation (48). Example strategies to improving EPA acquisition include orienting residents to the competency-based medical education (CBME) paradigm, recalibrating expectations relating to "low ratings" on EPAs, engaging in continuous faculty development to ensure familiarity and fluency with EPAs, and implementing longitudinal coaching programs between residents and faculty to encourage repetitive longitudinal interactions and high-quality specific feedback. Conclusions We identified key strategies to support residents, faculty, programs, and institutions in overcoming barriers and improving EPA assessment processes. This is an important step toward ensuring the successful implementation of CBME and the effective operationalization of EPAs within EM training programs.
Collapse
Affiliation(s)
- Quinten S. Paterson
- Department of Emergency MedicineUniversity of SaskatchewanSaskatoonSaskatchewanCanada
| | - Hussein Alrimawi
- Emergency Medicine Division, Department of MedicineMcMaster UniversityHamiltonOntarioCanada
| | - Spencer Sample
- Emergency Medicine Division, Department of MedicineMcMaster UniversityHamiltonOntarioCanada
| | - Melissa Bouwsema
- Department of Emergency MedicineQueens UniversityKingstonOntarioCanada
| | - Omar Anjum
- Department of Emergency MedicineUniversity of OttawaOttawaOntarioCanada
| | - Maggie Vincent
- Emergency Medicine Division, Department of MedicineMcMaster UniversityHamiltonOntarioCanada
| | - Warren J. Cheung
- Department of Emergency MedicineUniversity of OttawaOttawaOntarioCanada
| | - Andrew Hall
- Department of Emergency MedicineQueens UniversityKingstonOntarioCanada
- Department of Emergency MedicineUniversity of OttawaOttawaOntarioCanada
| | - Rob Woods
- Department of Emergency MedicineUniversity of SaskatchewanSaskatoonSaskatchewanCanada
| | - Lynsey J. Martin
- Department of Emergency MedicineUniversity of SaskatchewanSaskatoonSaskatchewanCanada
| | - Teresa Chan
- Emergency Medicine Division, Department of MedicineMcMaster UniversityHamiltonOntarioCanada
| |
Collapse
|
14
|
Booth GJ, Ross B, Cronin WA, McElrath A, Cyr KL, Hodgson JA, Sibley C, Ismawan JM, Zuehl A, Slotto JG, Higgs M, Haldeman M, Geiger P, Jardine D. Competency-Based Assessments: Leveraging Artificial Intelligence to Predict Subcompetency Content. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:497-504. [PMID: 36477379 DOI: 10.1097/acm.0000000000005115] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/17/2023]
Abstract
PURPOSE Faculty feedback on trainees is critical to guiding trainee progress in a competency-based medical education framework. The authors aimed to develop and evaluate a Natural Language Processing (NLP) algorithm that automatically categorizes narrative feedback into corresponding Accreditation Council for Graduate Medical Education Milestone 2.0 subcompetencies. METHOD Ten academic anesthesiologists analyzed 5,935 narrative evaluations on anesthesiology trainees at 4 graduate medical education (GME) programs between July 1, 2019, and June 30, 2021. Each sentence (n = 25,714) was labeled with the Milestone 2.0 subcompetency that best captured its content or was labeled as demographic or not useful. Inter-rater agreement was assessed by Fleiss' Kappa. The authors trained an NLP model to predict feedback subcompetencies using data from 3 sites and evaluated its performance at a fourth site. Performance metrics included area under the receiver operating characteristic curve (AUC), positive predictive value, sensitivity, F1, and calibration curves. The model was implemented at 1 site in a self-assessment exercise. RESULTS Fleiss' Kappa for subcompetency agreement was moderate (0.44). Model performance was good for professionalism, interpersonal and communication skills, and practice-based learning and improvement (AUC 0.79, 0.79, and 0.75, respectively). Subcompetencies within medical knowledge and patient care ranged from fair to excellent (AUC 0.66-0.84 and 0.63-0.88, respectively). Performance for systems-based practice was poor (AUC 0.59). Performances for demographic and not useful categories were excellent (AUC 0.87 for both). In approximately 1 minute, the model interpreted several hundred evaluations and produced individual trainee reports with organized feedback to guide a self-assessment exercise. The model was built into a web-based application. CONCLUSIONS The authors developed an NLP model that recognized the feedback language of anesthesiologists across multiple GME programs. The model was operationalized in a self-assessment exercise. It is a powerful tool which rapidly organizes large amounts of narrative feedback.
Collapse
Affiliation(s)
- Gregory J Booth
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - Benjamin Ross
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - William A Cronin
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - Angela McElrath
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - Kyle L Cyr
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - John A Hodgson
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - Charles Sibley
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - J Martin Ismawan
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - Alyssa Zuehl
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - James G Slotto
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - Maureen Higgs
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - Matthew Haldeman
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - Phillip Geiger
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| | - Dink Jardine
- G.J. Booth is assistant professor, Uniformed Services University of the Health Sciences, and residency program director, Department of Anesthesiology and Pain Medicine, Naval Medical Center Portsmouth, Portsmouth, Virginia
| |
Collapse
|
15
|
Hill AE, Bartle E, Copley JA, Olson R, Dunwoodie R, Barnett T, Zuber A. The VOTIS, part 1: development and pilot trial of a tool to assess students' interprofessional skill development using video-reflexive ethnography. J Interprof Care 2023; 37:223-231. [PMID: 35403549 DOI: 10.1080/13561820.2022.2052270] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
This paper explores the development and evaluation of the video Observation Tool for Interprofessional Skills (VOTIS). We describe the development of an authentic interprofessional assessment tool that incorporates video reflection and allows formative and summative assessment of individual learners' interprofessional skills within an authentic interprofessional context. We then investigate its validity and reliability. The VOTIS was developed using a modified Delphi technique. The tool was piloted with 61 students and 11 clinical educators who completed the VOTIS following team meetings where students interacted about their interprofessional clinical work. The following were calculated: internal consistency; students' proficiency levels; inter-rater reliability between students and clinical educators; and inter-rater reliability between clinical educators and an independent rater. Results indicate that the VOTIS has acceptable internal consistency and moderate reliability and has value in evaluating students' interprofessional skills. Study outcomes highlight the need for more explicit wording of tool content and instructions and further clinical educator training to increase the utility and reliability of the VOTIS as a learning and assessment tool.
Collapse
Affiliation(s)
- Anne E Hill
- School of Health and Rehabilitation Sciences, The University of Queensland, St. Lucia, Qld, Australia
| | - Emma Bartle
- School of Dentistry, The University of Queensland, St. Lucia, Qld, Australia
| | - Jodie A Copley
- School of Health and Rehabilitation Sciences, The University of Queensland, St. Lucia, Qld, Australia
| | - Rebecca Olson
- Sociology, School of Social Science, The University of Queensland, St. Lucia, Qld, Australia
| | - Ruth Dunwoodie
- School of Health and Rehabilitation Sciences, The University of Queensland, St. Lucia, Qld, Australia
| | - Tessa Barnett
- School of Health and Rehabilitation Sciences, The University of Queensland, St. Lucia, Qld, Australia
| | - Alice Zuber
- School of Health and Rehabilitation Sciences, The University of Queensland, St. Lucia, Qld, Australia
| |
Collapse
|
16
|
Rietmeijer CBT, van Esch SCM, Blankenstein AH, van der Horst HE, Veen M, Scheele F, Teunissen PW. A phenomenology of direct observation in residency: Is Miller's 'does' level observable? MEDICAL EDUCATION 2023; 57:272-279. [PMID: 36515981 PMCID: PMC10107098 DOI: 10.1111/medu.15004] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Subscribe] [Scholar Register] [Received: 04/11/2022] [Revised: 12/08/2022] [Accepted: 12/10/2022] [Indexed: 06/17/2023]
Abstract
INTRODUCTION Guidelines on direct observation (DO) present DO as an assessment of Miller's 'does' level, that is, the learner's ability to function independently in clinical situations. The literature, however, indicates that residents may behave 'inauthentically' when observed. To minimise this 'observer effect', learners are encouraged to 'do what they would normally do' so that they can receive feedback on their actual work behaviour. Recent phenomenological research on patients' experiences with DO challenges this approach; patients needed-and caused-some participation of the observing supervisor. Although guidelines advise supervisors to minimise their presence, we are poorly informed on how some deliberate supervisor participation affects residents' experience in DO situations. Therefore, we investigated what residents essentially experienced in DO situations. METHODS We performed an interpretive phenomenological interview study, including six general practice (GP) residents. We collected and analysed our data, using the four phenomenological lenses of lived body, lived space, lived time and lived relationship. We grouped our open codes by interpreting what they revealed about common structures of residents' pre-reflective experiences. RESULTS Residents experienced the observing supervisor not just as an observer or assessor. They also experienced them as both a senior colleague and as the patient's familiar GP, which led to many additional interactions. When residents tried to act as if the supervisor was not there, they could feel insecure and handicapped because the supervisor was there, changing the situation. DISCUSSION Our results indicate that the 'observer effect' is much more material than was previously understood. Consequently, observing residents' 'authentic' behaviour at Miller's 'does' level, as if the supervisor was not there, seems impossible and a misleading concept: misleading, because it may frustrate residents and cause supervisors to neglect patients' and residents' needs in DO situations. We suggest that one-way DO is better replaced by bi-directional DO in working-and-learning-together sessions.
Collapse
Affiliation(s)
- Chris B. T. Rietmeijer
- Department of General PracticeAmsterdam UMC, location Vrije Universiteit AmsterdamAmsterdamThe Netherlands
| | - Suzanne C. M. van Esch
- Department of General PracticeAmsterdam UMC, location University of AmsterdamAmsterdamThe Netherlands
| | - Annette H. Blankenstein
- Department of General PracticeAmsterdam UMC, location Vrije Universiteit AmsterdamAmsterdamThe Netherlands
| | - Henriëtte E. van der Horst
- Department of General PracticeAmsterdam UMC, location Vrije Universiteit AmsterdamAmsterdamThe Netherlands
| | - Mario Veen
- Department of General PracticeErasmus Medical CenterRotterdamThe Netherlands
| | - Fedde Scheele
- School of Medical Sciences, Athena Institute for Transdisciplinary ResearchAmsterdam UMC, location Vrije Universiteit AmsterdamAmsterdamThe Netherlands
| | - Pim W. Teunissen
- School of Health Professions EducationMaastricht UniversityMaastrichtThe Netherlands
| |
Collapse
|
17
|
Adam P, Mauksch LB, Brandenburg DL, Danner C, Ross VR. Optimal training in communication model (OPTiCOM): A programmatic roadmap. PATIENT EDUCATION AND COUNSELING 2023; 107:107573. [PMID: 36410312 DOI: 10.1016/j.pec.2022.107573] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/26/2022] [Revised: 11/13/2022] [Accepted: 11/16/2022] [Indexed: 06/16/2023]
Abstract
OBJECTIVES Teaching primary care residents patient communication skills is essential, complex, and impeded by barriers. We find no models guiding faculty how to train residents in the workplace that integrate necessary system components, the science of physician-patient communication training and competency-based medical education. The aim of this project is to create such a model. METHODS We created OPTiCOM using four steps: (1) communication educator interviews, analysis and theme development; (2) initial model construction; (3) model refinement using expert feedback; (4) structured literature review to validate, refine and finalize the model. RESULTS Our model contains ten interdependent building blocks organized into four developmental tiers. The Foundational value tier has one building block Naming relationship as a core value. The Expertize and resources tier includes four building blocks addressing: Curricular expertize, Curricular content, Leadership, and Time. The four building blocks in the Application and development tier are Observation form, Faculty development, Technology, and Formative assessment. The Language and culture tier identifies the final building block, Culture promoting continuous improvement in teaching communication. CONCLUSIONS OPTiCOM organizes ten interdependent systems building blocks to maximize and sustain resident learning of communication skills. Practice Implications Residency faculty can use OPTiCOM for self-assessment, program creation and revision.
Collapse
Affiliation(s)
- Patricia Adam
- Department of Family Medicine and Community Health, University of Minnesota, Smiley's Clinic, 2020 East 28th Street, Minneapolis, MN 55407, USA.
| | - Larry B Mauksch
- Emeritus - Department of Family Medicine, University of Washington, Home, 6026 30th Ave NE, Seattle, WA 98115, USA.
| | - Dana L Brandenburg
- Department of Family Medicine and Community Health, University of Minnesota, Smiley's Clinic, 2020 East 28th Street, Minneapolis, MN 55407, USA.
| | - Christine Danner
- Department of Family Medicine and Community Health, University of Minnesota, Bethesda Clinic, 580 Rice St, St Paul, MN 55103, USA.
| | - Valerie R Ross
- University of Washington Department of Family Medicine, Family Medicine Residency Program, Box 356390, 331 N.E. Thornton Place, Seattle, WA 98125, USA.
| |
Collapse
|
18
|
Kogan JR, Conforti LN, Holmboe ES. Faculty Perceptions of Frame of Reference Training to Improve Workplace-Based Assessment. J Grad Med Educ 2023; 15:81-91. [PMID: 36817545 PMCID: PMC9934818 DOI: 10.4300/jgme-d-22-00287.1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/06/2022] [Revised: 06/27/2022] [Accepted: 12/06/2022] [Indexed: 02/17/2023] Open
Abstract
BACKGROUND Workplace-based assessment (WBA) is a key assessment strategy in competency-based medical education. However, its full potential has not been actualized secondary to concerns with reliability, validity, and accuracy. Frame of reference training (FORT), a rater training technique that helps assessors distinguish between learner performance levels, can improve the accuracy and reliability of WBA, but the effect size is variable. Understanding FORT benefits and challenges help improve this rater training technique. OBJECTIVE To explore faculty's perceptions of the benefits and challenges associated with FORT. METHODS Subjects were internal medicine and family medicine physicians (n=41) who participated in a rater training intervention in 2018 consisting of in-person FORT followed by asynchronous online spaced learning. We assessed participants' perceptions of FORT in post-workshop focus groups and an end-of-study survey. Focus groups and survey free text responses were coded using thematic analysis. RESULTS All subjects participated in 1 of 4 focus groups and completed the survey. Four benefits of FORT were identified: (1) opportunity to apply skills frameworks via deliberate practice; (2) demonstration of the importance of certain evidence-based clinical skills; (3) practice that improved the ability to discriminate between resident skill levels; and (4) highlighting the importance of direct observation and the dangers using proxy information in assessment. Challenges included time constraints and task repetitiveness. CONCLUSIONS Participants believe that FORT training serves multiple purposes, including helping them distinguish between learner skill levels while demonstrating the impact of evidence-based clinical skills and the importance of direct observation.
Collapse
Affiliation(s)
- Jennifer R. Kogan
- Jennifer R. Kogan, MD, is Associate Dean, Student Success and Professional Development, and Professor of Medicine, Perelman School of Medicine, University of Pennsylvania
| | - Lisa N. Conforti
- Lisa N. Conforti, MPH, is Research Associate for Milestones Evaluation, Accreditation Council for Graduate Medical Education (ACGME)
| | - Eric S. Holmboe
- Eric S. Holmboe, MD, is Chief Research, Milestone Development, and Evaluation Officer, ACGME
| |
Collapse
|
19
|
Kogan JR, Dine CJ, Conforti LN, Holmboe ES. Can Rater Training Improve the Quality and Accuracy of Workplace-Based Assessment Narrative Comments and Entrustment Ratings? A Randomized Controlled Trial. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:237-247. [PMID: 35857396 DOI: 10.1097/acm.0000000000004819] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
PURPOSE Prior research evaluating workplace-based assessment (WBA) rater training effectiveness has not measured improvement in narrative comment quality and accuracy, nor accuracy of prospective entrustment-supervision ratings. The purpose of this study was to determine whether rater training, using performance dimension and frame of reference training, could improve WBA narrative comment quality and accuracy. A secondary aim was to assess impact on entrustment rating accuracy. METHOD This single-blind, multi-institution, randomized controlled trial of a multifaceted, longitudinal rater training intervention consisted of in-person training followed by asynchronous online spaced learning. In 2018, investigators randomized 94 internal medicine and family medicine physicians involved with resident education. Participants assessed 10 scripted standardized resident-patient videos at baseline and follow-up. Differences in holistic assessment of narrative comment accuracy and specificity, accuracy of individual scenario observations, and entrustment rating accuracy were evaluated with t tests. Linear regression assessed impact of participant demographics and baseline performance. RESULTS Seventy-seven participants completed the study. At follow-up, the intervention group (n = 41), compared with the control group (n = 36), had higher scores for narrative holistic specificity (2.76 vs 2.31, P < .001, Cohen V = .25), accuracy (2.37 vs 2.06, P < .001, Cohen V = .20) and mean quantity of accurate (6.14 vs 4.33, P < .001), inaccurate (3.53 vs 2.41, P < .001), and overall observations (2.61 vs 1.92, P = .002, Cohen V = .47). In aggregate, the intervention group had more accurate entrustment ratings (58.1% vs 49.7%, P = .006, Phi = .30). Baseline performance was significantly associated with performance on final assessments. CONCLUSIONS Quality and specificity of narrative comments improved with rater training; the effect was mitigated by inappropriate stringency. Training improved accuracy of prospective entrustment-supervision ratings, but the effect was more limited. Participants with lower baseline rating skill may benefit most from training.
Collapse
Affiliation(s)
- Jennifer R Kogan
- J.R. Kogan is associate dean, Student Success and Professional Development, and professor of medicine, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania; ORCID: https://orcid.org/0000-0001-8426-9506
| | - C Jessica Dine
- C.J. Dine is associate dean, Evaluation and Assessment, and associate professor of medicine, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania; ORCID: https://orcid.org/0000-0001-5894-0861
| | - Lisa N Conforti
- L.N. Conforti is research associate for milestones evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-7317-6221
| | - Eric S Holmboe
- E.S. Holmboe is chief, research, milestones development and evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois; ORCID: https://orcid.org/0000-0003-0108-6021
| |
Collapse
|
20
|
Erumeda NJ, George AZ, Jenkins LS. Evaluating postgraduate family medicine supervisor feedback in registrars' learning portfolios. Afr J Prim Health Care Fam Med 2022; 14:e1-e10. [PMID: 36546494 PMCID: PMC9772774 DOI: 10.4102/phcfm.v14i1.3744] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2022] [Revised: 10/19/2022] [Accepted: 10/25/2022] [Indexed: 12/24/2022] Open
Abstract
BACKGROUND Postgraduate supervision forms a vital component of decentralised family medicine training. While the components of effective supervisory feedback have been explored in high-income countries, how this construct is delivered in resource-constrained low- to middle-income countries has not been investigated adequately. AIM This article evaluated supervisory feedback in family medicine registrars' learning portfolios (LPs) as captured in their learning plans and mini-Clinical Evaluation Exercise (mini-CEX) forms and whether the training district or the year of training affected the nature of the feedback. SETTING Registrars' LPs from 2020 across five decentralised sites affiliated with the University of the Witwatersrand in South Africa were analysed. METHODS Two modified tools were used to evaluate the quantity of the written feedback in 38 learning plans and 57 mini-CEX forms. Descriptive statistics, Fisher's exact and Wilcoxon rank-sum tests were used for analysis. Content analysis was used to derive counts of areas of feedback. RESULTS Most learning plans (61.2%) did not refer to registrars' clinical knowledge or offer an improvement strategy (86.1%). The 'extent of supervisors' feedback' was rated as 'poor' (63.2%), with only 14.0% rated as 'good.' The 'some' and 'no' feedback categories in the mini-CEX competencies (p 0.001 to p = 0.014) and the 'extent of supervisors' feedback' (p 0.001) were significantly associated with training district. Feedback focused less on clinical reasoning and negotiation skills. CONCLUSION Supervisors should provide specific and constructive narrative feedback and an action plan to improve registrars' future performance.Contribution: Supervisory feedback in postgraduate family medicine training needs overall improvement to develop skilled family physicians.
Collapse
Affiliation(s)
- Neetha J. Erumeda
- Department of Family Medicine and Primary Care, Faculty of Health Sciences, University of the Witwatersrand, Johannesburg, South Africa,Gauteng Department of Health, Ekurhuleni District Health Services, Germiston, South Africa
| | - Ann Z. George
- Centre of Health Science Education, Faculty of Health Sciences, University of the Witwatersrand, Johannesburg, South Africa
| | - Louis S. Jenkins
- Division of Family Medicine and Primary Care, Department of Family and Emergency Medicine, Faculty of Health Sciences, Stellenbosch University, Cape Town, South Africa,George Hospital, Western Cape Department of Health, George, South Africa,Primary Health Care Directorate, Department of Family, Community and Emergency Care, Faculty of Health Sciences, University of Cape Town, Cape Town, South Africa
| |
Collapse
|
21
|
Core Entrustable Professional Activities for Entering Residency: A National Survey of Graduating Medical Students' Self-Assessed Skills by Specialty. J Am Coll Surg 2022; 235:940-951. [PMID: 36102502 PMCID: PMC9653107 DOI: 10.1097/xcs.0000000000000395] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/15/2022]
Abstract
BACKGROUND The Association of American Medical Colleges described 13 Core Entrustable Professional Activities (EPAs) that graduating students should be prepared to perform under indirect supervision on day one of residency. Surgery program directors recently recommended entrustability in these Core EPAs for incoming surgery interns. We sought to determine if graduating students intending to enter surgery agreed they had the skills to perform these Core EPAs. STUDY DESIGN Using de-identified, individual-level data collected from and about 2019 Association of American Medical Colleges Graduation Questionnaire respondents, latent profile analysis was used to group respondents based on their self-assessed Core EPAs skills' response patterns. Associations between intended specialty, among other variables, and latent profile analysis group were assessed using independent sample t -tests and chi-square tests and multivariable logistic regression methods. RESULTS Among 12,308 Graduation Questionnaire respondents, latent profile analysis identified 2 respondent groups: 7,863 (63.9%) in a high skill acquisition agreement (SAA) group and 4,445 (36.1%) in a moderate SAA group. Specialty was associated with SAA group membership (p < 0.001), with general surgery, orthopaedic surgery, and emergency medicine respondents (among others) overrepresented in the high SAA group. In the multivariable logistic regression models, each of anesthesiology, ophthalmology, pediatrics, psychiatry, and radiology (vs general surgery) specialty intention was associated with a lower odds of high SAA group membership. CONCLUSION Graduating students' self-assessed Core EPAs skills were higher for those intending general surgery than for those intending some other specialties. Our findings can inform collaborative efforts to ensure graduates' acquisition of the skills expected of them at the start of residency.
Collapse
|
22
|
Safavi AH, Papadakos J, Papadakos T, Quartey NK, Lawrie K, Klein E, Storer S, Croke J, Millar BA, Jang R, Bezjak A, Giuliani ME. Feedback Delivery in an Academic Cancer Centre: Reflections From an R2C2-based Microlearning Course. JOURNAL OF CANCER EDUCATION : THE OFFICIAL JOURNAL OF THE AMERICAN ASSOCIATION FOR CANCER EDUCATION 2022; 37:1790-1797. [PMID: 34169464 DOI: 10.1007/s13187-021-02028-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 05/17/2021] [Indexed: 06/13/2023]
Abstract
Feedback delivery and training have not been characterized in the context of academic cancer centres. The purpose of this study was to assess the feasibility and utility of a microlearning course based on the R2C2 (Relationship, Reaction, Content, Coaching) feedback model and characterize multidisciplinary healthcare provider (HCP) perspectives on existing feedback practices in an academic cancer centre. Five HCP (two radiation oncologists, one medical oncologist, and two allied health professionals) with supervisory roles were selected by purposive sampling to participate in a prospective longitudinal qualitative study. Each participant completed a web-based multimedia course. Semi-structured one-on-one interviews were conducted with each participant at four time points: pre- and immediately post-course, and at one- and three-months post course. All participants found the course to be time feasible and completed it in 10-20 min. Participants expressed that the course fulfilled their need for feedback training and that its adoption may normalize a feedback culture in the cancer centre. Three themes were identified regarding perceptions of existing feedback practices: (1) hierarchical and interdisciplinary relationships modulate feedback delivery, (2) interest in feedback delivery varies by duration of the supervisory relationship, and (3) the transactionality of supervisor-trainee relationships influences feedback delivery. This study demonstrates the perceived feasibility and utility of a digital microlearning approach for development of feedback competencies in an academic cancer centre, perceptions of cultural barriers to feedback delivery, and the need for organizational commitment to developing a feedback culture.
Collapse
Affiliation(s)
- Amir H Safavi
- Department of Radiation Oncology, University of Toronto, Toronto, ON, M5T 1P5, Canada
| | - Janet Papadakos
- Cancer Education, Princess Margaret Cancer Centre, 585 University Ave, Munk Building B-PMB 130, Toronto, ON, M5G 2M9, Canada
- Patient Education, Ontario Health, Toronto, ON, M5S 1A1, Canada
- Institute of Health Policy, Management and Evaluation, University of Toronto, 155 College St 4th Floor, Toronto, ON, M5T 3M6, Canada
| | - Tina Papadakos
- Cancer Education, Princess Margaret Cancer Centre, 585 University Ave, Munk Building B-PMB 130, Toronto, ON, M5G 2M9, Canada
- Patient Education, Ontario Health, Toronto, ON, M5S 1A1, Canada
| | - Naa Kwarley Quartey
- Cancer Education, Princess Margaret Cancer Centre, 585 University Ave, Munk Building B-PMB 130, Toronto, ON, M5G 2M9, Canada
| | - Karen Lawrie
- Cancer Education, Princess Margaret Cancer Centre, 585 University Ave, Munk Building B-PMB 130, Toronto, ON, M5G 2M9, Canada
| | - Eden Klein
- Cancer Education, Princess Margaret Cancer Centre, 585 University Ave, Munk Building B-PMB 130, Toronto, ON, M5G 2M9, Canada
| | - Sarah Storer
- Cancer Education, Princess Margaret Cancer Centre, 585 University Ave, Munk Building B-PMB 130, Toronto, ON, M5G 2M9, Canada
| | - Jennifer Croke
- Department of Radiation Oncology, University of Toronto, Toronto, ON, M5T 1P5, Canada
- Radiation Medicine Program, Princess Margaret Cancer Centre, 700 University Ave 7th Floor, Toronto, ON, M5G 2M9, Canada
| | - Barbara-Ann Millar
- Department of Radiation Oncology, University of Toronto, Toronto, ON, M5T 1P5, Canada
- Radiation Medicine Program, Princess Margaret Cancer Centre, 700 University Ave 7th Floor, Toronto, ON, M5G 2M9, Canada
| | - Raymond Jang
- Division of Medical Oncology and Hematology, University of Toronto, 700 University Ave 7th Floor, Toronto, ON, M5G 2M9, Canada
| | - Andrea Bezjak
- Department of Radiation Oncology, University of Toronto, Toronto, ON, M5T 1P5, Canada
- Radiation Medicine Program, Princess Margaret Cancer Centre, 700 University Ave 7th Floor, Toronto, ON, M5G 2M9, Canada
| | - Meredith E Giuliani
- Department of Radiation Oncology, University of Toronto, Toronto, ON, M5T 1P5, Canada.
- Cancer Education, Princess Margaret Cancer Centre, 585 University Ave, Munk Building B-PMB 130, Toronto, ON, M5G 2M9, Canada.
- Radiation Medicine Program, Princess Margaret Cancer Centre, 700 University Ave 7th Floor, Toronto, ON, M5G 2M9, Canada.
| |
Collapse
|
23
|
Kinnear B, Schumacher DJ, Driessen EW, Varpio L. How argumentation theory can inform assessment validity: A critical review. MEDICAL EDUCATION 2022; 56:1064-1075. [PMID: 35851965 PMCID: PMC9796688 DOI: 10.1111/medu.14882] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/28/2022] [Revised: 07/07/2022] [Accepted: 07/15/2022] [Indexed: 05/21/2023]
Abstract
INTRODUCTION Many health professions education (HPE) scholars frame assessment validity as a form of argumentation in which interpretations and uses of assessment scores must be supported by evidence. However, what are purported to be validity arguments are often merely clusters of evidence without a guiding framework to evaluate, prioritise, or debate their merits. Argumentation theory is a field of study dedicated to understanding the production, analysis, and evaluation of arguments (spoken or written). The aim of this study is to describe argumentation theory, articulating the unique insights it can offer to HPE assessment, and presenting how different argumentation orientations can help reconceptualize the nature of validity in generative ways. METHODS The authors followed a five-step critical review process consisting of iterative cycles of focusing, searching, appraising, sampling, and analysing the argumentation theory literature. The authors generated and synthesised a corpus of manuscripts on argumentation orientations deemed to be most applicable to HPE. RESULTS We selected two argumentation orientations that we considered particularly constructive for informing HPE assessment validity: New rhetoric and informal logic. In new rhetoric, the goal of argumentation is to persuade, with a focus on an audience's values and standards. Informal logic centres on identifying, structuring, and evaluating arguments in real-world settings, with a variety of normative standards used to evaluate argument validity. DISCUSSION Both new rhetoric and informal logic provide philosophical, theoretical, or practical groundings that can advance HPE validity argumentation. New rhetoric's foregrounding of audience aligns with HPE's social imperative to be accountable to specific stakeholders such as the public and learners. Informal logic provides tools for identifying and structuring validity arguments for analysis and evaluation.
Collapse
Affiliation(s)
- Benjamin Kinnear
- Department of PediatricsUniversity of Cincinnati College of MedicineCincinnatiOhioUSA
- School of Health Professions Education (SHE)Maastricht UniversityMaastrichtThe Netherlands
| | - Daniel J. Schumacher
- Department of PediatricsUniversity of Cincinnati College of MedicineCincinnatiOhioUSA
| | - Erik W. Driessen
- School of Health Professions Education Faculty of HealthMedicine and Life Sciences of Maastricht UniversityMaastrichtThe Netherlands
| | - Lara Varpio
- Uniformed Services University of the Health SciencesBethesdaMarylandUSA
| |
Collapse
|
24
|
Phinney LB, Fluet A, O'Brien BC, Seligman L, Hauer KE. Beyond Checking Boxes: Exploring Tensions With Use of a Workplace-Based Assessment Tool for Formative Assessment in Clerkships. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2022; 97:1511-1520. [PMID: 35703235 DOI: 10.1097/acm.0000000000004774] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
PURPOSE To understand the role of a workplace-based assessment (WBA) tool in facilitating feedback for medical students, this study explored changes and tensions in a clerkship feedback activity system through the lens of cultural historical activity theory (CHAT) over 2 years of tool implementation. METHOD This qualitative study uses CHAT to explore WBA use in core clerkships by identifying feedback activity system elements (e.g., community, tools, rules, objects) and tensions among these elements. University of California, San Francisco core clerkship students were invited to participate in semistructured interviews eliciting experience with a WBA tool intended to enhance direct observation and feedback in year 1 (2019) and year 2 (2020) of implementation. In year 1, the WBA tool required supervisor completion in the school's evaluation system on a computer. In year 2, both students and supervisors had WBA completion abilities and could access the form via a smartphone separate from the school's evaluation system. RESULTS Thirty-five students participated in interviews. The authors identified tensions that shifted with time and tool iterations. Year 1 students described tensions related to cumbersome tool design, fear of burdening supervisors, confusion over WBA purpose, WBA as checking boxes, and WBA usefulness depending on clerkship context and culture. Students perceived dissatisfaction with the year 1 tool version among peers and supervisors. The year 2 mobile-based tool and student completion capabilities helped to reduce many of the tensions noted in year 1. Students expressed wider WBA acceptance among peers and supervisors in year 2 and reported understanding WBA to be for low-stakes feedback, thereby supporting formative assessment for learning. CONCLUSIONS Using CHAT to explore changes in a feedback activity system with WBA tool iterations revealed elements important to WBA implementation, including designing technology for tool efficiency and affording students autonomy to document feedback with WBAs.
Collapse
Affiliation(s)
- Lauren B Phinney
- L.B. Phinney is a first-year internal medicine resident, Department of Medicine, University of California, San Francisco School of Medicine, San Francisco, California
| | - Angelina Fluet
- A. Fluet is a fourth-year medical student, University of California, San Francisco School of Medicine, San Francisco, California
| | - Bridget C O'Brien
- B.C. O'Brien is professor of medicine and education scientist, Department of Medicine and Center for Faculty Educators, University of California, San Francisco School of Medicine, San Francisco, California
| | - Lee Seligman
- L. Seligman is a second-year internal medicine resident, Department of Medicine, New York-Presbyterian Hospital, Columbia University Irving Medical Center, New York, New York
| | - Karen E Hauer
- K.E. Hauer is associate dean for competency assessment and professional standards and professor, Department of Medicine, University of California, San Francisco School of Medicine, San Francisco, California
| |
Collapse
|
25
|
Shafqat S, Tejani I, Ali M, Tariq H, Sabzwari S. Feasibility and Effectiveness of Mini-Clinical Evaluation Exercise (Mini-CEX) in an Undergraduate Medical Program: A Study From Pakistan. Cureus 2022; 14:e29563. [PMID: 36312643 PMCID: PMC9595266 DOI: 10.7759/cureus.29563] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/25/2022] [Indexed: 11/07/2022] Open
Abstract
Background In clinical settings, direct observation (DO) with feedback is an effective method to assess and improve learner performance. One tool used for DO is the mini-clinical evaluation exercise (Mini-CEX). We conducted a study to assess the effectiveness and feasibility of Mini-CEX for medical students at Aga Khan University, Karachi. Methods Utilizing a purposive sampling technique, a total of 199 students in six core clerkships of Years 3 and 4 were selected for this study. Participating faculty underwent training workshops for use of Mini-CEX and feedback strategies. Each student was assessed twice by one faculty, using a modified version of the Mini-CEX, which assessed four domains of clinical skills: Data Gathering, Communication, Diagnosis/Differential, and Management Plan and Organization. Feedback was given after each encounter. Faculty and students also provided detailed feedback regarding the process of DO. Data were analyzed using Statistical Package for Social Sciences (SPSS) version 26 (IBM Corp., Armonk, NY, USA), with categorical variables arranged as frequencies and percentages. The Chi-squared test was used for further statistical analyses, and a P-value of < 0.05 was considered statistically significant. Effectiveness was assessed via a change in student performance between the first and second Mini-CEX, and feasibility was assessed via qualitative feedback. Results We obtained three sets of results: Mini-CEX forms (523), from which we included a total 350 evaluations for analysis, 216 from Year 3 and 134 from Year 4, and feedback on DO: student (70) and faculty (18). Year 3 students performed significantly better in all foci of the Mini-CEX between the first and second assessment (P ≤ 0.001), whereas in Year 4, significant improvement was limited to only two domains of the Mini-CEX [Communication of History/Physical Examination (P = 0.040) and Diagnosis/Differential and Management Plan (P < 0.001)]. Students (65.7%) and faculty (94.4%) felt this exercise improved their interaction. 83.3% faculty recommended its formal implementation compared to 27.1% of students, who reported challenges in implementation of the Mini-CEX such as time constraints, logistics, the subjectivity of assessment, and varying interest by faculty. Conclusion Direct observation using Mini-CEX is effective in improving the clinical and diagnostic skills of medical students and strengthens student-faculty interaction. While challenges exist in its implementation, the strategic placement of Mini-CEX may enhance its utility in measuring student competency.
Collapse
|
26
|
Landreville JM, Wood TJ, Frank JR, Cheung WJ. Does direct observation influence the quality of workplace-based assessment documentation? AEM EDUCATION AND TRAINING 2022; 6:e10781. [PMID: 35903424 PMCID: PMC9305723 DOI: 10.1002/aet2.10781] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/14/2022] [Revised: 06/07/2022] [Accepted: 06/08/2022] [Indexed: 05/30/2023]
Abstract
BACKGROUND A key component of competency-based medical education (CBME) is direct observation of trainees. Direct observation has been emphasized as integral to workplace-based assessment (WBA) yet previously identified challenges may limit its successful implementation. Given these challenges, it is imperative to fully understand the value of direct observation within a CBME program of assessment. Specifically, it is not known whether the quality of WBA documentation is influenced by observation type (direct or indirect). METHODS The objective of this study was to determine the influence of observation type (direct or indirect) on quality of entrustable professional activity (EPA) assessment documentation within a CBME program. EPA assessments were scored by four raters using the Quality of Assessment for Learning (QuAL) instrument, a previously published three-item quantitative measure of the quality of written comments associated with a single clinical performance score. An analysis of variance was performed to compare mean QuAL scores among the direct and indirect observation groups. The reliability of the QuAL instrument for EPA assessments was calculated using a generalizability analysis. RESULTS A total of 244 EPA assessments (122 direct observation, 122 indirect observation) were rated for quality using the QuAL instrument. No difference in mean QuAL score was identified between the direct and indirect observation groups (p = 0.17). The reliability of the QuAL instrument for EPA assessments was 0.84. CONCLUSIONS Observation type (direct or indirect) did not influence the quality of EPA assessment documentation. This finding raises the question of how direct and indirect observation truly differ and the implications for meta-raters such as competence committees responsible for making judgments related to trainee promotion.
Collapse
Affiliation(s)
| | - Timothy J. Wood
- Department of Innovation in Medical EducationUniversity of OttawaOttawaOntarioCanada
| | - Jason R. Frank
- Department of Emergency MedicineUniversity of OttawaOttawaOntarioCanada
| | - Warren J. Cheung
- Department of Emergency MedicineUniversity of OttawaOttawaOntarioCanada
| |
Collapse
|
27
|
Gordon LB, Zelaya-Floyd M, White P, Hallen S, Varaklis K, Tavakolikashi M. Interprofessional bedside rounding improves quality of feedback to resident physicians. MEDICAL TEACHER 2022; 44:907-913. [PMID: 35373712 DOI: 10.1080/0142159x.2022.2049735] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
PURPOSE Obtaining high quality feedback in residency education is challenging, in part due to limited opportunities for faculty observation of authentic clinical work. This study reviewed the impact of interprofessional bedside rounds ('iPACE™') on the length and quality of faculty narrative evaluations of residents as compared to usual inpatient teaching rounds. METHODS Narrative comments from faculty evaluations of Internal Medicine (IM) residents both on usual teaching service as well as the iPACE™ service (spanning 2017-2020) were reviewed and coded using a deductive content analysis approach. RESULTS Six hundred ninety-two narrative evaluations by 63 attendings of 103 residents were included. Evaluations of iPACE™ residents were significantly longer than those of residents on usual teams (109 vs. 69 words, p < 0.001). iPACE™ evaluations contained a higher average occurrence of direct observations of patient/family interactions (0.72 vs. 0.32, p < 0.001), references to interprofessionalism (0.17 vs. 0.05, p < 0.001), as well as specific (3.21 vs. 2.26, p < 0.001), actionable (1.01 vs. 0.69, p < 0.001), and corrective feedback (1.2 vs. 0.88, p = 0.001) per evaluation. CONCLUSIONS This study suggests that the iPACE™ model, which prioritizes interprofessional bedside rounds, had a positive impact on the quantity and quality of feedback, as measured via narrative comments on weekly evaluations.
Collapse
Affiliation(s)
- Lesley B Gordon
- Tufts University School of Medicine, Boston, MA, USA
- Department of Medicine, Maine Medical Center, Portland, ME, USA
| | | | - Patricia White
- Department of Medical Education, Maine Medical Center, Portland, ME, USA
| | - Sarah Hallen
- Tufts University School of Medicine, Boston, MA, USA
- Division of Geriatrics, Maine Medical Center, Portland, ME, USA
| | - Kalli Varaklis
- Tufts University School of Medicine, Boston, MA, USA
- Department of Medical Education, Maine Medical Center, Portland, ME, USA
- Department of Obstetrics and Gynecology, Maine Medical Center, Portland, ME, USA
| | - Motahareh Tavakolikashi
- Department of Medical Education, Maine Medical Center, Portland, ME, USA
- Department of System Science and Industrial Engineering, Binghamton University, Binghamton, NY, USA
| |
Collapse
|
28
|
Reimagining the Clinical Competency Committee to Enhance Education and Prepare for Competency-Based Time-Variable Advancement. J Gen Intern Med 2022; 37:2280-2290. [PMID: 35445932 PMCID: PMC9021365 DOI: 10.1007/s11606-022-07515-3] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/15/2021] [Accepted: 03/25/2022] [Indexed: 12/01/2022]
Abstract
Assessing residents and clinical fellows is a high-stakes activity. Effective assessment is important throughout training so that identified areas of strength and weakness can guide educational planning to optimize outcomes. Assessment has historically been underemphasized although medical education oversight organizations have strengthened requirements in recent years. Growing acceptance of competency-based medical education and its logical extension to competency-based time-variable (CB-TV) graduate medical education (GME) further highlights the importance of implementing effective evidence-based approaches to assessment. The Clinical Competency Committee (CCC) has emerged as a key programmatic structure in graduate medical education. In the context of launching a multi-specialty pilot of CB-TV GME in our health system, we have examined several program's CCC processes and reviewed the relevant literature to propose enhancements to CCCs. We recommend that all CCCs fulfill three core goals, regularly applied to every GME trainee: (1) discern and describe the resident's developmental status to individualize education, (2) determine readiness for unsupervised practice, and (3) foster self-assessment ability. We integrate the literature and observations from GME program CCCs in our institutions to evaluate how current CCC processes support or undermine these goals. Obstacles and key enablers are identified. Finally, we recommend ways to achieve the stated goals, including the following: (1) assess and promote the development of competency in all trainees, not just outliers, through a shared model of assessment and competency-based advancement; (2) strengthen CCC assessment processes to determine trainee readiness for independent practice; and (3) promote trainee reflection and informed self-assessment. The importance of coaching for competency, robust workplace-based assessments, feedback, and co-production of individualized learning plans are emphasized. Individual programs and their CCCs must strengthen assessment tools and frameworks to realize the potential of competency-oriented education.
Collapse
|
29
|
Peterson BD, Magee CD, Martindale JR, Dreicer JJ, Mutter MK, Young G, Sacco MJ, Parsons LC, Collins SR, Warburton KM, Parsons AS. REACT: Rapid Evaluation Assessment of Clinical Reasoning Tool. J Gen Intern Med 2022; 37:2224-2229. [PMID: 35710662 PMCID: PMC9202973 DOI: 10.1007/s11606-022-07513-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/15/2021] [Accepted: 03/25/2022] [Indexed: 11/04/2022]
Abstract
INTRODUCTION Clinical reasoning encompasses the process of data collection, synthesis, and interpretation to generate a working diagnosis and make management decisions. Situated cognition theory suggests that knowledge is relative to contextual factors, and clinical reasoning in urgent situations is framed by pressure of consequential, time-sensitive decision-making for diagnosis and management. These unique aspects of urgent clinical care may limit the effectiveness of traditional tools to assess, teach, and remediate clinical reasoning. METHODS Using two validated frameworks, a multidisciplinary group of clinicians trained to remediate clinical reasoning and with experience in urgent clinical care encounters designed the novel Rapid Evaluation Assessment of Clinical Reasoning Tool (REACT). REACT is a behaviorally anchored assessment tool scoring five domains used to provide formative feedback to learners evaluating patients during urgent clinical situations. A pilot study was performed to assess fourth-year medical students during simulated urgent clinical scenarios. Learners were scored using REACT by a separate, multidisciplinary group of clinician educators with no additional training in the clinical reasoning process. REACT scores were analyzed for internal consistency across raters and observations. RESULTS Overall internal consistency for the 41 patient simulations as measured by Cronbach's alpha was 0.86. A weighted kappa statistic was used to assess the overall score inter-rater reliability. Moderate reliability was observed at 0.56. DISCUSSION To our knowledge, REACT is the first tool designed specifically for formative assessment of a learner's clinical reasoning performance during simulated urgent clinical situations. With evidence of reliability and content validity, this tool guides feedback to learners during high-risk urgent clinical scenarios, with the goal of reducing diagnostic and management errors to limit patient harm.
Collapse
Affiliation(s)
| | - Charles D Magee
- University of Virginia School of Medicine, Charlottesville, VA, USA
| | | | | | - M Kathryn Mutter
- University of Virginia School of Medicine, Charlottesville, VA, USA
| | - Gregory Young
- University of Virginia School of Medicine, Charlottesville, VA, USA
| | | | - Laura C Parsons
- University of Virginia School of Medicine, Charlottesville, VA, USA
| | | | | | - Andrew S Parsons
- University of Virginia School of Medicine, Charlottesville, VA, USA.
| |
Collapse
|
30
|
An Online Pattern Recognition-Oriented Workshop to Promote Interest among Undergraduate Students in How Mathematical Principles Could Be Applied within Veterinary Science. SUSTAINABILITY 2022. [DOI: 10.3390/su14116768] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/04/2023]
Abstract
Knowing the importance of mathematics and its relationship with veterinary medicine plays an important role for students. To promote interest in this relationship, we developed the workshop “Math in Nature” that utilizes the surrounding environment for stimulating pattern-recognition and observational skills. It consisted of four sections: A talk by a professional researcher, a question-and-answer section, a mathematical pattern identification session, and a discussion of the ideas proposed by students. The effectiveness of the program to raise interest in mathematics was evaluated using a questionnaire applied before and after the workshop. Following the course, a higher number of students agreed with the fact that biological phenomena can be explained and predicted by applying mathematics, and that it is possible to identify mathematical patterns in living beings. However, the students’ perspectives regarding the importance of mathematics in their careers, as well as their interest in deepening their mathematical knowledge, did not change. Arguably, “Math in Nature” could have exerted a positive effect on the students’ interest in mathematics. We thus recommend the application of similar workshops to improve interests and skills in relevant subjects among undergraduate students.
Collapse
|
31
|
Carbajal MM, Dadiz R, Sawyer T, Kane S, Frost M, Angert R. Part 5: Essentials of Neonatal-Perinatal Medicine Fellowship: evaluation of competence and proficiency using Milestones. J Perinatol 2022; 42:809-814. [PMID: 35149835 DOI: 10.1038/s41372-021-01306-0] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/08/2020] [Revised: 11/03/2021] [Accepted: 12/23/2021] [Indexed: 11/09/2022]
Abstract
The Accreditation Council for Graduate Medical Education (ACGME) Pediatric Subspecialty Milestone Project competencies are used for Neonatal-Perinatal Medicine (NPM) fellows. Milestones are longitudinal markers that range from novice to expert (levels 1-5). There is no standard approach to the required biannual evaluation of fellows by fellowship programs, resulting in significant variability among programs regarding procedural experience and exposure to pathology during clinical training. In this paper, we discuss the opportunities that Milestones provide, potential strategies to address challenges, and future directions.
Collapse
Affiliation(s)
- Melissa M Carbajal
- Department of Pediatrics, Section of Neonatology, Baylor College of Medicine, Houston, TX, USA.
| | - Rita Dadiz
- Departments of Pediatrics, University of Rochester School of Medicine and Dentistry, Rochester, NY, USA
| | - Taylor Sawyer
- Department of Pediatrics, University of Washington School of Medicine, Seattle, WA, USA
| | - Sara Kane
- Department of Pediatrics, Indiana University School of Medicine, Indianapolis, IN, USA
| | - Mackenzie Frost
- Department of Pediatrics, Perelman School of Medicine at the University of Pennsylvania, Philadelphia, PA, USA
| | | | - Robert Angert
- Department of Pediatrics, New York University Grossman School of Medicine, New York, NY, USA
| |
Collapse
|
32
|
Maggio LA, Haustein S, Costello JA, Driessen EW, Artino AR. Joining the meta-research movement: A bibliometric case study of the journal Perspectives on Medical Education. PERSPECTIVES ON MEDICAL EDUCATION 2022; 11:127-136. [PMID: 35727471 PMCID: PMC9210332 DOI: 10.1007/s40037-022-00717-9] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/15/2022] [Revised: 05/07/2022] [Accepted: 05/13/2022] [Indexed: 06/15/2023]
Abstract
PURPOSE To conduct a bibliometric case study of the journal Perspectives on Medical Education (PME) to provide insights into the journal's inner workings and to "take stock" of where PME is today, where it has been, and where it might go. METHODS Data, including bibliographic metadata, reviewer and author details, and downloads, were collected for manuscripts submitted to and published in PME from the journal's Editorial Manager and Web of Science. Gender of authors and reviewers was predicted using Genderize.io. To visualize and analyze collaboration patterns, citation relationships and term co-occurrence social network analyses (SNA) were conducted. VOSviewer was used to visualize the social network maps. RESULTS Between 2012-2019 PME received, on average, 260 manuscripts annually (range = 73-402). Submissions were received from authors in 81 countries with the majority in the United States (US), United Kingdom, and the Netherlands. PME published 518 manuscripts with authors based in 31 countries, the majority being in the Netherlands, US, and Canada. PME articles were downloaded 717,613 times (mean per document: 1388). In total 1201 (55% women) unique peer reviewers were invited and 649 (57% women) completed reviews; 1227 (49% women) unique authors published in PME. SNA revealed that PME authors were quite collaborative, with most authoring articles with others and only a minority (n = 57) acting as single authors. DISCUSSION This case study provides a glimpse into PME and offers evidence for PME's next steps. In the future, PME is committed to growing the journal thoughtfully; diversifying and educating editorial teams, authors, and reviewers, and liberating and sharing journal data.
Collapse
Affiliation(s)
- Lauren A Maggio
- Uniformed Services University of the Health Sciences, Bethesda, MD, USA.
| | - Stefanie Haustein
- School of Information Studies (ÉSIS) and Scholarly Communications Lab, University of Ottawa, Ottawa, ON, Canada
| | | | | | - Anthony R Artino
- The George Washington University School of Medicine and Health Sciences, Washington, DC, USA
| |
Collapse
|
33
|
de Jonge LPJWM, Minkels FNE, Govaerts MJB, Muris JWM, Kramer AWM, van der Vleuten CPM, Timmerman AA. Supervisory dyads' communication and alignment regarding the use of workplace-based observations: a qualitative study in general practice residency. BMC MEDICAL EDUCATION 2022; 22:330. [PMID: 35484573 PMCID: PMC9052511 DOI: 10.1186/s12909-022-03395-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/09/2021] [Accepted: 04/21/2022] [Indexed: 06/14/2023]
Abstract
BACKGROUND In medical residency, performance observations are considered an important strategy to monitor competence development, provide feedback and warrant patient safety. The aim of this study was to gain insight into whether and how supervisor-resident dyads build a working repertoire regarding the use of observations, and how they discuss and align goals and approaches to observation in particular. METHODS We used a qualitative, social constructivist approach to explore if and how supervisory dyads work towards alignment of goals and preferred approaches to performance observations. We conducted semi-structured interviews with supervisor-resident dyads, performing a template analysis of the data thus obtained. RESULTS The supervisory dyads did not frequently communicate about the use of observations, except at the start of training and unless they were triggered by internal or external factors. Their working repertoire regarding the use of observations seemed to be primarily driven by patient safety goals and institutional assessment requirements rather than by providing developmental feedback. Although intended as formative, the institutional test was perceived as summative by supervisors and residents, and led to teaching to the test rather than educating for purposes of competence development. CONCLUSIONS To unlock the full educational potential of performance observations, and to foster the development of an educational alliance, it is essential that supervisory dyads and the training institute communicate clearly about these observations and the role of assessment practices of- and for learning, in order to align their goals and respective approaches.
Collapse
Affiliation(s)
- Laury P J W M de Jonge
- Department of General Practice, Maastricht University, P.O. Box 616, 6200, MD, Maastricht, The Netherlands.
| | - Floor N E Minkels
- Department of General Practice, Maastricht University, P.O. Box 616, 6200, MD, Maastricht, The Netherlands
| | - Marjan J B Govaerts
- Department of Educational Research and Development, Maastricht University, Maastricht, The Netherlands
| | - Jean W M Muris
- Department of General Practice, Maastricht University, P.O. Box 616, 6200, MD, Maastricht, The Netherlands
| | - Anneke W M Kramer
- Department of Family Medicine, Leiden University, Leiden, The Netherlands
| | - Cees P M van der Vleuten
- Department of Educational Research and Development, Maastricht University, Maastricht, The Netherlands
| | - Angelique A Timmerman
- Department of General Practice, Maastricht University, P.O. Box 616, 6200, MD, Maastricht, The Netherlands
| |
Collapse
|
34
|
Swanberg M, Woodson-Smith S, Pangaro L, Torre D, Maggio L. Factors and Interactions Influencing Direct Observation: A Literature Review Guided by Activity Theory. TEACHING AND LEARNING IN MEDICINE 2022; 34:155-166. [PMID: 34238091 DOI: 10.1080/10401334.2021.1931871] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/29/2020] [Revised: 04/19/2021] [Accepted: 05/11/2021] [Indexed: 06/13/2023]
Abstract
PhenomenonEnsuring that future physicians are competent to practice medicine is necessary for high quality patient care and safety. The shift toward competency-based education has placed renewed emphasis on direct observation via workplace-based assessments in authentic patient care contexts. Despite this interest and multiple studies focused on improving direct observation, challenges regarding the objectivity of this assessment approach remain underexplored and unresolved. Approach: We conducted a literature review of direct observation in authentic patient contexts by systematically searching databases PubMed, Embase, Web of Science, and ERIC. Included studies comprised original research conducted in the patient care context with authentic patients, either as a live encounter or a video recording of an actual encounter, which focused on factors affecting the direct observation of undergraduate medical education (UME) or graduate medical education (GME) trainees. Because the patient care context adds factors that contribute to the cognitive load of the learner and of the clinician-observer we focused our question on such contexts, which are most useful in judgments about advancement to the next level of training or practice. We excluded articles or published abstracts not conducted in the patient care context (e.g., OSCEs) or those involving simulation, allied health professionals, or non-UME/GME trainees. We also excluded studies focused on end-of-rotation evaluations and in-training evaluation reports. We extracted key data from the studies and used Activity Theory as a lens to identify factors affecting these observations and the interactions between them. Activity Theory provides a framework to understand and analyze complex human activities, the systems in which people work, and the interactions or tensions between multiple associated factors. Findings: Nineteen articles were included in the analysis; 13 involved GME learners and 6 UME learners. Of the 19, six studies were set in the operating room and four in the Emergency department. Using Activity Theory, we discovered that while numerous studies focus on rater and tool influences, very few study the impact of social elements. These are the rules that govern how the activity happens, the environment and members of the community involved in the activity and how completion of the activity is divided up among the members of the community. Insights: Viewing direct observation via workplace-based assessment through the lens of Activity Theory may enable educators to implement curricular changes to improve direct observation of assessment. Activity Theory may allow researchers to design studies to focus on the identified underexplored interactions and influences in relation to direct observation.
Collapse
Affiliation(s)
- Margaret Swanberg
- Department of Neurology, Uniformed Services University, Bethesda, Maryland, USA
| | - Sarah Woodson-Smith
- Department of Neurology, Naval Medical Center Portsmouth, Portsmouth, Virginia, USA
| | - Louis Pangaro
- Department of Medicine, Uniformed Services University, Bethesda, Maryland, USA
| | - Dario Torre
- Department of Medicine, Uniformed Services University, Bethesda, Maryland, USA
- Center for Health Professions Education, Uniformed Services University, Bethesda, Maryland, USA
| | - Lauren Maggio
- Department of Medicine, Uniformed Services University, Bethesda, Maryland, USA
- Center for Health Professions Education, Uniformed Services University, Bethesda, Maryland, USA
| |
Collapse
|
35
|
Lacasse M, Renaud JS, Côté L, Lafleur A, Codsi MP, Dove M, Pélissier-Simard L, Pitre L, Rheault C. [Feedback Guide for direct observation of family medicine residents in Canada: a francophone tool]. CANADIAN MEDICAL EDUCATION JOURNAL 2022; 13:29-54. [PMID: 35321416 PMCID: PMC8909829 DOI: 10.36834/cmej.72587] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
BACKGROUND There is no CanMEDS-FM-based milestone tool to guide feedback during direct observation (DO). We have developed a guide to support documentation of feedback for direct observation (DO) in Canadian family medicine (FM) programs. METHODS The Guide was designed in three phases with the collaboration of five Canadian FM programs with at least a French-speaking teaching site: 1) literature review and needs assessment; 2) development of the DO Feedback Guide; 3) testing the Guide in a video simulation context with qualitative content analysis. RESULTS Phase 1 demonstrated the need for a narrative guide aimed at 1) specifying mutual expectations according to the resident's level of training and the clinical context, 2) providing the supervisor with tools and structure in his observations 3) to facilitate documentation of feedback. Phase 2 made it possible to develop the Guide, in paper and electronic formats, meeting the needs identified. In phase 3, 15 supervisors used the guide for three levels of residence. The Guide was adjusted following this testing to recall the phases of the clinical encounter that were often forgotten during feedback (before consultation, diagnosis and follow-up), and to suggest types of formulation to be favored (stimulating questions, questions of clarification, reflections). CONCLUSION Based on evidence and a collaborative approach, this Guide will equip French-speaking Canadian supervisors and residents performing DO in family medicine.
Collapse
Affiliation(s)
| | | | - Luc Côté
- Université Laval, Québec, Canada
| | | | | | | | | | | | | |
Collapse
|
36
|
French JC, Pien LC. A Document Analysis of Nationally Available Faculty Assessment Forms of Resident Performance. J Grad Med Educ 2021; 13:833-840. [PMID: 35070096 PMCID: PMC8672836 DOI: 10.4300/jgme-d-21-00289.1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/10/2021] [Revised: 06/28/2021] [Accepted: 09/07/2021] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Written feedback by faculty of resident performance is valuable when it includes components based on assessment for learning. However, it is not clear how often assessment forms include these components for summative and formative feedback. OBJECTIVE To analyze prompts used in forms for faculty assessment of resident performance, guided by best practices in survey research methodology, self-regulation theory, and competency-based assessment. METHODS A document analysis, which is a qualitative approach used to analyze content and structure of texts, was completed on assessment forms nationally available in MedHub. Due to the number of forms available, only internal medicine and surgery specialties were included. A document summary form was created to analyze the assessments. The summary form guided researchers through the analysis. RESULTS Forty-eight forms were reviewed, each from a unique residency program. All forms provided a textbox for comments, and 54% made this textbox required for assessment completion. Eighty-three percent of assessments placed the open textbox at the end of the form. One-third of forms contained a simple prompt, "Comments," for the narrative section. Fifteen percent of forms included a box to check if the information on the form had been discussed with the resident. Fifty percent of the assessments were unclear if they were meant to be formative or summative in nature. CONCLUSIONS Our document analysis of assessment forms revealed they do not always follow best practices in survey design for narrative sections, nor do they universally address elements deemed important for promotion of self-regulation and competency-based assessment.
Collapse
Affiliation(s)
- Judith C. French
- Judith C. French, PhD, is Surgical Educator, General Surgery Residency Program, Department of General Surgery, Cleveland Clinic, and Assistant Professor of Surgery, Cleveland Clinic Lerner College of Medicine of Case Western Reserve University
| | - Lily C. Pien
- Lily C. Pien, MD, MHPE, is Core Faculty, Allergy and Immunology Fellowship Program, Cleveland Clinic, and Associate Professor of Medicine, Cleveland Clinic Lerner College of Medicine of Case Western Reserve University
| |
Collapse
|
37
|
Bube SH, Kingo PS, Madsen MG, Vásquez JL, Norus TP, Olsen RG, Dahl C, Hansen RB, Konge L, Azawi NH. Validation of a novel assessment tool identifying proficiency in Transurethral Bladder Tumour Resection: The OSATURBS assessment tool. J Endourol 2021; 36:572-579. [PMID: 34731011 DOI: 10.1089/end.2021.0768] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
BACKGROUND Competence in transurethral bladder tumour resection (TURB) is critical in bladder cancer management and should be ensured before independent practice. OBJECTIVE Develop an assessment tool for TURB and explore validity evidence in a clinical context. DESIGN, SETTING, AND PARTICIPANTS July 2019-March 2021, a total of 33 volunteer doctors from three hospitals were included. Participants performed two TURB procedures on patients with bladder tumours. A newly developed assessment tool (OSATURBS) was used for direct observation assessment, self-assessment, and blinded video-assessment. Outcome measurements and statistical analysis: Cronbach's alpha and Pearson's r were calculated for across items internal consistency reliability, inter-rater reliability, and test-retest reliability. Correlation between OSATURBS scores and the operative experience was calculated with Pearson's r and a pass/fail score was established. Differences in assessment scores were explored with paired t-test and independent samples t-test. RESULTS AND LIMITATIONS The internal consistency reliability across items Cronbach's alpha was 0.94 (n = 260, p < 0.001). Inter-rater reliability = 0.80 (n = 64, p < 0.001). Test-retest correlation was high, r = 0.71 (n = 32, p < 0.001). Relation to TURB experience was high, r = 0.71 (n = 32, p < 0.001). Pass/fail score = 19 points. Direct observation assessments were strongly correlated with video ratings (r = 0.85, p < 0.001) but with a significant social bias with lower scores for inexperienced and higher scores for experienced participants. Participants tended to overestimate their own performances. CONCLUSIONS OSATURBS assessment tool for TURB can be used for assessment of surgical proficiency in the clinical setting. Direct observation assessment and self-assessment are biased, and blinded video-assessment of TURB performances is advised.
Collapse
Affiliation(s)
- Sarah Hjartbro Bube
- Zealand University Hospital Roskilde, 53140, Department of Urology, Roskilde, Zealand, Denmark.,University of Copenhagen, 4321, Faculty of Health and Medical Science, Copenhagen, Denmark;
| | | | - Mia Gebauer Madsen
- Aarhus Universitetshospital, 11297, Department of Urology, Aarhus, Denmark;
| | - Juan Luis Vásquez
- Zealand University Hospital Roskilde, 53140, Department of Urology, Roskilde, Zealand, Denmark.,University of Copenhagen, 4321, Faculty of Health and Medical Science, Copenhagen, Zealand, Denmark;
| | - Thomas Peter Norus
- Zealand University Hospital Roskilde, 53140, Department of Urology, Roskilde, Sjaelland, Denmark;
| | - Rikke Groth Olsen
- National Hospital of the Faroe Islands, 112892, Surgical Department, Torshavn, Faroe Islands.,Rigshospitalet, 53146, CAMES - Copenhagen Academy for Medical Education and Simulation, Copenhagen, Denmark;
| | - Claus Dahl
- Capio Ramsay Santé, Department of Urology, Hellerup, Denmark;
| | - Rikke Bølling Hansen
- Herlev Hospital, 53176, Department of Urology, Gentofte, Denmark.,Rigshospitalet, 53146, CAMES - Copenhagen Academy for Medical Education and Simulation, Copenhagen, Denmark;
| | - Lars Konge
- Rigshospitalet, 53146, CAMES - Copenhagen Academy for Medical Education and Simulation, Copenhagen, Denmark.,University of Copenhagen, 4321, Faculty of Health and Medical Science, Copenhagen, Denmark;
| | - Nessn H Azawi
- Zealand University Hospital Roskilde, 53140, Department of Urology, Roskilde, Zealand, Denmark.,University of Copenhagen, 4321, Faculty of Health and Medical Science, Copenhagen, Denmark;
| |
Collapse
|
38
|
Zickuhr L, Kolfenbach J, Bolster MB. Applying Educational Theory to Optimize Trainee Education in the Ambulatory Virtual Care Environment. MEDICAL SCIENCE EDUCATOR 2021; 31:1715-1722. [PMID: 34422453 PMCID: PMC8370462 DOI: 10.1007/s40670-021-01365-0] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 07/30/2021] [Indexed: 05/10/2023]
Abstract
Virtual care (VC) encounters have become an essential part of outpatient clinical care. The theory of situated learning and legitimate peripheral participation posits that medical trainees learn best when they participate in authentic patient care experiences and engage effectively with their preceptors, members of the health care team, and the clinical learning environment. This theory can provide a framework from which to approach teaching in the VC setting, whereby preceptors may capitalize on the unique learning and assessment opportunities provided during VC encounters and optimize educational experiences for trainees as well as clinical outcomes for patients. In this monograph, we propose an approach grounded in situated learning and legitimate peripheral participation for teaching in the VC environment, particularly during real-time video visits.
Collapse
Affiliation(s)
- Lisa Zickuhr
- Department of Medicine, Washington University School of Medicine, St. Louis, MO USA
| | - Jason Kolfenbach
- Department of Medicine, University of Colorado School of Medicine, Aurora, CO USA
| | - Marcy B. Bolster
- Department of Medicine, Massachusetts General Hospital, Boston, MA USA
| |
Collapse
|
39
|
Rietmeijer CBT, Deves M, van Esch SCM, van der Horst HE, Blankenstein AH, Veen M, Scheele F, Teunissen PW. A phenomenological investigation of patients' experiences during direct observation in residency: busting the myth of the fly on the wall. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2021; 26:1191-1206. [PMID: 33765197 PMCID: PMC8452584 DOI: 10.1007/s10459-021-10044-z] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/14/2020] [Accepted: 03/07/2021] [Indexed: 05/30/2023]
Abstract
Direct observation (DO) of residents by supervisors is a highly recommended educational tool in postgraduate medical education, yet its uptake is poor. Residents and supervisors report various reasons for not engaging in DO. Some of these relate to their interaction with patients during DO. We do not know the patient perspectives on these interactions, nor, more broadly, what it is like to be a patient in a DO situation. Understanding the patient perspective may lead to a more complete understanding of the dynamics in DO situations, which may benefit patient wellbeing and improve the use of DO as an educational tool. We conducted a phenomenological interview study to investigate the experience of being a patient in a DO situation. Our analysis included multiple rounds of coding and identifying themes, and a final phase of phenomenological reduction to arrive at the essential elements of the experience. Constant reflexivity was at the heart of this process. Our results provide a new perspective on the role of the supervisor in DO situations. Patients were willing to address the resident, but sought moments of contact with, and some participation by, the supervisor. Consequently, conceptions of DO in which the supervisor thinks she is a fly on the wall rather than a part of the interaction, should be critically reviewed. To that end, we propose the concept of participative direct observation in workplace learning, which also acknowledges the observer's role as participant. Embracing this concept may benefit both patients' wellbeing and residents' learning.
Collapse
Affiliation(s)
- Chris B T Rietmeijer
- Department of General Practice/Family Medicine, Amsterdam University Medical Centers, location VUmc, De Boelelaan 1109, 1081 HV, Amsterdam, The Netherlands.
- , Heemraadschapslaan 33, 1181 TZ, Amstelveen, The Netherlands.
| | - Mark Deves
- Department of General Practice/Family Medicine, Amsterdam University Medical Centers, location VUmc, De Boelelaan 1109, 1081 HV, Amsterdam, The Netherlands
| | - Suzanne C M van Esch
- Department of General Practice/Family Medicine, Amsterdam University Medical Centers, Location AMC, Meibergdreef 9, 1105 AZ, Amsterdam, The Netherlands
| | - Henriëtte E van der Horst
- Department of General Practice/Family Medicine, Amsterdam University Medical Centers, location VUmc, De Boelelaan 1109, 1081 HV, Amsterdam, The Netherlands
| | - Annette H Blankenstein
- Department of General Practice/Family Medicine, Amsterdam University Medical Centers, location VUmc, De Boelelaan 1109, 1081 HV, Amsterdam, The Netherlands
| | - Mario Veen
- Department of General Practice, Erasmus Medical Center, Dr. Molewaterplein 40, 3015 GD, Rtterdam, The Netherlands
| | - Fedde Scheele
- Amsterdam University Medical Centers, School of Medical Sciences, Athena Institute for Transdisciplinary Research, VU University, location VUmc, Amsterdam, The Netherlands
| | - Pim W Teunissen
- School of Health Professions Education, Maastricht University, Universiteitssingel 60, 6229 ER, Maastricht, The Netherlands
| |
Collapse
|
40
|
Kogan JR, Hauer KE, Holmboe ES. The Dissolution of the Step 2 Clinical Skills Examination and the Duty of Medical Educators to Step Up the Effectiveness of Clinical Skills Assessment. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2021; 96:1242-1246. [PMID: 34166235 DOI: 10.1097/acm.0000000000004216] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
In this Invited Commentary, the authors explore the implications of the dissolution of the Step 2 Clinical Skills Examination (Step 2 CS) for medical student clinical skills assessment. The authors describe the need for medical educators (at both the undergraduate and graduate levels) to work collaboratively to improve medical student clinical skills assessment to assure the public that medical school graduates have the requisite skills to begin residency training. The authors outline 6 specific recommendations for how to capitalize on the discontinuation of Step 2 CS to improve clinical skills assessment: (1) defining national, end-of-clerkship, and transition-to-residency standards for required clinical skills and for levels of competence; (2) creating a national resource for standardized patient, augmented reality, and virtual reality assessments; (3) improving workplace-based assessment through local collaborations and national resources; (4) improving learner engagement in and coproduction of assessments; (5) requiring, as a new standard for accreditation, medical schools to establish and maintain competency committees; and (6) establishing a national registry of assessment data for research and evaluation. Together, these actions will help the medical education community earn the public's trust by enhancing the rigor of assessment to ensure the mastery of skills that are essential to providing safe, high-quality care for patients.
Collapse
Affiliation(s)
- Jennifer R Kogan
- J.R. Kogan is associate dean, Student Success and Professional Development, and professor of medicine, Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania; ORCID: https://orcid.org/0000-0001-8426-9506
| | - Karen E Hauer
- K.E. Hauer is associate dean, Competency Assessment and Professional Standards, and professor of medicine, University of California, San Francisco, School of Medicine, San Francisco, California; ORCID: https://orcid.org/0000-0002-8812-4045
| | - Eric S Holmboe
- E.S. Holmboe is chief, Research, Milestones Development and Evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois; ORCID: https://orcid.org/0000-0003-0108-6021
| |
Collapse
|
41
|
Donovan SK, Herstein JJ, Prober CG, Kolars JC, Gordon JA, Boyers P, Gold J, Davies HD. Expansion of simulation and extended reality for undergraduate health professions education: A call to action. JOURNAL OF INTERPROFESSIONAL EDUCATION & PRACTICE 2021; 24:100436. [PMID: 36567809 PMCID: PMC9765302 DOI: 10.1016/j.xjep.2021.100436] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/26/2021] [Accepted: 04/27/2021] [Indexed: 12/27/2022]
Abstract
In the spring of 2020, the COVID-19 pandemic limited access for many health professions students to clinical settings amid concerns about availability of appropriate personal protective equipment as well as the desire to limit exposure in these high-risk settings. Furthermore, the pandemic led to a need to cancel clinics and inpatient rotations, with a major impact on training for health professions and interprofessional health delivery, the long-term effects of which are currently unknown. While problematic, this also presents an opportunity to reflect on challenges facing the traditional clinical training paradigm in a rapidly changing and complex health care system and develop sustainable, high-quality competency-based educational models that incorporate rapidly progressing technologies. We call for pilot studies to explore specific simulation-based inpatient and outpatient clinical rotations for professional and interprofessional training.
Collapse
|
42
|
Yudkowsky R, Szauter K. Farewell to the Step 2 Clinical Skills Exam: New Opportunities, Obligations, and Next Steps. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2021; 96:1250-1253. [PMID: 34133347 DOI: 10.1097/acm.0000000000004209] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
The unexpected discontinuation of the United States Medical Licensing Examination Step 2 Clinical Skills (CS) exam in January 2021 carries both risks and opportunities for medical education in the United States. Step 2 CS had far-reaching effects on medical school curricula and school-based clinical skills assessments. Absent the need to prepare students for this high-stakes exam, will the rigor of foundational clinical skills instruction and assessment remain a priority at medical schools? In this article, the authors consider the potential losses and gains from the elimination of Step 2 CS and explore opportunities to expand local summative assessments beyond the narrow bounds of Step 2 CS. The responsibility for implementing a rigorous and credible summative assessment of clinical skills that are critical for patient safety as medical students transition to residency now lies squarely with medical schools. Robust human simulation (standardized patient) programs, including regional and virtual simulation consortia, can provide infrastructure and expertise for innovative and creative local assessments to meet this need. Novel applications of human simulation and traditional formative assessment methods, such as workplace-based assessments and virtual patients, can contribute to defensible summative decisions about medical students' clinical skills. The need to establish validity evidence for decisions based on these novel assessment methods comprises a timely and relevant focus for medical education research.
Collapse
Affiliation(s)
- Rachel Yudkowsky
- R. Yudkowsky is professor and director of graduate studies, Department of Medical Education, University of Illinois at Chicago College of Medicine, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-2145-7582
| | - Karen Szauter
- K. Szauter is assistant dean of educational affairs, University of Texas Medical Branch, Galveston, Texas; ORCID: https://orcid.org/0000-0002-2064-3535
| |
Collapse
|
43
|
Rietmeijer CBT, Blankenstein AH, Huisman D, van der Horst HE, Kramer AWM, de Vries H, Scheele F, Teunissen PW. What happens under the flag of direct observation, and how that matters: A qualitative study in general practice residency. MEDICAL TEACHER 2021; 43:937-944. [PMID: 33765396 DOI: 10.1080/0142159x.2021.1898572] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
INTRODUCTION In competency-based medical education, direct observation (DO) of residents' skills is scarce, notwithstanding its undisputed importance for credible feedback and assessment. A growing body of research is investigating this discrepancy. Strikingly, in this research, DO as a concrete educational activity tends to remain vague. In this study, we concretised DO of technical skills in postgraduate longitudinal training relationships. METHODS Informed by constructivist grounded theory, we performed a focus group study among general practice residents. We asked residents about their experiences with different manifestations of DO of technical skills. A framework describing different DO patterns with their varied impact on learning and the training relationship was constructed and refined until theoretical sufficiency was reached. RESULTS The dominant DO pattern was ad hoc, one-way DO. Importantly, in this pattern, various unpredictable, and sometimes unwanted, scenarios could occur. Residents hesitated to discuss unwanted scenarios with their supervisors, sometimes instead refraining from future requests for DO or even for help. Planned bi-directional DO sessions, though seldom practiced, contributed much to collaborative learning in a psychologically safe training relationship. DISCUSSION AND CONCLUSION Patterns matter in DO. Residents and supervisors should be made aware of this and educated in maintaining an open dialogue on how to use DO for the benefit of learning and the training relationship.
Collapse
Affiliation(s)
- Chris B T Rietmeijer
- Department of General Practice, Location VUmc, Amsterdam University Medical Center, Amsterdam, The Netherlands
| | - Annette H Blankenstein
- Department of General Practice, Location VUmc, Amsterdam University Medical Center, Amsterdam, The Netherlands
| | - Daniëlle Huisman
- Department of General Practice, Location VUmc, Amsterdam University Medical Center, Amsterdam, The Netherlands
| | - Henriëtte E van der Horst
- Department of General Practice, Location VUmc, Amsterdam University Medical Center, Amsterdam, The Netherlands
| | - Anneke W M Kramer
- Department of Public Health and Primary Care, Leiden University, Leiden, The Netherlands
| | - Henk de Vries
- Department of General Practice, Location VUmc, Amsterdam University Medical Center, Amsterdam, The Netherlands
| | - Fedde Scheele
- School of Medical Sciences, Amsterdam University Medical Center, Location VUmc, Athena Institute for Transdisciplinary Research, VU University, Amsterdam, The Netherlands
| | - Pim W Teunissen
- School of Health Professions Education, Maastricht University, Universiteitssingel 60, Maastricht, The Netherlands
| |
Collapse
|
44
|
Bray MJ, Bradley EB, Martindale JR, Gusic ME. Implementing Systematic Faculty Development to Support an EPA-Based Program of Assessment: Strategies, Outcomes, and Lessons Learned. TEACHING AND LEARNING IN MEDICINE 2021; 33:434-444. [PMID: 33331171 DOI: 10.1080/10401334.2020.1857256] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Problem: Development of a novel, competency-based program of assessment requires creation of a plan to measure the processes that enable successful implementation. The principles of implementation science outline the importance of considering key drivers that support and sustain transformative change within an educational program. The introduction of Entrustable Professional Activities (EPAs) as a framework for assessment has underscored the need to create a structured plan to prepare assessors to engage in a new paradigm of assessment. Although approaches to rater training for workplace-based assessments have been described, specific strategies to prepare assessors to apply standards related to the level of supervision a student needs have not been documented. Intervention: We describe our systematic approach to prepare assessors, faculty and postgraduate trainees, to complete EPA assessments for medical students during the clerkship phase of our curriculum. This institution-wide program is designed to build assessors' skills in direct observation of learners during authentic patient encounters. Assessors apply new knowledge and practice skills in using established performance expectations to determine the level of supervision a learner needs to perform clinical tasks. Assessors also learn to provide feedback and narrative comments to coach students and promote their ongoing clinical development. Data visualizations for assessors facilitate reinforcement of the tenets learned during training. Collaborative learning and peer feedback during faculty development sessions promote the formation of a community of practice among assessors. Context: Faculty development for assessors was implemented in advance of implementation of the EPA program. Assessors in the program include residents/fellows who work closely with students, faculty with discipline-specific expertise and a group of experienced clinicians who were selected to serve as experts in competency-based EPA assessments, the Master Assessors. Training focused on creating a shared understanding about the application of criteria used to evaluate student performance. EPA assessments based on the AAMC's Core Entrustable Professional Activities for Entering Residency, were completed in nine core clerkships. EPA assessments included a supervision rating based on a modified scale for use in undergraduate medical education. Impact: Data from EPA assessments completed during the first year of the program were analyzed to evaluate the effectiveness of the faculty development activities implemented to prepare assessors to consistently apply standards for assessment. A systematic approach to training and attention to critical drivers that enabled institution-wide implementation, led to consistency in the supervision rating for students' first EPA assessment completed by any type of assessor, ratings by assessors done within a specific clinical context, and ratings assigned by a group of specific assessors across clinical settings. Lessons learned: A systematic approach to faculty development with a willingness to be flexible and reach potential participants using existing infrastructure, can facilitate assessors' engagement in a new culture of assessment. Interaction among participants during training sessions not only promotes learning but also contributes to community building. A leadership group responsible to oversee faculty development can ensure that the needs of stakeholders are addressed and that a change in assessment culture is sustained.
Collapse
Affiliation(s)
- Megan J Bray
- Department of Obstetrics and Gynecology, Center for Medical Education Research and Scholarly Innovation, Office of Medical Education, University of Virginia School of Medicine, Charlottesville, Virginia, USA
| | - Elizabeth B Bradley
- Center for Medical Education Research and Scholarly Innovation, Office of Medical Education, University of Virginia School of Medicine, Charlottesville, Virginia, USA
| | - James R Martindale
- Center for Medical Education Research and Scholarly Innovation, Office of Medical Education, University of Virginia School of Medicine, Charlottesville, Virginia, USA
| | - Maryellen E Gusic
- Center for Medical Education Research and Scholarly Innovation, Office of Medical Education, Department of Pediatrics, University of Virginia School of Medicine, Charlottesville, Virginia, USA
| |
Collapse
|
45
|
Ginsburg S, Watling CJ, Schumacher DJ, Gingerich A, Hatala R. Numbers Encapsulate, Words Elaborate: Toward the Best Use of Comments for Assessment and Feedback on Entrustment Ratings. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2021; 96:S81-S86. [PMID: 34183607 DOI: 10.1097/acm.0000000000004089] [Citation(s) in RCA: 23] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
The adoption of entrustment ratings in medical education is based on a seemingly simple premise: to align workplace-based supervision with resident assessment. Yet it has been difficult to operationalize this concept. Entrustment rating forms combine numeric scales with comments and are embedded in a programmatic assessment framework, which encourages the collection of a large quantity of data. The implicit assumption that more is better has led to an untamable volume of data that competency committees must grapple with. In this article, the authors explore the roles of numbers and words on entrustment rating forms, focusing on the intended and optimal use(s) of each, with a focus on the words. They also unpack the problematic issue of dual-purposing words for both assessment and feedback. Words have enormous potential to elaborate, to contextualize, and to instruct; to realize this potential, educators must be crystal clear about their use. The authors set forth a number of possible ways to reconcile these tensions by more explicitly aligning words to purpose. For example, educators could focus written comments solely on assessment; create assessment encounters distinct from feedback encounters; or use different words collected from the same encounter to serve distinct feedback and assessment purposes. Finally, the authors address the tyranny of documentation created by programmatic assessment and urge caution in yielding to the temptation to reduce words to numbers to make them manageable. Instead, they encourage educators to preserve some educational encounters purely for feedback, and to consider that not all words need to become data.
Collapse
Affiliation(s)
- Shiphra Ginsburg
- S. Ginsburg is professor of medicine, Department of Medicine, Sinai Health System and Faculty of Medicine, University of Toronto, scientist, Wilson Centre for Research in Education, University of Toronto, Toronto, Ontario, Canada, and Canada Research Chair in Health Professions Education; ORCID: http://orcid.org/0000-0002-4595-6650
| | - Christopher J Watling
- C.J. Watling is professor and director, Centre for Education Research and Innovation, Schulich School of Medicine & Dentistry, Western University, London, Ontario, Canada; ORCID: https://orcid.org/0000-0001-9686-795X
| | - Daniel J Schumacher
- D.J. Schumacher is associate professor of pediatrics, Cincinnati Children's Hospital Medical Center and University of Cincinnati College of Medicine, Cincinnati, Ohio; ORCID: https://orcid.org/0000-0001-5507-8452
| | - Andrea Gingerich
- A. Gingerich is assistant professor, Northern Medical Program, University of Northern British Columbia, Prince George, British Columbia, Canada; ORCID: https://orcid.org/0000-0001-5765-3975
| | - Rose Hatala
- R. Hatala is professor, Department of Medicine, and director, Clinical Educator Fellowship, Center for Health Education Scholarship, University of British Columbia, Vancouver, British Columbia, Canada; ORCID: https://orcid.org/0000-0003-0521-2590
| |
Collapse
|
46
|
Steinemann S, Korndorffer J, Dent D, Rucinski J, Newman RW, Blair P, Lupi LK, Sachdeva AK. Defining the need for faculty development in assessment. Am J Surg 2021; 222:679-684. [PMID: 34226039 DOI: 10.1016/j.amjsurg.2021.06.010] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2021] [Revised: 06/04/2021] [Accepted: 06/20/2021] [Indexed: 12/29/2022]
Abstract
BACKGROUND High-quality workplace-based assessments are essential for competency-based surgical education. We explored education leaders' perceptions regarding faculty competence in assessment. METHODS Surgical education leaders were surveyed regarding which areas faculty needed improvement, and knowledge of assessment tools. Respondents were queried on specific skills regarding (a)importance in resident/medical student education (b)competence of faculty in assessment and feedback. RESULTS Surveys (n = 636) were emailed, 103 responded most faculty needed improvement in: verbal (86%) and written (83%) feedback, assessing operative skill (49%) and preparation for procedures (50%). Cholecystectomy, trauma laparotomy, inguinal herniorrhaphy were "very-extremely important" in resident education (99%), but 21-24% thought faculty "moderately to not-at-all" competent in assessment. This gap was larger for non-technical skills. Regarding assessment tools, 56% used OSATS, 49% Zwisch; most were unfamiliar with all non-technical tools. SUMMARY These data demonstrate a significant perceived gap in competence of faculty in assessment and feedback, and unfamiliarity with assessment tools. This can inform faculty development to support competency-based surgical education.
Collapse
Affiliation(s)
- Susan Steinemann
- Department of Surgery, University of Hawaii John A. Burns School of Medicine, 651 Ilalo Street, MEB223H, Honolulu, HI, 96813, USA.
| | - James Korndorffer
- Department of Surgery, Stanford University School of Medicine, 300 Pasteur Drive, Stanford, CA, 94305, USA.
| | - Daniel Dent
- Department of Surgery, University of Texas Health Science Center at San Antonio, 4502 Medical, San Antonio, TX, 78229, USA.
| | - James Rucinski
- Department of Surgery, New York-Presbyterian Brooklyn Methodist Hospital, 506 6th Street, Brooklyn, NY, 11215, USA.
| | - Rachel Williams Newman
- Division of Education, American College of Surgeons, 633 N. Saint Clair Street, Chicago, IL, 60611, USA
| | - Patrice Blair
- Division of Education, American College of Surgeons, 633 N. Saint Clair Street, Chicago, IL, 60611, USA
| | - Linda K Lupi
- Division of Education, American College of Surgeons, 633 N. Saint Clair Street, Chicago, IL, 60611, USA
| | - Ajit K Sachdeva
- Division of Education, American College of Surgeons, 633 N. Saint Clair Street, Chicago, IL, 60611, USA
| |
Collapse
|
47
|
Sleiman J, Savage DJ, Switzer B, Colbert CY, Chevalier C, Neuendorf K, Harris D. Teaching residents how to break bad news: piloting a resident-led curriculum and feedback task force as a proof-of-concept study. BMJ SIMULATION & TECHNOLOGY ENHANCED LEARNING 2021; 7:568-574. [DOI: 10.1136/bmjstel-2021-000897] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 06/12/2021] [Indexed: 11/04/2022]
Abstract
BackgroundBreaking bad news (BBN) is a critically important skill set for residents. Limited formal supervision and unpredictable timing of bad news delivery serve as barriers to the exchange of meaningful feedback.Purpose of studyThe goal of this educational innovation was to improve internal medicine residents’ communication skills during challenging BBN encounters. A formal BBN training programme and innovative on-demand task force were part of this two-phase project.Study designInternal medicine residents at a large academic medical centre participated in an interactive workshop focused on BBN. Workshop survey results served as a needs assessment for the development of a novel resident-led BBN task force. The task force was created to provide observations at the bedside and feedback after BBN encounters. Training of task force members incorporated video triggers and a feedback checklist. Inter-rater reliability was analysed prior to field testing, which provided data on real-world implementation challenges.Results148 residents were trained during the 2-hour communications skills workshop. Based on survey results, 73% (108 of 148) of the residents indicated enhanced confidence in BBN after participation. Field testing of the task force on a hospital ward revealed potential workflow barriers for residents requesting observations and prompted troubleshooting. Solutions were implemented based on field testing results.ConclusionsA trainee-led BBN task force and communication skills workshop is offered as an innovative model for improving residents’ interpersonal and communication skills in BBN. We believe the model is both sustainable and reproducible. Lessons learnt are offered to aid in implementation in other settings.
Collapse
|
48
|
Young JQ, Frank JR, Holmboe ES. Advancing Workplace-Based Assessment in Psychiatric Education: Key Design and Implementation Issues. Psychiatr Clin North Am 2021; 44:317-332. [PMID: 34049652 DOI: 10.1016/j.psc.2021.03.005] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 01/18/2023]
Abstract
With the adoption of competency-based medical education, assessment has shifted from traditional classroom domains of knows and knows how to the workplace domain of doing. This workplace-based assessment has 2 purposes; assessment of learning (summative feedback) and the assessment for learning (formative feedback). What the trainee does becomes the basis for identifying growth edges and determining readiness for advancement and ultimately independent practice. High-quality workplace-based assessment programs require thoughtful choices about the framework of assessment, the tools themselves, the platforms used, and the contexts in which the assessments take place, with an emphasis on direct observation.
Collapse
Affiliation(s)
- John Q Young
- Department of Psychiatry, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell and, Zucker Hillside Hospital at Northwell Health, 75-59 263rd Street, Kaufman Building, Glen Oaks, NY 11004, USA.
| | - Jason R Frank
- Department of Emergency Medicine, University of Ottawa, Royal College of Physicians and Surgeons of Canada, 774 Echo Drive, Ottawa, Ontario K15 5NB, Canada
| | - Eric S Holmboe
- Accreditation Council for Graduate Medical Education, ACGME, 401 North Michigan Avenue, Chicago, IL 60611, USA
| |
Collapse
|
49
|
Kinnear B, Kelleher M, Sall D, Schauer DP, Warm EJ, Kachelmeyer A, Martini A, Schumacher DJ. Development of Resident-Sensitive Quality Measures for Inpatient General Internal Medicine. J Gen Intern Med 2021; 36:1271-1278. [PMID: 33105001 PMCID: PMC8131459 DOI: 10.1007/s11606-020-06320-0] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/18/2020] [Revised: 07/20/2020] [Accepted: 10/14/2020] [Indexed: 11/28/2022]
Abstract
BACKGROUND Graduate medical education (GME) training has long-lasting effects on patient care quality. Despite this, few GME programs use clinical care measures as part of resident assessment. Furthermore, there is no gold standard to identify clinical care measures that are reflective of resident care. Resident-sensitive quality measures (RSQMs), defined as "measures that are meaningful in patient care and are most likely attributable to resident care," have been developed using consensus methodology and piloted in pediatric emergency medicine. However, this approach has not been tested in internal medicine (IM). OBJECTIVE To develop RSQMs for a general internal medicine (GIM) inpatient residency rotation using previously described consensus methods. DESIGN The authors used two consensus methods, nominal group technique (NGT) and a subsequent Delphi method, to generate RSQMs for a GIM inpatient rotation. RSQMs were generated for specific clinical conditions found on a GIM inpatient rotation, as well as for general care on a GIM ward. PARTICIPANTS NGT participants included nine IM and medicine-pediatrics (MP) residents and six IM and MP faculty members. The Delphi group included seven IM and MP residents and seven IM and MP faculty members. MAIN MEASURES The number and description of RSQMs generated during this process. KEY RESULTS Consensus methods resulted in 89 RSQMs with the following breakdown by condition: GIM general care-21, diabetes mellitus-16, hyperkalemia-14, COPD-13, hypertension-11, pneumonia-10, and hypokalemia-4. All RSQMs were process measures, with 48% relating to documentation and 51% relating to orders. Fifty-eight percent of RSQMs were related to the primary admitting diagnosis, while 42% could also be related to chronic comorbidities that require management during an admission. CONCLUSIONS Consensus methods resulted in 89 RSQMs for a GIM inpatient service. While all RSQMs were process measures, they may still hold value in learner assessment, formative feedback, and program evaluation.
Collapse
Affiliation(s)
- Benjamin Kinnear
- Department of Pediatrics, University of Cincinnati College of Medicine, , Cincinnati, OH, USA. .,Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, OH, USA.
| | - Matthew Kelleher
- Department of Pediatrics, University of Cincinnati College of Medicine, , Cincinnati, OH, USA.,Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Dana Sall
- Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Daniel P Schauer
- Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Eric J Warm
- Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, OH, USA
| | - Andrea Kachelmeyer
- Department of Pediatrics, University of Cincinnati College of Medicine, , Cincinnati, OH, USA
| | - Abigail Martini
- Department of Pediatrics, University of Cincinnati College of Medicine, , Cincinnati, OH, USA
| | - Daniel J Schumacher
- Department of Pediatrics, University of Cincinnati College of Medicine, , Cincinnati, OH, USA
| |
Collapse
|
50
|
Martinsen SSS, Espeland T, Berg EAR, Samstad E, Lillebo B, Slørdahl TS. Examining the educational impact of the mini-CEX: a randomised controlled study. BMC MEDICAL EDUCATION 2021; 21:228. [PMID: 33882913 PMCID: PMC8061047 DOI: 10.1186/s12909-021-02670-3] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/06/2020] [Accepted: 04/14/2021] [Indexed: 06/12/2023]
Abstract
BACKGROUND The purpose of this study is to evaluate the mini-Clinical Evaluation Exercise (mini-CEX) as a formative assessment tool among undergraduate medical students, in terms of student perceptions, effects on direct observation and feedback, and educational impact. METHODS Cluster randomised study of 38 fifth-year medical students during a 16-week clinical placement. Hospitals were randomised to provide a minimum of 8 mini-CEXs per student (intervention arm) or continue with ad-hoc feedback (control arm). After finishing their clinical placement, students completed an Objective Structured Clinical Examination (OSCE), a written test and a survey. RESULTS All participants in the intervention group completed the pre-planned number of assessments, and 60% found them to be useful during their clinical placement. Overall, there were no statistically significant differences between groups in reported quantity or quality of direct observation and feedback. Observed mean scores were marginally higher on the OSCE and written test in the intervention group, but not statistically significant. CONCLUSIONS There is considerable potential in assessing medical students during clinical placements and routine practice, but the educational impact of formative assessments remains mostly unknown. This study contributes with a robust study design, and may serve as a basis for future research.
Collapse
Affiliation(s)
| | - Torvald Espeland
- Department of Circulation and Medical Imaging, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
- Clinic of Cardiology, St. Olavs Hospital, Trondheim University Hospital, Trondheim, Norway
| | - Erik Andreas Rye Berg
- Department of Circulation and Medical Imaging, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
- Clinic of Thoracic and Occupational Medicine, St. Olavs Hospital, Trondheim University Hospital, Trondheim, Norway
| | - Eivind Samstad
- Department of Clinical and Molecular Medicine, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
- Clinic of Medicine and Rehabilitation, Ålesund Hospital, Møre og Romsdal Hospital Trust, Ålesund, Norway
| | - Børge Lillebo
- Department of Circulation and Medical Imaging, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
- Clinic of Medicine and Rehabilitation, Levanger Hospital, Nord-Trøndelag Hospital Trust, Levanger, Norway
| | - Tobias S Slørdahl
- Department of Clinical and Molecular Medicine, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
- Department of Haematology, St. Olavs Hospital, Trondheim University Hospital, Trondheim, Norway
| |
Collapse
|