1
|
Nissen C, Ying J, Kalantari F, Patel M, Prabhu AV, Kesaria A, Kim T, Maraboyina S, Harrell L, Xia F, Lewis GD. A Prospective Study Measuring Resident and Faculty Contour Concordance: A Potential Tool for Quantitative Assessment of Residents' Performance in Contouring and Target Delineation in Radiation Oncology Residency. J Am Coll Radiol 2024; 21:464-472. [PMID: 37844655 DOI: 10.1016/j.jacr.2023.08.049] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/10/2023] [Revised: 07/18/2023] [Accepted: 08/10/2023] [Indexed: 10/18/2023]
Abstract
PURPOSE/OBJECTIVE(S) Accurate target delineation (ie, contouring) is essential for radiation treatment planning and radiotherapy efficacy. As a result, improving the quality of target delineation is an important goal in the education of radiation oncology residents. The purpose of this study was to track the concordance of radiation oncology residents' contours with those of faculty physicians over the course of 1 year to assess for patterns. MATERIALS/METHODS Residents in postgraduate year (PGY) levels 2 to 4 were asked to contour target volumes that were then compared to the finalized, faculty physician-approved contours. Concordance between resident and faculty physician contours was determined by calculating the Jaccard concordance index (JCI), ranging from 0, meaning no agreement, to 1, meaning complete agreement. Multivariate mixed-effect models were used to assess the association of JCI to the fixed effect of PGY level and its interactions with cancer type and other baseline characteristics. Post hoc means of JCI were compared between PGY levels after accounting for multiple comparisons using Tukey's method. RESULTS In total, 958 structures from 314 patients collected during the 2020-2021 academic year were studied. The mean JCI was 0.77, 0.75, and 0.61 for the PGY-4, PGY-3, and PGY-2 levels, respectively. The JCI score for PGY-2 was found to be lower than those for PGY-3 and PGY-4, respectively (all P < .001). No statistically significant difference of JCI score was found between the PGY-3 and PGY-4 levels. The average JCI score was lowest (0.51) for primary head and/or neck cancers, and it was highest (0.80) for gynecologic cancers. CONCLUSIONS Tracking and comparing the concordance of resident contours with faculty physician contours is an intriguing method of assessing resident performance in contouring and target delineation and could potentially serve as a quantitative metric, which is lacking currently, in radiation oncology resident evaluation. However, additional study is necessary before this technique can be incorporated into residency assessments.
Collapse
Affiliation(s)
- Caleb Nissen
- Department of Radiation Oncology, University of Arkansas for Medical Sciences, Little Rock, Arkansas
| | - Jun Ying
- Department of Biostatistics, University of Arkansas for Medical Sciences, Little Rock, Arkansas
| | - Faraz Kalantari
- Department of Radiation Oncology, University of Arkansas for Medical Sciences, Little Rock, Arkansas
| | - Mausam Patel
- Department of Radiation Oncology, University of Arkansas for Medical Sciences, Little Rock, Arkansas
| | - Arpan V Prabhu
- Department of Radiation Oncology, University of Arkansas for Medical Sciences, Little Rock, Arkansas
| | - Anam Kesaria
- Department of Radiation Oncology, University of Arkansas for Medical Sciences, Little Rock, Arkansas
| | - Thomas Kim
- Associate Program Director, Department of Radiation Oncology, Rush University, Chicago, Illinois
| | - Sanjay Maraboyina
- Clinic Director, Department of Radiation Oncology, University of Arkansas for Medical Sciences, Little Rock, Arkansas
| | - Leslie Harrell
- Department of Radiation Oncology, University of Arkansas for Medical Sciences, Little Rock, Arkansas
| | - Fen Xia
- Department Chair, Department of Radiation Oncology, University of Arkansas for Medical Sciences, Little Rock, Arkansas
| | - Gary D Lewis
- Department of Radiation Oncology, University of Arkansas for Medical Sciences, Little Rock, Arkansas.
| |
Collapse
|
2
|
Liebert CA, Melcer EF, Keehl O, Eddington H, Trickey AW, Lee M, Tsai J, Camacho F, Merrell SB, Korndorffer JR, Lin DT. Validity Evidence for ENTRUST as an Assessment of Surgical Decision-Making for the Inguinal Hernia Entrustable Professional Activity (EPA). J Surg Educ 2022; 79:e202-e212. [PMID: 35909070 DOI: 10.1016/j.jsurg.2022.07.008] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/26/2022] [Revised: 06/02/2022] [Accepted: 07/05/2022] [Indexed: 06/15/2023]
Abstract
OBJECTIVE As the American Board of Surgery (ABS) moves toward implementation of Entrustable Professional Activities (EPAs), there is a growing need for objective evaluation of readiness for entrustment of residents. This requires not only assessment of technical skills and knowledge, but also surgical decision-making in preoperative, intraoperative, and postoperative settings. We developed and piloted an Inguinal Hernia EPA Assessment on ENTRUST, a serious game-based online virtual patient simulation platform to assess trainees' decision-making competence. DESIGN This is a prospective analysis of resident performance on the ENTRUST Inguinal Hernia EPA Assessment using bivariate analyses. SETTING This study was conducted at an academic institution in a proctored exam setting. PARTICIPANTS Forty-three surgical residents completed the ENTRUST Inguinal Hernia EPA Assessment. RESULTS Four case scenarios for the Inguinal Hernia EPA and corresponding scoring algorithms were iteratively developed by expert consensus aligned with ABS EPA descriptions and functions. ENTRUST Inguinal Hernia Grand Total Score was positively correlated with PGY-level (p < 0.0001). Preoperative, Intraoperative, and Postoperative Total Scores were also positively correlated with PGY-level (p = 0.001, p = 0.006, and p = 0.038, respectively). Total Case Scores were positively correlated with PGY-level for cases representing elective unilateral inguinal hernia (p = 0.0004), strangulated inguinal hernia (p < 0.0001), and elective bilateral inguinal hernia (p = 0.0003). Preoperative Sub-Scores were positively correlated with PGY-level for all cases (p < 0.01). Intraoperative Sub-Scores were positively correlated with PGY-level for strangulated inguinal hernia and bilateral inguinal hernia (p = 0.0007 and p = 0.0002, respectively). Grand Total Score and Intraoperative Sub-Score were correlated with prior operative experience (p < 0.0001). Prior video game experience did not correlate with performance on ENTRUST (p = 0.56). CONCLUSIONS Performance on the ENTRUST Inguinal Hernia EPA Assessment was positively correlated to PGY-level and prior inguinal hernia operative performance, providing initial validity evidence for its use as an objective assessment for surgical decision-making. The ENTRUST platform holds potential as tool for assessment of ABS EPAs in surgical residency programs.
Collapse
Affiliation(s)
- Cara A Liebert
- Department of Surgery, Stanford University School of Medicine, Stanford, California; VA Palo Alto Health Care System, Surgical Services, Palo Alto, California.
| | - Edward F Melcer
- Department of Computational Media, University of California-Santa Cruz, Baskin School of Engineering, Santa Cruz, California
| | - Oleksandra Keehl
- Department of Computational Media, University of California-Santa Cruz, Baskin School of Engineering, Santa Cruz, California
| | - Hyrum Eddington
- Stanford-Surgery Policy Improvement Research and Education Center (S-SPIRE), Department of Surgery, Stanford University School of Medicine, Palo Alto, California
| | - Amber W Trickey
- Stanford-Surgery Policy Improvement Research and Education Center (S-SPIRE), Department of Surgery, Stanford University School of Medicine, Palo Alto, California
| | - Melissa Lee
- Stanford University School of Medicine, Stanford, California
| | - Jason Tsai
- Department of Computational Media, University of California-Santa Cruz, Baskin School of Engineering, Santa Cruz, California
| | - Fatyma Camacho
- Department of Computational Media, University of California-Santa Cruz, Baskin School of Engineering, Santa Cruz, California
| | | | - James R Korndorffer
- Department of Surgery, Stanford University School of Medicine, Stanford, California; VA Palo Alto Health Care System, Surgical Services, Palo Alto, California
| | - Dana T Lin
- Department of Surgery, Stanford University School of Medicine, Stanford, California
| |
Collapse
|
3
|
Gold JM, Yemane L, Keppler H, Balasubramanian V, Rassbach CE. Words Matter: Examining Gender Differences in the Language Used to Evaluate Pediatrics Residents. Acad Pediatr 2022; 22:698-704. [PMID: 35158087 DOI: 10.1016/j.acap.2022.02.004] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/21/2021] [Revised: 01/21/2022] [Accepted: 02/07/2022] [Indexed: 11/19/2022]
Abstract
BACKGROUND Gender disparities in academic medicine continue to be pervasive. Written evaluations of residents may provide insight into perceptions of residents by faculty, which may influence letters of recommendation for positions beyond residency and reinforce perceived stereotype threat experienced by trainees. OBJECTIVE To examine language used in faculty evaluations of pediatrics residents to determine if there are differences in language used with respect to gender of resident. DESIGN/METHODS All faculty evaluations of residents in 3 consecutive intern classes from 2016 to 2018 were collected and redacted for name and gender identifiers. We performed a qualitative analysis of written comments in 2 mandatory free text sections. The study team initially coded text collectively, generating a code book, then individually to apply the coding scheme. Next, evaluations were unblinded to gender. Code applications were aggregated by resident, and frequencies of code application by resident were compared by standardized mean differences to detect imbalances between genders. RESULTS A total of 448 evaluations were analyzed: 88 evaluations of 17 male residents, and 360 evaluations of 70 female residents. Codes more frequently applied to women included "enthusiasm," and "caring," while codes more frequently applied to men included "intelligence," and "prepared." A conceptual model was created to reflect potential impacts of these differences using a lens of social role theory. CONCLUSIONS We identified differences in the way male and female residents are evaluated by faculty, which may have negative downstream effects on female residents, who may experience negative self-perception, differential development of clinical skills, and divergent career opportunities as a result.
Collapse
Affiliation(s)
- Jessica M Gold
- Department of Pediatrics (JM Gold, L Yemane, and CE Rassbach), Stanford University School of Medicine, Palo Alto, Calif.
| | - Lahia Yemane
- Department of Pediatrics (JM Gold, L Yemane, and CE Rassbach), Stanford University School of Medicine, Palo Alto, Calif
| | - Hannah Keppler
- Department of Pediatrics (H Keppler), Albert Einstein College of Medicine, Bronx, NY
| | | | - Caroline E Rassbach
- Department of Pediatrics (JM Gold, L Yemane, and CE Rassbach), Stanford University School of Medicine, Palo Alto, Calif
| |
Collapse
|
4
|
Schmiederer IS, Kearse LE, Korndorffer JR, Lee E, Sgroi MD, Lee JT. Validity Evidence for Vascular Skills Assessment: The Feasibility of Fundamentals of Vascular Surgery in General Surgery Residency. J Surg Educ 2021; 78:e201-e209. [PMID: 34446383 DOI: 10.1016/j.jsurg.2021.07.009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/29/2021] [Revised: 06/09/2021] [Accepted: 07/16/2021] [Indexed: 06/13/2023]
Abstract
OBJECTIVE As the Fundamentals of Laparoscopic Surgery (FLS) and Fundamentals of Endoscopic Surgery (FES) have been used for general surgery assessment, the Fundamentals of Vascular Surgery (FVS) has recently been developed to evaluate core operative skills for vascular trainees. This study examines the 3 year implementation of FVS for general surgery residents and it gathers validity evidence using Messick's framework. We hypothesized that the curriculum and assessment tool enhance general surgery resident training and assessment. DESIGN This is a retrospective review of FVS assessments of residents using descriptive and multivariate analyses. SETTING This study was conducted at an academic institution, where simulation-based teaching sessions occur in coordination between the general surgery and the integrated vascular surgery residency programs. PARTICIPANTS Seventeen general surgery residents were assessed in FVS skills by an expert rater from 2018 to 2020. RESULTS Overall, 86 assessments were completed. CONTENT Assessment focuses on 3 open vascular skills (End-to-Side Anastomosis, Patch Angioplasty and Clockface Suturing). Response Process: 7 items comprise a graded rating for a skills score. Additionally, a global summary score is designated. Internal Structure: The assessment tool has a Cronbach's alpha of 0.87, demonstrating good internal consistency. Addition of the second rater correlated with Cohen's kappa -0.69 (p < 0.001), indicating poor interrater reliability. Relationships to other variables: The most significant improvement occurred in total scores between PGY2s (17.4 ± 2.37) and PGY4s (23.2 ± 3.00), p < 0.001, indicating adequate level discernment. CONCLUSIONS The validity evidence of FVS assessment in this study supports its use in general surgery residency at a time when opportunities for open vascular skills assessment may be decreasing due to case availability and shifting paradigms. Further study into quality rater training is needed to optimize national implementation of FVS and ensure consistency in grading.
Collapse
Affiliation(s)
- Ingrid S Schmiederer
- Stanford University Medical Center, Department of Surgery, Stanford, California.
| | - LaDonna E Kearse
- Stanford University Medical Center, Department of Surgery, Stanford, California
| | - James R Korndorffer
- Stanford University Medical Center, Department of Surgery, Stanford, California
| | - Edmund Lee
- Inova Fairfax Hospital, Department of Surgery, Falls Church, Annandale, Virginia
| | - Michael D Sgroi
- Stanford University Medical Center, Department of Surgery, Stanford, California
| | - Jason T Lee
- Stanford University Medical Center, Department of Surgery, Stanford, California
| |
Collapse
|
5
|
Palis AG, Barrio-Barrio J, Mayorga EP, Mili-Boussen I, Noche CD, Swaminathan M, Golnik KC. The International Council of Ophthalmology Ophthalmic clinical evaluation exercise. Indian J Ophthalmol 2021; 69:43-47. [PMID: 33323570 PMCID: PMC7926108 DOI: 10.4103/ijo.ijo_154_20] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/21/2022] Open
Abstract
Purpose: Fifteen years after the publication of the Ophthalmic Clinical Evaluation Exercise (OCEX), it was deemed necessary to review and revise it, and to validate it for an international audience of ophthalmologists. This study to revise the OCEX and validate it for international use. Methods: The OCEX rubric was changed to a modified Dreyfus scale; a behavioral descriptor was created for each category. An international panel of ophthalmic educators reviewed the international applicability and appropriateness of the tool. Results: A tool for assessing and giving feedback on four aspects of clinical competence during the ophthalmic consultation (interview skills, examination, interpersonal and communication skills, and case presentation) was revised. The original scoring tool was improved to a new behavioral one, and relevant comments and suggestions from international reviewers were incorporated. The new tool has face and content validity for an international audience. Conclusion: The OCEX is the only tool for workplace assessment and feedback specifically for ophthalmology residents and the ophthalmic consultation. This improved and simplified version will facilitate its use and implementation to diverse programs around the world.
Collapse
Affiliation(s)
- Ana G Palis
- Department of Ophthalmology, Hospital Italiano de Buenos Aires, Buenos Aires, Argentina
| | - Jesús Barrio-Barrio
- Department of Ophthalmology, Clínica Universidad de Navarra, Navarra Institute for Health Research (IdiSNA), Pamplona, Spain
| | - Eduardo P Mayorga
- Department of Ophthalmology, Hospital Italiano de Buenos Aires, Buenos Aires, Argentina
| | - Ilhem Mili-Boussen
- Department of Ophthalmology, Charles Nicolle University Hospital, University of Tunis El Manar, Tunis, Tunisia
| | - Christelle D Noche
- Higher Institute of Health Sciences, Université des Montagnes, Bangangte, Cameroon
| | | | - Karl C Golnik
- Cincinnati Eye Institute and the University of Cincinnati, United States of America
| |
Collapse
|
6
|
Weber RA, Wong S, Allen SJ, Fornfeist DS. Assessing the Correlation Between a Surgeon's Ability to Draw a Procedure and Ability to Perform the Procedure. J Surg Educ 2020; 77:635-642. [PMID: 31954663 DOI: 10.1016/j.jsurg.2019.12.010] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/24/2019] [Accepted: 12/23/2019] [Indexed: 06/10/2023]
Abstract
OBJECTIVES The ability to assess a trainee's technical skill in a manner that maintains patient safety is critical to resident education. To do so, senior plastic surgery educators frequently ask residents to draw their proposed operation, presuming that a surgeon's ability to perform a surgery is reflected in his or her ability to diagram the procedure, independent of artistic ability. The purpose of this study was to delineate the relationship between the ability to draw a surgical procedure and execute it in a simulated model, and to determine if the ability to draw a procedure depends on artistic ability. DESIGN Participants in varying levels of knowledge and surgical skill were asked to draw a 4-strand cruciate tendon repair and subsequently perform the procedure on a validated, simulated model. The participants were graded according to Objective Structured Assessment of Technical Skills scales by 2 blinded hand surgeon examiners. Statistical analysis was performed in SAS 9.4 with Spearman's rank correlation coefficient. SETTING The study was performed at Baylor Scott and White Health in Temple, TX in an office-based laboratory setting. Participants Forty participants comprised of senior medical students, plastic/orthopedic surgery residents, and plastic/hand surgery attendings. All 40 participants entered and completed the study. RESULTS A statistically significant strongly positive correlation was found between overall assessment of drawing and overall assessment of performing the surgical procedure (p = 0.004). At the same time, the assessment of ability to draw the procedure was not associated with a general ability to draw or previous art training (p = 0.28). CONCLUSIONS Our findings support the use of drawing a specific procedure as an assessment tool to evaluate a surgeon's ability to perform a procedure.
Collapse
Affiliation(s)
- Robert A Weber
- Department of Surgery, Division of Plastic Surgery, Baylor Scott & White Medical Center, Temple, Texas.
| | - Stacy Wong
- Department of Surgery, Division of Plastic Surgery, Baylor Scott & White Medical Center, Temple, Texas
| | - Samantha J Allen
- Department of Obstetrics & Gynecology, University of Colorado Denver, Denver, Colorado
| | - Douglas S Fornfeist
- Department of Surgery, Division of Orthopedic Surgery, Baylor Scott & White Medical Center, Temple, Texas
| |
Collapse
|
7
|
Krueger CA, Rivera JC, Bhullar PS, Osborn PM. Developing a Novel Scoring System to Objectively Track Orthopaedic Resident Educational Performance and Progression. J Surg Educ 2020; 77:454-460. [PMID: 31889688 DOI: 10.1016/j.jsurg.2019.09.009] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/14/2019] [Revised: 08/16/2019] [Accepted: 09/11/2019] [Indexed: 06/10/2023]
Abstract
OBJECTIVE Objectively determining orthopedic resident competence remains difficult and lacks standardization across residency programs. We sought to develop a scoring system to measure resident educational activity to stratify participation and performance in particular aspects of training and the effect of these measures on board certification. DESIGN A weighted scoring system (Average Resident Score, ARS) was developed using the number of logged cases, clinic notes dictated, OITE PGY percentile, case minimums met, and scholarly activity completed each academic year (AY), with clinical activity being more heavily weighted. The Resident Effectiveness Score (RES), a z-score showing the number of standard deviations from the mean, was determined using the ARS. The RES effect on the Accreditation Council for Graduate Medical Education (ACGME) Milestones and American Board of Orthopedic Surgery (ABOS) Part 1 percentile score was determined using a Spearman correlation. SETTING Large academic orthopedic residency. PARTICIPANTS Thirty one orthopedic residents graduating between 2011 and 2016 were included. RESULTS The RES did not differ between classes in the same AY, nor change significantly for individual residents during their training. Milestone z-scores increased as residents progressed in their education. The RES correlated with each Milestone competency subscore. The PGY5 OITE score and achieving ACGME minimums correlated with passing ABOS Part 1 (28/31 1st time pass), but the RES did not predict passing the board examination. CONCLUSIONS This study demonstrates a scoring system encompassing multiple facets of resident education to track resident activity and progress. The RES can be tailored to an individual program's goals and aims and help program directors identify residents not maximizing educational opportunities compared to their peers. Monitoring this score may allow tailoring of educational efforts to individual resident needs. This RES may also allow residents to measure their performance and educational accomplishments and adjust their focus to obtain competence and board certification.
Collapse
Affiliation(s)
- Chad A Krueger
- San Antonio Uniformed Services Healthcare Education Consortium, Fort Sam Houston, Texas
| | - Jessica C Rivera
- San Antonio Uniformed Services Healthcare Education Consortium, Fort Sam Houston, Texas
| | - Preetinder S Bhullar
- San Antonio Uniformed Services Healthcare Education Consortium, Fort Sam Houston, Texas
| | - Patrick M Osborn
- San Antonio Uniformed Services Healthcare Education Consortium, Fort Sam Houston, Texas.
| |
Collapse
|
8
|
Schumacher DJ, Poynter S, Burman N, Elliott SP, Barnes M, Gellin C, Gonzalez Del Rey J, Sklansky D, Thoreson L, King B, Schwartz A; APPD LEARN CCC Study Group. Justifications for Discrepancies Between Competency Committee and Program Director Recommended Resident Supervisory Roles. Acad Pediatr 2019; 19:561-5. [PMID: 30572027 DOI: 10.1016/j.acap.2018.12.003] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 06/11/2018] [Revised: 12/11/2018] [Accepted: 12/14/2018] [Indexed: 11/23/2022]
Abstract
OBJECTIVE To explore justifications for differences between summative entrustment decisions made about pediatric residents by individuals who are charged with the review of residents (clinical competency committee, or CCC, members) and those who ultimately make final summative decisions about resident performance (program directors, or PDs). METHODS Individual CCC member and PD supervisory role categorizations were made in the 2015 to 2016 academic year at 14 pediatric residency programs, placing residents into 1 of 5 progressive supervisory roles. When PD recommendations differed from CCC members, a free-text justification was requested. Free-text responses were analyzed using manifest content analysis. RESULTS In total, 801 supervisory role categorizations were made by both CCC members and PDs, with the same recommendations made in 685 cases. In the 116 instances of discrepancy, PDs assigned a lower level of supervisory responsibility (n = 73) more often than a greater one (n = 43). When moving residents to a greater supervisory role category, PDs had more justifications anchored in resident performance than experience. When moving residents to a lower supervisory role categorization, PDs conversely noted experience more than performance. CONCLUSIONS PDs provide more justifications anchored in resident performance when moving residents to a greater supervisory role category compared with CCC members. However, when moving residents to a lower supervisory role categorization, they note experience more than performance. These patterns may or may not be entirely consistent with a competency-based approach and should be explored further.
Collapse
|
9
|
Schumacher DJ, Martini A, Holmboe E, Varadarajan K, Busari J, van der Vleuten C, Carraccio C. Developing Resident-Sensitive Quality Measures: Engaging Stakeholders to Inform Next Steps. Acad Pediatr 2019; 19:177-185. [PMID: 30268426 DOI: 10.1016/j.acap.2018.09.013] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/02/2018] [Revised: 09/11/2018] [Accepted: 09/24/2018] [Indexed: 11/29/2022]
Abstract
OBJECTIVE Despite the need for quality measures relevant to the work residents complete, few attempts have been made to address this gap. Resident-sensitive quality measures (RSQMs) can help fill this void. This study engaged resident and supervisor stakeholders to develop and inform next steps in creating such measures. METHODS Two separate nominal group techniques (NGTs), one with residents and one with faculty and fellow supervisors, were used to generate RSQMs for 3 specific illnesses (asthma, bronchiolitis, and closed head injury) as well as general care for the pediatric emergency department. Two separate Delphi processes were then used to prioritize identified RSQMs. The measures produced by each group were compared side by side, illuminating similarities and differences that were explored through focus groups with residents and supervisors. These focus groups also probed future settings in which to develop RSQMs. RESULTS In the NGT and Delphi groups, residents and supervisors placed considerable focus on measures in 3 areas across the illnesses of interest: 1) appropriate medication dosing, 2) documentation, and 3) information provided at patient discharge. Focus groups highlighted hospital medicine and general pediatrics as priority areas for developing future RSQMs but also noted contextual variables that influence the application of similar measures in different settings. Residents and supervisors had both similar as well as unique insights into developing RSQMs. CONCLUSIONS This study continues to pave the path forward in developing future RSQMs by exploring specific settings, measures, and stakeholders to consider when undertaking this work.
Collapse
Affiliation(s)
- Daniel J Schumacher
- Division of Emergency Medicine, Cincinnati Children's Hospital Medical Center (DJ Schumacher and A Martini); Office of the Chief Medical Officer, UCHealth (K Varadarajan), Cincinnati, Ohio.
| | - Abigail Martini
- Division of Emergency Medicine, Cincinnati Children's Hospital Medical Center (DJ Schumacher and A Martini); Office of the Chief Medical Officer, UCHealth (K Varadarajan), Cincinnati, Ohio
| | - Eric Holmboe
- Accreditation Council for Graduate Medical Education (E Holmboe), Chicago, Ill
| | - Kartik Varadarajan
- Division of Emergency Medicine, Cincinnati Children's Hospital Medical Center (DJ Schumacher and A Martini); Office of the Chief Medical Officer, UCHealth (K Varadarajan), Cincinnati, Ohio
| | - Jamiu Busari
- School of Health Professions Education (J Busari), Maastricht University, Maastricht, The Netherlands
| | - Cees van der Vleuten
- Department of Educational Development and Research in the Faculty of Health, Medicine, and Life Sciences and School of Health Professions Education (SHE) (C van der Vleuten), Maastricht University, Maastricht, The Netherlands
| | | |
Collapse
|
10
|
Goldman RH, Tuomala RE, Bengtson JM, Stagg AR. How Effective are New Milestones Assessments at Demonstrating Resident Growth? 1 Year of Data. J Surg Educ 2017; 74:68-73. [PMID: 27395399 DOI: 10.1016/j.jsurg.2016.06.009] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/28/2016] [Revised: 05/09/2016] [Accepted: 06/07/2016] [Indexed: 06/06/2023]
Abstract
OBJECTIVE Assessment tools that accrue data for the Accreditation Council for Graduate Medical Education Milestones must evaluate residents across multiple dimensions, including medical knowledge, procedural skills, teaching, and professionalism. Our objectives were to: (1) develop an assessment tool to evaluate resident performance in accordance with the Milestones and (2) review trends in resident achievements during the inaugural year of Milestone implementation. DESIGN A novel venue and postgraduate year (PGY) specific assessment tool was built, tested, and implemented for both operating room and labor and delivery "venues." Resident development of competence and independence was captured over time. To account for variable rotation schedules, the year was divided into thirds and compared using two-tailed Fisher's exact test. SETTING Brigham and Women's and Massachusetts General Hospitals, Boston MA. PARTICIPANTS Faculty evaluators and obstetrics and gynecology residents. RESULTS A total of 822 assessments of 44 residents were completed between 9/2014 and 6/2015. The percentage of labor and delivery tasks completed "independently" increased monotonically across the start of all years: 8.4% for PGY-1, 60.3% for PGY-2, 73.7% for PGY-3, and 87.5% for PGY-4. Assessments of PGY-1 residents demonstrated a significant shift toward "with minimal supervision" and "independent" for the management of normal labor (p = 0.03). PGY-3 residents demonstrated an increase in "able to be primary surgeon" in the operating room, from 36% of the time in the first 2/3 of the year, to 62.3% in the last 1/3 (p < 0.01). CONCLUSION Assessment tools developed to assist with Milestone assignments capture the growth of residents over time and demonstrate quantifiable differences in achievements between PGY classes. These tools will allow for targeted teaching opportunities for both individual residents and residency programs.
Collapse
Affiliation(s)
- Randi H Goldman
- Department of Obstetrics, Gynecology and Reproductive Biology, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts.
| | - Ruth E Tuomala
- Department of Obstetrics, Gynecology and Reproductive Biology, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts; Vincent Department of Obstetrics and Gynecology, Massachusetts General Hospital, Harvard Medical School, Boston, Massachusetts
| | - Joan M Bengtson
- Department of Obstetrics, Gynecology and Reproductive Biology, Brigham and Women's Hospital, Harvard Medical School, Boston, Massachusetts
| | - Amy R Stagg
- Vincent Department of Obstetrics and Gynecology, Massachusetts General Hospital, Harvard Medical School, Boston, Massachusetts
| |
Collapse
|
11
|
Burkhardt JC, Stansfield RB, Vohra T, Losman E, Turner-Lawrence D, Hopson LR. PROGNOSTIC VALUE OF THE MULTIPLE MINI-INTERVIEW FOR EMERGENCY MEDICINE RESIDENCY PERFORMANCE. J Emerg Med 2015; 49:196-202. [PMID: 25937476 DOI: 10.1016/j.jemermed.2015.02.008] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/07/2014] [Revised: 01/04/2015] [Accepted: 02/17/2015] [Indexed: 11/22/2022]
Abstract
BACKGROUND The Multiple Mini-Interview (MMI) uses short, structured contacts, and is known to predict medical school success better than traditional interviews and application materials. Its utility in Emergency Medicine residency selection is untested. OBJECTIVES We investigate whether it provides additional information regarding future first-year resident performance that can be useful in resident selection. METHODS From three Emergency Medicine residency programs, 71 interns in their first month completed an MMI developed to focus on desirable resident characteristics. Application data were reviewed. First-year resident performance assessments covering the American Council for Graduate Medical Education (ACGME) core competencies, along with professionalism and performance concerns, were obtained. Multiple logistic regressions were employed and MMI correlations were compared with program rank lists and typical selection factors. RESULTS An individual's score on the MMI correlated with overall performance (p < 0.05) in single logistic regression. MMI correlated with ACGME individual competencies patient care and procedural skills at a less robust level (p < 0.1), but not with any other outcomes. Rank list position correlated with the diagnostic skill competency (p < 0.05), but no others. Traditional selection factors correlated with overall performance, disciplinary action, patient care, medical knowledge, and diagnostic skills (p < 0.05). MMI was not correlated significantly with the outcomes when included in multiple ordinal logistic regression with other selection factors. CONCLUSIONS MMI scores correlate with overall performance, but are not statistically significant when other traditional selection factors were considered. The MMI process seems potentially superior to program rank list at correlating with first-year performance. The MMI may provide additional benefit when examined using a larger and more diverse sample.
Collapse
|
12
|
Tiyyagura G, Balmer D, Chaudoin L, Kessler D, Khanna K, Srivastava G, Chang TP, Auerbach M. The greater good: how supervising physicians make entrustment decisions in the pediatric emergency department. Acad Pediatr 2014; 14:597-602. [PMID: 25439158 DOI: 10.1016/j.acap.2014.06.001] [Citation(s) in RCA: 27] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/27/2014] [Revised: 05/20/2014] [Accepted: 06/05/2014] [Indexed: 11/17/2022]
Abstract
BACKGROUND Graduate medical education is transitioning to the use of entrustable professional activities to contextualize educational competencies. Factors influencing entrustment decisions have been reported in adult medicine. Knowing how such decisions are made in pediatrics is critical to this transition. PURPOSE To understand how supervisors determine the level of procedural supervision to provide a resident, taking into consideration simulation performance; to understand factors that affect supervisors' transparency to parents about residents' procedural experience. METHODS We conducted 18 one-on-one interviews with supervisors in a tertiary care pediatric emergency department, iteratively revising interview questions as patterns in the data were elucidated. Two researchers independently coded transcripts and then met with the investigative team to refine codes and create themes. RESULTS Five factors influenced supervisors' entrustment decisions: 1) resident characteristics that include self-reported confidence, seniority, and prior interactions with the resident; 2) supervisor style; 3) nature of the procedure/characteristics of the patient; 4) environmental factors; and 5) parental preferences. Supervisors thought that task-based simulators provided practice opportunities but that simulated performance did not provide evidence for entrustment. Supervisors reported selectively omitting details about a resident's experience level to families to optimize experiential learning for residents they entrusted to perform a procedure. CONCLUSIONS In pediatrics, supervisors consider various factors when making decisions regarding resident procedural readiness, including parental preferences. An educational system using entrustable professional activities may facilitate holistic assessment and foster expertise-informed decisions about residents' progression toward entrustment; such a system may also lessen supervisors' need to omit information to parents about residents' procedural readiness.
Collapse
Affiliation(s)
- Gunjan Tiyyagura
- Department of Pediatrics, Yale University School of Medicine, New Haven, Conn.
| | - Dorene Balmer
- Department of Pediatrics, Baylor College of Medicine, Houston, Tex
| | - Lindsey Chaudoin
- Department of Pediatrics and Emergency Medicine, Mt Sinai Hospital, New York, NY
| | - David Kessler
- Department of Pediatrics, Columbia University, New York, NY
| | - Kajal Khanna
- Department of Emergency Medicine, Stanford University, Stanford, Calif
| | | | - Todd P Chang
- Department of Pediatrics, Children's Hospital Los Angeles, Los Angeles, Calif
| | - Marc Auerbach
- Department of Pediatrics, Yale University School of Medicine, New Haven, Conn
| |
Collapse
|
13
|
Warm EJ, Mathis BR, Held JD, Pai S, Tolentino J, Ashbrook L, Lee CK, Lee D, Wood S, Fichtenbaum CJ, Schauer D, Munyon R, Mueller C. Entrustment and mapping of observable practice activities for resident assessment. J Gen Intern Med 2014; 29:1177-82. [PMID: 24557518 PMCID: PMC4099463 DOI: 10.1007/s11606-014-2801-5] [Citation(s) in RCA: 85] [Impact Index Per Article: 8.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/25/2013] [Revised: 09/17/2013] [Accepted: 01/22/2014] [Indexed: 11/30/2022]
Abstract
Entrustable Professional Activities (EPAs) and the Next Accreditation System reporting milestones reduce general competencies into smaller evaluable parts. However, some EPAs and reporting milestones may be too broad to use as direct assessment tools. We describe our internal medicine residency curriculum and assessment system, which uses entrustment and mapping of observable practice activities (OPAs) for resident assessment. We created discrete OPAs for each resident rotation and learning experience. In combination, these serve as curricular foundation and tools for assessment. OPA performance is measured via a 5-point entrustment scale, and mapped to milestones and EPAs. Entrustment ratings of OPAs provide an opportunity for immediate structured feedback of specific clinical skills, and mapping OPAs to milestones and EPAs can be used for longitudinal assessment, promotion decisions, and reporting. Direct assessment and demonstration of progressive entrustment of trainee skill over time are important goals for all training programs. Systems that use OPAs mapped to milestones and EPAs provide the opportunity for achieving both, but require validation.
Collapse
Affiliation(s)
- Eric J Warm
- University of Cincinnati Academic Health Center, 231 Albert Sabin Way ML 0557, Cincinnati, OH, 45267-0557, USA,
| | | | | | | | | | | | | | | | | | | | | | | | | |
Collapse
|
14
|
Borman KR, Augustine R, Leibrandt T, Pezzi CM, Kukora JS. Initial performance of a modified milestones global evaluation tool for semiannual evaluation of residents by faculty. J Surg Educ 2013; 70:739-749. [PMID: 24209650 DOI: 10.1016/j.jsurg.2013.08.004] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/19/2013] [Revised: 08/03/2013] [Accepted: 08/22/2013] [Indexed: 06/02/2023]
Abstract
OBJECTIVES To determine whether faculty could successfully evaluate residents using a competency-based modified Milestones global evaluation tool. DESIGN A program's leadership team modified a draft Surgery Milestones Working Group summative global assessment instrument into a modified Milestones tool (MMT) for local use during faculty meetings devoted to semiannual resident review. Residents were scored on 15 items spanning all competencies using an 8-point graphic response scale; unstructured comments also were solicited. Arithmetic means were computed at the resident and postgraduate year cohort levels for items and competency item sets. Score ranges (highest minus lowest score) were calculated; variability was termed "low" (range <2.0 points), "moderate" (range = 2.0), or "high" (range >2.0). A subset of "low" was designated "small" (1.0-1.9). Trends were sought among item, competency, and total Milestones scores. MMT correlations with examination scores and multisource (360°) assessments were explored. The success of implementing MMT was judged using published criteria for educational assessment methods. SETTING Fully accredited, independently sponsored residency. PARTICIPANTS Program leaders and 22 faculty members (71% voluntary, mean 12y of experience). RESULTS Twenty-six residents were assessed, yielding 7 to 13 evaluations for MMT per categorical resident and 3 to 6 per preliminary trainee. Scores spanned the entire response scale. All MMT evaluations included narrative comments. Individual resident score variability was low (96% within competencies and 92% across competencies). Subset analysis showed that small variations were common (35% within competencies and 54% across competencies). Postgraduate year cohort variability was higher (61% moderate or high within competencies and 50% across competencies). Cohort scores at the item, competency, and total score levels exhibited rising trajectories, suggesting MMT construct validity. MMT scores did not demonstrate concurrent validity, correlating poorly with other metrics. The MMT met multiple criteria for good assessment. CONCLUSIONS A modified Milestones global evaluation tool can be successfully adopted for semiannual assessments of resident performance by volunteer faculty members.
Collapse
Affiliation(s)
- Karen R Borman
- Department of Surgery, Abington Memorial Hospital, Abington, Pennsylvania.
| | | | | | | | | |
Collapse
|
15
|
Friedman Z, Siddiqui N, Mahmoud S, Davies S. Video-assisted structured teaching to improve aseptic technique during neuraxial block. Br J Anaesth 2013; 111:483-7. [PMID: 23562931 DOI: 10.1093/bja/aet062] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
BACKGROUND Teaching epidural catheter insertion tends to focus on developing manual dexterity rather than improving aseptic technique which usually remains poor despite increasing experience. The aim of this study was to compare epidural aseptic technique performance, by novice operators after a targeted teaching intervention, with operators taught aseptic technique before the intervention was initiated. METHODS Starting July 2008, two groups of second-year anaesthesia residents (pre- and post-teaching intervention) performing their 4-month obstetric anaesthesia rotation in a university affiliated centre were videotaped three to four times while performing epidural procedures. Trained blinded independent examiners reviewed the procedures. The primary outcome was a comparison of aseptic technique performance scores (0-30 points) graded on a scale task-specific checklist. RESULTS A total of 86 sessions by 29 residents were included in the study analysis. The intraclass correlation coefficient for inter-rater reliability for the aseptic technique was 0.90. The median aseptic technique scores for the rotation period were significantly higher in the post-intervention group [27.58, inter-quartile range (IQR) 22.33-29.50 vs 16.56, IQR 13.33-22.00]. Similar results were demonstrated when scores were analysed for low, moderate, and high levels of experience throughout the rotation. CONCLUSIONS Procedure-specific aseptic technique teaching, aided by video assessment and video demonstration, helped significantly improve aseptic practice by novice trainees. Future studies should consider looking at retention over longer periods of time in more senior residents.
Collapse
Affiliation(s)
- Z Friedman
- Department of Anesthesia and Pain Management, Mount Sinai Hospital, University of Toronto, 600 University Avenue, Toronto, ON, M5G1X5, Canada.
| | | | | | | |
Collapse
|