1
|
Carter B, Sidrak J, Wagner B, Travis C, Nehler M, Christian N. Preliminary Development of a Program ABSITE Dashboard (PAD) to Guide Curriculum Innovation. JOURNAL OF SURGICAL EDUCATION 2024; 81:226-242. [PMID: 38195275 DOI: 10.1016/j.jsurg.2023.10.014] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/14/2023] [Accepted: 10/20/2023] [Indexed: 01/11/2024]
Abstract
PURPOSE Medical Knowledge for general surgery residents' is assessed by the American Board of Surgery In- Training Examination (ABSITE). ASBITE score reports contain many metrics residency directors can utilize to assess resident progress and perform program evaluation. The purpose of this study was to develop a framework to evaluate program effectiveness in teaching specific subtest and subtopic areas of the ABSITE, using ABSITE score reports as an indicator. The aim is to demonstrate the identification of topic areas of weakness in program-wide performance on the ABSITE to guide proposed modification of the general surgery residency program curriculum, and to initiate development of a data visualizing dashboard to communicate these metrics. METHODS A single institution retrospective study was performed utilizing ABSITE score reports from general surgery residents at a large academic training program from 2017 to 2020. ABSITE performance metrics from 320 unique records were entered into a database; statistical analysis for linear trends and variance were conducted for standard scores, subtest standard scores, and incorrect subtest topics. Deviation from national average scores were calculated by subtracting the national average score from each subtest score for each trainee. Data were displayed as medians or proportions and are displayed to optimize visualization as a proof-of-concept for the development of a program dashboard. RESULTS Trends and variance in general surgery program and cohort performance on various elements of the ABSITE were visualized using figures and tables that represent a prototype for a program dashboard. Figure A1 demonstrates one example, in which a heatmap displays the median deviation from national average scores for each subtest by program year. Boxplots show the distribution of the deviation from national average, range for national average scores, and the recorded scores for each subtest by program year. Trends in median deviation from the national average scores are displayed for each program year paneled by subtest or for each exam year paneled by cohort. Median change in overall test scores from one program year to another in a cohort is visualized as a table. Bar graphs show the most often missed topics across all program years and heatmaps were generated showing the proportion of times each topic was missed for each subtest and exam year. CONCLUSIONS We demonstrate use of ABSITE reports to identify specific thematic areas of opportunities for curriculum modification and innovation as an element of program evaluation. In this study we demonstrate, through data analysis and visualization, feasibility for the creation of a Program ABSITE Dashboard (PAD) that enhances the use of ABSITE reports for formative program evaluation and can guide modifications to surgery program curriculum and educational practices.
Collapse
Affiliation(s)
- Brian Carter
- University of Colorado School of Medicine, Aurora, Colorado
| | - Jason Sidrak
- University of Colorado School of Medicine, Aurora, Colorado
| | - Brandie Wagner
- Department of Biostatistics and Informatics, Colorado School of Public Health, Aurora, Colorado
| | - Claire Travis
- Department of Surgery, University of Colorado Anschutz Medical Center, Aurora, Colorado
| | - Mark Nehler
- Department of Surgery, University of Colorado Anschutz Medical Center, Aurora, Colorado
| | - Nicole Christian
- Department of Surgery, University of Colorado Anschutz Medical Center, Aurora, Colorado.
| |
Collapse
|
2
|
Velez DR. Prospective Factors that Predict American Board of Surgery In-Training Examination Performance: A Systematic Review. Am Surg 2021; 87:1867-1878. [PMID: 34763542 DOI: 10.1177/00031348211058626] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/04/2023]
Abstract
INTRODUCTION American Board of Surgery In-Training Examination (ABSITE) performance has become an important factor when monitoring resident progress. Understanding which prospective factors predict performance can help identify residents at risk. METHODS A literature search was conducted searching PubMed, EMBASE, and JAMA Network from June 2011 to June 2021, in accordance with the PRISMA guidelines. Searches were performed for the terms "ABSITE" and "American Board of Surgery In-Training Examination." Prospective factors such as prior examination performance, clinical evaluations, and demographics were evaluated. RESULTS A final 35 studies were included. The prospective factor most consistently found to predict ABSITE performance is performance on prior knowledge-based examinations such as the USMLE step exams. The ACGME Medical Knowledge 1 milestone evaluation also appears to correlate to ABSITE performance, although clinical evaluations, in general, do not. Demographics have no significant correlation to ABSITE performance. DISCUSSION Using performance on prior knowledge-based examinations programs may be able to identify residents at risk for failing ABSITE. It may be possible to initiate early intervention before rather than only remediation after poor performance.
Collapse
Affiliation(s)
- David R Velez
- Department of Surgery, 3579University of North Dakota School of Medicine & Health Sciences, Grand Forks, ND, USA
| |
Collapse
|
3
|
Pirie J, St. Amant L, Glover Takahashi S. Managing residents in difficulty within CBME residency educational systems: a scoping review. BMC MEDICAL EDUCATION 2020; 20:235. [PMID: 32703231 PMCID: PMC7376876 DOI: 10.1186/s12909-020-02150-0] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/20/2020] [Accepted: 07/15/2020] [Indexed: 05/28/2023]
Abstract
BACKGROUND Best practices in managing residents in difficulty (RID) in the era of competency-based medical education (CBME) are not well described. This scoping review aimed to inventory the current literature and identify major themes in the articles that address or employ CBME as part of the identification and remediation of residents in difficulty. METHODS Articles published between 2011 to 2017 were included if they were about postgraduate medical education, RID, and offered information to inform the structure and/or processes of CBME. All three reviewers performed a primary screening, followed by a secondary screening of abstracts of the chosen articles, and then a final comprehensive sub-analysis of the 11 articles identified as using a CBME framework. RESULTS Of 165 articles initially identified, 92 qualified for secondary screening; the 63 remaining articles underwent full-text abstracting. Ten themes were identified from the content analysis with "identification of RID" (41%) and "defining and classifying deficiencies" (30%) being the most frequent. In the CBME article sub-analysis, the most frequent themes were: need to identify RID (64%), improving assessment tools (45%), and roles and responsibilities of players involved in remediation (27%). Almost half of the CBME articles were published in 2016-2017. CONCLUSIONS Although CBME programs have been implemented for many years, articles have only recently begun specifically addressing RID within a competency framework. Much work is needed to describe the sequenced progression, tailored learning experiences, and competency-focused instruction. Finally, future research should focus on the outcomes of remediation in CBME programs.
Collapse
Affiliation(s)
- Jonathan Pirie
- Department of Pediatrics, Faculty of Medicine, University of Toronto, Toronto, Canada
- Paediatric Emergency Medicine, The Hospital for Sick Children, Toronto, Canada
| | - Lisa St. Amant
- Postgraduate Medical Education, Faculty of Medicine, University of Toronto, Toronto, Canada
| | - Susan Glover Takahashi
- Department of Family and Community Medicine, Faculty of Medicine, Integrated Senior Scholar – Centre for Faculty Development and Postgraduate Medical Education, University of Toronto, Toronto, Canada
| |
Collapse
|
4
|
Flentje AO, Caturegli I, Kavic SM. Practice Makes Perfect: Introducing a Question Bank for ABSITE Preparation Improves Program Performance. JOURNAL OF SURGICAL EDUCATION 2020; 77:54-60. [PMID: 31526642 DOI: 10.1016/j.jsurg.2019.09.005] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/30/2019] [Revised: 07/29/2019] [Accepted: 09/02/2019] [Indexed: 06/10/2023]
Abstract
BACKGROUND The American Board of Surgery In-Training Examination (ABSITE) is an important predictor of passing the Qualifying Examination and a determinant of fellowship competitiveness. OBJECTION Study the impact of providing program-wide access to a commercially available question bank for ABSITE preparation. STUDY DESIGN The surgery residency program purchased access to the TrueLearn question bank in 2018 A paired sample t test analysis compared the 2018 ABSITE percentage and percentile scores, prior to practice question bank access to 2019 ABSITE percentile scores. A simple linear regression analysis was calculated to predict improvement in percentage scores from 2018 to 2019 based on total number of practice questions as well as number of correct practice questions completed. Data were analyzed using SPSS. RESULTS Among the residents utilizing practice questions with serial exam scores, the individual resident ABSITE percentage of correct questions showed a statistically significant improvement after introduction of the question bank from 2018 (mean = 68.7, standard deviation = 7.3) to 2019 (mean = 72.2, standard deviation = 7.2; t(35) = -4.529, p < 0.001). A statistically significant regression equation both linear (F(1,33) = 6.274, p = 0.017) and logarithmic (F(1,33) = 7.405, p =0.01) was found with an R2 of 0.160 and 0.183, respectively, for total number of practice questions completed, signifying that more completed practice questions correlated with a higher improvement in ABSITE percentage score. The improvement in residents' ABSITE percentage score increased by 3 ± 1 percentage point for each 100 practice questions completed from 2018 to 2019 (Figure 1). A significant regression equation was also found for improvement in percentage score among all residents (F (1,33) = 8.211, p = 0.007) with an R2 of 0.199 for the number of correct practice questions completed. CONCLUSION Use of a commercial question bank improved overall ABSITE scores. More questions answered translated into improved performance. Percent correct on the practice questions also correlated strongly with performance. Programs seeking to improve scores may wish to provide access to a question bank.
Collapse
Affiliation(s)
| | | | - Stephen M Kavic
- University of Maryland, School of Medicine, Baltimore, Maryland.
| |
Collapse
|
5
|
McCall N, Umuhoza C, O’Callahan C, Rogo T, Stafford D, Kanyamuhunga A, Cartledge PT. Measuring change in knowledge acquisition of Rwandan residents: using the American Board of Pediatrics International In-Training Examination (I-ITE) as an independent tool to monitor individual and departmental improvements during the Human Resources for Health program: an observational study. BMC MEDICAL EDUCATION 2019; 19:217. [PMID: 31208418 PMCID: PMC6580544 DOI: 10.1186/s12909-019-1617-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/21/2019] [Accepted: 05/22/2019] [Indexed: 06/09/2023]
Abstract
BACKGROUND Rwanda is the only African country to use the pediatric International In-Training Examination (I-ITE). The objectives of this study were to use the scores from the I-ITE to outline the baseline level of knowledge of Rwandan residents entering the pediatric residency and the trends in knowledge acquisition from 2012 to 2018, during the Human Resources for Health (HRH) Program, an education partnership between the Rwanda Ministry of Health and a consortium of US universities. METHODS A retrospective descriptive analysis of the I-ITE exam scores, taken by all Rwandan pediatric residents for five of the six academic years of the study period. Individual resident scores were weighted using the non-Rwandan I-ITE sites to minimise confounding from annual variations in exam difficulty. Statistical analysis included descriptives with ANOVA to compare variation in annual mean scores. RESULTS Eighty-four residents took 213 I-ITE exam sittings over the five exam cycles. The mean weighted I-ITE score of all residents increased from 34% in 2013 to 49% (p < 0.001) in 2018. The 32-point gap between the mean US-ITE and Rwandan I-ITE score in 2012-2013 was reduced to a 16-point gap in 2017-2018. First year resident (PG1) scores, which likely reflect the knowledge level of undergraduate medical students entering the residency program, increased from 34.8 to 44.3% (p = 0.002) between 2013 and 2018. CONCLUSIONS The I-ITE is an independent, robust tool, measuring both learners and the institutional factors supporting residents. This is the first study to demonstrate that the I-ITE can be used to monitor resident knowledge acquisition in resource-limited settings, where assessment of resident knowledge can be a major challenge facing the academic medicine community. The significant increase in I-ITE scores between 2012 and 18 reflects the substantial curricular reorganisation accomplished through collaboration between Rwandan and US embedded faculty and supports the theory that programs such as HRH are highly effective at improving the quality of residency programs and undergraduate medical education.
Collapse
Affiliation(s)
- Natalie McCall
- Yale University Rwanda Human Resources for Health Program, Department of Paediatrics, Centre Hospitalier Universitaire de Kigali (CHUK), PO Box 655, Kigali, Rwanda
| | - Christian Umuhoza
- Department of Paediatrics, College of Medicine and Health Sciences, University of Rwanda, Kigali, Rwanda
| | | | - Tanya Rogo
- Icahn School of Medicine at Mount Sinai, One Gustave L. Levy Place, New York, NY 10029-6574 USA
| | - Diane Stafford
- Lucile Packard Children’s Hospital, Stanford University, 291 Campus Drive, Li Ka Shing Building, Stanford, CA 94305-5101 USA
| | - Aimable Kanyamuhunga
- Department of Paediatrics, College of Medicine and Health Sciences, University of Rwanda, Kigali, Rwanda
| | - Peter T. Cartledge
- Yale University Rwanda Human Resources for Health Program, Department of Paediatrics, Centre Hospitalier Universitaire de Kigali (CHUK), PO Box 655, Kigali, Rwanda
| |
Collapse
|
6
|
Kantar RS, Wise E, Morales D, Harris DG, Kidd-Romero S, Kavic S. The American Board Style Practice In-Training Examination as a Predictor of Performance on the American Board of Surgery In-Training Examination. JOURNAL OF SURGICAL EDUCATION 2018; 75:895-900. [PMID: 29396273 DOI: 10.1016/j.jsurg.2017.12.010] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/08/2017] [Revised: 11/04/2017] [Accepted: 12/09/2017] [Indexed: 06/07/2023]
Abstract
BACKGROUND The American Board of Surgery In-Training Examination (ABSITE), is an annual 250 question, multiple-choice test that assesses residents' surgical knowledge in preparation for board examinations. At our program, we developed a Surgical Council on Resident Education-based American Board Style Practice In-Training Examination: The ABSPITE. The 40-question examination was designed to help with test preparation. The purpose of this study was to evaluate the ABSPITE's predictive value on ABSITE performance. METHODS From 2013 to 2016, the ABSPITE was administered to residents at our program. Performances (N = 134) were graded based on a standardized scale to determine resident percent and percentile performance, then compared to average ABSITE performance. RESULTS Combined analysis showed a statistically significant positive correlation between average ABSITE and ABSPITE percentages and percentiles. This held true when categorical and preliminary residents were compared. When stratified by resident PGY level, the same results were seen for PGY 1 and PGY 2 residents but correlations failed to reach statistical significance for higher resident training levels. CONCLUSIONS The practice ABSPITE examination strongly correlates with ABSITE performance among junior residents at our program, and may be a valuable tool to predict ABSITE performance and guide review efforts.
Collapse
Affiliation(s)
- Rami S Kantar
- Department of Surgery, The University of Maryland Medical System, Baltimore, Maryland.
| | - Eric Wise
- Department of Surgery, The University of Maryland Medical System, Baltimore, Maryland
| | - David Morales
- Department of Surgery, The University of Maryland Medical System, Baltimore, Maryland
| | - Donald G Harris
- Department of Surgery, The University of Maryland Medical System, Baltimore, Maryland
| | - Sarah Kidd-Romero
- Department of Surgery, The University of Maryland Medical System, Baltimore, Maryland
| | - Stephen Kavic
- The University of Maryland School of Medicine, Baltimore, Maryland
| |
Collapse
|
7
|
Performance on a Surgical In-Training Examination Varies by Training Year and Pathway. Plast Reconstr Surg 2017; 138:358e-364e. [PMID: 27465196 DOI: 10.1097/prs.0000000000002397] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
BACKGROUND Few studies in surgery have addressed medical knowledge competency training as defined by the Accreditation Council for Graduate Medical Education. As in-training examinations are ubiquitous educational tools for surgical residents in the United States, insights into examination performance may help fill this void. The purpose of this study was to determine the relationship between In-Service Examination performance and training characteristics in plastic surgery. METHODS This retrospective cohort study reviewed performance data for the Plastic Surgery In-Service Training Examination for the years 2012 to 2015. Comparisons were made both within and between training pathways by means of Kruskal-Wallis and Mann-Whitney U tests. RESULTS Data were available for 1367 independent (37.9 percent) and 2240 integrated residents (62.1 percent). Among integrated residents, performance increased with additional years of training (p < 0.001), but no difference existed between postgraduate year-5 and postgraduate year-6 residents (p > 0.05). Similarly, independent resident examination performance increased by year of training (p < 0.001), with no difference between postgraduate year-2 and postgraduate year-3 residents (p > 0.05). At each level of training (postgraduate years 4 to 6), integrated residents outperformed their independent resident colleagues (postgraduate years 1 to 3) (p < 0.001). CONCLUSIONS Performance on the Plastic Surgery In-Service Training Examination increases during residency, with integrated residents outperforming independent residents. These findings may have implications for medical knowledge competency training as defined by the Accreditation Council for Graduate Medical Education.
Collapse
|
8
|
Silvestre J, Chang B, Serletti JM. Relevancy of an In-Service Examination for Core Knowledge Training in a Surgical Subspecialty. JOURNAL OF SURGICAL EDUCATION 2016; 73:305-310. [PMID: 26868315 DOI: 10.1016/j.jsurg.2015.09.013] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/17/2015] [Revised: 09/09/2015] [Accepted: 09/21/2015] [Indexed: 06/05/2023]
Abstract
OBJECTIVE To facilitate knowledge acquisition during plastic surgery residency, we analyzed the breast curriculum on the Plastic Surgery In-Service Training Exam (PSITE). DESIGN Breast-related questions on 6 consecutive PSITEs were analyzed (2008-2013). Topics were categorized by the content outline for the American Board of Plastic Surgery written board examination. Question vignettes were classified by taxonomy and clinical setting. References for correct answer choices were categorized by source and publication lag. RESULTS A total of 136 breast-related questions were analyzed (136/1174, 12%). Questions tended to appear more in the Breast and Cosmetic (75%) section than the Comprehensive (25%) section (p < 0.001). Most question vignettes were written in a clinical setting (64%, p < 0.001). Question taxonomy was evenly distributed among recall (34%), interpretation (28%), and decision-making (37%, p > 0.05). Only 6% of questions required photographic evaluation. Breast-related topics focused on esthetic problems (35%), traumatic deformities (22%), and tumors (21%). Answer references comprised 293 citations to 63 unique journals published a median of 6 years before PSITE administration. Plastic and Reconstructive Surgery (57%) was the most cited journal (p < 0.001) and Surgery of the Breast: Principles and Art by Spear was the most referenced textbook (22%). CONCLUSIONS The PSITE affords a curriculum that reflects breast-related topics on the American Board of Plastic Surgery written board examination. These data may optimize knowledge acquisition in esthetic and reconstructive breast surgery.
Collapse
Affiliation(s)
- Jason Silvestre
- The Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania
| | - Benjamin Chang
- The Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania
| | - Joseph M Serletti
- The Perelman School of Medicine, University of Pennsylvania, Philadelphia, Pennsylvania.
| |
Collapse
|
9
|
Buckley EJ, Markwell S, Farr D, Sanfey H, Mellinger J. Improving resident performance on standardized assessments of medical knowledge: a retrospective analysis of interventions correlated to American Board of Surgery In-Service Training Examination performance. Am J Surg 2015. [DOI: 10.1016/j.amjsurg.2015.06.004] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022]
|