1
|
Schafer KR, Sood L, King CJ, Alexandraki I, Aronowitz P, Cohen M, Chretien K, Pahwa A, Shen E, Williams D, Hauer KE. The Grade Debate: Evidence, Knowledge Gaps, and Perspectives on Clerkship Assessment Across the UME to GME Continuum. Am J Med 2023; 136:394-398. [PMID: 36632923 DOI: 10.1016/j.amjmed.2023.01.001] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/22/2022] [Accepted: 01/03/2023] [Indexed: 01/10/2023]
Affiliation(s)
- Katherine R Schafer
- Department of Internal Medicine, Wake Forest University School of Medicine, Winston-Salem, NC.
| | - Lonika Sood
- Elson S. Floyd College of Medicine, Washington State University, Spokane
| | - Christopher J King
- Division of Hospital Medicine, Department of Medicine, University of Colorado School of Medicine, Aurora
| | | | | | - Margot Cohen
- Department of Medicine, Perelman School of Medicine, University of Pennsylvania, Philadelphia
| | | | - Amit Pahwa
- Johns Hopkins University School of Medicine, Baltimore, Md
| | - E Shen
- Department of Internal Medicine, Wake Forest University School of Medicine, Winston-Salem, NC
| | - Donna Williams
- Department of Internal Medicine, Wake Forest University School of Medicine, Winston-Salem, NC
| | | |
Collapse
|
2
|
Shirkhodaie C, Avila S, Seidel H, Gibbons RD, Arora VM, Farnan JM. The Association Between USMLE Step 2 Clinical Knowledge Scores and Residency Performance: A Systematic Review and Meta-Analysis. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:264-273. [PMID: 36512984 DOI: 10.1097/acm.0000000000005061] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/17/2023]
Abstract
PURPOSE With the change in Step 1 score reporting, Step 2 Clinical Knowledge (CK) may become a pivotal factor in resident selection. This systematic review and meta-analysis seeks to synthesize existing observational studies that assess the relationship between Step 2 CK scores and measures of resident performance. METHOD The authors searched MEDLINE, Web of Science, and Scopus databases using terms related to Step 2 CK in 2021. Two researchers identified studies investigating the association between Step 2 CK and measures of resident performance and included studies if they contained a bivariate analysis examining Step 2 CK scores' association with an outcome of interest: in-training examination (ITE) scores, board certification examination scores, select Accreditation Council for Graduate Medical Education core competency assessments, overall resident performance evaluations, or other subjective measures of performance. For outcomes that were investigated by 3 or more studies, pooled effect sizes were estimated with random-effects models. RESULTS Among 1,355 potential studies, 68 met inclusion criteria and 43 were able to be pooled. There was a moderate positive correlation between Step 2 CK and ITE scores (0.52, 95% CI 0.45-0.59, P < .01). There was a moderate positive correlation between Step 2 CK and ITE scores for both nonsurgical (0.59, 95% CI 0.51-0.66, P < .01) and surgical specialties (0.41, 95% CI 0.33-0.48, P < .01). There was a very weak positive correlation between Step 2 CK scores and subjective measures of resident performance (0.19, 95% CI 0.13-0.25, P < .01). CONCLUSIONS This study found Step 2 CK scores have a statistically significant moderate positive association with future examination scores and a statistically significant weak positive correlation with subjective measures of resident performance. These findings are increasingly relevant as Step 2 CK scores will likely become more important in resident selection.
Collapse
Affiliation(s)
- Camron Shirkhodaie
- C. Shirkhodaie is a medical student, Pritzker School of Medicine, University of Chicago, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-4279-3251
| | - Santiago Avila
- S. Avila is a medical student, Pritzker School of Medicine, University of Chicago, Chicago, Illinois; ORCID: https://orcid.org/0000-0003-3633-4304
| | - Henry Seidel
- H. Seidel is a medical student, Pritzker School of Medicine, University of Chicago, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-7364-1365
| | - Robert D Gibbons
- R.D. Gibbons is professor, Center for Health Statistics and Departments of Medicine and Public Health Sciences, University of Chicago, Chicago, Illinois
| | - Vineet M Arora
- V.M. Arora is professor, Department of Medicine, University of Chicago Medicine, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-4745-7599
| | - Jeanne M Farnan
- J.M. Farnan is professor, Department of Medicine, University of Chicago Medicine, Chicago, Illinois; ORCID: https://orcid.org/0000-0002-1138-9416
| |
Collapse
|
3
|
Jacobparayil A, Ali H, Pomeroy B, Baronia R, Chavez M, Ibrahim Y. Predictors of Performance on the United States Medical Licensing Examination Step 2 Clinical Knowledge: A Systematic Literature Review. Cureus 2022; 14:e22280. [PMID: 35350504 PMCID: PMC8933259 DOI: 10.7759/cureus.22280] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/16/2022] [Indexed: 11/05/2022] Open
|
4
|
Dougherty PJ. CORR® Curriculum-Orthopaedic Education: Changing USMLE Step 1 Scores to Pass/Fail Removes an Objective Measure of Medical Knowledge. Clin Orthop Relat Res 2021; 479:1194-1196. [PMID: 33944805 PMCID: PMC8133170 DOI: 10.1097/corr.0000000000001765] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/09/2021] [Accepted: 03/10/2021] [Indexed: 01/31/2023]
Affiliation(s)
- Paul J Dougherty
- P. J. Dougherty, Professor and Chairman, Department of Orthopaedic Surgery, University of Florida, Jacksonville, FL, USA
| |
Collapse
|
5
|
Conner B. Drug Calculations in Veterinary Medical Education-Where Are We? JOURNAL OF VETERINARY MEDICAL EDUCATION 2021; 48:252-255. [PMID: 32412369 DOI: 10.3138/jvme.2019-0118] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Veterinary studies tracking medical errors and their underlying causes are lacking. In human health care, the importance of drug calculation errors in patient safety is well documented. As much as 25% of all medical errors in people are reportedly drug errors, and as much as 14% of those can be attributed to poor drug calculation skills among doctors and nurses. Assessment of the math and analytical skills needed to perform drug calculations accurately is not standardized in veterinary medical education, and there is potential for significant deficit. The purposes of this "Challenges and Issues" article are to briefly discuss the potential impact of poor drug calculation skills on veterinary patients; share one instructor's experience incorporating drug calculations into a veterinary curriculum; and promote further discussion and research that might yield more insight into the assessment and delivery of drug calculation education in veterinary medicine.
Collapse
|
6
|
Ingram MA, Pearman JL, Estrada CA, Zinski A, Williams WL. Are We Measuring What Matters? How Student and Clerkship Characteristics Influence Clinical Grading. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2021; 96:241-248. [PMID: 32701555 DOI: 10.1097/acm.0000000000003616] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
PURPOSE Given the growing emphasis placed on clerkship performance for residency selection, clinical evaluation and its grading implications are critically important; therefore, the authors conducted this study to determine which evaluation components best predict a clinical honors recommendation across 3 core clerkships. METHOD Student evaluation data were collected during academic years 2015-2017 from the third-year internal medicine (IM), pediatrics, and surgery clerkships at the University of Alabama at Birmingham School of Medicine. The authors used factor analysis to examine 12 evaluation components (12 items), and they applied multilevel logistic regression to correlate evaluation components with a clinical honors recommendation. RESULTS Of 3,947 completed evaluations, 1,508 (38%) recommended clinical honors. The top item that predicted a clinical honors recommendation was clinical reasoning skills for IM (odds ratio [OR] 2.8; 95% confidence interval [CI], 1.9 to 4.2; P < .001), presentation skills for surgery (OR 2.6; 95% CI, 1.6 to 4.2; P < .001), and knowledge application for pediatrics (OR 4.8; 95% CI, 2.8 to 8.2; P < .001). Students who spent more time with their evaluators were more likely to receive clinical honors (P < .001), and residents were more likely than faculty to recommend clinical honors (P < .001). Of the top 5 evaluation items associated with clinical honors, 4 composed a single factor for all clerkships: clinical reasoning, knowledge application, record keeping, and presentation skills. CONCLUSIONS The 4 characteristics that best predicted a clinical honors recommendation in all disciplines (clinical reasoning, knowledge application, record keeping, and presentation skills) correspond with traditional definitions of clinical competence. Structural components, such as contact time with evaluators, also correlated with a clinical honors recommendation. These findings provide empiric insight into the determination of clinical honors and the need for heightened attention to structural components of clerkships and increased scrutiny of evaluation rubrics.
Collapse
Affiliation(s)
- Mary A Ingram
- M.A. Ingram is pediatrics intern, Children's of Alabama, University of Alabama at Birmingham, Birmingham, Alabama
| | - Joseph L Pearman
- J.L. Pearman is internal medicine intern, University of California, Davis, Sacramento, California; ORCID: http://orcid.org/0000-0001-5780-3689
| | - Carlos A Estrada
- C.A. Estrada is staff physician, Birmingham Veterans Affairs Medical Center, and professor of medicine, Department of Medicine, University of Alabama at Birmingham, Birmingham, Alabama; ORCID: http://orcid.org/0000-0001-6262-7421
| | - Anne Zinski
- A. Zinski is assistant professor, Department of Medical Education, School of Medicine, University of Alabama at Birmingham, Birmingham, Alabama; ORCID: http://orcid.org/0000-0003-0414-248X
| | - Winter L Williams
- W.L. Williams is clerkship codirector and assistant professor of medicine, Department of Medicine, University of Alabama at Birmingham, and staff physician at the Birmingham Veterans Affairs Medical Center, Birmingham, Alabama; ORCID: http://orcid.org/0000-0002-4015-9409
| |
Collapse
|
7
|
Peterson LE, Boulet JR, Clauser B. Associations Between Medical Education Assessments and American Board of Family Medicine Certification Examination Score and Failure to Obtain Certification. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2020; 95:1396-1403. [PMID: 32271228 DOI: 10.1097/acm.0000000000003344] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
PURPOSE Family medicine residency programs can be cited for low pass or take rates on the American Board of Family Medicine (ABFM) certification examination, and the relationships among standardized medical education assessments and performance on board certification examinations and eventual board certification have not been comprehensively studied. The objective of this study was to evaluate the associations of all required standardized examinations in medical education with ABFM certification examination scores and eventual ABFM certification. METHOD All graduates of U.S. MD-granting family medicine residency programs from 2008 to 2012 were included. Data on ABFM certification examination score, ABFM certification status (as of December 31, 2014), Medical College Admission Test (MCAT) section scores, undergraduate grade point average, all United States Medical Licensing Examination (USMLE) Step scores, and all ABFM in-training examination scores were linked. Nested logistic and linear regression models, controlling for clustering by residency program, determined associations between assessments and both certification examination scores and board certification status. As many international medical graduates (IMGs) do not take the MCAT, separate models for U.S. medical graduates (USMG) and IMGs were run. RESULTS The study sample was 15,902 family medicine graduates, of whom 92.1% (14,648/15,902) obtained board certification. In models for both IMGs and USMGs, the addition of more recent assessments weakened the associations of earlier assessments. USMLE Step 2 Clinical Knowledge was predictive of certification examination scores and certification status in all models in which it was included. CONCLUSIONS For family medicine residents, more recent assessments generally have stronger associations with board certification score and status than earlier assessments. Solely using medical school admissions (grade point average and MCAT) and licensure (USMLE) scores for resident selection may not adequately predict ultimate board certification.
Collapse
Affiliation(s)
- Lars E Peterson
- L.E. Peterson is vice president of research, American Board of Family Medicine, and associate professor, Department of Family and Community Medicine, University of Kentucky, Lexington, Kentucky; ORCID: http://orcid.org/0000-0003-4853-3108
| | - John R Boulet
- J.R. Boulet is vice president, Research and Data Resources, Foundation for Advancement of International Medical Education and Research, Philadelphia, Pennsylvania
| | - Brian Clauser
- B. Clauser is vice president, Center for Advanced Assessment, National Board of Medical Examiners, Philadelphia, Pennsylvania
| |
Collapse
|
8
|
Wilson MA, Odem MA, Walters T, DePass AL, Bean AJ. A Model for Holistic Review in Graduate Admissions That Decouples the GRE from Race, Ethnicity, and Gender. CBE LIFE SCIENCES EDUCATION 2019; 18:ar7. [PMID: 30735085 PMCID: PMC6757224 DOI: 10.1187/cbe.18-06-0103] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/26/2018] [Revised: 11/26/2018] [Accepted: 11/28/2018] [Indexed: 05/30/2023]
Abstract
Graduate schools around the United States are working to improve access to science, technology, engineering, and mathematics (STEM) in a manner that reflects local and national demographics. The admissions process has been the focus of examination, as it is a potential bottleneck for entry into STEM. Standardized tests are widely used as part of the decision-making process; thus, we examined the Graduate Record Examination (GRE) in two models of applicant review: metrics-based applicant review and holistic applicant review to understand whether it affected applicant demographics at The University of Texas MD Anderson Cancer Center UTHealth Graduate School of Biomedical Sciences. We measured the relationship between GRE scores of doctoral applicants and admissions committee scores. Metrics-based review of applicants excluded twice the number of applicants who identified as a historically underrepresented minority compared with their peers. Efforts to implement holistic applicant review resulted in an unexpected result: the GRE could be used as a tool in a manner that did not reflect its reported bias. Applicant assessments in our holistic review process were independent of gender, racial, and citizenship status. Importantly, our recommendations provide a blueprint for institutions that want to implement a data-driven approach to assess applicants in a manner that uses the GRE as part of the review process.
Collapse
Affiliation(s)
- Marenda A. Wilson
- Deans’ Office, The University of Texas MD Anderson Cancer Center UTHealth Graduate School of Biomedical Sciences, Houston, TX 77030
- Graduate College, Rush University, Chicago, IL 60612
| | - Max A. Odem
- Deans’ Office, The University of Texas MD Anderson Cancer Center UTHealth Graduate School of Biomedical Sciences, Houston, TX 77030
| | - Taylor Walters
- College of Arts and Sciences, Oberlin College and Conservatory, Oberlin, OH 44074
| | | | - Andrew J. Bean
- Deans’ Office, The University of Texas MD Anderson Cancer Center UTHealth Graduate School of Biomedical Sciences, Houston, TX 77030
- Graduate College, Rush University, Chicago, IL 60612
- Department of Neurobiology and Anatomy, Cell Biology and Biochemistry, McGovern Medical School at The University of Texas Health Science Center at Houston, Houston, TX 77030
- Department of Pediatrics, The University of Texas MD Anderson Cancer Center, Houston, TX 77030
| |
Collapse
|
9
|
Predicting American Board of Emergency Medicine Qualifying Examination Passage Using United States Medical Licensing Examination Step Scores. Ochsner J 2018; 18:204-208. [PMID: 30275782 DOI: 10.31486/toj.17.0101] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022] Open
Abstract
Background The objective of the current study was to determine whether emergency medicine residents' United States Medical Licensing Examination (USMLE) scores are significantly associated with first-attempt passage of the American Board of Emergency Medicine (ABEM) qualifying (written) examination. We hypothesized that USMLE Step 2 Clinical Knowledge (CK) scores would be useful in predicting students who passed the ABEM qualifying examination on their first attempt. Methods For this retrospective cohort study, we examined the data of residents who successfully completed training at two emergency medicine residency programs between the years 2002-2013. Because scores on the USMLE Step examinations varied greatly across years, we obtained means and standard deviations from the National Board of Medical Examiners. We subtracted the mean score for the year each resident took the examination from the resident's examination score, creating centered Step 1 and centered Step 2 CK scores. Results A multivariate logistic regression analysis indicated that centered Step 2 CK scores could be used to predict the odds of passing the ABEM qualifying examination (odds ratio = 1.05 [95% confidence interval 1.02 to 1.08, P < 0.001]). Using a Step 2 CK score cutoff of 7 points lower than the mean yielded 64% sensitivity and 81% specificity for predicting passage of the ABEM written examination on the first attempt. Conclusion Program directors and selection committees may wish to consider whether applicants' Step 2 CK scores are near the national average when making ranking decisions, as this variable is highly predictive of passing the ABEM qualifying examination on the initial attempt.
Collapse
|
10
|
Dong T, Gilliland WR, Cruess D, Hutchinson J, Morres L, Curtis J, Hewitt-Clarke GS, Durning SJ. A Longitudinal Study of Commonly Used Admissions Measures and Disenrollment from Medical School and Graduate Medical Education Probation or Termination from Training. Mil Med 2018; 183:e680-e684. [PMID: 29718290 DOI: 10.1093/milmed/usy069] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2018] [Indexed: 11/14/2022] Open
Affiliation(s)
- Ting Dong
- Department of Medicine, Uniformed Services University of the Health Sciences, 4301 Jones Bridge Rd., Bethesda, MD
| | - William R Gilliland
- Department of Medicine, Uniformed Services University of the Health Sciences, 4301 Jones Bridge Rd., Bethesda, MD
| | - David Cruess
- Department of Preventive Medicine and Biostatistics, Uniformed Services University of the Health Sciences, 4301 Jones Bridge Rd., Bethesda, MD
| | - Jeffrey Hutchinson
- Department of Pediatrics, Uniformed Services University of the Health Sciences, 4301 Jones Bridge Rd., Bethesda, MD
| | - Lisa Morres
- Department of Medicine, Uniformed Services University of the Health Sciences, 4301 Jones Bridge Rd., Bethesda, MD
| | - Jerri Curtis
- Department of Medicine, Uniformed Services University of the Health Sciences, 4301 Jones Bridge Rd., Bethesda, MD
| | - Gail-Selina Hewitt-Clarke
- Department of Medicine, Uniformed Services University of the Health Sciences, 4301 Jones Bridge Rd., Bethesda, MD
| | - Steven J Durning
- Department of Medicine, Uniformed Services University of the Health Sciences, 4301 Jones Bridge Rd., Bethesda, MD
| |
Collapse
|
11
|
Surry LT, Torre D, Durning SJ. Exploring examinee behaviours as validity evidence for multiple-choice question examinations. MEDICAL EDUCATION 2017; 51:1075-1085. [PMID: 28758233 DOI: 10.1111/medu.13367] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/19/2016] [Revised: 03/03/2017] [Accepted: 05/02/2017] [Indexed: 05/24/2023]
Abstract
CONTEXT Clinical-vignette multiple choice question (MCQ) examinations are used widely in medical education. Standardised MCQ examinations are used by licensure and certification bodies to award credentials that are meant to assure stakeholders as to the quality of physicians. Such uses are based on the interpretation of MCQ examination performance as giving meaningful information about the quality of clinical reasoning. There are several assumptions foundational to these interpretations and uses of standardised MCQ examinations. This study explores the implicit assumption that cognitive processes elicited by clinical-vignette MCQ items are like the processes thought to occur with 'real-world' clinical reasoning as theorised by dual-process theory. METHODS Fourteen participants (three medical students, five residents and six staff physicians) completed three sets of five timed MCQ items (total 15) from the Medical Knowledge Self-Assessment Program (MKSAP). Upon answering a set of MCQs, each participant completed a retrospective think aloud (TA) protocol. Using constant comparative analysis (CCA) methods sensitised by dual-process theory, we performed a qualitative thematic analysis. RESULTS Examinee behaviours fell into three categories: clinical reasoning behaviours, test-taking behaviours and reactions to the MCQ. Consistent with dual-process theory, statements about clinical reasoning behaviours were divided into two sub-categories: analytical reasoning and non-analytical reasoning. Each of these categories included several themes. CONCLUSIONS Our study provides some validity evidence that test-takers' descriptions of their cognitive processes during completion of high-quality clinical-vignette MCQs align with processes expected in real-world clinical reasoning. This supports one of the assumptions important for interpretations of MCQ examination scores as meaningful measures of clinical reasoning. Our observations also suggest that MCQs elicit other cognitive processes, including certain test-taking behaviours, that seem 'inauthentic' to real-world clinical reasoning. Further research is needed to explore if similar themes arise in other contexts (e.g. simulated patient encounters) and how observed behaviours relate to performance on MCQ-based assessments.
Collapse
Affiliation(s)
- Luke T Surry
- Department of Medicine, Uniformed Services University of the Health Sciences, Bethesda, Maryland, USA
| | - Dario Torre
- Department of Medicine, Uniformed Services University of the Health Sciences, Bethesda, Maryland, USA
| | - Steven J Durning
- Department of Medicine, Uniformed Services University of the Health Sciences, Bethesda, Maryland, USA
| |
Collapse
|
12
|
Kondrashov P, McDaniel DJ, Jordan RM. Premedical anatomy experience and student performance in medical gross anatomy. Clin Anat 2017; 30:303-311. [DOI: 10.1002/ca.22846] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2016] [Revised: 01/24/2017] [Accepted: 02/02/2017] [Indexed: 11/09/2022]
Affiliation(s)
- Peter Kondrashov
- Department of Anatomy, Kirksville College of Osteopathic Medicine; A.T. Still University; Kirksville Missouri
| | - Dalton J. McDaniel
- Department of Anatomy, Kirksville College of Osteopathic Medicine; A.T. Still University; Kirksville Missouri
| | | |
Collapse
|
13
|
Durning SJ, Dong T, LaRochelle JL, Artino AR, Gilliland WR, DeZee KJ, Saguil A, Cruess DF, Picho K, McManigle JE. The Long-Term Career Outcome Study: Lessons Learned and Implications for Educational Practice. Mil Med 2015; 180:164-70. [PMID: 25850148 DOI: 10.7205/milmed-d-14-00574] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022] Open
Abstract
ABSTRACT
The work of the Long-Term Career Outcome Study has been a program of scholarship spanning 10 years. Borrowing from established quality assurance literature, the Long-Term Career Outcome Study team has organized its scholarship into three phases; before medical school, during medical school, and after medical school. The purpose of this commentary is to address two fundamental questions: (1) what has been learned? and (2) how does this knowledge translate to educational practice and policy now and into the future? We believe that answers to these questions are relevant not only to our institution but also to other educational institutions seeking to provide high-quality health professions education.
Collapse
Affiliation(s)
- Steven J. Durning
- Department of Medicine, Uniformed Services University of the Health Sciences, 4301 Jones Bridge Road, Bethesda, MD 20814
| | - Ting Dong
- Department of Medicine, Uniformed Services University of the Health Sciences, 4301 Jones Bridge Road, Bethesda, MD 20814
| | - Jeffrey L. LaRochelle
- Department of Medicine, Uniformed Services University of the Health Sciences, 4301 Jones Bridge Road, Bethesda, MD 20814
| | - Anthony R. Artino
- Department of Medicine, Uniformed Services University of the Health Sciences, 4301 Jones Bridge Road, Bethesda, MD 20814
| | - William R. Gilliland
- Department of Medicine, Uniformed Services University of the Health Sciences, 4301 Jones Bridge Road, Bethesda, MD 20814
| | - Kent J. DeZee
- Department of Medicine, Uniformed Services University of the Health Sciences, 4301 Jones Bridge Road, Bethesda, MD 20814
| | - Aaron Saguil
- Department of Medicine, Uniformed Services University of the Health Sciences, 4301 Jones Bridge Road, Bethesda, MD 20814
| | - David F. Cruess
- Department of Preventive Medicine and Biometrics, Uniformed Services University of the Health Sciences, 4301 Jones Bridge Road, Bethesda, MD 20814
| | - Katherine Picho
- Department of Medicine, Uniformed Services University of the Health Sciences, 4301 Jones Bridge Road, Bethesda, MD 20814
| | - John E. McManigle
- Department of Military and Emergency Medicine, Uniformed Services University of the Health Sciences, 4301 Jones Bridge Road, Bethesda, MD 20814
| |
Collapse
|