1
|
Soppe AN, Hauser JM, Jacobson AR, McElrath AD. Implementation of an Un-Pairing Passport to Improve the Transition From Intern to Resident During a Critical Period of Anesthesiology Residency Training. THE JOURNAL OF EDUCATION IN PERIOPERATIVE MEDICINE : JEPM 2023; 25:E719. [PMID: 38162707 PMCID: PMC10753154 DOI: 10.46374/volxxv_issue4_soppe] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/03/2024]
Abstract
Background The transition from intern year to the first year of clinical anesthesiology residency (CA-1) is a challenging period for residents and their supervisors. Orientation methods and instructional material targeting this transition vary across U.S. residency programs. An un-pairing passport was implemented during the 2021-2022 transition to guide and provide expectations for interns, senior residents, and staff. The objective of this quality improvement project was to assess the effectiveness of the passport in improving the transition period and overall preparedness of the new CA-1s. Methods We surveyed 3 groups (CA-1s, CA-2s/CA-3s, and staff anesthesiologists) 6 months after the completion of passport implementation to retrospectively assess the 2021-2022 CA-1 class's preparedness across 7 domains compared with those who transitioned before passport implementation. Mann-Whitney U statistics and median effect sizes were used to compare pre- and postintervention. Results Self-reflected preparedness scores of the CA-1s were higher across all domains compared with the senior resident group (r = 0.328-0.548). Overall level of comfort and preparedness for the start of the CA-1 year was higher in the postintervention group (r = 0.162- 0.514). Staff anesthesiologists' perceived preparedness of the residents was also higher across all domains for the postintervention group (r = 0.197-0.387). Conclusion The un-pairing passport improved residents' and staff anesthesiologists' subjective assessments of the readiness of new CA-1 residents after a critical transition in their training. Similar tools can be more broadly applied to other anesthesiology residency and possibly fellowship programs as well as subspecialty rotations within those programs.
Collapse
Affiliation(s)
- Ashley N. Soppe
- Ashley N. Soppe is a Staff Anesthesiologist in the Department of Anesthesiology at Tripler Army Medical Center, Honolulu, HI, and an Assistant Professor of Anesthesiology at the Uniformed Services University of the Health Sciences, Honolulu, HI. Joshua M. Hauser is a Senior Associate Consultant in the Department of Anesthesiology and Perioperative Medicine at the Mayo Clinic, Rochester, MN, and Assistant Professor of Anesthesiology at the Mayo Clinic College of Medicine and Science, Rochester, MN. Andrew R. Jacobson is a Staff Anesthesiologist in the Department of Anesthesiology at Brooke Army Medical Center, Fort Sam Houston, TX. Angela D. McElrath is a Pediatric and Adult Anesthesiologist in the Department of Anesthesiology at Brooke Army Medical Center, Fort Sam Houston, TX, and Assistant Professor of Anesthesiology at the Uniformed Services University of the Health Sciences, Fort Sam Houston, TX
| | - Joshua M. Hauser
- Ashley N. Soppe is a Staff Anesthesiologist in the Department of Anesthesiology at Tripler Army Medical Center, Honolulu, HI, and an Assistant Professor of Anesthesiology at the Uniformed Services University of the Health Sciences, Honolulu, HI. Joshua M. Hauser is a Senior Associate Consultant in the Department of Anesthesiology and Perioperative Medicine at the Mayo Clinic, Rochester, MN, and Assistant Professor of Anesthesiology at the Mayo Clinic College of Medicine and Science, Rochester, MN. Andrew R. Jacobson is a Staff Anesthesiologist in the Department of Anesthesiology at Brooke Army Medical Center, Fort Sam Houston, TX. Angela D. McElrath is a Pediatric and Adult Anesthesiologist in the Department of Anesthesiology at Brooke Army Medical Center, Fort Sam Houston, TX, and Assistant Professor of Anesthesiology at the Uniformed Services University of the Health Sciences, Fort Sam Houston, TX
| | - Andrew R. Jacobson
- Ashley N. Soppe is a Staff Anesthesiologist in the Department of Anesthesiology at Tripler Army Medical Center, Honolulu, HI, and an Assistant Professor of Anesthesiology at the Uniformed Services University of the Health Sciences, Honolulu, HI. Joshua M. Hauser is a Senior Associate Consultant in the Department of Anesthesiology and Perioperative Medicine at the Mayo Clinic, Rochester, MN, and Assistant Professor of Anesthesiology at the Mayo Clinic College of Medicine and Science, Rochester, MN. Andrew R. Jacobson is a Staff Anesthesiologist in the Department of Anesthesiology at Brooke Army Medical Center, Fort Sam Houston, TX. Angela D. McElrath is a Pediatric and Adult Anesthesiologist in the Department of Anesthesiology at Brooke Army Medical Center, Fort Sam Houston, TX, and Assistant Professor of Anesthesiology at the Uniformed Services University of the Health Sciences, Fort Sam Houston, TX
| | - Angela D. McElrath
- Ashley N. Soppe is a Staff Anesthesiologist in the Department of Anesthesiology at Tripler Army Medical Center, Honolulu, HI, and an Assistant Professor of Anesthesiology at the Uniformed Services University of the Health Sciences, Honolulu, HI. Joshua M. Hauser is a Senior Associate Consultant in the Department of Anesthesiology and Perioperative Medicine at the Mayo Clinic, Rochester, MN, and Assistant Professor of Anesthesiology at the Mayo Clinic College of Medicine and Science, Rochester, MN. Andrew R. Jacobson is a Staff Anesthesiologist in the Department of Anesthesiology at Brooke Army Medical Center, Fort Sam Houston, TX. Angela D. McElrath is a Pediatric and Adult Anesthesiologist in the Department of Anesthesiology at Brooke Army Medical Center, Fort Sam Houston, TX, and Assistant Professor of Anesthesiology at the Uniformed Services University of the Health Sciences, Fort Sam Houston, TX
| |
Collapse
|
2
|
Murphy EA, White K, Meltzer D, Martin SK. Developing hospitalist educators when teaching time is scarce: The Passport model as a professional development approach. J Hosp Med 2023; 18:860-864. [PMID: 36635876 DOI: 10.1002/jhm.13042] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/26/2022] [Revised: 12/20/2022] [Accepted: 12/23/2022] [Indexed: 01/14/2023]
Affiliation(s)
- Elizabeth A Murphy
- Department of Medicine, Section of Hospital Medicine, University of Chicago Medicine and Biological Sciences, Chicago, Illinois, USA
| | - Kara White
- Department of Medicine, Section of Hospital Medicine, University of Chicago Medicine and Biological Sciences, Chicago, Illinois, USA
| | - David Meltzer
- Department of Medicine, Section of Hospital Medicine, University of Chicago Medicine and Biological Sciences, Chicago, Illinois, USA
| | - Shannon K Martin
- Department of Medicine, Section of Hospital Medicine, University of Chicago Medicine and Biological Sciences, Chicago, Illinois, USA
| |
Collapse
|
3
|
Reeves PT, Min SB, Kolasinski NT. Development and Implementation of DIGEST: The Digital Interactive Gastroenterology Education Suite for Trainees. Mil Med 2021; 188:e963-e968. [PMID: 34791344 DOI: 10.1093/milmed/usab446] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/24/2021] [Revised: 10/02/2021] [Accepted: 11/05/2021] [Indexed: 11/13/2022] Open
Abstract
INTRODUCTION Clinical clerkship curricula should exist to provide rotating learners on subspecialty rotations with consistent exposure to specific topics geared toward the discipline of interest, such as pediatric gastroenterology (GI). We aim to describe our experience developing and implementing DIGEST: the Digital Interactive Gastroenterology Education Suite for Trainees, a novel, online GI curriculum delivered to virtual, rotating learners during the coronavirus (COVID-19) pandemic stay-at-home order. MATERIALS AND METHODS A general needs assessment in 2019 identified a lack of standardized educational experience amongst the rotating learners on pediatric GI service. The COVID-19 pandemic compelled us to transition our curriculum from our institution's secure share drive to the GOOGLE classroom. A program evaluation was undertaken and included learner responses to content and confidence questionnaires and a health care professions education (HPE) expert's response to a course quality assessment rubric. RESULTS Feasibility-the final DIGEST product was free of charge to create but incurred direct and indirect costs of time and training on behalf of the authors. Acceptance-7 possible learners participated and responded to the questionnaires (100% response rate). Learners reported a superior learning experience and increased confidence with DIGEST. An HPE expert reported that the course design of DIGEST met or exceeded expectations in all categories. CONCLUSIONS DIGEST is a novel pediatric GI curriculum for rotating learners that could be rapidly deployed, or adapted, for a wide range of clinical disciplines within the Military Health System.
Collapse
Affiliation(s)
- Patrick T Reeves
- Department of Pediatrics, Walter Reed National Military Medical Center, Bethesda, MD 20814, USA.,Department of Pediatrics, Uniformed Services University of the Health Sciences, Bethesda, MD, USA
| | - Steve B Min
- Department of Pediatrics, Walter Reed National Military Medical Center, Bethesda, MD 20814, USA.,Department of Pediatrics, Uniformed Services University of the Health Sciences, Bethesda, MD, USA
| | - Nathan T Kolasinski
- Department of Pediatrics, Walter Reed National Military Medical Center, Bethesda, MD 20814, USA.,Department of Pediatrics, Uniformed Services University of the Health Sciences, Bethesda, MD, USA
| |
Collapse
|
4
|
Zurca AD, Krawiec C, McKeone D, Solaiman AZ, Smith BM, Ceneviva GD. PICU Passport: Pilot study of a handheld resident curriculum. BMC MEDICAL EDUCATION 2021; 21:281. [PMID: 34001109 PMCID: PMC8130359 DOI: 10.1186/s12909-021-02705-9] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/25/2020] [Accepted: 04/29/2021] [Indexed: 06/12/2023]
Abstract
BACKGROUND To explore the impact of an educational tool designed to streamline resident learning during their pediatric intensive care (PICU) rotations. METHODS Topics and procedures were chosen for inclusion based on national requirements for pediatric residents. Residents received a PICU Passport at the beginning of their rotations. PICU faculty were provided learning objectives for each topic. Residents and faculty were surveyed before and after starting use of the Passport. RESULTS Twenty-two residents pre-Passport and 38 residents post-Passport were compared. Residents were more satisfied with their educational experiences (27 % vs. 79 %; P < 0.001), more likely to report faculty targeted teaching towards knowledge gaps (5 % vs. 63 %; P < 0.001) and felt more empowered to ask faculty to discuss specific topics (27 % vs. 76 %; P = 0.002). The median number of teaching sessions increased from 3 to 10 (Z = 4.2; P < 0.001). Most residents (73 %) felt the Passport helped them keep track of their learning and identify gaps in their knowledge. CONCLUSIONS The PICU Passport helps residents keep track of their learning and identify gaps in their knowledge. Passport use increases resident satisfaction with education during their PICU rotation and empowers residents to ask PICU faculty to address specific knowledge gaps.
Collapse
Affiliation(s)
- Adrian D Zurca
- Department of Pediatrics, Penn State Hershey Children's Hospital, P.O. Box 850, 500 University Drive, Mail Code H085, PA, 17033, Hershey, USA.
| | - Conrad Krawiec
- Department of Pediatrics, Penn State Hershey Children's Hospital, P.O. Box 850, 500 University Drive, Mail Code H085, PA, 17033, Hershey, USA
| | - Daniel McKeone
- Department of Pediatrics, Penn State Hershey Children's Hospital, P.O. Box 850, 500 University Drive, Mail Code H085, PA, 17033, Hershey, USA
| | - Adil Z Solaiman
- Department of Pediatrics, Penn State Hershey Children's Hospital, P.O. Box 850, 500 University Drive, Mail Code H085, PA, 17033, Hershey, USA
- Division of General Academic Pediatrics, Nemours/Alfred I. duPont Hospital for Children, Hershey, USA
| | - Brandon M Smith
- Department of Pediatrics, Penn State Hershey Children's Hospital, P.O. Box 850, 500 University Drive, Mail Code H085, PA, 17033, Hershey, USA
| | - Gary D Ceneviva
- Department of Pediatrics, Penn State Hershey Children's Hospital, P.O. Box 850, 500 University Drive, Mail Code H085, PA, 17033, Hershey, USA
| |
Collapse
|
5
|
Arnstead N. Feedback Frequency in Competence by Design: A Quality Improvement Initiative. J Grad Med Educ 2020; 12:46-50. [PMID: 32089793 PMCID: PMC7012523 DOI: 10.4300/jgme-d-19-00358.1] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/22/2019] [Revised: 08/07/2019] [Accepted: 11/04/2019] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Otolaryngology-head and neck surgery is in the first wave of residency training programs in Canada to adopt Competence by Design (CBD), a model of competency-based medical education. CBD is built on frequent, low-stakes assessments and requires an increase in the number of feedback interactions. The University of Toronto otolaryngology-head and neck surgery residents piloted the CBD model but were completing only 1 assessment every 4 weeks, which was insufficient to support CBD. OBJECTIVE This project aimed to increase assessment completion to once per resident per week using quality improvement methodology. METHODS Stakeholder engagement activities had residents and faculty characterize barriers to assessment completion. Brief electronic assessment forms were completed by faculty on residents' personal mobile devices in face-to-face encounters, and the number completed per resident was tracked for 10 months during the 2016-2017 pilot year. Response to the intervention was analyzed using statistical process control charts. RESULTS The first bundled intervention-a rule set dictating which clinical instance should be assessed, combined with a weekly reminder implemented for 10 weeks-was unsuccessful in increasing the frequency of assessments. The second intervention was a leaderboard, designed on an audit-and-feedback system, which sent weekly comparison e-mails of each resident's completion rate to all residents and the program director. The leaderboard demonstrated significant improvement from baseline over 10 weeks, increasing the assessment completion rate from 0.22 to 2.87 assessments per resident per week. CONCLUSIONS A resident-designed audit-and-feedback leaderboard system improved the frequency of CBD assessment completion.
Collapse
|
6
|
Mamtani M, Shofer FS, Sackeim A, Conlon L, Scott K, Mills AM. Feedback With Performance Metric Scorecards Improves Resident Satisfaction but Does Not Impact Clinical Performance. AEM EDUCATION AND TRAINING 2019; 3:323-330. [PMID: 31637349 PMCID: PMC6795364 DOI: 10.1002/aet2.10348] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/31/2019] [Revised: 03/27/2019] [Accepted: 03/27/2019] [Indexed: 06/10/2023]
Abstract
OBJECTIVES The Emergency Medicine Milestone Project, a framework for assessing competencies, has been used as a method of providing focused resident feedback. However, the emergency medicine milestones do not include specific objective data about resident clinical efficiency and productivity, and studies have shown that milestone-based feedback does not improve resident satisfaction with the feedback process. We examined whether providing performance metric reports to resident physicians improves their satisfaction with the feedback process and their clinical performance. METHODS We conducted a three-phase stepped-wedge randomized pilot study of emergency medicine residents at a single, urban academic site. In phase 1, all residents received traditional feedback; in phase 2, residents were randomized to receive traditional feedback (control group) or traditional feedback with performance metric reports (intervention group); and in phase 3, all residents received monthly performance metric reports and traditional feedback. To assess resident satisfaction with the feedback process, surveys using 6-point Likert scales were administered at each study phase and analyzed using two-sample t-tests. Analysis of variance in repeated measures was performed to compare impact of feedback on resident clinical performance, specifically patient treatment time (PTT) and patient visits per hour. RESULTS Forty-one residents participated in the trial of which 21 were randomized to the intervention group and 20 in the control group. Ninety percent of residents liked receiving the report and 74% believed that it better prepared them for expectations of becoming an attending physician. Additionally, residents randomized to the intervention group reported higher satisfaction (p = 0.01) with the quality of the feedback compared to residents in the control group. However, receiving performance metric reports, regardless of study phase or postgraduate year status, did not affect clinical performance, specifically PTT (183 minutes vs. 177 minutes, p = 0.34) or patients visits per hour (0.99 vs. 1.04, p = 0.46). CONCLUSIONS While feedback with performance metric reports did not improve resident clinical performance, resident physicians were more satisfied with the feedback process, and a majority of residents expressed liking the reports and felt that it better prepared them to become attending physicians. Residency training programs could consider augmenting feedback with performance metric reports to aide in the transition from resident to attending physician.
Collapse
Affiliation(s)
- Mira Mamtani
- Department of Emergency MedicineHospital of the University of PennsylvaniaPhiladelphiaPA
| | - Frances S. Shofer
- Department of Emergency MedicineHospital of the University of PennsylvaniaPhiladelphiaPA
| | - Alexander Sackeim
- Department of Emergency MedicineHospital of the University of PennsylvaniaPhiladelphiaPA
| | - Lauren Conlon
- Department of Emergency MedicineHospital of the University of PennsylvaniaPhiladelphiaPA
| | - Kevin Scott
- Department of Emergency MedicineHospital of the University of PennsylvaniaPhiladelphiaPA
| | - Angela M. Mills
- Department of Emergency MedicineColumbia University College of Physician and SurgeonsNew YorkNY
| |
Collapse
|
7
|
Connolly A, Goepfert A, Blanchard A, Buys E, Donnellan N, Amundsen CL, Galvin SL, Kenton K. myTIPreport and Training for Independent Practice: A Tool for Real-Time Workplace Feedback for Milestones and Procedural Skills. J Grad Med Educ 2018; 10:70-77. [PMID: 29467977 PMCID: PMC5821011 DOI: 10.4300/jgme-d-17-00137.1] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/21/2017] [Revised: 09/09/2017] [Accepted: 10/18/2017] [Indexed: 12/26/2022] Open
Abstract
BACKGROUND Few tools currently exist for effective, accessible delivery of real-time, workplace feedback in the clinical setting. OBJECTIVE We developed and implemented a real-time, web-based tool for performance-based feedback in the clinical environment. METHODS The tool (myTIPreport) was designed for performance-based feedback to learners on the Accreditation Council for Graduate Medical Education (ACGME) Milestones and procedural skills. "TIP" stands for "Training for Independent Practice." We implemented myTIPreport in obstetrics and gynecology (Ob-Gyn) and female pelvic medicine and reconstructive surgery (FPMRS) programs between November 2014 and May 2015. Residents, fellows, teachers, and program directors completed preimplementation and postimplementation surveys on their perceptions of feedback. RESULTS Preimplementation surveys were completed by 656 participants of a total of 980 learners and teachers in 19 programs (12 Ob-Gyn and 7 FPMRS). This represented 72% (273 of 378) of learners and 64% (383 of 602) of teachers. Seventy percent of participants (381 of 546) reported having their own individual processes for real-time feedback; the majority (79%, 340 of 430) described these processes as informal discussions. Over 6 months, one-third of teachers and two-thirds of learners used the myTIPreport tool a total of 4311 times. Milestone feedback was recorded 944 times, and procedural feedback was recorded 3367 times. Feedback addressed all ACGME Milestones and procedures programmed into myTIPreport. Most program directors reported that tool implementation was successful. CONCLUSIONS The majority of learners successfully received workplace feedback using myTIPreport. This web-based tool, incorporating procedures and ACGME Milestones, may be an important transition from other feedback formats.
Collapse
|
8
|
Gundle KR, Mickelson DT, Cherones A, Black J, Hanel DP. Rapid Web-Based Platform for Assessment of Orthopedic Surgery Patient Care Milestones: A 2-Year Validation. JOURNAL OF SURGICAL EDUCATION 2017; 74:1116-1123. [PMID: 28529195 DOI: 10.1016/j.jsurg.2017.05.001] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/17/2016] [Accepted: 05/01/2017] [Indexed: 06/07/2023]
Abstract
OBJECTIVE To determine the validity, feasibility, and responsiveness of a new web-based platform for rapid milestone-based evaluations of orthopedic surgery residents. SETTING Single academic medical center, including a trauma center and pediatrics tertiary hospital. PARTICIPANTS Forty residents (PG1-5) in an orthopedic residency program and their faculty evaluators. METHODS Residents and faculty were trained and supported in the use of a novel trainee-initiated web-based evaluation system. Residents were encouraged to use the system to track progress on patient care subcompetencies. Two years of prospectively collected data were reviewed from residents at an academic program. The primary outcome was Spearman's rank correlation between postgraduate year (PGY) and competency level achieved as a measure of validity. Secondary outcomes assessed feasibility, resident self-evaluation versus faculty evaluation, the distributions among subcompetencies, and responsiveness over time. RESULTS Between February 2014 and February 2016, 856 orthopedic surgery patient care subcompetency evaluations were completed (1.2 evaluations per day). Residents promptly requested feedback after a procedure (median = 0 days, interquartile range: 0-2), and faculty responded within 2 days in 51% (median = 2 days, interquartile range: 0-13). Primary outcome showed a correlation between PGY and competency level (r = 0.78, p < 0.001), with significant differences in competency among PGYs (p < 0.001 by Kruskal-Wallis rank sum test). Self-evaluations by residents substantially agreed with faculty-assigned competency level (weighted Cohen's κ = 0.72, p < 0.001). Resident classes beginning the study as PGY1, 2, and 3 separately demonstrated gains in competency over time (Spearman's rank correlation 0.39, 0.60, 0.59, respectively, each p < 0.001). There was significant variance in the number of evaluations submitted per subcompetency (median = 43, range: 6-113) and competency level assigned (p < 0.01). CONCLUSIONS Rapid tracking of trainee competency with milestone-based evaluations in a learner-centered mobile platform demonstrated validity, feasibility, and responsiveness. Next Accreditation System-mandated data may be efficiently collected and used for trainee and program self-study.
Collapse
Affiliation(s)
- Kenneth R Gundle
- Department of Orthopaedics & Rehabilitation, Oregon Health & Science University; Operative Care Division, Portland VA Medical Center, Portland, Oregon.
| | - Dayne T Mickelson
- Department of Orthopaedics Surgery, Duke University, Durham, North Carolina
| | - Arien Cherones
- Department of Orthopaedics and Sports Medicine, University of Washington, Seattle, Washington
| | - Jason Black
- Department of Orthopaedics and Sports Medicine, University of Washington, Seattle, Washington
| | - Doug P Hanel
- Department of Orthopaedics and Sports Medicine, University of Washington, Seattle, Washington
| |
Collapse
|
9
|
Robertson AC, Fowler LC. Medical Student Perceptions of Learner-Initiated Feedback Using a Mobile Web Application. JOURNAL OF MEDICAL EDUCATION AND CURRICULAR DEVELOPMENT 2017; 4:2382120517746384. [PMID: 29349345 PMCID: PMC5736051 DOI: 10.1177/2382120517746384] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/29/2017] [Accepted: 11/10/2017] [Indexed: 06/07/2023]
Abstract
Feedback, especially timely, specific, and actionable feedback, frequently does not occur. Efforts to better understand methods to improve the effectiveness of feedback are an important area of educational research. This study represents preliminary work as part of a plan to investigate the perceptions of a student-driven system to request feedback from faculty using a mobile device and Web-based application. We hypothesize that medical students will perceive learner-initiated, timely feedback to be an essential component of clinical education. Furthermore, we predict that students will recognize the use of a mobile device and Web application to be an advantageous and effective method when requesting feedback from supervising physicians. Focus group data from 18 students enrolled in a 4-week anesthesia clerkship revealed the following themes: (1) students often have to solicit feedback, (2) timely feedback is perceived as being advantageous, (3) feedback from faculty is perceived to be more effective, (4) requesting feedback from faculty physicians poses challenges, (5) the decision to request feedback may be influenced by the student's clinical performance, and (6) using a mobile device and Web application may not guarantee timely feedback. Students perceived using a mobile Web-based application to initiate feedback from supervising physicians to be a valuable method of assessment. However, challenges and barriers were identified.
Collapse
Affiliation(s)
- Amy C Robertson
- Department of Anesthesiology, Vanderbilt University Medical Center, Nashville, TN, USA
| | - Leslie C Fowler
- Department of Anesthesiology, Vanderbilt University School of Medicine, Nashville, TN, USA
| |
Collapse
|
10
|
Bentley S, Hu K, Messman A, Moadel T, Khandelwal S, Streich H, Noelker J. Are All Competencies Equal in the Eyes of Residents? A Multicenter Study of Emergency Medicine Residents' Interest in Feedback. West J Emerg Med 2016; 18:76-81. [PMID: 28116012 PMCID: PMC5226767 DOI: 10.5811/westjem.2016.11.32626] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/23/2016] [Accepted: 11/30/2016] [Indexed: 11/11/2022] Open
Abstract
Introduction Feedback, particularly real-time feedback, is critical to resident education. The emergency medicine (EM) milestones were developed in 2012 to enhance resident assessment, and many programs use them to provide focused resident feedback. The purpose of this study was to evaluate EM residents’ level of interest in receiving real-time feedback on each of the 23 competencies/sub-competencies. Methods This was a multicenter cross-sectional study of EM residents. We surveyed participants on their level of interest in receiving real-time on-shift feedback on each of the 23 competencies/sub-competencies. Anonymous paper or computerized surveys were distributed to residents at three four-year training programs and three three-year training programs with a total of 223 resident respondents. Residents rated their level of interest in each milestone on a six-point Likert-type response scale. We calculated average level of interest for each of the 23 sub-competencies, for all 223 respondents and separately by postgraduate year (PGY) levels of training. One-way analyses of variance were performed to determine if there were differences in ratings by level of training. Results The overall survey response rate across all institutions was 82%. Emergency stabilization had the highest mean rating (5.47/6), while technology had the lowest rating (3.24/6). However, we observed no differences between levels of training on any of the 23 competencies/sub-competencies. Conclusion Residents seem to ascribe much more value in receiving feedback on domains involving high-risk, challenging procedural skills as compared to low-risk technical and communication skills. Further studies are necessary to determine whether residents’ perceived importance of competencies/sub-competencies needs to be considered when developing an assessment or feedback program based on these 23 EM competencies/sub-competencies.
Collapse
Affiliation(s)
- Suzanne Bentley
- Icahn School of Medicine at Mount Sinai, Elmhurst Hospital Center, Department of Emergency Medicine, Department of Medical Education, New York, New York
| | - Kevin Hu
- Icahn School of Medicine at Mount Sinai, Department of Emergency Medicine, New York, New York
| | - Anne Messman
- Wayne State University School of Medicine, Department of Emergency Medicine, Detroit, Michigan
| | - Tiffany Moadel
- Yale School of Medicine, Department of Emergency Medicine, New Haven, Connecticut
| | - Sorabh Khandelwal
- The Ohio State University, Department of Emergency Medicine, Columbus, Ohio
| | - Heather Streich
- University of Virginia, Department of Emergency Medicine, Charlottesville, Virginia
| | - Joan Noelker
- Washington University in St. Louis, Department of Medicine, Division of Emergency Medicine, St. Louis, Missouri
| |
Collapse
|
11
|
Nadir NA, Bentley S, Papanagnou D, Bajaj K, Rinnert S, Sinert R. Characteristics of Real-Time, Non-Critical Incident Debriefing Practices in the Emergency Department. West J Emerg Med 2016; 18:146-151. [PMID: 28116028 PMCID: PMC5226751 DOI: 10.5811/westjem.2016.10.31467] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/08/2016] [Revised: 10/03/2016] [Accepted: 10/27/2016] [Indexed: 11/11/2022] Open
Abstract
INTRODUCTION Benefits of post-simulation debriefings as an educational and feedback tool have been widely accepted for nearly a decade. Real-time, non-critical incident debriefing is similar to post-simulation debriefing; however, data on its practice in academic emergency departments (ED), is limited. Although tools such as TeamSTEPPS® (Team Strategies and Tools to Enhance Performance and Patient Safety) suggest debriefing after complicated medical situations, they do not teach debriefing skills suited to this purpose. Anecdotal evidence suggests that real-time debriefings (or non-critical incident debriefings) do in fact occur in academic EDs;, however, limited research has been performed on this subject. The objective of this study was to characterize real-time, non-critical incident debriefing practices in emergency medicine (EM). METHODS We conducted this multicenter cross-sectional study of EM attendings and residents at four large, high-volume, academic EM residency programs in New York City. Questionnaire design was based on a Delphi panel and pilot testing with expert panel. We sought a convenience sample from a potential pool of approximately 300 physicians across the four sites with the goal of obtaining >100 responses. The survey was sent electronically to the four residency list-serves with a total of six monthly completion reminder emails. We collected all data electronically and anonymously using SurveyMonkey.com; the data were then entered into and analyzed with Microsoft Excel. RESULTS The data elucidate various characteristics of current real-time debriefing trends in EM, including its definition, perceived benefits and barriers, as well as the variety of formats of debriefings currently being conducted. CONCLUSION This survey regarding the practice of real-time, non-critical incident debriefings in four major academic EM programs within New York City sheds light on three major, pertinent points: 1) real-time, non-critical incident debriefing definitely occurs in academic emergency practice; 2) in general, real-time debriefing is perceived to be of some value with respect to education, systems and performance improvement; 3) although it is practiced by clinicians, most report no formal training in actual debriefing techniques. Further study is needed to clarify actual benefits of real-time/non-critical incident debriefing as well as details on potential pitfalls of this practice and recommendations for best practices for use.
Collapse
Affiliation(s)
- Nur-Ain Nadir
- OSF St. Francis Medical Center, University of Illinois College of Medicine at Peoria, Department of Emergency Medicine, Peoria, Illinois; Kings County Hospital and SUNY Downstate Medical Center, Department of Emergency Medicine, New York, New York
| | - Suzanne Bentley
- Elmhurst Hospital Center, Icahn School of Medicine at Mount Sinai, Department of Emergency Medicine and Department of Medical Education, Elmhurst, New York
| | - Dimitrios Papanagnou
- Thomas Jefferson University Hospital, Department of Emergency Medicine, Philadelphia, Pennsylvania
| | - Komal Bajaj
- Jacobi Medical Center, Department of Obstetrics and Gynecology, New York, New York
| | - Stephan Rinnert
- Kings County Hospital and SUNY Downstate Medical Center, Department of Emergency Medicine, New York, New York
| | - Richard Sinert
- Kings County Hospital and SUNY Downstate Medical Center, Department of Emergency Medicine, New York, New York
| |
Collapse
|
12
|
Page CP, Reid A, Coe CL, Carlough M, Rosenbaum D, Beste J, Fagan B, Steinbacher E, Jones G, Newton WP. Learnings From the Pilot Implementation of Mobile Medical Milestones Application. J Grad Med Educ 2016; 8:569-575. [PMID: 27777669 PMCID: PMC5058591 DOI: 10.4300/jgme-d-15-00550.1] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Implementation of the educational milestones benefits from mobile technology that facilitates ready assessments in the clinical environment. We developed a point-of-care resident evaluation tool, the Mobile Medical Milestones Application (M3App), and piloted it in 8 North Carolina family medicine residency programs. OBJECTIVE We sought to examine variations we found in the use of the tool across programs and explored the experiences of program directors, faculty, and residents to better understand the perceived benefits and challenges of implementing the new tool. METHODS Residents and faculty completed presurveys and postsurveys about the tool and the evaluation process in their program. Program directors were interviewed individually. Interviews and open-ended survey responses were analyzed and coded using the constant comparative method, and responses were tabulated under themes. RESULTS Common perceptions included increased data collection, enhanced efficiency, and increased perceived quality of the information gathered with the M3App. Residents appreciated the timely, high-quality feedback they received. Faculty reported becoming more comfortable with the tool over time, and a more favorable evaluation of the tool was associated with higher utilization. Program directors reported improvements in faculty knowledge of the milestones and resident satisfaction with feedback. CONCLUSIONS Faculty and residents credited the M3App with improving the quality and efficiency of resident feedback. Residents appreciated the frequency, proximity, and specificity of feedback, and faculty reported the app improved their familiarity with the milestones. Implementation challenges included lack of a physician champion and competing demands on faculty time.
Collapse
Affiliation(s)
- Cristen P. Page
- Corresponding author: Cristen P. Page, MD, MPH, University of North Carolina at Chapel Hill, Department of Family Medicine, CB7595, 590 Manning Drive, Chapel Hill, NC 27599-7595,
| | | | | | | | | | | | | | | | | | | |
Collapse
|
13
|
Clay AS, Andolsek K, Grochowski CO, Engle DL, Chudgar SM. Using Transitional Year Milestones to Assess Graduating Medical Students' Skills During a Capstone Course. J Grad Med Educ 2015; 7:658-62. [PMID: 26692982 PMCID: PMC4675425 DOI: 10.4300/jgme-d-14-00569.1] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 10/18/2014] [Revised: 02/05/2015] [Accepted: 05/19/2015] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Undergraduate medical education (UME) follows the lead of graduate medical education (GME) in moving to competency-based assessment. The means for and the timing of competency-based assessments in UME are unclear. OBJECTIVE We explored the feasibility of using the Accreditation Council for Graduate Medical Education Transitional Year (TY) Milestones to assess student performance during a mandatory, fourth-year capstone course. METHODS Our single institution, observational study involved 99 medical students who completed the course in the spring of 2014. Students' skills were assessed by self, peer, and faculty assessment for 6 existing course activities using the TY Milestones. Evaluation completion rates and mean scores were calculated. RESULTS Students' mean milestone levels ranged between 2.2 and 3.6 (on a 5-level scoring rubric). Level 3 is the performance expected at the completion of a TY. Students performed highest in breaking bad news and developing a quality improvement project, and lowest in developing a learning plan, working in interdisciplinary teams, and stabilizing acutely ill patients. Evaluation completion rates were low for some evaluations, and precluded use of the data for assessing student performance in the capstone course. Students were less likely to complete separate online evaluations. Faculty were less likely to complete evaluations when activities did not include dedicated time for evaluations. CONCLUSIONS Assessment of student competence on 9 TY Milestones during a capstone course was useful, but achieving acceptable evaluation completion rates was challenging. Modifications are necessary if milestone scores from a capstone are intended to be used as a handoff between UME and GME.
Collapse
Affiliation(s)
| | | | | | | | - Saumil M. Chudgar
- Corresponding author: Saumil M. Chudgar, MD, MS, Duke University School of Medicine, DUMC 3534, Durham, NC 27710,
| |
Collapse
|
14
|
Guy C. Genetic Counseling Milestones: A Framework for Student Competency Evaluation. J Genet Couns 2015; 25:635-43. [PMID: 26462934 DOI: 10.1007/s10897-015-9895-8] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2015] [Accepted: 09/23/2015] [Indexed: 11/25/2022]
Abstract
Graduate medical education has recently increased focus on the development of medical specialty competency milestones to provide a targeted tool for medical resident evaluation. Milestones provide developmental assessment of the attainment of competencies over the course of an educational program. An educational framework is described to explore the development of Genetic Counseling Milestones for the evaluation of the development of genetic counseling competencies by genetic counseling students. The development of Genetic Counseling Milestones may provide a valuable tool to assess genetic counseling students across all program activities. Historical educational context, current practices, and potential benefits and challenges in the development of Genetic Counseling Milestones are discussed.
Collapse
Affiliation(s)
- Carrie Guy
- Department of Pediatrics, Division of Genetics, Masters in Genetic Counseling Program, University of Oklahoma Health Science Center, 1200 Children's Ave, Ste 12100, Oklahoma City, OK, 73104, USA.
| |
Collapse
|
15
|
Holmboe ES, Yamazaki K, Edgar L, Conforti L, Yaghmour N, Miller RS, Hamstra SJ. Reflections on the First 2 Years of Milestone Implementation. J Grad Med Educ 2015; 7:506-11. [PMID: 26457171 PMCID: PMC4597976 DOI: 10.4300/jgme-07-03-43] [Citation(s) in RCA: 63] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
|