1
|
Ali AA, Crimmins A, Chen H, Khoujah D. • Education • Simulation-based assessment for the emergency medicine milestones: a national survey of simulation experts and program directors. World J Emerg Med 2024; 15:301-305. [PMID: 39050213 PMCID: PMC11265633 DOI: 10.5847/wjem.j.1920-8642.2024.055] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/13/2023] [Accepted: 02/29/2024] [Indexed: 07/27/2024] Open
Affiliation(s)
- Afrah A Ali
- Department of Emergency Medicine, University of Maryland School of Medicine, Baltimore 21201, USA
| | - Ashley Crimmins
- Department of Emergency Medicine, University of Maryland School of Medicine, Baltimore 21201, USA
| | - Hegang Chen
- Department of Epidemiology and Public Health, University of Maryland School of Medicine, Baltimore 21201, USA
| | - Danya Khoujah
- Department of Emergency Medicine, University of Maryland School of Medicine, Baltimore 21201, USA
- Department of Emergency Medicine, AdventHealth Tampa, Tampa 33606, USA
| |
Collapse
|
2
|
Frey-Vogel AS, Ching K, Dzara K, Mallory L. The Acceptability of Avatar Patients for Teaching and Assessing Pediatric Residents in Communicating Medical Ambiguity. J Grad Med Educ 2022; 14:696-703. [PMID: 36591423 PMCID: PMC9765906 DOI: 10.4300/jgme-d-22-00088.1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/28/2022] [Revised: 07/08/2022] [Accepted: 09/27/2022] [Indexed: 12/23/2022] Open
Abstract
Background Simulation offers a means to assess resident competence in communication, but pediatric standardized patient simulation has limitations. A novel educational technology, avatar patients (APs), holds promise, but its acceptability to residents, educational relevance, and perception of realism have not been determined. Objective To determine if APs are acceptable, provide a relevant educational experience, and are realistic for teaching and assessment of a complex communication topic. Methods Pediatric residents at one academic institution participated in an AP experience from 2019 to 2021 consisting of 2 scenarios representing issues of medical ambiguity. After the experience, residents completed a survey on the emotional relevance, realism, and acceptability of the technology for assessment of their communication competence. Results AP actor training required approximately 3 hours. Software and training was provided free of charge. Actors were paid $30/hour; the total estimated curricular cost is $50,000. Sixty-five of 89 (73%) pediatric residents participated in the AP experience; 61 (93.8%) completed the survey. Forty-eight (78.7%) were emotionally invested in the scenarios. The most cited emotions evoked were anxiety, uncertainty, concern, and empathy. The conversations were rated by 49 (80.3%) as realistic. APs were rated as beneficial for learning to communicate about medical ambiguity by 40 (65.5%), and 41 (66.7%) felt comfortable having APs used to assess their competence in this area. Conclusions Pediatric residents were emotionally invested in the AP experience and found it to be realistic. The experience was rated as beneficial for learning and acceptable to be used for assessment of how to communicate medical ambiguity.
Collapse
Affiliation(s)
- Ariel S. Frey-Vogel
- Ariel S. Frey-Vogel, MD, MAT, is Director, Pediatric Education, Innovation and Research Center, and Associate Program Director, Pediatric Residency Program, Harvard Medical School and Mass General for Children
| | - Kevin Ching
- Kevin Ching, MD, is Medical Director, Weill Cornell Medicine New York Presbyterian Simulation Center
| | - Kristina Dzara
- Kristina Dzara, PhD, MMSc, is Assistant Dean for Educator Development and Director, Center for Leadership and Innovation in Medical Education, Department of Biomedical Informatics and Medical Education, and Center for Leadership and Innovation in Medical Education, University of Washington School of Medicine
| | - Leah Mallory
- Leah Mallory, MD, is Medical Director, The Hannaford Center for Safety, Innovation, and Simulation, The Barbara Bush Children's Hospital at Maine Medical Center
| |
Collapse
|
3
|
Mallory LA, Doughty CB, Davis KI, Cheng A, Calhoun AW, Auerbach MA, Duff JP, Kessler DO. A Decade Later-Progress and Next Steps for Pediatric Simulation Research. Simul Healthc 2022; 17:366-376. [PMID: 34570084 DOI: 10.1097/sih.0000000000000611] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
SUMMARY STATEMENT A decade ago, at the time of formation of the International Network for Pediatric Simulation-based Innovation, Research, and Education, the group embarked on a consensus building exercise. The goal was to forecast the facilitators and barriers to growth and maturity of science in the field of pediatric simulation-based research. This exercise produced 6 domains critical to progress in the field: (1) prioritization, (2) research methodology and outcomes, (3) academic collaboration, (4) integration/implementation/sustainability, (5) technology, and (6) resources/support/advocacy. This article reflects on and summarizes a decade of progress in the field of pediatric simulation research and suggests next steps in each domain as we look forward, including lessons learned by our collaborative grass roots network that can be used to accelerate research efforts in other domains within healthcare simulation science.
Collapse
Affiliation(s)
- Leah A Mallory
- From the Tufts University School of Medicine (L.A.M.), Boston, MA; Department of Medical Education (L.A.M.), The Hannaford Center for Simulation, Innovation and Education; Section of Hospital Medicine (L.A.M.), Department of Pediatrics, The Barbara Bush Children's Hospital at Maine Medical Center, Portland, ME; Section of Emergency Medicine (C.B.D.), Department of Pediatrics, Baylor College of Medicine; Simulation Center (C.B.D.), Texas Children's Hospital, Pediatric Emergency Medicine, Baylor College of Medicine; Section of Critical Care Medicine (K.I.D.), Department of Pediatrics, Baylor College of Medicine, Texas Children's Hospital, Houston, TX; Departments of Pediatrics and Emergency Medicine (A.C.), University of Calgary, Calgary, Canada; Division of Pediatric Critical Care (A.W.C.), University of Louisville School of Medicine and Norton Children's Hospital, Louisville, KY; Section of Emergency Medicine (M.A.A.), Yale University School of Medicine, New Haven, CT; Division of Critical Care (J.P.D.), University of Alberta, Alberta, Canada; and Columbia University Vagelos College of Physicians and Surgeons (D.O.K.), New York, NY
| | | | | | | | | | | | | | | |
Collapse
|
4
|
Howell HB, Desai PV, Altshuler L, McGrath M, Ramsey R, Vrablik L, Levy FH, Zabar S. Teaching and Assessing Communication Skills in Pediatric Residents: How Do Parents Think We Are Doing? Acad Pediatr 2022; 22:179-183. [PMID: 34186252 DOI: 10.1016/j.acap.2021.06.011] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 02/22/2021] [Revised: 06/14/2021] [Accepted: 06/21/2021] [Indexed: 11/28/2022]
Abstract
OBJECTIVE Curricula designed to teach and assess the communication skills of pediatric residents variably integrates the parent perspective. We compared pediatric residents' communication skills in an objective structured clinical exam (OSCE) case as assessed by Family Faculty (FF), parents of pediatric patients, versus standardized patients (SP). METHODS Residents participated in an OSCE case with a SP acting as a patient's parent. We compared resident performance as assessed by FF and SP with a behaviorally-anchored checklist. Items were rated as not done, partly done or well done, with well-done indicating mastery. The residents evaluated the experience. RESULTS 42 residents consented to study participation. FF assessed a lower percentage of residents as demonstrating skill mastery as compared to SP in 19 of the 23 behaviors. There was a significant difference between FF and SP for Total Mastery Score and Mastery of the Competency Scores in three domains (Respect and Value, Information Sharing and Participation in Care and Decision Making). The majority of residents evaluated the experience favorably. CONCLUSION Involving parents of pediatric patients in the instructive and assessment components of a communication curriculum for pediatric residents adds a unique perspective and integrates the true stakeholders in parent-physician communication.
Collapse
Affiliation(s)
- Heather B Howell
- Department of Pediatrics, New York University Grossman School of Medicine (HB Howell, PV Desai, L Vrablik, and FH Levy), New York, NY.
| | - Purnahamsi V Desai
- Department of Pediatrics, New York University Grossman School of Medicine (HB Howell, PV Desai, L Vrablik, and FH Levy), New York, NY
| | - Lisa Altshuler
- Department of Medicine, New York University Grossman School of Medicine (L Altshuler and S Zabar), New York, NY
| | - Meaghan McGrath
- Sala Institute for Child and Family Centered Care, Hassenfeld Children's Hospital (M McGrath, R Ramsey and FH Levy), New York, NY
| | - Rachel Ramsey
- Sala Institute for Child and Family Centered Care, Hassenfeld Children's Hospital (M McGrath, R Ramsey and FH Levy), New York, NY
| | - Lauren Vrablik
- Department of Pediatrics, New York University Grossman School of Medicine (HB Howell, PV Desai, L Vrablik, and FH Levy), New York, NY
| | - Fiona H Levy
- Department of Pediatrics, New York University Grossman School of Medicine (HB Howell, PV Desai, L Vrablik, and FH Levy), New York, NY; Sala Institute for Child and Family Centered Care, Hassenfeld Children's Hospital (M McGrath, R Ramsey and FH Levy), New York, NY
| | - Sondra Zabar
- Department of Medicine, New York University Grossman School of Medicine (L Altshuler and S Zabar), New York, NY
| |
Collapse
|
5
|
Thiel embalming in neonates: methodology and benefits in medical training. Anat Sci Int 2022; 97:290-296. [PMID: 35137346 PMCID: PMC9167811 DOI: 10.1007/s12565-022-00650-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/28/2021] [Accepted: 01/18/2022] [Indexed: 12/04/2022]
Abstract
Current teaching and training methods for surgical techniques in the pediatric population involve artificial models (manikins), animals or adult human cadavers embalmed using various techniques. We found no references in the literature concerning the use of the Thiel method in the pediatric population. The aim of this study, therefore, was to assess the viability of using pediatric human cadavers embalmed through Thiel’s technique and to compare them with standard pediatric manikins. After donation of a 24-week stillborn, the Thiel technique was carried out for fixation following the usual protocol. A video recording with eye-tracking glasses was used to perform an examination, and techniques. The same procedures were conducted on a pediatric manikin. Medical students, medical residents and physicians were asked to respond to questions in an online survey after being shown the video. A total of 92 responses were obtained. The Thiel-embalmed stillborn was assessed as superior to the manikin in all items. Our study confirmed that this technique is feasible even with extremely small donors. The value of this form of preservation for medical training is not widely known though it is receiving increasing interest. Our results show that Thiel fixation in pediatrics is clearly more highly valued than a manikin and offers great potential. This innovative application of the Thiel method in the pediatric population is technically possible. It poses no additional difficulties and is very positively assessed for undergraduate and postgraduate teaching.
Collapse
|
6
|
Frey-Vogel A, Rogers A, Sparger K, Mehta R, Mirchandani-Shah D, Mangold K, Mitchell D, Wood A. Taking the Pulse on Pediatric Simulation: A National Survey of Pediatric Residency Programs' Simulation Practices and Challenges. Pediatr Emerg Care 2021; 37:e1303-e1307. [PMID: 31977771 DOI: 10.1097/pec.0000000000002013] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
OBJECTIVES There is abundant literature on simulation use in individual pediatric residency programs but limited overall data on simulation in US pediatric residency programs. This study sought to determine how US pediatric residency programs use simulation for teaching and assessment and the challenges programs face in their use of simulation. METHODS The Association of Pediatric Program Director's Healthcare Simulation in Pediatrics Learning Community members developed a 15-multipart question survey on the use of simulation in US pediatric residency programs using best practices in survey design. The survey was distributed electronically to US pediatric residency program directors. Qualitative questions were analyzed by content analysis and quantitative questions using descriptive statistics. RESULTS The survey response rate was 21%; respondents were disproportionately from large academic medical centers. Qualitative analysis found that respondents use simulation to teach pediatric residents in the areas of urgent/emergent situations, procedures, and communication, and common challenges to simulation implementation are time, physical resources, expertise, competing priorities, logistics, and buy-in. Quantitative analysis demonstrated that, although respondents are largely confident that their simulation programs improve resident preparedness and competence, few objectively evaluate their simulation programs. CONCLUSIONS Pediatric residency programs use simulation for similar purposes and face similar challenges. By collaborating, the resources of the national pediatric simulation community can be leveraged to collect evidence for best practices for simulation use in pediatric residency training.
Collapse
Affiliation(s)
- Ariel Frey-Vogel
- From the Department of Pediatrics, MassGeneral Hospital for Children, Boston, MA
| | - Amanda Rogers
- Department of Pediatrics, Medical College of Wisconsin, Milwaukee, WI
| | - Katherine Sparger
- From the Department of Pediatrics, MassGeneral Hospital for Children, Boston, MA
| | - Renuka Mehta
- Department of Pediatrics, Medical College of Georgia at Augusta University, Augusta, GA
| | | | - Karen Mangold
- Departments of Pediatrics and Medical Education, Ann & Robert H. Lurie Children's Hospital of Chicago
| | - Diana Mitchell
- Department of Pediatrics, The University of Chicago Comer Children's Hospital, Chicago, IL
| | - Amy Wood
- Department of Pediatrics, Our Lady of the Lake Children's Hospital, Baton Rouge, LA
| |
Collapse
|
7
|
Nguyen MC, Elliott NC, Begany DP, Best KM, Cook MD, Jong MR, Matuzsan ZM, Morolla LA, Partington SS, Kane BG. Assessment of Emergency Medicine Resident Performance in a Pediatric In Situ Simulation Using Multi-Source Feedback. Cureus 2021; 13:e16812. [PMID: 34522472 PMCID: PMC8425063 DOI: 10.7759/cureus.16812] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 08/01/2021] [Indexed: 12/02/2022] Open
Abstract
Introduction Multi-source feedback (MSF) is an evaluation method mandated by the Accreditation Council for Graduate Medical Education (ACGME). The Queen’s Simulation Assessment Tool (QSAT) has been validated as being able to distinguish between resident performances in a simulation setting. The QSAT has also been demonstrated to have excellent MSF agreement when used in an adult simulation performed in a simulation lab. Using the QSAT, this study sought to determine the degree of agreement of MSF in a single pediatric (Peds) simulation case conducted in situ in a Peds emergency department (ED). Methods This Institutional Review Board-approved study was conducted in a four-year emergency medicine residency. A Peds resuscitation case was developed with specific behavioral anchors on the QSAT, which uses a 1-5 scale in each of five categories: Primary Assessment, Diagnostic Actions, Therapeutic Actions, Communication, and Overall Assessment. Data was gathered from six participants for each simulation. The lead resident self-evaluated and received MSF from a junior peer resident, a fixed Peds ED nurse, a random ED nurse, and two faculty (one fixed, the other from a dyad). The agreement was calculated with intraclass correlation coefficients (ICC). Results The simulation was performed on 35 separate days over two academic years. A total of 106 MSF participants were enrolled. Enrollees included three faculty members, 35 team leaders, 34 peers, 33 ED registered nurses (RN), and one Peds RN; 50% of the enrollees were female (n=53). Mean QSAT scores ranged from 20.7 to 23.4. A fair agreement was demonstrated via ICC; there was no statistically significant difference between sources of MSF. Removing self-evaluation led to the highest ICC. ICC for any single or grouped non-faculty source of MSF was poor. Conclusion Using the QSAT, the findings from this single-site cohort suggest that faculty must be included in MSF. Self-evaluation appears to be of limited value in MSF with the QSAT. The degree of MSF agreement as gathered by the QSAT was lower in this cohort than previously reported for adult simulation cases performed in the simulation lab. This may be due to either the pediatric nature of the case, the location of the simulation, or both.
Collapse
Affiliation(s)
- Michael C Nguyen
- Department of Emergency and Hospital Medicine, Lehigh Valley Hospital and Health Network/University of South Florida Morsani College of Medicine, Allentown, USA
| | - Nicole C Elliott
- Department of Emergency and Hospital Medicine, Lehigh Valley Hospital and Health Network/University of South Florida Morsani College of Medicine, Allentown, USA
| | - Diane P Begany
- Department of Pediatrics, Lehigh Valley Hospital and Health Network/University of South Florida Morsani College of Medicine, Allentown, USA
| | - Katie M Best
- Department of Emergency and Hospital Medicine, Lehigh Valley Hospital and Health Network/University of South Florida Morsani College of Medicine, Allentown, USA
| | - Matthew D Cook
- Department of Emergency and Hospital Medicine, Lehigh Valley Hospital and Health Network/University of South Florida Morsani College of Medicine, Allentown, USA
| | - Michael R Jong
- Department of Emergency and Hospital Medicine, Lehigh Valley Hospital and Health Network/University of South Florida Morsani College of Medicine, Allentown, USA
| | - Zachary M Matuzsan
- Department of Emergency and Hospital Medicine, Lehigh Valley Hospital and Health Network/University of South Florida Morsani College of Medicine, Allentown, USA
| | - Louis A Morolla
- Department of Emergency and Hospital Medicine, Lehigh Valley Hospital and Health Network/University of South Florida Morsani College of Medicine, Allentown, USA
| | - Suzanne S Partington
- Department of Emergency and Hospital Medicine, Lehigh Valley Hospital and Health Network/University of South Florida Morsani College of Medicine, Allentown, USA
| | - Bryan G Kane
- Department of Emergency and Hospital Medicine, Lehigh Valley Hospital and Health Network/University of South Florida Morsani College of Medicine, Allentown, USA
| |
Collapse
|
8
|
"Changing the focus" for simulation-based education assessment… not simply "changing the view" with videolaryngoscopy. J Pediatr (Rio J) 2021; 97:4-6. [PMID: 32619410 PMCID: PMC9432164 DOI: 10.1016/j.jped.2020.06.003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/23/2022] Open
|
9
|
Mallory L, Floyed R, Doughty C, Thompson T, Lopreiato J, Chang TP. Validation of a Modified Jefferson Scale of Empathy for Observers to Assess Trainees. Acad Pediatr 2021; 21:165-169. [PMID: 32540426 DOI: 10.1016/j.acap.2020.06.005] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/10/2019] [Revised: 05/12/2020] [Accepted: 06/07/2020] [Indexed: 12/30/2022]
Abstract
OBJECTIVE "Demonstrate insight and understanding into emotion" is a competency amenable to simulation-based assessment. The Jefferson Scale of Patient Perceptions of Physician Empathy (JSPPPE) has validity evidence for patients to assess provider empathy. A version adapted for a third-party observers does not exist. Our aim was to modify the JSPPPE and use recorded standardized encounters to obtain validity evidence. METHODS This cross-sectional study used video and data collected from 2 pediatric residencies. In 2018, 4 raters reviewed 24 videos of 12 interns communicating with standardized patients (SP) in 2 encounters and completed a modified JSPPE for observers (JSEO). Reliability between raters was established using Intraclass Correlations (ICC). JSEO mean scores were correlated to Essential Elements of Communication (EEC), JSPPPE, and faculty composite interpersonal communication (IC) scores using Spearman Rank. RESULTS The mean ICC for all 4 raters was 0.573 (0.376-0.755). When ICC was calculated for pairs of raters, Rater 1 was an outlier. ICCs for mean scores for pairs among the 3 remaining raters was 0.81 to 0.84. Mean JSEO scores from the four raters correlated with the JSPPPE (rho = 0.45, P = .03) and IC (rho = 0.68, P < .001), but not the EEC (rho = 0.345, P = .1). CONCLUSIONS We found validity evidence for the use of a modified JSPPPE for an observer to assess empathy in a recorded encounter with a SP. This may be useful as medical educators shift toward competency-based tracking. The brevity of this tool and potential assessment using video are also appealing.
Collapse
Affiliation(s)
- Leah Mallory
- Tufts University School of Medicine (L Mallory), The Hannaford Simulation Center at Maine Medical Center, Maine.
| | - Rebecca Floyed
- Dell Medical School (R Floyed), University of Texas at Austin, Tex
| | - Cara Doughty
- Baylor College of Medicine (C Doughty), Texas Children's Hospital Simulation Center, Baylor College of Medicine/Texas Children's Hospital, Houston, Tex
| | - Tonya Thompson
- University of Arkansas for Medical Sciences (T Thompson), Little Rock, Ark
| | - Joseph Lopreiato
- Uniformed Services University of the Health Sciences (J Lopreiato), The Val G. Hemming Simulation Center, Silver Spring, Md
| | - Todd P Chang
- Keck School of Medicine of USC (TP Chang), Children's Hospital Los Angeles, Los Angeles, Calif
| |
Collapse
|
10
|
Emerging Prevalence of Simulation-Based Education in Pediatric Critical Care Medicine Fellowship Training: We Have Come a Long Way, (Sim)baby! Pediatr Crit Care Med 2020; 21:909-910. [PMID: 33009305 DOI: 10.1097/pcc.0000000000002516] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
11
|
Reed S, Frey-Vogel A, Frost M. Look Who's Talking: A Survey of Pediatric Program Directors on Communication Skills Education in Pediatric Residency Programs. Acad Pediatr 2020; 20:9-13. [PMID: 31103882 DOI: 10.1016/j.acap.2019.05.005] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/30/2019] [Revised: 05/08/2019] [Accepted: 05/10/2019] [Indexed: 11/30/2022]
Abstract
OBJECTIVE To determine current practices for communication skills curriculum and assessment in pediatric residency programs and to identify programs' greatest needs regarding communication curricula and assessment. METHODS We surveyed pediatric residency program directors about their programs' approach to teaching and assessing residents' communication skills and how satisfied they were with their curricula and assessment of competence. Respondents were asked about their programs' greatest needs for teaching and assessing communication skills. RESULTS Response rate was 41% (82/202). Most programs did teach communication skills to residents; only 14% provided no formal training. Programs identified various 1) educational formats for teaching communication skills, 2) curricular content, and 3) assessment methods for determining competence. Many programs were less than satisfied with their curriculum and the accuracy of their assessments. The greatest programmatic need regarding curricula was time, while the greatest need for assessment was a tool. CONCLUSIONS While teaching and assessment of communication skills is common in pediatric residency programs, it is inconsistent and variable, and many programs are not satisfied with their current communication training. There is need for development of and access to appropriate and useful curricula as well as a practical tool for assessment which has been evaluated for validity evidence.
Collapse
Affiliation(s)
- Suzanne Reed
- Department of Pediatrics, The Ohio State University College of Medicine, Nationwide Children's Hospital, Columbus (S Reed); Department of Pediatrics, Harvard Medical School, MassGeneral Hospital for Children, Boston, Mass (A Frey-Vogel); Department of Pediatrics, University of Texas Southwestern Medical Center, Dallas (M Frost).
| | - Ariel Frey-Vogel
- Department of Pediatrics, The Ohio State University College of Medicine, Nationwide Children's Hospital, Columbus (S Reed); Department of Pediatrics, Harvard Medical School, MassGeneral Hospital for Children, Boston, Mass (A Frey-Vogel); Department of Pediatrics, University of Texas Southwestern Medical Center, Dallas (M Frost)
| | - Mackenzie Frost
- Department of Pediatrics, The Ohio State University College of Medicine, Nationwide Children's Hospital, Columbus (S Reed); Department of Pediatrics, Harvard Medical School, MassGeneral Hospital for Children, Boston, Mass (A Frey-Vogel); Department of Pediatrics, University of Texas Southwestern Medical Center, Dallas (M Frost)
| |
Collapse
|
12
|
Binotti M, Genoni G, Rizzollo S, De Luca M, Carenzo L, Monzani A, Ingrassia PL. Simulation-based medical training for paediatric residents in Italy: a nationwide survey. BMC MEDICAL EDUCATION 2019; 19:161. [PMID: 31113417 PMCID: PMC6529987 DOI: 10.1186/s12909-019-1581-3] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 08/20/2018] [Accepted: 04/26/2019] [Indexed: 06/09/2023]
Abstract
BACKGROUND A prompt start to an appropriate neonatal and paediatric resuscitation is critical to reduce mortality and morbidity. However, residents are rarely exposed to real emergency situations. Simulation-based medical training (SBMT) offers the opportunity to improve medical and non-technical skills in a controlled setting. This survey describes the availability and current use of SBMT by paediatric residents in Italy with the purpose of understanding residents' expectations regarding neonatal and paediatric emergency training, and identifying gaps and potential areas for future implementation. METHODS A survey was developed and distributed to Italian residents. SBMT was defined as any kind of training with a mannequin in a contextualised clinically realistic scenario. RESULTS The response rate was 14.4%, covering the 71% of Italian paediatric residency programmes. Among them, 88% stated that Out of the 274 residents, 88% stated that they received less than 5 h of SBMT during the past training year, with 66% not participating in any kind of simulation activity. In 62% of the programmes no simulation training facility was available to residents. Among those who received SBMT, 46% used it for procedures and skills, 30% for clinical scenarios, but only 24% of them reported a regular use for debriefing. Of the overall respondents, 93% were interested in receiving SBMT to improve decision-making abilities in complex medical situations, to improve technical/procedural skills, and to improve overall competency in neonatal and paediatric emergencies, including non-technical skills. The main barriers to the implementation of SBMT programmes in Italian paediatric residencies were: the lack of experts (57%), the lack of support from the school director (56%), the lack of organisation in planning simulation centre courses (42%) and the lack of teaching materials (42%). CONCLUSIONS This survey shows the scarce use of SBMT during paediatric training programmes in Italy and points out the main limitations to its diffusion. This is a call to action to develop organised SBMT during paediatric residency programs, to train qualified personnel, and to improve the quality of education and care in this field.
Collapse
Affiliation(s)
- Marco Binotti
- Neonatal and Paediatric Intensive Care Unit, Maggiore della Carità Hospital, Novara, Italy
- SIMNOVA, Interdepartmental Centre for Innovative Didactics and Simulation in Medicine and Health Professions, University of Piemonte Orientale, Novara, Italy
| | - Giulia Genoni
- Division of Paediatrics, Department of Health Sciences, University of Piemonte Orientale, Via Solaroli 17, 28100 Novara, Italy
| | - Stefano Rizzollo
- Division of Paediatrics, Department of Health Sciences, University of Piemonte Orientale, Via Solaroli 17, 28100 Novara, Italy
| | - Marco De Luca
- SIMMeyer, Anna Meyer Children’s University Hospital, Florence, Italy
| | - Luca Carenzo
- SIMNOVA, Interdepartmental Centre for Innovative Didactics and Simulation in Medicine and Health Professions, University of Piemonte Orientale, Novara, Italy
| | - Alice Monzani
- Division of Paediatrics, Department of Health Sciences, University of Piemonte Orientale, Via Solaroli 17, 28100 Novara, Italy
| | - Pier Luigi Ingrassia
- SIMNOVA, Interdepartmental Centre for Innovative Didactics and Simulation in Medicine and Health Professions, University of Piemonte Orientale, Novara, Italy
| |
Collapse
|
13
|
Bismilla Z, Boyle T, Mangold K, Van Ittersum W, White ML, Zaveri P, Mallory L. Development of a Simulation-Based Interprofessional Teamwork Assessment Tool. J Grad Med Educ 2019; 11:168-176. [PMID: 31024648 PMCID: PMC6476092 DOI: 10.4300/jgme-d-18-00729.1] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/19/2018] [Revised: 01/22/2019] [Accepted: 02/11/2019] [Indexed: 12/29/2022] Open
Abstract
BACKGROUND The Accreditation Council for Graduate Medical Education (ACGME) Milestone projects required each specialty to identify essential skills and develop means of assessment with supporting validity evidence for trainees. Several specialties rate trainees on a milestone subcompetency related to working in interprofessional teams. A tool to assess trainee competence in any role on an interprofessional team in a variety of scenarios would be valuable and suitable for simulation-based assessment. OBJECTIVE We developed a tool for simulation settings that assesses interprofessional teamwork in trainees. METHODS In 2015, existing tools that assess teamwork or interprofessionalism using direct observation were systematically reviewed for appropriateness, generalizability, adaptability, ease of use, and resources required. Items from these tools were included in a Delphi method with multidisciplinary pediatrics experts using an iterative process from June 2016 to January 2017 to develop an assessment tool. RESULTS Thirty-one unique tools were identified. A 2-stage review narrowed this list to 5 tools, and 81 items were extracted. Twenty-two pediatrics experts participated in 4 rounds of Delphi surveys, with response rates ranging from 82% to 100%. Sixteen items reached consensus for inclusion in the final tool. A global 4-point rating scale from novice to proficient was developed. CONCLUSIONS A novel tool to assess interprofessional teamwork for individual trainees in a simulated setting was developed using a systematic review and Delphi methodology. This is the first step to establish the validity evidence necessary to use this tool for competency-based assessment.
Collapse
|
14
|
A Multicenter Collaboration for Simulation-Based Assessment of ACGME Milestones in Emergency Medicine. Simul Healthc 2019; 13:348-355. [PMID: 29620703 DOI: 10.1097/sih.0000000000000291] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/08/2023]
Abstract
STATEMENT In 2014, the six allopathic emergency medicine (EM) residency programs in Chicago established an annual, citywide, simulation-based assessment of all postgraduate year 2 EM residents. The cases and corresponding assessment tools were designed by the simulation directors from each of the participating sites. All assessment tools include critical actions that map directly to numerous EM milestones in 11 different subcompetencies. The 2-hour assessments provide opportunities for residents to lead resuscitations of critically ill patients and demonstrate procedural skills, using mannequins and task trainers respectively. More than 80 residents participate annually and their assessment experiences are essentially identical across testing sites. The assessments are completed electronically and comparative performance data are immediately available to program directors.
Collapse
|
15
|
Validity Evidence for a Serious Game to Assess Performance on Critical Pediatric Emergency Medicine Scenarios. Simul Healthc 2018; 13:168-180. [PMID: 29377865 DOI: 10.1097/sih.0000000000000283] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/18/2023]
Abstract
INTRODUCTION We developed a first-person serious game, PediatricSim, to teach and assess performances on seven critical pediatric scenarios (anaphylaxis, bronchiolitis, diabetic ketoacidosis, respiratory failure, seizure, septic shock, and supraventricular tachycardia). In the game, players are placed in the role of a code leader and direct patient management by selecting from various assessment and treatment options. The objective of this study was to obtain supportive validity evidence for the PediatricSim game scores. METHODS Game content was developed by 11 subject matter experts and followed the American Heart Association's 2011 Pediatric Advanced Life Support Provider Manual and other authoritative references. Sixty subjects with three different levels of experience were enrolled to play the game. Before game play, subjects completed a 40-item written pretest of knowledge. Game scores were compared between subject groups using scoring rubrics developed for the scenarios. Validity evidence was established and interpreted according to Messick's framework. RESULTS Content validity was supported by a game development process that involved expert experience, focused literature review, and pilot testing. Subjects rated the game favorably for engagement, realism, and educational value. Interrater agreement on game scoring was excellent (intraclass correlation coefficient = 0.91, 95% confidence interval = 0.89-0.9). Game scores were higher for attendings followed by residents then medical students (Pc < 0.01) with large effect sizes (1.6-4.4) for each comparison. There was a very strong, positive correlation between game and written test scores (r = 0.84, P < 0.01). CONCLUSIONS These findings contribute validity evidence for PediatricSim game scores to assess knowledge of pediatric emergency medicine resuscitation.
Collapse
|
16
|
Hart D, Bond W, Siegelman JN, Miller D, Cassara M, Barker L, Anders S, Ahn J, Huang H, Strother C, Hui J. Simulation for Assessment of Milestones in Emergency Medicine Residents. Acad Emerg Med 2018; 25:205-220. [PMID: 28833892 DOI: 10.1111/acem.13296] [Citation(s) in RCA: 21] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/09/2017] [Revised: 08/01/2017] [Accepted: 08/16/2017] [Indexed: 11/29/2022]
Abstract
OBJECTIVES All residency programs in the United States are required to report their residents' progress on the milestones to the Accreditation Council for Graduate Medical Education (ACGME) biannually. Since the development and institution of this competency-based assessment framework, residency programs have been attempting to ascertain the best ways to assess resident performance on these metrics. Simulation was recommended by the ACGME as one method of assessment for many of the milestone subcompetencies. We developed three simulation scenarios with scenario-specific milestone-based assessment tools. We aimed to gather validity evidence for this tool. METHODS We conducted a prospective observational study to investigate the validity evidence for three mannequin-based simulation scenarios for assessing individual residents on emergency medicine (EM) milestones. The subcompetencies (i.e., patient care [PC]1, PC2, PC3) included were identified via a modified Delphi technique using a group of experienced EM simulationists. The scenario-specific checklist (CL) items were designed based on the individual milestone items within each EM subcompetency chosen for assessment and reviewed by experienced EM simulationists. Two independent live raters who were EM faculty at the respective study sites scored each scenario following brief rater training. The inter-rater reliability (IRR) of the assessment tool was determined by measuring intraclass correlation coefficient (ICC) for the sum of the CL items as well as the global rating scales (GRSs) for each scenario. Comparing GRS and CL scores between various postgraduate year (PGY) levels was performed with analysis of variance. RESULTS Eight subcompetencies were chosen to assess with three simulation cases, using 118 subjects. Evidence of test content, internal structure, response process, and relations with other variables were found. The ICCs for the sum of the CL items and the GRSs were >0.8 for all cases, with one exception (clinical management GRS = 0.74 in sepsis case). The sum of CL items and GRSs (p < 0.05) discriminated between PGY levels on all cases. However, when the specific CL items were mapped back to milestones in various proficiency levels, the milestones in the higher proficiency levels (level 3 [L3] and 4 [L4]) did not often discriminate between various PGY levels. L3 milestone items discriminated between PGY levels on five of 12 occasions they were assessed, and L4 items discriminated only two of 12 times they were assessed. CONCLUSION Three simulation cases with scenario-specific assessment tools allowed evaluation of EM residents on proficiency L1 to L4 within eight of the EM milestone subcompetencies. Evidence of test content, internal structure, response process, and relations with other variables were found. Good to excellent IRR and the ability to discriminate between various PGY levels was found for both the sum of CL items and the GRSs. However, there was a lack of a positive relationship between advancing PGY level and the completion of higher-level milestone items (L3 and L4).
Collapse
Affiliation(s)
- Danielle Hart
- Emergency Medicine; Hennepin County Medical Center; University of Minnesota Medical School; Minneapolis MN
| | - William Bond
- Department of Emergency Medicine; Lehigh Valley Health Network; Allentown PA
| | | | - Daniel Miller
- Department of Emergency Medicine; University of Iowa; Iowa City IA
| | - Michael Cassara
- Department of Emergency Medicine; Hofstra University North Shore Long Island Jewish SOM; Northwell Health Center; Lake Success NY
| | - Lisa Barker
- Department of Emergency Medicine; University of Illinois College of Medicine at Peoria; Peoria IL
| | - Shilo Anders
- Department of Anesthesiology; Vanderbilt University; Nashville TN
| | - James Ahn
- Department of Emergency Medicine; University of Chicago; Chicago IL
| | - Hubert Huang
- Division of Education; Lehigh Valley Health Network; Allentown PA
| | | | - Joshua Hui
- Department of Emergency Medicine; Kaiser Permanente; Los Angeles Medical Center; Los Angeles CA
| |
Collapse
|