1
|
Olin K, Göras C, Nilsson U, Unbeck M, Ehrenberg A, Pukk-Härenstam K, Ekstedt M. Mapping registered nurse anaesthetists' intraoperative work: tasks, multitasking, interruptions and their causes, and interactions: a prospective observational study. BMJ Open 2022; 12:e052283. [PMID: 35045998 PMCID: PMC8772415 DOI: 10.1136/bmjopen-2021-052283] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/13/2021] [Accepted: 01/05/2022] [Indexed: 11/04/2022] Open
Abstract
INTRODUCTION Safe anaesthesia care is a fundamental part of healthcare. In a previous study, registered nurse anaesthetists (RNAs) had the highest task frequency, with the largest amount of multitasking and interruptions among all professionals working in a surgical team. There is a lack of knowledge on how these factors are distributed during the intraoperative anaesthesia care process, and what implications they might have on safety and quality of care. OBJECTIVE To map the RNAs' work as done in practice, including tasks, multitasking, interruptions and their causes, and interactions, during all phases of the intraoperative anaesthesia work process. METHODS Structured observations of RNAs (n=8) conducted during 30 procedures lasting a total of 73 hours in an operating department at a county hospital in Sweden, using the Work Observation Method By Activity Timing tool. RESULTS High task intensity and multitasking were revealed during preparation for anaesthesia induction (79 tasks/hour, 61.9% of task time spent multitasking), anaesthesia induction (98 tasks/hour, 50.7%) and preparation for anaesthesia maintenance (86 tasks/hour, 80.2%). Frequent interruptions took place during preoperative preparation (4.7 /hour), anaesthesia induction (6.2 /hour) and preparation for anaesthesia maintenance (4.3 /hour). The interruptions were most often related to medication care (n=54, 19.8%), equipment issues (n=40, 14.7%) or the procedure itself (n=39, 14.3%). RNAs' work was conducted mostly independently (58.4%), but RNAs interacted with multiple professionals in and outside the operating room during anaesthesia. CONCLUSION The tasks, multitasking, interruptions and their causes, and interactions during different phases illustrated the RNAs' work as done, as part of a complex adaptive system. Management of safety in the most intense phases-preparing for anaesthesia induction, induction and preparing for anaesthesia maintenance-should be investigated further. The complexity and adaptivity of the nature of RNAs' work should be taken into consideration in future management, development, research and education.
Collapse
Affiliation(s)
- Karolina Olin
- Medical Management Centre, Department of Learning, Informatics, Management and Ethics, Karolinska Institute, Stockholm, Sweden
- Administration Centre, Tyks and Hospital District of Southwest Finland, Turku, Finland
| | - Camilla Göras
- Faculty of Medicine, School of Education, Health and Social Studies, Örebro University, Orebro, Sweden
- Center for Clinical Research Dalarna, Falun, Sweden
- Department of Anaesthesia and Intensive Care Unit, Falu Hospital, Falun, Sweden
| | - Ulrica Nilsson
- Department of Neurobiology, Care Sciences and Society, Karolinska Institutet, Stockholm, Sweden
| | - Maria Unbeck
- Department of Neurobiology, Care Sciences and Society, Karolinska Institutet, Stockholm, Sweden
- School of Health and Welfare, Dalarna University, Falun, Sweden
| | - Anna Ehrenberg
- School of Health and Welfare, Dalarna University, Falun, Sweden
| | - Karin Pukk-Härenstam
- Medical Management Centre, Department of Learning, Informatics, Management and Ethics, Karolinska Institute, Stockholm, Sweden
- Astrid Lindgren's Children's Hospital, Karolinska Universitetssjukhuset, Stockholm, Sweden
| | - Mirjam Ekstedt
- Medical Management Centre, Department of Learning, Informatics, Management and Ethics, Karolinska Institute, Stockholm, Sweden
- Department of Health and Caring Sciences, Faculty of Health and Life Sciences, Linnaeus University, Kalmar, Sweden
| |
Collapse
|
2
|
Jeon Y, Meretoja R, Vahlberg T, Leino-Kilpi H. Developing and psychometric testing of the anaesthesia nursing competence scale. J Eval Clin Pract 2020; 26:866-878. [PMID: 31264335 DOI: 10.1111/jep.13215] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 04/05/2019] [Revised: 06/04/2019] [Accepted: 06/09/2019] [Indexed: 12/01/2022]
Abstract
RATIONALE, AIMS, AND OBJECTIVES The competence of nurses in anaesthesia care is important for the quality of anaesthesia nursing care and patient safety. However, there is a lack of psychometrically tested instruments to measure the competence. Therefore, this study aimed to develop and test the psychometric properties of an anaesthesia nursing competence scale (AnestComp) assessing nurses' competence in anaesthesia care. METHOD The scale development and psychometric testing had three phases: (1) based on literature reviews and the description of experts, competence areas were identified and items were created; (2) the content validity of the scale was tested by a content expert group, and the scale was pilot tested; and (3) psychometric testing of scale was tested by anaesthesia nurses' (n = 222) and nursing students' (n = 205) self-assessments. The psychometric testing assessed the reliability when using Cronbach's α and the construct validity using factor analyses (confirmatory and exploratory) and known-group technique. Nursing students were included for the purpose of construct validity testing. RESULTS The AnestComp has 39 items and consists of seven competence areas: (a) ethics of anaesthesia care, (b) patient's risk care, (c) patient engagement with technology, (d) collaboration within patient care, (e) anaesthesia patient care with medication, (f) peri-anaesthesia nursing intervention, and (g) knowledge of anaesthesia patient care. Cronbach's α values were high in all categories (0.83-0.95), and factor analyses and known-group technique supported a seven-factor model. CONCLUSION The initial results supported the reliability and construct validity of the AnestComp. The scale is considered a promising instrument for measuring anaesthesia nursing competence among anaesthesia nurses. Further research with larger and more diverse samples is suggested to refine the current psychometric evaluation.
Collapse
Affiliation(s)
- Yunsuk Jeon
- Department of Nursing Science, University of Turku, Turku, Finland.,Group Administration, Helsinki University Hospital, Helsinki, Finland
| | - Riitta Meretoja
- Department of Nursing Science, University of Turku, Turku, Finland.,Group Administration, Helsinki University Hospital, Helsinki, Finland
| | - Tero Vahlberg
- Department of Biostatistics, University of Turku, Turku, Finland
| | - Helena Leino-Kilpi
- Department of Nursing Science, University of Turku, Turku, Finland.,Turku University Hospital, Turku, Finland
| |
Collapse
|
3
|
|
4
|
King CR, Abraham J, Kannampallil TG, Fritz BA, Ben Abdallah A, Chen Y, Henrichs B, Politi M, Torres BA, Mickle A, Budelier TP, McKinnon S, Gregory S, Kheterpal S, Wildes T, Avidan MS. Protocol for the Effectiveness of an Anesthesiology Control Tower System in Improving Perioperative Quality Metrics and Clinical Outcomes: the TECTONICS randomized, pragmatic trial. F1000Res 2019; 8:2032. [PMID: 32201572 PMCID: PMC7076336 DOI: 10.12688/f1000research.21016.1] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 11/12/2019] [Indexed: 01/25/2023] Open
Abstract
Introduction: Perioperative morbidity is a public health priority, and surgical volume is increasing rapidly. With advances in technology, there is an opportunity to research the utility of a telemedicine-based control center for anesthesia clinicians that assess risk, diagnoses negative patient trajectories, and implements evidence-based practices. Objectives: The primary objective of this trial is to determine whether an anesthesiology control tower (ACT) prevents clinically relevant adverse postoperative outcomes including 30-day mortality, delirium, respiratory failure, and acute kidney injury. Secondary objectives are to determine whether the ACT improves perioperative quality of care metrics including management of temperature, mean arterial pressure, mean airway pressure with mechanical ventilation, blood glucose, anesthetic concentration, antibiotic redosing, and efficient fresh gas flow. Methods and analysis: We are conducting a single center, randomized, controlled, phase 3 pragmatic clinical trial. A total of 58 operating rooms are randomized daily to receive support from the ACT or not. All adults (eighteen years and older) undergoing surgical procedures in these operating rooms are included and followed until 30 days after their surgery. Clinicians in operating rooms randomized to ACT support receive decision support from clinicians in the ACT. In operating rooms randomized to no intervention, the current standard of anesthesia care is delivered. The intention-to-treat principle will be followed for all analyses. Differences between groups will be presented with 99% confidence intervals; p-values <0.005 will be reported as providing compelling evidence, and p-values between 0.05 and 0.005 will be reported as providing suggestive evidence. Registration: TECTONICS is registered on ClinicalTrials.gov, NCT03923699; registered on 23 April 2019.
Collapse
Affiliation(s)
- Christopher R. King
- Department of Anesthesiology, Washington University in St Louis, St Louis, MO, 63110, USA
| | - Joanna Abraham
- Department of Anesthesiology, Washington University in St Louis, St Louis, MO, 63110, USA
- Institute for Informatics, Washington University in St Louis, St Louis, MO, 63110, USA
| | - Thomas G. Kannampallil
- Department of Anesthesiology, Washington University in St Louis, St Louis, MO, 63110, USA
- Institute for Informatics, Washington University in St Louis, St Louis, MO, 63110, USA
| | - Bradley A. Fritz
- Department of Anesthesiology, Washington University in St Louis, St Louis, MO, 63110, USA
| | - Arbi Ben Abdallah
- Department of Anesthesiology, Washington University in St Louis, St Louis, MO, 63110, USA
| | - Yixin Chen
- Department of Computer Science and Engineering, Washington University in St Louis, St Louis, MO, 63110, USA
| | - Bernadette Henrichs
- Department of Anesthesiology, Washington University in St Louis, St Louis, MO, 63110, USA
| | - Mary Politi
- Department of Surgery, Washington University in St Louis, St Louis, MO, 63110, USA
| | - Brian A. Torres
- Department of Anesthesiology, Washington University in St Louis, St Louis, MO, 63110, USA
| | - Angela Mickle
- Department of Anesthesiology, Washington University in St Louis, St Louis, MO, 63110, USA
| | - Thaddeus P. Budelier
- Department of Anesthesiology, Washington University in St Louis, St Louis, MO, 63110, USA
| | - Sherry McKinnon
- Department of Anesthesiology, Washington University in St Louis, St Louis, MO, 63110, USA
| | - Stephen Gregory
- Department of Anesthesiology, Washington University in St Louis, St Louis, MO, 63110, USA
| | - Sachin Kheterpal
- Department of Anesthesiology, University of Michigan, Ann Arbor, MI, 48109, USA
| | - Troy Wildes
- Department of Anesthesiology, Washington University in St Louis, St Louis, MO, 63110, USA
| | - Michael S. Avidan
- Department of Anesthesiology, Washington University in St Louis, St Louis, MO, 63110, USA
| | - TECTONICS Research Group
- Department of Anesthesiology, Washington University in St Louis, St Louis, MO, 63110, USA
- Institute for Informatics, Washington University in St Louis, St Louis, MO, 63110, USA
- Department of Computer Science and Engineering, Washington University in St Louis, St Louis, MO, 63110, USA
- Department of Surgery, Washington University in St Louis, St Louis, MO, 63110, USA
- Department of Anesthesiology, University of Michigan, Ann Arbor, MI, 48109, USA
| |
Collapse
|
5
|
Parsons SM, Kuszajewski ML, Merritt DR, Muckler VC. High-Fidelity Simulation Training for Nurse Anesthetists Managing Malignant Hyperthermia: A Quality Improvement Project. Clin Simul Nurs 2019. [DOI: 10.1016/j.ecns.2018.10.003] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
6
|
Henrichs B, Thorn S, Thompson JA. Teaching Student Nurse Anesthetists to Respond to Simulated Anesthetic Emergencies. Clin Simul Nurs 2018. [DOI: 10.1016/j.ecns.2017.10.007] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
|
7
|
McEvoy MD, Thies KC, Einav S, Ruetzler K, Moitra VK, Nunnally ME, Banerjee A, Weinberg G, Gabrielli A, Maccioli GA, Dobson G, O’Connor MF. Cardiac Arrest in the Operating Room. Anesth Analg 2018; 126:889-903. [DOI: 10.1213/ane.0000000000002595] [Citation(s) in RCA: 29] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/05/2023]
|
8
|
Jeon Y, Lakanmaa RL, Meretoja R, Leino-Kilpi H. Competence Assessment Instruments in Perianesthesia Nursing Care: A Scoping Review of the Literature. J Perianesth Nurs 2017; 32:542-556. [DOI: 10.1016/j.jopan.2016.09.008] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2016] [Revised: 09/06/2016] [Accepted: 09/10/2016] [Indexed: 10/19/2022]
|
9
|
Simulation-based Assessment of the Management of Critical Events by Board-certified Anesthesiologists. Anesthesiology 2017; 127:475-489. [DOI: 10.1097/aln.0000000000001739] [Citation(s) in RCA: 48] [Impact Index Per Article: 6.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
Abstract
Background
We sought to determine whether mannequin-based simulation can reliably characterize how board-certified anesthesiologists manage simulated medical emergencies. Our primary focus was to identify gaps in performance and to establish psychometric properties of the assessment methods.
Methods
A total of 263 consenting board-certified anesthesiologists participating in existing simulation-based maintenance of certification courses at one of eight simulation centers were video recorded performing simulated emergency scenarios. Each participated in two 20-min, standardized, high-fidelity simulated medical crisis scenarios, once each as primary anesthesiologist and first responder. Via a Delphi technique, an independent panel of expert anesthesiologists identified critical performance elements for each scenario. Trained, blinded anesthesiologists rated video recordings using standardized rating tools. Measures included the percentage of critical performance elements observed and holistic (one to nine ordinal scale) ratings of participant’s technical and nontechnical performance. Raters also judged whether the performance was at a level expected of a board-certified anesthesiologist.
Results
Rater reliability for most measures was good. In 284 simulated emergencies, participants were rated as successfully completing 81% (interquartile range, 75 to 90%) of the critical performance elements. The median rating of both technical and nontechnical holistic performance was five, distributed across the nine-point scale. Approximately one-quarter of participants received low holistic ratings (i.e., three or less). Higher-rated performances were associated with younger age but not with previous simulation experience or other individual characteristics. Calling for help was associated with better individual and team performance.
Conclusions
Standardized simulation-based assessment identified performance gaps informing opportunities for improvement. If a substantial proportion of experienced anesthesiologists struggle with managing medical emergencies, continuing medical education activities should be reevaluated.
Collapse
|
10
|
DeMaria S, Levine A, Petrou P, Feldman D, Kischak P, Burden A, Goldberg A. Performance gaps and improvement plans from a 5-hospital simulation programme for anaesthesiology providers: a retrospective study. BMJ SIMULATION & TECHNOLOGY ENHANCED LEARNING 2017; 3:37-42. [DOI: 10.1136/bmjstel-2016-000163] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 01/21/2017] [Indexed: 01/08/2023]
Abstract
BackgroundSimulation is increasingly employed in healthcare provider education, but usage as a means of identifying system-wide practitioner gaps has been limited. We sought to determine whether practice gaps could be identified, and if meaningful improvement plans could result from a simulation course for anaesthesiology providers.MethodsOver a 2-year cycle, 288 anaesthesiologists and 67 certified registered nurse anaesthetists (CRNAs) participated in a 3.5 hour, malpractice insurer-mandated simulation course, encountering 4 scenarios. 5 anaesthesiology departments within 3 urban academic healthcare systems were represented. A real-time rater scored each individual on 12 critical performance items (CPIs) representing learning objectives for a given scenario. Participants completed a course satisfaction survey, a 1-month postcourse practice improvement plan (PIP) and a 6-month follow-up survey.ResultsAll recorded course data were retrospectively reviewed. Course satisfaction was generally positive (88–97% positive rating by item). 4231 individual CPIs were recorded (of a possible 4260 rateable), with a majority of participants demonstrating remediable gaps in medical/technical and non-technical skills (97% of groups had at least one instance of a remediable gap in communication/non-technical skills during at least one of the scenarios). 6 months following the course, 91% of respondents reported successfully implementing 1 or more of their PIPs. Improvements in equipment/environmental resources or personal knowledge domains were most often successful, and several individual reports demonstrated a positive impact on actual practice.ConclusionsThis professional liability insurer-initiated simulation course for 5 anaesthesiology departments was feasible to deliver and well received. Practice gaps were identified during the course and remediation of gaps, and/or application of new knowledge, skills and resources was reported by participants.
Collapse
|
11
|
Aiming for excellence – A simulation-based study on adapting and testing an instrument for developing non-technical skills in Norwegian student nurse anaesthetists. Nurse Educ Pract 2017; 22:37-46. [DOI: 10.1016/j.nepr.2016.11.008] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/02/2016] [Revised: 09/11/2016] [Accepted: 11/28/2016] [Indexed: 11/21/2022]
|
12
|
Ohsfeldt RL, Miller TR, Schneider JE, Scheibling CM. Cost impact of unexpected disposition after orthopedic ambulatory surgery associated with category of anesthesia provider. J Clin Anesth 2016; 35:157-162. [PMID: 27871514 DOI: 10.1016/j.jclinane.2016.06.012] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/30/2016] [Revised: 05/18/2016] [Accepted: 06/15/2016] [Indexed: 11/26/2022]
Abstract
STUDY OBJECTIVE To provide estimates of the costs and health outcomes implications of the excess risk of unexpected disposition for nurse anesthetist (NA) procedures. DESIGN A projection model was used to apply estimates of costs and health outcomes associated with the excess risk of unexpected disposition for NAs reported in a recent study. SETTING Ambulatory and inpatient surgery. PATIENTS Base-case model parameters were based on estimates taken from peer-reviewed publications when available, or from other sources including data for all hospital stays in the United States in 2013 from the Healthcare Cost and Utilization Project Web site. The impact of parameter uncertainty was assessed using 1-way and 2-way sensitivity analyses. INTERVENTIONS Not applicable. MEASUREMENTS Major complication rates, relative risks of complications, anesthesia costs, costs of complications, and cost-effectiveness ratios. MAIN RESULTS In the base-case model, there were on average 2.3 fewer unexpected dispositions for physician anesthesiologists compared with NAs. Overall, anesthesia-related costs (including the cost of managing unexpected dispositions) were estimated to be about $31 higher per procedure for physician anesthesiologists compared with NAs. Alternative model scenarios in the sensitivity analysis produced estimates of smaller additional costs associated with physician anesthesia administration, to the point of cost savings in some scenarios. CONCLUSIONS Provision of anesthesia for ambulatory knee and shoulder procedures by physician anesthesiologists results in better health outcomes, at a reasonable additional cost, compared with procedures with NA-administered anesthesia, at least when using updated cost-effectiveness willingness-to-pay benchmarks.
Collapse
Affiliation(s)
- Robert L Ohsfeldt
- Avalon Health Economics LLC, 26 Washington St., Floor 3, Morristown, NJ 07960, USA; Texas A&M University, 212 Adriance Lab Rd 1266 TAMU, College Station, TX 77843, USA.
| | - Thomas R Miller
- American Society of Anesthesiologists, 1061 American Lane, Schaumburg, IL 60173, USA.
| | - John E Schneider
- Avalon Health Economics LLC, 26 Washington St., Floor 3, Morristown, NJ 07960, USA.
| | - Cara M Scheibling
- Avalon Health Economics LLC, 26 Washington St., Floor 3, Morristown, NJ 07960, USA.
| |
Collapse
|
13
|
Ott T, Schmidtmann I, Limbach T, Gottschling PF, Buggenhagen H, Kurz S, Pestel G. [Simulation-based training and OR apprenticeship for medical students : A prospective, randomized, single-blind study of clinical skills]. Anaesthesist 2016; 65:822-831. [PMID: 27678137 DOI: 10.1007/s00101-016-0221-0] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/20/2016] [Revised: 07/05/2016] [Accepted: 08/15/2016] [Indexed: 11/30/2022]
Abstract
BACKGROUND Simulation-based training (SBT) has developed into an established method of medical training. Studies focusing on the education of medical students have used simulation as an evaluation tool for defined skills. A small number of studies provide evidence that SBT improves medical students' skills in the clinical setting. Moreover, they were strictly limited to a few areas, such as the diagnosis of heart murmurs or the correct application of cricoid pressure. Other studies could not prove adequate transferability from the skills gained in SBT to the patient site. Whether SBT has an effect on medical students' skills in anesthesiology in the clinical setting is controversial. To explore this issue, we designed a prospective, randomized, single-blind trial that was integrated into the undergraduate anesthesiology curriculum of our department during the second year of the clinical phase of medical school. OBJECTIVES This study intended to explore the effect of SBT on medical students within the mandatory undergraduate anesthesiology curriculum of our department in the operating room with respect to basic skills in anesthesiology. MATERIALS AND METHODS After obtaining ethical approval, the participating students of the third clinical semester were randomized into two groups: the SIM-OR group was trained by a 225 min long SBT in basic skills in anesthesiology before attending the operating room (OR) apprenticeship. The OR-SIM group was trained after the operating room apprenticeship by SBT. During SBT the students were trained in five clinical skills detailed below. Further, two clinical scenarios were simulated using a full-scale simulator. The students had to prepare the patient and perform induction of anesthesia, including bag-mask ventilation after induction in scenario 1 and rapid sequence induction in scenario 2. Using the five-point Likert scale, five defined skills were evaluated at defined time points during the study period. 1) application of the safety checklist, 2) application of basic patient monitoring, 3) establishment of intravenous access, 4) bag-and-mask ventilation, and 5) adjustment of ventilatory parameters after the patients' airways were secured. A cumulative score of 5 points was defined as the best and a cumulative score of 25 as the worst rating for a defined time point. The primary endpoint was the cumulative score after day 1 in the operating room apprenticeship and the difference in cumulative scores from days 1 to 4. Our hypothesis was that the SIM-OR group would achieve a better score after day 1 in the operating room apprenticeship and would gain a larger increase in score from day 1 to day 4 than the OR-SIM group. RESULTS 73 students were allocated to the OR-SIM group and 70 students to the SIM-OR group. There was no significant difference between the two groups after day 1 of the operating room apprenticeship and no difference in increase of the cumulative score from day 1 to day 4 (median of cumulative score on day 1: 'SIM-OR' 11.2 points vs. 'OR-SIM' 14.6 points; p = 0.067; median of difference from day 1 to day 4: 'SIM-OR' -3.7 vs. 'OR-SIM' -6.4; p = 0.110). CONCLUSION With the methods applied, this study could not prove that 225 min of SBT before the operating room apprenticeship increased the medical students' clinical skills as evaluated in the operating room. Secondary endpoints indicate that medical students have better clinical skills at the end of the entire curriculum when they have been trained through SBT before the operating room apprenticeship. However, the authors believe that simulator training has a positive impact on students' acquisition of procedural and patient safety skills, even if the methods applied in this study may not mirror this aspect sufficiently.
Collapse
Affiliation(s)
- T Ott
- Klinik für Anästhesiologie, Universitätsmedizin Mainz, Johannes Gutenberg-Universität Mainz, Langenbeckstr. 1, 55131, Mainz, Deutschland.
| | - I Schmidtmann
- Institut für Medizinische Biometrie, Epidemiologie und Informatik, Universitätsmedizin Mainz, Johannes Gutenberg-Universität Mainz, Mainz, Deutschland
| | - T Limbach
- Klinik für Anästhesiologie, Universitätsmedizin Mainz, Johannes Gutenberg-Universität Mainz, Langenbeckstr. 1, 55131, Mainz, Deutschland
| | - P F Gottschling
- Klinik für Anästhesiologie, Universitätsmedizin Mainz, Johannes Gutenberg-Universität Mainz, Langenbeckstr. 1, 55131, Mainz, Deutschland
| | - H Buggenhagen
- Klinik für Anästhesiologie, Universitätsmedizin Mainz, Johannes Gutenberg-Universität Mainz, Langenbeckstr. 1, 55131, Mainz, Deutschland
| | - S Kurz
- Klinik für Anästhesiologie, Universitätsmedizin Mainz, Johannes Gutenberg-Universität Mainz, Langenbeckstr. 1, 55131, Mainz, Deutschland
| | - G Pestel
- Klinik für Anästhesiologie, Universitätsmedizin Mainz, Johannes Gutenberg-Universität Mainz, Langenbeckstr. 1, 55131, Mainz, Deutschland
| |
Collapse
|
14
|
A Smartphone-based Decision Support Tool Improves Test Performance Concerning Application of the Guidelines for Managing Regional Anesthesia in the Patient Receiving Antithrombotic or Thrombolytic Therapy. Anesthesiology 2016; 124:186-98. [PMID: 26513023 DOI: 10.1097/aln.0000000000000885] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
BACKGROUND The American Society of Regional Anesthesia and Pain Medicine (ASRA) consensus statement on regional anesthesia in the patient receiving antithrombotic or thrombolytic therapy is the standard for evaluation and management of these patients. The authors hypothesized that an electronic decision support tool (eDST) would improve test performance compared with native physician behavior concerning the application of this guideline. METHODS Anesthesiology trainees and faculty at 8 institutions participated in a prospective, randomized trial in which they completed a 20-question test involving clinical scenarios related to the ASRA guidelines. The eDST group completed the test using an iOS app programmed to contain decision logic and content of the ASRA guidelines. The control group completed the test by using any resource in addition to the app. A generalized linear mixed-effects model was used to examine the effect of the intervention. RESULTS After obtaining institutional review board's approval and informed consent, 259 participants were enrolled and randomized (eDST = 122; control = 137). The mean score was 92.4 ± 6.6% in the eDST group and 68.0 ± 15.8% in the control group (P < 0.001). eDST use increased the odds of selecting correct answers (7.8; 95% CI, 5.7 to 10.7). Most control group participants (63%) used some cognitive aid during the test, and they scored higher than those who tested from memory alone (76 ± 15% vs. 57 ± 18%, P < 0.001). There was no difference in time to completion of the test (P = 0.15) and no effect of training level (P = 0.56). CONCLUSIONS eDST use improved application of the ASRA guidelines compared with the native clinician behavior in a testing environment.
Collapse
|
15
|
Affiliation(s)
- Michael J Murray
- From the *Grand Canyon Anesthesiology Consultants, Phoenix, Arizona; †Department of Anesthesiology, Washington University School of Medicine, St. Louis, Missouri; and ‡Department of Anesthesiology, University of California-San Francisco, San Francisco, California
| | | | | |
Collapse
|
16
|
Park K, Ahn Y, Kang N, Sohn M. Development of a simulation-based assessment to evaluate the clinical competencies of Korean nursing students. NURSE EDUCATION TODAY 2016; 36:337-41. [PMID: 26362069 DOI: 10.1016/j.nedt.2015.08.020] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/17/2015] [Revised: 08/14/2015] [Accepted: 08/21/2015] [Indexed: 05/14/2023]
Abstract
OBJECTIVES To describe a simulation-based assessment (SBA) to evaluate the clinical competencies of nursing students in children's health and to compare its results with grade point average (GPA), self-efficacy, topic-specific knowledge, and self-reported clinical competency using the Six-D Scale. METHODS This cross-sectional, descriptive study recruited nursing students from a children's health clinical practicum. Students were assigned to either an asthma (n=55) or a type 1 diabetes (n=48) care scenario conducted on a high-fidelity simulator. Clinical competencies were assessed using the global rating scale (GRS) and a checklist. RESULTS Data on 103 students were analyzed. The SBA-GRS indicated that 64.6%-87.3% of students passed. The SBA-GRS showed a statistically significant positive association with the SBA checklist in both the asthma (rho=.763, p<.001) and the type 1 diabetes (rho=.475, p=.001) group. In the asthma group, the SBA-GRS and checklist showed statistically significant associations with GPA (rho=.413, p=.002 vs. r=.508, p<.001) and the Six-D Scale (rho=.266, p=.049 vs. r=.352, p=.008); in the diabetes group, only the SBA checklist showed a statistically significant association with self-efficacy (r=.339, p=.018) and the Six-D Scale (r=.373, p=.009). Four groups by SBA-GRS had statistically significant differences in scores on the SBA checklist in both groups (F=25.757, p<.001 in the asthma group; F=4.790, p=.006 in the diabetes group) and GPA only in the asthma groups (F=6.095, p<.001). CONCLUSION SBA was found to be feasible for nursing students. The GRS and checklist were reasonably correlated with other evaluation methods of student competency, but correlations were better with easier scenarios.
Collapse
Affiliation(s)
- Kyongok Park
- Department of Nursing, Far East University, Eumseong, Republic of Korea
| | - Youngmee Ahn
- Department of Nursing, Inha University, Incheon, Republic of Korea
| | - Narae Kang
- Department of Nursing, Inha University, Incheon, Republic of Korea
| | - Min Sohn
- Department of Nursing, Inha University, Incheon, Republic of Korea.
| |
Collapse
|
17
|
Abstract
In early 2015, the Medical Board of Australia commissioned research into international revalidation models and what might be applicable for Australia. This review examines the implications for Australian anaesthetists. What problem is revalidation seeking to address? What is happening in similar countries? Is there an issue with Australian anaesthetists' performance? Isn't continuing professional development enough? Could the Medical Board target known high-risk doctors? What is the evidence for the benefit of revalidation? How is and how should the profession be involved? Revalidation has been introduced in other developed countries. It commonly involves continuing professional development, feedback from colleagues, co-workers and patients, clinical audit and peer review. Although its evidence base is limited, the General Medical Council in the United Kingdom is evaluating its revalidation system, which should provide useful guidance for other countries. Australian anaesthetists and their professional organisations must remain informed about, and engaged in, the national debate about revalidation, to ensure that any new process is workable for Australian anaesthesia practice.
Collapse
Affiliation(s)
- L J Roberts
- Specialist Anaesthetist and Pain Medicine Physician, Departments of Anaesthesia and Pain Management, Sir Charles Gairdner Hospital, Nedlands, Western Australia
| |
Collapse
|
18
|
The role of simulation in continuing medical education for acute care physicians: a systematic review. Crit Care Med 2015; 43:186-93. [PMID: 25343571 DOI: 10.1097/ccm.0000000000000672] [Citation(s) in RCA: 52] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/23/2022]
Abstract
OBJECTIVES We systematically reviewed the effectiveness of simulation-based education, targeting independently practicing qualified physicians in acute care specialties. We also describe how simulation is used for performance assessment in this population. DATA SOURCES Data source included: DataMEDLINE, Embase, Cochrane Database of Systematic Reviews, Cochrane CENTRAL Database of Controlled Trials, and National Health Service Economic Evaluation Database. The last date of search was January 31, 2013. STUDY SELECTION All original research describing simulation-based education for independently practicing physicians in anesthesiology, critical care, and emergency medicine was reviewed. DATA EXTRACTION Data analysis was performed in duplicate with further review by a third author in cases of disagreement until consensus was reached. Data extraction was focused on effectiveness according to Kirkpatrick's model. For simulation-based performance assessment, tool characteristics and sources of validity evidence were also collated. DATA SYNTHESIS Of 39 studies identified, 30 studies focused on the effectiveness of simulation-based education and nine studies evaluated the validity of simulation-based assessment. Thirteen studies (30%) targeted the lower levels of Kirkpatrick's hierarchy with reliance on self-reporting. Simulation was unanimously described as a positive learning experience with perceived impact on clinical practice. Of the 17 remaining studies, 10 used a single group or "no intervention comparison group" design. The majority (n = 17; 44%) were able to demonstrate both immediate and sustained improvements in educational outcomes. Nine studies reported the psychometric properties of simulation-based performance assessment as their sole objective. These predominantly recruited independent practitioners as a convenience sample to establish whether the tool could discriminate between experienced and inexperienced operators and concentrated on a single aspect of validity evidence. CONCLUSIONS Simulation is perceived as a positive learning experience with limited evidence to support improved learning. Future research should focus on the optimal modality and frequency of exposure, quality of assessment tools and on the impact of simulation-based education beyond the individuals toward improved patient care.
Collapse
|
19
|
Adherence to guidelines for the management of local anesthetic systemic toxicity is improved by an electronic decision support tool and designated "Reader". Reg Anesth Pain Med 2015; 39:299-305. [PMID: 24956454 DOI: 10.1097/aap.0000000000000097] [Citation(s) in RCA: 40] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
BACKGROUND AND OBJECTIVES A hardcopy or paper cognitive aid has been shown to improve performance during the management of simulated local anesthetic systemic toxicity (LAST) when given to the team leader. However, there remains room for improvement to ensure a system that can achieve perfect adherence to the published guidelines for LAST management. Recent research has shown that implementing a checklist via a designated reader may be of benefit. Accordingly, we sought to investigate the effect of an electronic decision support tool (DST) and designated "Reader" role on team performance during an in situ simulation of LAST. METHODS Participants were randomized to Reader + DST (n = 16, rDST) and Control (n = 15, memory alone). The rDST group received the assistance of a dedicated Reader on the response team who was equipped with an electronic DST. The primary outcome measure was adherence to guidelines. RESULTS For overall and critical percent correct scores, the rDST group scored higher than Control (99.3% vs 72.2%, P < 0.0001; 99.5% vs 70%, P < 0.0001, respectively). In the LAST scenario, 0 (0%) of 15 in the control group performed 100% of critical management steps, whereas 15 (93.8%) of 16 in the rDST group did so (P < 0.0001). CONCLUSIONS In a prospective, randomized single-blinded study, a designated Reader with an electronic DST improved adherence to guidelines in the management of an in situ simulation of LAST. Such tools are promising in the future of medicine, but further research is needed to ensure the best methods for implementing them in the clinical arena.
Collapse
|
20
|
Hastings RH, Rickard TC. Deliberate Practice for Achieving and Maintaining Expertise in Anesthesiology. Anesth Analg 2015; 120:449-59. [DOI: 10.1213/ane.0000000000000526] [Citation(s) in RCA: 31] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|
21
|
Abstract
INTRODUCTION The authors developed a Standardized Assessment for Evaluation of Team Skills (SAFE-TeamS) in which actors portray health care team members in simulated challenging teamwork scenarios. Participants are scored on scenario-specific ideal behaviors associated with assistance, conflict resolution, communication, assertion, and situation assessment. This research sought to provide evidence of the validity and feasibility of SAFE-TeamS as a tool to support the advancement of science related to team skills training. METHODS Thirty-eight medical and nursing students were assessed using SAFE-TeamS before and after team skills training. The SAFE-TeamS pretraining and posttraining scores were compared, and participants were surveyed. Generalizability analysis was used to estimate the variance in scores associated with the following: examinee, scenario, rater, pretraining/posttraining, examinee type, rater type (actor-live vs. external rater-videotape), actor team, and scenario order. RESULTS The SAFE-TeamS scores reflected improvement after training and were sensitive to individual differences. Score variance due to rater was low. Variance due to scenario was moderate. Estimates of relative reliability for 2 raters and 8 scenarios ranged from 0.6 to 0.7. With fixed scenarios and raters, 2 raters and 2 scenarios, reliability is greater than 0.8. Raters believed SAFE-TeamS assessed relevant team skills. Examinees' responses were mixed. CONCLUSIONS The SAFE-TeamS was sensitive to individual differences and team skill training, providing evidence for validity. It is not clear whether different scenarios measure different skills and whether the scenarios cover the necessary breadth of skills. Use of multiple scenarios will support assessment across a broader range of skills. Future research is required to determine whether assessments using SAFE-TeamS will translate to performance in clinical practice.
Collapse
|
22
|
Cook DA, Zendejas B, Hamstra SJ, Hatala R, Brydges R. What counts as validity evidence? Examples and prevalence in a systematic review of simulation-based assessment. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2014; 19:233-50. [PMID: 23636643 DOI: 10.1007/s10459-013-9458-4] [Citation(s) in RCA: 129] [Impact Index Per Article: 12.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/12/2012] [Accepted: 04/09/2013] [Indexed: 05/26/2023]
Abstract
Ongoing transformations in health professions education underscore the need for valid and reliable assessment. The current standard for assessment validation requires evidence from five sources: content, response process, internal structure, relations with other variables, and consequences. However, researchers remain uncertain regarding the types of data that contribute to each evidence source. We sought to enumerate the validity evidence sources and supporting data elements for assessments using technology-enhanced simulation. We conducted a systematic literature search including MEDLINE, ERIC, and Scopus through May 2011. We included original research that evaluated the validity of simulation-based assessment scores using two or more evidence sources. Working in duplicate, we abstracted information on the prevalence of each evidence source and the underlying data elements. Among 217 eligible studies only six (3 %) referenced the five-source framework, and 51 (24 %) made no reference to any validity framework. The most common evidence sources and data elements were: relations with other variables (94 % of studies; reported most often as variation in simulator scores across training levels), internal structure (76 %; supported by reliability data or item analysis), and content (63 %; reported as expert panels or modification of existing instruments). Evidence of response process and consequences were each present in <10 % of studies. We conclude that relations with training level appear to be overrepresented in this field, while evidence of consequences and response process are infrequently reported. Validation science will be improved as educators use established frameworks to collect and interpret evidence from the full spectrum of possible sources and elements.
Collapse
Affiliation(s)
- David A Cook
- Office of Education Research, Mayo Medical School, Rochester, MN, USA,
| | | | | | | | | |
Collapse
|
23
|
Simulation-based assessment to identify critical gaps in safe anesthesia resident performance. Anesthesiology 2014; 120:129-41. [PMID: 24398731 DOI: 10.1097/aln.0000000000000055] [Citation(s) in RCA: 55] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
BACKGROUND Valid methods are needed to identify anesthesia resident performance gaps early in training. However, many assessment tools in medicine have not been properly validated. The authors designed and tested use of a behaviorally anchored scale, as part of a multiscenario simulation-based assessment system, to identify high- and low-performing residents with regard to domains of greatest concern to expert anesthesiology faculty. METHODS An expert faculty panel derived five key behavioral domains of interest by using a Delphi process (1) Synthesizes information to formulate a clear anesthetic plan; (2) Implements a plan based on changing conditions; (3) Demonstrates effective interpersonal and communication skills with patients and staff; (4) Identifies ways to improve performance; and (5) Recognizes own limits. Seven simulation scenarios spanning pre-to-postoperative encounters were used to assess performances of 22 first-year residents and 8 fellows from two institutions. Two of 10 trained faculty raters blinded to trainee program and training level scored each performance independently by using a behaviorally anchored rating scale. Residents, fellows, facilitators, and raters completed surveys. RESULTS Evidence supporting the reliability and validity of the assessment scores was procured, including a high generalizability coefficient (ρ = 0.81) and expected performance differences between first-year resident and fellow participants. A majority of trainees, facilitators, and raters judged the assessment to be useful, realistic, and representative of critical skills required for safe practice. CONCLUSION The study provides initial evidence to support the validity of a simulation-based performance assessment system for identifying critical gaps in safe anesthesia resident performance early in training.
Collapse
|
24
|
Validation of simulated difficult bag-mask ventilation as a training and evaluation method for first-year internal medicine house staff. Simul Healthc 2013; 8:20-4. [PMID: 22902607 DOI: 10.1097/sih.0b013e318263341f] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
INTRODUCTION The past decade has witnessed the increased use of patient simulation in medical training as a method to teach complex bedside skills. Although effective bag-mask ventilation (BMV) is a critical part of airway management, the quality of training in this skill has been questioned. METHODS First-year internal medicine house staff (novices) were used to evaluate a computerized patient simulator as a tool to teach difficult BMV. A novice group and an expert group (certified registered nurse anesthetists and anesthesiologists) were tested to validate the simulator's ability to distinguish between these 2 skill levels. RESULTS The difference between the novice and expert groups in the ability to perform difficult BMV was statistically significant (P < 0.0001). Brief training for novices led to a 100% pass rate and competence as measured by the simulator. Simulation training was effective in increasing the ability to ventilate a simulated difficult-to-ventilate patient (P < 0.0001). CONCLUSIONS This study suggests that this computerized patient simulator was validated as a simulation model for teaching difficult BMV and differentiating skill levels in BMV. Using the simulator with brief training on difficult BMV allowed new internal medicine house staff to successfully ventilate a simulated difficult patient.
Collapse
|
25
|
DeMaria S, Samuelson ST, Schwartz AD, Sim AJ, Levine AI. Simulation-based Assessment and Retraining for the Anesthesiologist Seeking Reentry to Clinical Practice. Anesthesiology 2013; 119:206-17. [DOI: 10.1097/aln.0b013e31829761c8] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
Abstract
Background:
Established models for assessment and maintenance of competency in anesthesiology may not be adequate for anesthesiologists wishing to reenter practice. The authors describe a program developed in their institution incorporating simulator-based education, to help determine competency in licensed and previously licensed anesthesiologists before return to practice.
Methods:
The authors have used simulation for assessment and retraining at their institution since 2002. Physicians evaluated by the authors’ center undergo an adaptable 2-day simulation-based assessment conducted by two board-certified anesthesiologists. A minimum of three cases are presented on each day, with specific core competencies assessed, and participants complete a standard Clinical Anesthesia Year 3 level anesthesia knowledge test. Participants are debriefed extensively and retraining regimens are designed, where indicated, consisting of a combination of simulation and operating-room observership.
Results:
Twenty anesthesiologists were referred to the authors’ institution between 2002 and 2012. Fourteen participants (70%) were in active clinical practice 1 yr after participation in the authors’ program, five (25%) were in supervised positions, and nine (45%) had resumed independent clinical practice. The reasons of participants not in practice were personal (1 participant) and medico-legal (3 participants); two participants were lost to follow-up. Two of 14 physicians, who were formally assessed in the authors’ program, were deemed likely unfit for safe return to practice, irrespective of further training. These physicians were unavailable for contact 1 yr after assessment.
Conclusion:
Anesthesiologists seeking to return to active clinical status are a heterogeneous group. The simulated environment provides an effective means by which to assess baseline competency and also a way to retrain physicians.
Collapse
Affiliation(s)
| | | | | | | | - Adam I. Levine
- Professor, Department of Anesthesiology, Icahn School of Medicine at Mount Sinai, New York, New York
| |
Collapse
|
26
|
Stepaniak PS, Dexter F. Monitoring Anesthesiologists’ and Anesthesiology Departments’ Managerial Performance. Anesth Analg 2013; 116:1198-200. [DOI: 10.1213/ane.0b013e3182900466] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|
27
|
Dexter F, Logvinov II, Brull SJ. Anesthesiology Residents’ and Nurse Anesthetists’ Perceptions of Effective Clinical Faculty Supervision by Anesthesiologists. Anesth Analg 2013; 116:1352-5. [DOI: 10.1213/ane.0b013e318286dc01] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|
28
|
Hirshey Dirksen SJ, Van Wicklin SA, Mashman DL, Neiderer P, Merritt DR. Developing Effective Drills in Preparation for a Malignant Hyperthermia Crisis. AORN J 2013; 97:329-53. [DOI: 10.1016/j.aorn.2012.12.009] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/09/2012] [Accepted: 12/12/2012] [Indexed: 11/27/2022]
|
29
|
Endacott R, Scholes J, Cooper S, McConnell-Henry T, Porter J, Missen K, Kinsman L, Champion R. Identifying patient deterioration: Using simulation and reflective interviewing to examine decision making skills in a rural hospital. Int J Nurs Stud 2012; 49:710-7. [DOI: 10.1016/j.ijnurstu.2011.11.018] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/28/2011] [Revised: 11/27/2011] [Accepted: 11/28/2011] [Indexed: 10/14/2022]
|
30
|
External Validation of Simulation-Based Assessments With Other Performance Measures of Third-Year Anesthesiology Residents. Simul Healthc 2012; 7:73-80. [DOI: 10.1097/sih.0b013e31823d018a] [Citation(s) in RCA: 30] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
31
|
Beydon L, Dureuil B, Nathan N, Piriou V, Steib A. [High fidelity simulation in Anesthesia and Intensive Care: context and opinion of performing centres--a survey by the French College of Anesthesiologists and Intensivists]. ACTA ACUST UNITED AC 2010; 29:782-6. [PMID: 20934299 DOI: 10.1016/j.annfar.2010.08.013] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/24/2010] [Accepted: 08/21/2010] [Indexed: 11/24/2022]
Abstract
OBJECTIVES High fidelity simulation is rapidly expanding in France. The question of knowing how to accredit this new mode of continuous education and how far it is available for this purpose is pending. The purpose of this survey was to collect how active centres operate and which criteria they would prefer to accredit this form of continuing medical education. STUDY DESIGN National survey. METHODS A questionnaire was sent to all centres using high fidelity simulation in France (December 2009). RESULTS Eighteen of 21 centres answered (86%; all university hospitals). These centres are equipped with adult high fidelity simulation and procedural heads for intubation. Funding is achieved via multiple sources and one third of centres benefit from manufacturers' lending. Centres are mostly located within the university premises (70%). One or more staff practitioners are involved in 78% and the majority of centres are operated by more than three. Nurse anaesthetists are not involved in most centres. Operating procedures are similar and high fidelity simulation is mostly used for in-site resident training. At present, centres are only marginally able to train non-resident senior anaesthesiologists. Sessions extend over one day (72%). The majority of centres is prone to share scenarios (75%) and pedagogic aids (93%). Basic scenarios (e.g., cardiopulmonary resuscitation) are mainstream objectives for 85% of centres. CONCLUSION high fidelity simulation is rapidly expending in France but its ability to contribute to continuous medical education is still limited to date.
Collapse
Affiliation(s)
- L Beydon
- Commission formation initiale du Cfar, pôle d'anesthésie réanimation, CHU d'Angers, 4, rue Larrey, 49933 Angers cedex 09, France.
| | | | | | | | | | | |
Collapse
|
32
|
Kleinpell R. Evidence-based review and discussion points. Certification and patient safety. Am J Crit Care 2009; 18:115-6. [PMID: 19350695 DOI: 10.4037/ajcc2009271] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/01/2022]
Affiliation(s)
- Ruth Kleinpell
- Ruth Kleinpell is contributing editor of the Evidence-Based Review section. She is a professor in the Rush University College of Nursing, a teacher-practitioner at the Rush University Medical Center, and a nurse practitioner with Our Lady of the Resurrection Medical Center, Chicago, Illinois
| |
Collapse
|
33
|
McIntosh CA. Lake Wobegon for anesthesia...where everyone is above average except those who aren't: variability in the management of simulated intraoperative critical incidents. Anesth Analg 2009; 108:6-9. [PMID: 19095823 DOI: 10.1213/ane.0b013e31818e5f91] [Citation(s) in RCA: 16] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|