1
|
Sukotjo C, Koseoglu M, Suwannasin P, Yuan JCC, Park YS, Johnson BR, Thammasitboon K, Tekian A. Assessing methodological quality in dental education research using MERSQI: Analysis of publications from two journals. J Dent Educ 2024; 88:786-797. [PMID: 38343340 DOI: 10.1002/jdd.13477] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2023] [Revised: 12/15/2023] [Accepted: 01/13/2024] [Indexed: 06/16/2024]
Abstract
PURPOSE The Medical Education Research Study Quality Instrument (MERSQI) has been used frequently to assess the methodological quality of medical education but not for dental education. The present study aimed to assess the methodological quality using MERSQI scores of articles published in the Journal of Dental Education (JDE) and the European Journal of Dental Education (EJDE). METHODS A cross-sectional assessment of the quality of manuscripts published in 2012, 2017, and 2022 JDE and EJDE was conducted. MERSQI data, numbers of authors, first and corresponding author degrees, geographic origins, and funding information were also extracted for each included study. Descriptive and analytical statistics were conducted, and significance level was set at α < 0.05. RESULTS Four hundred ninety-five articles met the inclusion criteria. The most common study design was a single-group cross-sectional or single-group posttest and conducted in one institution for all studied years. In all journals and years, studies were assessed mainly by participants. The study outcome was mostly satisfaction, attitudes, perceptions, opinions, and general facts. The total mean MERSQI score for each journal and year varied. Year and geographic origin significantly affected the total MERSQI score. Papers originating from Asia had the highest score, followed by South America, Europe, North America, Oceania, and Africa. CONCLUSION MERSQI score is applicable to the assessment of the methodological quality of dental educational research. The MERSQI score for most of the domains was similar for both journals. The MERSQI score was affected by publication years and geographic origins.
Collapse
Affiliation(s)
- Cortino Sukotjo
- Department of Restorative Dentistry, College of Dentistry, University of Illinois Chicago, Chicago, Illinois, USA
| | - Merve Koseoglu
- Department of Prosthodontics, Faculty of Dentistry, University of Sakarya, Sakarya, Turkey
| | - Pitcha Suwannasin
- Department of Conservative Dentistry, Faculty of Dentistry, Prince of Songkla University, Hat Yai, Thailand
| | - Judy Chia-Chun Yuan
- Department of Restorative Dentistry, College of Dentistry, University of Illinois Chicago, Chicago, Illinois, USA
| | - Yoon Soo Park
- Department of Medical Education, College of Medicine, University of Illinois Chicago, Chicago, Illinois, USA
| | - Bradford Ray Johnson
- Department of Endodontics, College of Dentistry, University of Illinois Chicago, Chicago, Illinois, USA
| | - Kewalin Thammasitboon
- Department of Conservative Dentistry, Research Center of Excellence for Oral Health, Faculty of Dentistry, Prince of Songkla University, Hat Yai, Thailand
| | - Ara Tekian
- Department of Medical Education, College of Medicine, University of Illinois Chicago, Chicago, Illinois, USA
| |
Collapse
|
2
|
Thelen AE, George BC, Burkhardt JC, Khamees D, Haas MRC, Weinstein D. Improving Graduate Medical Education by Aggregating Data Across the Medical Education Continuum. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2024; 99:139-145. [PMID: 37406284 DOI: 10.1097/acm.0000000000005313] [Citation(s) in RCA: 2] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/07/2023]
Abstract
ABSTRACT Meaningful improvements to graduate medical education (GME) have been achieved in recent decades, yet many GME improvement pilots have been small trials without rigorous outcome measures and with limited generalizability. Thus, lack of access to large-scale data is a key barrier to generating empiric evidence to improve GME. In this article, the authors examine the potential of a national GME data infrastructure to improve GME, review the output of 2 national workshops on this topic, and propose a path toward achieving this goal.The authors envision a future where medical education is shaped by evidence from rigorous research powered by comprehensive, multi-institutional data. To achieve this goal, premedical education, undergraduate medical education, GME, and practicing physician data must be collected using a common data dictionary and standards and longitudinally linked using unique individual identifiers. The envisioned data infrastructure could provide a foundation for evidence-based decisions across all aspects of GME and help optimize the education of individual residents.Two workshops hosted by the National Academies of Sciences, Engineering, and Medicine Board on Health Care Services explored the prospect of better using GME data to improve education and its outcomes. There was broad consensus about the potential value of a longitudinal data infrastructure to improve GME. Significant obstacles were also noted.Suggested next steps outlined by the authors include producing a more complete inventory of data already being collected and managed by key medical education leadership organizations, pursuing a grass-roots data sharing pilot among GME-sponsoring institutions, and formulating the technical and governance frameworks needed to aggregate data across organizations.The power and potential of big data is evident across many disciplines, and the authors believe that harnessing the power of big data in GME is the best next step toward advancing evidence-based physician education.
Collapse
|
3
|
Li H, Upreti T, Do V, Dance E, Lewis M, Jacobson R, Goldberg A. Measuring wellbeing: A scoping review of metrics and studies measuring medical student wellbeing across multiple timepoints. MEDICAL TEACHER 2024; 46:82-101. [PMID: 37405740 DOI: 10.1080/0142159x.2023.2231625] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 07/06/2023]
Abstract
PURPOSE Studies have demonstrated poor mental health in medical students. However, there is wide variation in study design and metric use, impairing comparability. The authors aimed to examine the metrics and methods used to measure medical student wellbeing across multiple timepoints and identify where guidance is necessary. METHODS Five databases were searched between May and June 2021 for studies using survey-based metrics among medical students at multiple timepoints. Screening and data extraction were done independently by two reviewers. Data regarding the manuscript, methodology, and metrics were analyzed. RESULTS 221 studies were included, with 109 observational and 112 interventional studies. There were limited studies (15.4%) focused on clinical students. Stress management interventions were the most common (40.2%). Few (3.57%) interventional studies followed participants longer than 12 months, and 38.4% had no control group. There were 140 unique metrics measuring 13 constructs. 52.1% of metrics were used only once. CONCLUSIONS Unique guidance is needed to address gaps in study design as well as unique challenges surrounding medical student wellbeing surveys. Metric use is highly variable and future research is necessary to identify metrics specifically validated in medical student samples that reflect the diversity of today's students.
Collapse
Affiliation(s)
- Henry Li
- Department of Emergency Medicine, Faculty of Medicine & Dentistry, University of Alberta, Edmonton, Canada
| | - Tushar Upreti
- Max Rady College of Medicine, Faculty of Health Sciences, University of Manitoba Rady, Winnipeg, Canada
| | - Victor Do
- Department of Pediatrics, Faculty of Medicine, University of Toronto Temerty, Toronto, Canada
| | - Erica Dance
- Department of Emergency Medicine, Faculty of Medicine & Dentistry, University of Alberta, Edmonton, Canada
| | - Melanie Lewis
- Department of Pediatrics, Faculty of Medicine and Dentistry, University of Alberta, Edmonton, Canada
| | - Ryan Jacobson
- Office of Advocacy and Wellbeing, Faculty of Medicine and Dentistry, University of Alberta, Edmonton, Canada
| | - Aviva Goldberg
- Department of Pediatrics and Child Health, Faculty of Health Sciences, University of Manitoba Rady, Winnipeg, Canada
| |
Collapse
|
4
|
Pietersen PI, Hertz P, Olsen RG, Møller LB, Konge L, Bjerrum F. Transfer of skills between laparoscopic and robot-assisted surgery: a systematic review. Surg Endosc 2023; 37:9030-9042. [PMID: 37875694 DOI: 10.1007/s00464-023-10472-5] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2023] [Accepted: 09/17/2023] [Indexed: 10/26/2023]
Abstract
BACKGROUND Robot-assisted surgery is today well-implemented in many surgical specialties, but requires another skill set than laparoscopy. Most often, robot-assisted surgery is considered add-on to laparoscopic skills but very little is known about the transfer of skills. The aim of the study was to examine to what extent surgical skills are transferable between laparoscopic and robot-assisted surgery. METHODS A systematic search was conducted in three databases (Ovid Medline, Embase, and Web of Science). Studies investigating transfer of skills between laparoscopy and robot-assisted surgery in either a phantom-based, simulation-based, animal model, or clinical setting were eligible for inclusion. Quality assessment was done using the Medical education research study quality instrument and educational New Ottawa Scale. RESULTS Of 15,610 studies identified, 89 studies continued to full-text reading, and 37 studies were included. Four studies were found non-comparable and were left out of the results for the primary outcome. All 33 studies explored transfer from laparoscopy to robot-assisted surgery and 17 found a positive transfer whereas 15 did not. Only 11 studies explored transfer from robot-assisted surgery to laparoscopy, of which only three found a positive transfer. CONCLUSION An almost equal number of publications found a positive transfer and no transfer from laparoscopic to robot-assisted surgery. Fewer studies explored the transfer from robot-assisted surgery to laparoscopy. Very little evidence supports that surgeons trained solely in robot-assisted surgery can perform laparoscopy. This must be considered in future training programs as robot-assisted surgery is expected to become the first-in-line modality for many future surgeons.
Collapse
Affiliation(s)
- Pia Iben Pietersen
- Department of Radiology, Odense University Hospital, Kløvervænget 10, Entrance 112, 2nd floor, 5000, Odense C, Denmark.
- Simulation Center (SimC), Odense University Hospital, Odense, Denmark.
| | - Peter Hertz
- Department of Surgery, Hospital Lillebaelt, University of Southern Denmark, Kolding, Denmark
| | - Rikke Groth Olsen
- Copenhagen Prostate Cancer Center, Rigshospitalet, Copenhagen, Denmark
- Center for HR & Education, The Capital Region of Denmark, Copenhagen Academy for Medical Education and Simulation (CAMES), Copenhagen, Denmark
| | - Louise Birch Møller
- Center for HR & Education, The Capital Region of Denmark, Copenhagen Academy for Medical Education and Simulation (CAMES), Copenhagen, Denmark
| | - Lars Konge
- Center for HR & Education, The Capital Region of Denmark, Copenhagen Academy for Medical Education and Simulation (CAMES), Copenhagen, Denmark
- University of Copenhagen, Copenhagen, Denmark
| | - Flemming Bjerrum
- Center for HR & Education, The Capital Region of Denmark, Copenhagen Academy for Medical Education and Simulation (CAMES), Copenhagen, Denmark
- Department of Gastrointestinal and Hepatic Diseases, Copenhagen University Hospital - Herlev and Gentofte, Herlev, Denmark
| |
Collapse
|
5
|
Han H, Youm J, Tucker C, Teal CR, Rougas S, Park YS, J Mooney C, L Hanson J, Berry A. Research Methodologies in Health Professions Education Publications: Breadth and Rigor. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2022; 97:S54-S62. [PMID: 35947465 DOI: 10.1097/acm.0000000000004911] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
PURPOSE Research methodologies represent assumptions about knowledge and ways of knowing. Diverse research methodologies and methodological standards for rigor are essential in shaping the collective set of knowledge in health professions education (HPE). Given this relationship between methodologies and knowledge, it is important to understand the breadth of research methodologies and their rigor in HPE research publications. However, there are limited studies examining these questions. This study synthesized current trends in methodologies and rigor in HPE papers to inform how evidence is gathered and collectively shapes knowledge in HPE. METHOD This descriptive quantitative study used stepwise stratified cluster random sampling to analyze 90 papers from 15 HPE journals published in 2018 and 2019. Using a research design codebook, the authors conducted group coding processes for fidelity, response process validity, and rater agreement; an index quantifying methodological rigor was developed and applied for each paper. RESULTS Over half of research methodologies were quantitative (51%), followed by qualitative (28%), and mixed methods (20%). No quantitative and mixed methods papers reported an epistemological approach. All qualitative papers that reported an epistemological approach (48%) used social constructivism. Most papers included participants from North America (49%) and Europe (20%). The majority of papers did not specify participant sampling strategies (56%) or a rationale for sample size (80%). Among those reported, most studies (81%) collected data within 1 year.The average rigor score of the papers was 56% (SD = 17). Rigor scores varied by journal categories and research methodologies. Rigor scores differed between general HPE journals and discipline-specific journals. Qualitative papers had significantly higher rigor scores than quantitative and mixed methods papers. CONCLUSIONS This review of methodological breadth and rigor in HPE papers raises awareness in addressing methodological gaps and calls for future research on how the authors shape the nature of knowledge in HPE.
Collapse
Affiliation(s)
- Heeyoung Han
- H. Han is associate professor and director of postdoctoral programs, Department of Medical Education, Southern Illinois University School of Medicine, Springfield, Illinois; ORCID: https://orcid.org/0000-0002-7286-2473
| | - Julie Youm
- J. Youm is associate dean of education compliance and quality, University of California, Irvine School of Medicine, Irvine, California
| | - Constance Tucker
- C. Tucker is associate professor, Vice Provost of Educational Improvement and Innovation, Academic Affairs, Oregon Health & Science University, Portland, Oregon; ORCID: https://orcid.org/0000-0002-6507-8832
| | - Cayla R Teal
- C.R. Teal is associate professor and associate dean of assessment and evaluation, Office of Medical Education, University of Kansas School of Medicine, Kansas City, Kansas; ORCID: https://orcid.org/0000-0002-2138-4926
| | - Steven Rougas
- S. Rougas is associate professor of emergency medicine and medical science and director of the doctoring program, The Warren Alpert Medical School of Brown University, Providence, Rhode Island; ORCID: https://orcid.org/0000-0003-2225-9657
| | - Yoon Soo Park
- Y.S. Park is associate professor, Harvard Medical School, and director of health professions education research, Massachusetts General Hospital, Boston, Massachusetts; ORCID: http://orcid.org/0000-0001-8583-4335
| | - Christopher J Mooney
- C.J. Mooney is assistant professor of medicine and director of assessment, University of Rochester School of Medicine and Dentistry, Rochester, New York; ORCID: https://orcid.org/0000-0003-2881-2169
| | - Janice L Hanson
- J.L. Hanson is professor of medicine, Department of Medicine and Office of Education, Washington University, St. Louis, Missouri; ORCID: https://orcid.org/0000-0001-7051-8225
| | - Andrea Berry
- A. Berry is executive director of faculty life, University of Central Florida College of Medicine, Orlando, Florida
| |
Collapse
|
6
|
Anderson TN, Kearse LE, Shi R, Kaba A, Schmiederer IS, Huffman EM, Ritter EM, Korndorffer JR. Surgical endoscopy education research: how are we doing? Surg Endosc 2022; 36:8403-8407. [PMID: 35194666 DOI: 10.1007/s00464-022-09104-1] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/18/2021] [Accepted: 02/07/2022] [Indexed: 01/06/2023]
Abstract
BACKGROUND Surgical endoscopy (SE), the official journal of the Society of American Gastrointestinal and Endoscopic Surgeons and the European Association for Endoscopic Surgery, is an important source of new evidence pertaining to surgical education in the field. However, qualitative deficiencies in medical education research have prompted medical education leaders to advocate for increased methodological rigor. The purpose of this study is to review the quality of education-focused research published through SE. METHODS A PubMed search examining all SE articles categorized as education-related research from 2010 to 2019 was conducted; studies not meeting inclusion criteria were excluded. Remaining publications were independently reviewed, classified, and scored by 7 raters using the medical education research study quality instrument (MERSQI). Intraclass correlation was calculated and data were examined with descriptive statistics. RESULTS A total of 227 studies met inclusion criteria. There was no significant difference in number of publications by year (average 25.88 [SD 5.6]); 60% were conducted outside of the United States, and 47% (n = 106) were funded. The average MERSQI was 12.5 (SD 2). Most studies used two-group non-random (42%, n = 96) or post/cross-sectional designs (29%, n = 65). Thirty-six (16%) were randomized controlled trials. Multi-institutional studies comprised 24% (n = 54). Of the manuscripts, 96% (n = 217) reported at least one measure of validity evidence and 28% (n = 67) described three levels of validity evidence. Studies primarily reported changes in skills or knowledge (45%, n = 103) or satisfaction or general facts (44%, n = 99), while patient-related outcomes encompassed 3% (n = 6) of studies. ICC between raters was 0.93 (CI 0.90-0.93, p < 0.001). CONCLUSIONS Based on publications to date, this journal's peer review process appears to facilitate the dissemination of education-related studies of moderate to good quality. However, there were uncovered deficits, ranging from validity evidence to study designs and level of outcomes. This journal's breadth of viewership offers a potential venue to advance education-related research.
Collapse
Affiliation(s)
- Tiffany N Anderson
- Department of Surgery, University of Florida College of Medicine, 1600 SW Archer Road, P.O. Box 100286, Gainesville, FL, 32610, USA.
| | - LaDonna E Kearse
- Department of Surgery, Stanford University School of Medicine, Stanford, CA, USA
| | - Robert Shi
- Department of Surgery, Stanford University School of Medicine, Stanford, CA, USA
| | - Aboubacar Kaba
- Department of Urology, David Geffen School of Medicine at the University of California, Los Angeles (UCLA), Los Angeles, CA, USA
| | | | - Elizabeth M Huffman
- Department of Surgery, Indiana University School of Medicine, Indianapolis, IN, USA
| | - E M Ritter
- Department of Surgery, Indiana University School of Medicine, Indianapolis, IN, USA
| | - James R Korndorffer
- Department of Surgery, Stanford University School of Medicine, Stanford, CA, USA
| |
Collapse
|
7
|
McAfee NW, Schumacher JA, Madson MB, Villarosa-Hurlocker MC, Williams DC. The Status of SBIRT Training in Health Professions Education: A Cross-Discipline Review and Evaluation of SBIRT Curricula and Educational Research. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2022; 97:1236-1246. [PMID: 35320126 DOI: 10.1097/acm.0000000000004674] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
PURPOSE To assess the quality of curricular research on the Screening Brief Intervention and Referral to Treatment (SBIRT) approach and determine the presence of useful training modalities, particularly motivational interviewing (MI) training, across health care training curricula. METHOD The authors conducted a systematic review of published, peer-reviewed studies in PubMed, ERIC, CINAHL, Ovid HealthSTAR, and PsycINFO databases through March 2021 for English-language studies describing SBIRT, a curriculum for health care trainees, and curricular intervention outcomes. After the records were independently assessed, data were extracted and 20% of the studies were double-coded for interrater reliability. RESULTS Of 1,856 studies, 95 were included in the review; 22 had overlapping samples and were consolidated into 10 nested studies, leaving 83 total. Interrater reliability ranged from moderate (κ = .74, P < .001) to strong (κ = .91, P < .001) agreement. SBIRT training was delivered to trainees across many professions, including nursing (n = 34, 41%), medical residency (n = 28, 34%), and social work (n = 24, 29%). Nearly every study described SBIRT training methods (n = 80, 96%), and most reported training in MI (n = 54, 65%). On average, studies reported 4.06 (SD = 1.64) different SBIRT training methods and 3.31 (SD = 1.59) MI training methods. Their mean design score was 1.92 (SD = 0.84) and mean measurement score was 1.89 (SD = 1.05). A minority of studies measured SBIRT/MI skill (n = 23, 28%), and 4 studies (5%) set a priori benchmarks for their curricula. CONCLUSIONS SBIRT training has been delivered to a wide range of health care trainees and often includes MI. Rigor scores for the studies were generally low due to limited research designs and infrequent use of objective skill measurement. Future work should include predefined training benchmarks and validated skills measurement.
Collapse
Affiliation(s)
- Nicholas W McAfee
- N.W. McAfee is assistant professor, Department of Psychiatry and Human Behavior, University of Mississippi Medical Center, Jackson, Mississippi; ORCID: 0000-0002-7992-9124
| | - Julie A Schumacher
- J.A. Schumacher is professor, Department of Psychiatry and Human Behavior, University of Mississippi Medical Center, Jackson, Mississippi
| | - Michael B Madson
- M.B. Madson is professor, School of Psychology, University of Southern Mississippi, Hattiesburg, Mississippi; ORCID: 0000-0002-2025-8856
| | - Margo C Villarosa-Hurlocker
- M.C. Villarosa-Hurlocker is assistant professor, Department of Psychology, University of New Mexico, Albuquerque, New Mexico; ORCID: 0000-0002-9744-8551
| | - Daniel C Williams
- D.C. Williams is associate professor, Department of Family and Community Medicine, University of New Mexico, Albuquerque, New Mexico
| |
Collapse
|
8
|
Cook DA, Wilkinson JM, Foo J. Quality of cost evaluations of physician continuous professional development: Systematic review of reporting and methods. PERSPECTIVES ON MEDICAL EDUCATION 2022; 11:156-164. [PMID: 35357652 PMCID: PMC9240125 DOI: 10.1007/s40037-022-00705-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/03/2021] [Revised: 01/31/2022] [Accepted: 02/03/2022] [Indexed: 06/14/2023]
Abstract
INTRODUCTION We sought to evaluate the reporting and methodological quality of cost evaluations of physician continuing professional development (CPD). METHODS We conducted a systematic review, searching MEDLINE, Embase, PsycInfo, and the Cochrane Database for studies comparing the cost of physician CPD (last update 23 April 2020). Two reviewers, working independently, screened all articles for inclusion. Two reviewers extracted information on reporting quality using the Consolidated Health Economic Evaluation Reporting Standards (CHEERS), and on methodological quality using the Medical Education Research Study Quality Instrument (MERSQI) and a published reference case. RESULTS Of 3338 potentially eligible studies, 62 were included. Operational definitions of methodological and reporting quality elements were iteratively revised. Articles reported mean (SD) 43% (20%) of CHEERS elements for the Title/Abstract, 56% (34%) for Introduction, 66% (19%) for Methods, 61% (17%) for Results, and 66% (30%) for Discussion, with overall reporting index 292 (83) (maximum 500). Valuation methods were reported infrequently (resource selection 10 of 62 [16%], resource quantitation 10 [16%], pricing 26 [42%]), as were descriptions/discussion of the physicians trained (42 [68%]), training setting (42 [68%]), training intervention (40 [65%]), sensitivity analyses of uncertainty (9 [15%]), and generalizability (30 [48%]). MERSQI scores ranged from 6.0 to 16.0 (mean 11.2 [2.4]). Changes over time in reporting index (initial 241 [105], final 321 [52]) and MERSQI scores (initial 9.8 [2.7], final 11.9 [1.9]) were not statistically significant (p ≥ 0.08). DISCUSSION Methods and reporting of HPE cost evaluations fall short of current standards. Gaps exist in the valuation, analysis, and contextualization of cost outcomes.
Collapse
Affiliation(s)
- David A Cook
- School of Continuous Professional Development, Mayo Clinic College of Medicine and Science, Rochester, MN, USA.
- Division of General Internal Medicine, Mayo Clinic, Rochester, MN, USA.
| | | | - Jonathan Foo
- School of Primary and Allied Health Care, Monash University, Victoria, Australia
| |
Collapse
|
9
|
Datasets for Automated Affect and Emotion Recognition from Cardiovascular Signals Using Artificial Intelligence- A Systematic Review. SENSORS 2022; 22:s22072538. [PMID: 35408149 PMCID: PMC9002643 DOI: 10.3390/s22072538] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/16/2022] [Revised: 03/21/2022] [Accepted: 03/22/2022] [Indexed: 02/04/2023]
Abstract
Simple Summary We reviewed the literature on the publicly available datasets used to automatically recognise emotion and affect using artificial intelligence (AI) techniques. We were particularly interested in databases with cardiovascular (CV) data. Additionally, we assessed the quality of the included papers. We searched the sources until 31 August 2020. Each step of identification was carried out independently by two reviewers to maintain the credibility of our review. In case of disagreement, we discussed them. Each action was first planned and described in a protocol that we posted on the Open Science Framework (OSF) platform. We selected 18 works focused on providing datasets of CV signals for automated affect and emotion recognition. In total, data for 812 participants aged 17 to 47 were analysed. The most frequently recorded signal was electrocardiography. The authors most often used video stimulation. Noticeably, we did not find much necessary information in many of the works, resulting in mainly low quality among included papers. Researchers in this field should focus more on how they carry out experiments. Abstract Our review aimed to assess the current state and quality of publicly available datasets used for automated affect and emotion recognition (AAER) with artificial intelligence (AI), and emphasising cardiovascular (CV) signals. The quality of such datasets is essential to create replicable systems for future work to grow. We investigated nine sources up to 31 August 2020, using a developed search strategy, including studies considering the use of AI in AAER based on CV signals. Two independent reviewers performed the screening of identified records, full-text assessment, data extraction, and credibility. All discrepancies were resolved by discussion. We descriptively synthesised the results and assessed their credibility. The protocol was registered on the Open Science Framework (OSF) platform. Eighteen records out of 195 were selected from 4649 records, focusing on datasets containing CV signals for AAER. Included papers analysed and shared data of 812 participants aged 17 to 47. Electrocardiography was the most explored signal (83.33% of datasets). Authors utilised video stimulation most frequently (52.38% of experiments). Despite these results, much information was not reported by researchers. The quality of the analysed papers was mainly low. Researchers in the field should concentrate more on methodology.
Collapse
|
10
|
Stojan J, Haas M, Thammasitboon S, Lander L, Evans S, Pawlik C, Pawilkowska T, Lew M, Khamees D, Peterson W, Hider A, Grafton-Clarke C, Uraiby H, Gordon M, Daniel M. Online learning developments in undergraduate medical education in response to the COVID-19 pandemic: A BEME systematic review: BEME Guide No. 69. MEDICAL TEACHER 2022; 44:109-129. [PMID: 34709949 DOI: 10.1080/0142159x.2021.1992373] [Citation(s) in RCA: 40] [Impact Index Per Article: 20.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
BACKGROUND The COVID-19 pandemic spurred an abrupt transition away from in-person educational activities. This systematic review investigated the pivot to online learning for nonclinical undergraduate medical education (UGME) activities and explored descriptions of educational offerings deployed, their impact, and lessons learned. METHODS The authors systematically searched four online databases and conducted a manual electronic search of MedEdPublish up to December 21, 2020. Two authors independently screened titles, abstracts and full texts, performed data extraction and assessed risk of bias. A third author resolved discrepancies. Findings were reported in accordance with the STORIES (STructured apprOach to the Reporting in healthcare education of Evidence Synthesis) statement and BEME guidance. RESULTS Fifty-six articles were included. The majority (n = 41) described the rapid transition of existing offerings to online formats, whereas fewer (n = 15) described novel activities. The majority (n = 27) included a combination of synchronous and asynchronous components. Didactics (n = 40) and small groups (n = 26) were the most common instructional methods. Teachers largely integrated technology to replace and amplify rather than transform learning, though learner engagement was often interactive. Thematic analysis revealed unique challenges of online learning, as well as exemplary practices. The quality of study designs and reporting was modest, with underpinning theory at highest risk of bias. Virtually all studies (n = 54) assessed reaction/satisfaction, fewer than half (n = 23) assessed changes in attitudes, knowledge or skills, and none assessed behavioral, organizational or patient outcomes. CONCLUSIONS UGME educators successfully transitioned face-to-face instructional methods online and implemented novel solutions during the COVID-19 pandemic. Although technology's potential to transform teaching is not yet fully realized, the use of synchronous and asynchronous formats encouraged virtual engagement, while offering flexible, self-directed learning. As we transition from emergency remote learning to a post-pandemic world, educators must underpin new developments with theory, report additional outcomes and provide details that support replication.
Collapse
Affiliation(s)
- Jennifer Stojan
- Internal Medicine and Pediatrics, University of Michigan Medical School, Ann Arbor, MI, USA
| | - Mary Haas
- Internal Medicine and Pediatrics, University of Michigan Medical School, Ann Arbor, MI, USA
| | - Satid Thammasitboon
- Department of Pediatrics, Texas Children's Hospital and Baylor College of Medicine, Houston, TX, USA
| | - Lina Lander
- Family Medicine and Public Health, University of California San Diego School of Medicine, La Jolla, CA, USA
| | - Sean Evans
- Family Medicine and Public Health, University of California San Diego School of Medicine, La Jolla, CA, USA
| | - Cameron Pawlik
- Internal Medicine and Pediatrics, University of Michigan Medical School, Ann Arbor, MI, USA
| | | | - Madelyn Lew
- Internal Medicine and Pediatrics, University of Michigan Medical School, Ann Arbor, MI, USA
| | - Deena Khamees
- McGovern Medical School, University of Texas Health Science Center, Houston, TX, USA
| | - William Peterson
- Internal Medicine and Pediatrics, University of Michigan Medical School, Ann Arbor, MI, USA
| | - Ahmad Hider
- Internal Medicine and Pediatrics, University of Michigan Medical School, Ann Arbor, MI, USA
| | | | - Hussein Uraiby
- School of Medicine, University of Leicester, Leicester, UK
| | - Morris Gordon
- Blackpool Victoria Hospital, Blackpool, UK
- School of Medicine, University of Central Lancashire, Preston, UK
| | - Michelle Daniel
- Family Medicine and Public Health, University of California San Diego School of Medicine, La Jolla, CA, USA
| |
Collapse
|
11
|
Doja A, Lavin Venegas C, Cowley L, Wiesenfeld L, Writer H, Clarkin C. Barriers and facilitators to program directors' use of the medical education literature: a qualitative study. BMC MEDICAL EDUCATION 2022; 22:45. [PMID: 35045845 PMCID: PMC8772128 DOI: 10.1186/s12909-022-03104-4] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/08/2021] [Accepted: 12/10/2021] [Indexed: 06/14/2023]
Abstract
BACKGROUND It is unclear how often frontline clinical teachers are using this literature and its evidence base in teaching and assessment. Our study purpose was to examine postgraduate program director perspectives on the utilization and integration of evidence-based medical education literature in their teaching and assessment practices. METHODS The authors conducted semi-structured telephone interviews with a convenience sample of current and former program directors from across Canada. Interviews were transcribed and analyzed inductively to distil pertinent themes. RESULTS In 2017, 11 former and current program directors participated in interviews. Major themes uncovered included the desire for time-efficient and easily adaptable teaching and assessment tools. Participants reported insufficient time to examine the medical education literature, and preferred that it be 'synthesized for them'. (i.e., Best evidence guidelines). Participants recognised continuing professional development and peer to peer sharing as useful means of education about evidence-based tools. Barriers to the integration of the literature in practice included inadequate time, lack of financial compensation for teaching and assessment, and the perception that teaching and assessment of trainees was not valued in academic promotion. DISCUSSION Faculty development offices should consider the time constraints of clinical teachers when planning programming on teaching and assessment. To enhance uptake, medical education publications need to consider approaches that best meet the needs of a targeted audiences, including frontline clinical teachers. This may involve novel methods and formats that render evidence and findings from their studies more easily 'digestible' by clinical teachers to narrow the knowledge to practice gap.
Collapse
Affiliation(s)
- Asif Doja
- Department of Pediatrics, Children's Hospital of Eastern Ontario, 401 Smyth Rd, Ottawa, ON, K1H 8L1, Canada.
| | - Carolina Lavin Venegas
- Children's Hospital of Eastern Ontario Research Institute, 401 Smyth Rd, Ottawa, ON, K1H 8L1, Canada
| | - Lindsay Cowley
- Department for Innovation in Medical Education Research Support Unit, University of Ottawa, 451 Smyth Rd, Ottawa, ON, K1H 8M5, Canada
| | | | - Hilary Writer
- Department of Pediatrics, Children's Hospital of Eastern Ontario, 401 Smyth Rd, Ottawa, ON, K1H 8L1, Canada
| | - Chantalle Clarkin
- Department of Virtual Mental Health and Outreach, Centre for Addiction and Mental Health, 1001 Queen St West, Toronto, Ontario, M6J 1H4, Canada
| |
Collapse
|
12
|
Hudson E, Clavel N, Kilpatrick K, Lavoie-Tremblay M. Effective online learning strategies for leadership and policy undergraduate courses for nursing students: a rapid review. J Prof Nurs 2021; 37:1079-1085. [PMID: 34887026 DOI: 10.1016/j.profnurs.2021.08.012] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/15/2021] [Indexed: 10/20/2022]
Abstract
OBJECTIVES Due to the importance of developing leadership competencies during nursing education, it is critical to make evidence-based decisions regarding the transformation from face-to-face to online delivery of leadership and policy courses for nursing students in the wake of the COVID-19 pandemic. This rapid review aims to identify effective learning strategies for creating online leadership and policy courses for undergraduate nursing students. DATA SOURCES AND REVIEW METHODS A rapid review methodology was used. Searches in CINAHL and ERIC yielded 4112 records. After screening, seven articles were included. The Criteria for Describing and Evaluating Training Interventions in Healthcare Professions (CRe-DEPTH) tool was used for quality appraisal and data extraction. A narrative synthesis approach was used to summarize the data. RESULTS The learning activities were heterogeneous in terms of content and format. Articles described the use of discussion forums, case studies, virtual clinical learning experiences, microblogging, and video clips. The methods of evaluation for these learning activities also varied greatly. CONCLUSION The findings will act as a steppingstone to help develop an online undergraduate leadership and policy nursing course. This review also demonstrated the need for rigorous evaluation of learning activities. The use of a tool such as the CRe-DEPTH can help instructors plan and report on their learning interventions or courses.
Collapse
Affiliation(s)
- Emilie Hudson
- Ingram School of Nursing, McGill University, Ingram School of Nursing, 680 Sherbrooke St. West, Montreal, QC H3A 2M7, Canada.
| | - Nathalie Clavel
- Ingram School of Nursing, McGill University, Ingram School of Nursing, 680 Sherbrooke St. West, Montreal, QC H3A 2M7, Canada.
| | - Kelley Kilpatrick
- Ingram School of Nursing, McGill University, Ingram School of Nursing, 680 Sherbrooke St. West, Montreal, QC H3A 2M7, Canada.
| | - Mélanie Lavoie-Tremblay
- Ingram School of Nursing, McGill University, Ingram School of Nursing, 680 Sherbrooke St. West, Montreal, QC H3A 2M7, Canada.
| |
Collapse
|
13
|
Sztramko R, Levinson AJ, Wurster AE, Jezrawi R, Sivapathasundaram B, Papaioannou A, Cowan D, St Onge J, Marr S, Patterson C, Woo T, Mosca L, Lokker C. Online Educational Tools for Caregivers of People with Dementia: A Scoping Literature Review. Can Geriatr J 2021; 24:351-366. [PMID: 34912490 PMCID: PMC8629496 DOI: 10.5770/cgj.24.506] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
BACKGROUND Informal caregivers of people with dementia provide the majority of health-based care to people with dementia. Providing this care requires knowledge and access to resources, which caregivers often do not receive. We set out to evaluate the effect of online educational tools on informal caregiver self-efficacy, quality of life, burden/stress, depression, and anxiety, and to identify effective processes for online educational tool development. METHODS We conducted a scoping review of articles on online educational interventions for informal caregivers of people with dementia searching CINAHL, MEDLINE, EMBASE, and PubMed from 1990 to March 2018, with an updated search conducted in 2020. The identified articles were screened and the data were charted. RESULTS 33 articles that reported on 24 interventions were included. There is some evidence that online interventions improve caregiver-related outcomes such as self-efficacy, depression, dementia knowledge, and quality of life; and decrease caregiver burden. Common findings across the studies included the need for tailored, stage-specific information applicable to the caregiver's situation and the use of psychosocial techniques to develop the knowledge components of the interventions. CONCLUSION We demonstrate the importance of having caregivers and health-care professionals involved at all stages of tool conceptualization and development. Online tools should be evaluated with robust trials that focus on how increased knowledge and development approaches affect caregiver-related outcomes.
Collapse
Affiliation(s)
- Richard Sztramko
- Division of Geriatric Medicine, Department of Medicine, McMaster University, Hamilton
- GERAS Centre, St. Peter’s Hospital, Hamilton
| | - Anthony J. Levinson
- Division of e-Learning Innovation, Faculty of Health Sciences, McMaster University, Hamilton
- GERAS Centre, St. Peter’s Hospital, Hamilton
| | - Andrea E. Wurster
- GERAS Centre, St. Peter’s Hospital, Hamilton
- Department of Health Research Methods, Evidence, and Impact, McMaster University, Hamilton, ON
| | - Rita Jezrawi
- Department of Health Research Methods, Evidence, and Impact, McMaster University, Hamilton, ON
| | | | - Alexandra Papaioannou
- Division of Geriatric Medicine, Department of Medicine, McMaster University, Hamilton
- GERAS Centre, St. Peter’s Hospital, Hamilton
- Department of Health Research Methods, Evidence, and Impact, McMaster University, Hamilton, ON
| | - David Cowan
- Division of Geriatric Medicine, Department of Medicine, McMaster University, Hamilton
- GERAS Centre, St. Peter’s Hospital, Hamilton
| | - Joye St Onge
- Division of Geriatric Medicine, Department of Medicine, McMaster University, Hamilton
- GERAS Centre, St. Peter’s Hospital, Hamilton
| | - Sharon Marr
- Division of Geriatric Medicine, Department of Medicine, McMaster University, Hamilton
- GERAS Centre, St. Peter’s Hospital, Hamilton
| | - Christopher Patterson
- Division of Geriatric Medicine, Department of Medicine, McMaster University, Hamilton
- GERAS Centre, St. Peter’s Hospital, Hamilton
| | - Tricia Woo
- Division of Geriatric Medicine, Department of Medicine, McMaster University, Hamilton
- GERAS Centre, St. Peter’s Hospital, Hamilton
| | - Lori Mosca
- Division of e-Learning Innovation, Faculty of Health Sciences, McMaster University, Hamilton
| | - Cynthia Lokker
- Department of Health Research Methods, Evidence, and Impact, McMaster University, Hamilton, ON
| |
Collapse
|
14
|
Hill E, Gurbutt D, Makuloluwa T, Gordon M, Georgiou R, Roddam H, Seneviratne S, Byrom A, Pollard K, Abhayasinghe K, Chance-Larsen K. Collaborative healthcare education programmes for continuing professional education in low and middle-income countries: A Best Evidence Medical Education (BEME) systematic review. BEME Guide No. 65. MEDICAL TEACHER 2021; 43:1228-1241. [PMID: 34499841 DOI: 10.1080/0142159x.2021.1962832] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
BACKGROUND Large discrepancies exist between standards of healthcare provision in high-income (HICs) and low and middle-income countries (LMICs). The root cause is often financial, resulting in poor infrastructure and under-resourced education and healthcare systems. Continuing professional education (CPE) programmes improve staff knowledge, skills, retention, and practice, but remain costly and rare in low-resource settings. One potential solution involves healthcare education collaborations between institutions in HICs and LMICs to provide culturally appropriate CPE in LMICs. To be effective, educational partnerships must address the challenges arising from differences in cultural norms, language, available technology and organisational structures within collaborating countries. METHODS Seven databases and other sources were systematically searched on 7 July 2020 for relevant studies. Citations, abstracts, and studies were screened and consensus was reached on which to include within the review. 54 studies were assessed regarding the type of educational programme involved, the nature of HIC/LMIC collaboration and quality of the study design. RESULTS Studies varied greatly regarding the types and numbers of healthcare professionals involved, pedagogical and delivery methods, and the ways in which collaboration was undertaken. Barriers and enablers of collaboration were identified and discussed. The key findings were: 1. The methodological quality of reporting in the studies was generally poor. 2. The way in which HIC/LMIC healthcare education collaboration is undertaken varies according to many factors, including what is to be delivered, the learner group, the context, and the resources available. 3. Western bias was a major barrier. 4. The key to developing successful collaborations was the quality, nature, and duration of the relationships between those involved. CONCLUSION This review provides insights into factors that underpin successful HIC/LMIC healthcare CPE collaborations and outlines inequities and quality issues in reporting.
Collapse
Affiliation(s)
- Elaine Hill
- School of Sport and Health Sciences, UCLan, Preston, UK
| | - Dawne Gurbutt
- Centre for Collaborative Learning, UCLan, Preston, UK
| | - Thamasi Makuloluwa
- Faculty of Medicine, General Sir John Kotelawala Defence University, Ratmalana, Sri Lanka
| | | | | | - Hazel Roddam
- School of Sport and Health Sciences, UCLan, Preston, UK
| | - Sujatha Seneviratne
- Department of Nursing and Midwifery, University of Sri Jayewardenepura, Nugegoda, Sri Lanka
| | - Anna Byrom
- School of Community Health and Midwifery, UCLan, Preston, UK
| | - Kerry Pollard
- School of Community Health and Midwifery, UCLan, Preston, UK
| | - Kalpani Abhayasinghe
- Department of Nursing and Midwifery, General Sir John Kotelawala Defence University, Ratmalana, Sri Lanka
| | | |
Collapse
|
15
|
Guckian J, Utukuri M, Asif A, Burton O, Adeyoju J, Oumeziane A, Chu T, Rees EL. Social media in undergraduate medical education: A systematic review. MEDICAL EDUCATION 2021; 55:1227-1241. [PMID: 33988867 DOI: 10.1111/medu.14567] [Citation(s) in RCA: 33] [Impact Index Per Article: 11.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/26/2020] [Revised: 05/06/2021] [Accepted: 05/08/2021] [Indexed: 06/12/2023]
Abstract
INTRODUCTION There are over 3.81 billion worldwide active social media (SoMe) users. SoMe are ubiquitous in medical education, with roles across undergraduate programmes, including professionalism, blended learning, well being and mentoring. Previous systematic reviews took place before recent explosions in SoMe popularity and revealed a paucity of high-quality empirical studies assessing its effectiveness in medical education. This review aimed to synthesise evidence regarding SoMe interventions in undergraduate medical education, to identify features associated with positive and negative outcomes. METHODS Authors searched 31 key terms through seven databases, in addition to references, citation and hand searching, between 16 June and 16 July 2020. Studies describing SoMe interventions and research on exposure to existing SoMe were included. Title, abstract and full paper screening were undertaken independently by two reviewers. Included papers were assessed for methodological quality using the Medical Education Research Study Quality Instrument (MERSQI) and/or the Standards for Reporting Qualitative Research (SRQR) instrument. Extracted data were synthesised using narrative synthesis. RESULTS 112 studies from 26 countries met inclusion criteria. Methodological quality of included studies had not significantly improved since 2013. Engagement and satisfaction with SoMe platforms in medical education are described. Students felt SoMe flattened hierarchies and improved communication with educators. SoMe use was associated with improvement in objective knowledge assessment scores and self-reported clinical and professional performance, however evidence for long term knowledge retention was limited. SoMe use was occasionally linked to adverse impacts upon mental and physical health. Professionalism was heavily investigated and considered important, though generally negative correlations between SoMe use and medical professionalism may exist. CONCLUSIONS Social media is enjoyable for students who may improve short term knowledge retention and can aid communication between learners and educators. However, higher-quality study is required to identify longer-term impact upon knowledge and skills, provide clarification on professionalism standards and protect against harms.
Collapse
Affiliation(s)
- Jonathan Guckian
- Dermatology Department, Leeds Teaching Hospitals NHS Trust, Yorkshire, UK
- School of Medical Education, Newcastle University, Newcastle Upon Tyne, UK
| | - Mrudula Utukuri
- School of Clinical Medicine, University of Cambridge, Cambridge, UK
| | - Aqua Asif
- Leicester Medical School, University of Leicester, Leicester, UK
| | - Oliver Burton
- Warwick Medical School, University of Warwick, Coventry, UK
| | - Joshua Adeyoju
- Faculty of Medicine, University of Southampton, Southampton, UK
| | - Adam Oumeziane
- School of Medicine, Anglia Ruskin University, Chelmsford, UK
| | - Timothy Chu
- School of Medical Education, Newcastle University, Newcastle Upon Tyne, UK
| | - Eliot L Rees
- School of Medicine, Keele University, Newcastle-under-Lyme, UK
- Research Department of Primary Care and Population Health, University College London, London, UK
| |
Collapse
|
16
|
Jujo S, Nakahira A, Kataoka Y, Banno M, Tsujimoto Y, Tsujimoto H, Oikawa S, Matsui H, Berg BW. Transesophageal Echocardiography Simulator Training: A Systematic Review and Meta-analysis of Randomized Controlled Trials. Simul Healthc 2021; 16:341-352. [PMID: 33428355 DOI: 10.1097/sih.0000000000000537] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
SUMMARY STATEMENT We aimed to assess the learning effects of novice transesophageal echocardiography (TEE) simulator training and to identify gaps in existing studies. We performed a systematic review and meta-analysis of randomized controlled trials (RCTs) comparing the learning effects of novice TEE training with versus without simulators, searching published articles and proceedings in 6 major databases in June 2019. We included 9 RCTs (268 participants). Compared with nonsimulator training, TEE simulator training resulted in higher skill and knowledge posttraining test scores with large effect sizes (standardized mean difference = 0.81 for skill, 1.61 for knowledge; low-certainty evidence) and higher training satisfaction with a small effect size (standardized mean difference = 0.36; very low-certainty evidence). No RCTs reported training budget or patient outcomes. Additional well-designed studies with low risk of bias and large sample sizes are needed to provide reliable and robust findings and develop more effective TEE simulation-based training curricula.
Collapse
Affiliation(s)
- Satoshi Jujo
- From the SimTiki Simulation Center (S.J., A.N., B.W.B.), John A. Burns School of Medicine, University of Hawaii at Manoa, Honolulu, HI; Department of Anesthesiology (S.J., H.M.), Kameda General Hospital, Chiba; Department of Critical Care Medicine (A.N.), Nara Prefecture General Medical Center, Nara; Department of Respiratory Medicine (Y.K.) and Hospital Care Research Unit (Y.K., H.T.), Hyogo Prefectural Amagasaki General Medical Center, Hyogo; Department of Psychiatry (M.B.), Seichiryo Hospital; Department of Psychiatry (M.B.), Nagoya University Graduate School of Medicine, Aichi; Department of Nephrology and Dialysis (Y.T.), Kyoritsu Hospital, Hyogo; and Department of Healthcare Epidemiology (Y.T.), School of Public Health in the Graduate School of Medicine, and Medical Education Center (S.O.), Graduate School of Medicine, Kyoto University, Kyoto, Japan
| | | | | | | | | | | | | | | | | |
Collapse
|
17
|
Hope D, Dewar A, Hay C. Is There a Replication Crisis in Medical Education Research? ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2021; 96:958-963. [PMID: 33735127 DOI: 10.1097/acm.0000000000004063] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Scholars are increasingly aware that studies-across many disciplines-cannot be replicated by independent researchers. Here, the authors describe how medical education research may be vulnerable to this "replication crisis," explain how researchers can act together to reduce risks, and discuss the positive steps that can increase confidence in research findings. Medical education research contributes to policy and influences practitioner behavior. Findings that cannot be replicated suggest that the original research was not credible. This risk raises the possibility that unhelpful or even harmful changes to medical education have been implemented as a result of research that appeared defensible but was not. By considering these risk factors, researchers can increase the likelihood that studies are generating credible results. The authors discuss and provide examples of 6 factors that may endanger the replicability of medical education research: (1) small sample sizes, (2) small effect sizes, (3) exploratory designs, (4) flexibility in design choices, analysis strategy, and outcome measures, (5) conflicts of interest, and (6) very active fields with many competing research teams. Importantly, medical education researchers can adopt techniques used successfully elsewhere to improve the rigor of their investigations. Researchers can improve their work through better planning in the development stage, carefully considering design choices, and using sensible data analysis. The wider medical education community can help by encouraging higher levels of collaboration among medical educators, by routinely evaluating existing educational innovations, and by raising the prestige of replication and collaborative medical education research. Medical education journals should adopt new approaches to publishing. As medical education research improves, so too will the quality of medical education and patient care.
Collapse
Affiliation(s)
- David Hope
- D. Hope is a senior lecturer in medical education, Medical Education Unit, University of Edinburgh, Edinburgh, Scotland, United Kingdom; ORCID: https://orcid.org/0000-0001-6623-2857
| | - Avril Dewar
- A. Dewar is a fellow in medical education, Medical Education Unit, University of Edinburgh, Edinburgh, Scotland, United Kingdom; ORCID: https://orcid.org/0000-0003-1992-6148
| | - Christopher Hay
- C. Hay is an interventional radiologist, Royal Infirmary of Edinburgh, Edinburgh, Scotland, United Kingdom
| |
Collapse
|
18
|
Jordan J, Coates WC, Gottlieb M, Soares WE, Shah KH, Love JN. The Impact of a Medical Education Research Faculty Development Program on Career Development, Through the Lens of Social Cognitive Career Theory. AEM EDUCATION AND TRAINING 2021; 5:e10565. [PMID: 34124511 PMCID: PMC8171782 DOI: 10.1002/aet2.10565] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/02/2020] [Revised: 11/23/2020] [Accepted: 11/24/2020] [Indexed: 06/12/2023]
Abstract
OBJECTIVES The Medical Education Research Certificate at the Council of Residency Directors in Emergency Medicine (MERC at CORD), a specialized adaptation of the Association of American Medical Colleges MERC program, provides faculty development in education research in emergency medicine. However, its long-term influence on career development remains unknown. Our study explored the impact of MERC at CORD on career development through the lens of social cognitive career (SCC) theory. METHODS This was a prospective qualitative study using a constructivist/interpretivist paradigm to assess long-term career development outcomes. A purposeful randomized stratified sampling strategy of MERC at CORD graduates (2011-2014) ensured diversity of representation (sex, region, number of research publications, and project group leadership). Subjects were invited by e-mail to participate in semistructured phone interviews. Thematic analysis by two independent reviewers followed an iterative process until saturation was reached. RESULTS Twelve graduates were interviewed. All engaged with MERC at CORD early in their careers with minimal previous education research experience. Currently, all hold medical education leadership positions. Graduates had a mean of 19.3 publications (range = 9-43). Themes explaining reasons for participating in MERC at CORD include: desire for education research skills, recommendation of mentors/colleagues, and accessibility. Themes citing the program's value to career development include networking/collaboration, mentorship, informational framework to build upon, and the application of theoretical knowledge through experiential learning. MERC at CORD impacted career development aligning with the core domains of SCC theory including self-efficacy, outcome expectations, and goals. CONCLUSION MERC at CORD enhanced the long-term career development of participants by providing a core knowledge framework in a mentored, experiential learning environment. Participants identified themes aligned with SCC theory as influential in their long-term career advancement in medical education including the development of education research skills, successful completion of education research, career acceleration, promotion, niche development, and formulation of professional goals.
Collapse
Affiliation(s)
- Jaime Jordan
- From theDepartment of Emergency MedicineDavid Geffen School of Medicine at UCLALos AngelesCAUSA
| | - Wendy C. Coates
- From theDepartment of Emergency MedicineDavid Geffen School of Medicine at UCLALos AngelesCAUSA
| | - Michael Gottlieb
- theDepartment of Emergency MedicineRush University Medical CenterChicagoILUSA
| | - William E. Soares
- theDepartment of Emergency MedicineInstitute of Healthcare Delivery and Population ScienceUniversity of Massachusetts Medical School‐BaystateSpringfieldMAUSA
| | - Kaushal H. Shah
- theDepartment of Emergency MedicineWeill Cornell Medical SchoolNew YorkNYUSA
| | - Jeffrey N. Love
- and theDepartment of Emergency MedicineGeorge Washington University, and Georgetown UniversityWashingtonDCUSA
| |
Collapse
|
19
|
Effectiveness of blended learning in pharmacy education: A systematic review and meta-analysis. PLoS One 2021; 16:e0252461. [PMID: 34138880 PMCID: PMC8211173 DOI: 10.1371/journal.pone.0252461] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/25/2021] [Accepted: 05/15/2021] [Indexed: 12/02/2022] Open
Abstract
Background & objective Though blended learning (BL), is widely adopted in higher education, evaluating effectiveness of BL is difficult because the components of BL can be extremely heterogeneous. Purpose of this study was to evaluate the effectiveness of BL in improving knowledge and skill in pharmacy education. Methods PubMed/MEDLINE, Scopus and the Cochrane Library were searched to identify published literature. The retrieved studies from databases were screened for its title and abstracts followed by the full-text in accordance with the pre-defined inclusion and exclusion criteria. Methodological quality was appraised by modified Ottawa scale. Random effect model used for statistical modelling. Key findings A total of 26 studies were included for systematic review. Out of which 20 studies with 4525 participants for meta-analysis which employed traditional teaching in control group. Results showed a statistically significant positive effect size on knowledge (standardized mean difference [SMD]: 1.35, 95% confidence interval [CI]: 0.91 to 1.78, p<0.00001) and skill (SMD: 0.68; 95% CI: 0.19 to 1.16; p = 0.006) using a random effect model. Subgroup analysis of cohort studies showed, studies from developed countries had a larger effect size (SMD: 1.54, 95% CI: 1.01 to 2.06), than studies from developing countries(SMD: 0.44, 95% CI: 0.23 to 0.65, studies with MCQ pattern as outcome assessment had larger effect size (SMD: 2.81, 95% CI: 1.76 to 3.85) than non-MCQs (SMD 0.53, 95% CI 0.33 to 0.74), and BL with case studies (SMD 2.72, 95% CI 1.86–3.59) showed better effect size than non-case-based studies (SMD: 0.22, CI: 0.02 to 0.41). Conclusion BL is associated with better academic performance and achievement than didactic teaching in pharmacy education.
Collapse
|
20
|
Ting DK, Boreskie P, Luckett-Gatopoulos S, Gysel L, Lanktree MB, Chan TM. Quality Appraisal and Assurance Techniques for Free Open Access Medical Education (FOAM) Resources: A Rapid Review. Semin Nephrol 2021; 40:309-319. [PMID: 32560781 DOI: 10.1016/j.semnephrol.2020.04.011] [Citation(s) in RCA: 17] [Impact Index Per Article: 5.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/22/2022]
Abstract
Free open access medical education (FOAM) has disrupted traditional modes of knowledge translation and dissemination. These are popular resources with a wide educational reach. Nephrology has been a leader in FOAM, but many skeptics still question the accuracy and reliability of this content. Recently, quality-assurance techniques have been developed to address these concerns. These techniques may be helpful for readers to appraise the online literature and for institutions to reward the production of high-quality open educational resources. We performed a rapid review of the literature. A medical librarian conducted a systematic search of the Medline and Cumulative Index of Nursing and Allied Health Literature databases. Two independent assessors screened and selected articles, performed a hand-search of reference lists, and scored articles on their quality using the Medical Education Research Study Quality Instrument. Thirteen reports were included for the final descriptive analysis. We identified 10 quality-assessment techniques, and 4 of them having been validated. The quality of the reports was fairly high, with an average Medical Education Research Study Quality Instrument score of 11.5 of 18 (SD, 2.3; range, 7.25-14.25). The calculated Cronbach α was 0.85. There is burgeoning literature on the topic of critical appraisal of open educational resources, and, more specifically, FOAM resources. Many of the techniques used are of varying quality and developed with different intended uses and audiences. By continuing to refine these tools, we can continue not only to support and legitimize the FOAM movement, but also foster individual critical appraisal skills that increasingly are necessary in this age of information.
Collapse
Affiliation(s)
- Daniel K Ting
- Department of Emergency Medicine, University of British Columbia, Vancouver, BC, Canada
| | - Patrick Boreskie
- Department of Emergency Medicine, Max Rady College of Medicine, University of Manitoba, Winnipeg, MB, Canada
| | - S Luckett-Gatopoulos
- Division of Emergency Medicine, Department of Medicine, Western University, London, ON, Canada; Division of Paediatric Emergency Medicine, Department of Paediatrics, McMaster University, Hamilton, ON, Canada
| | - Lisa Gysel
- Interior Health, Royal Inland Hospital Library, Kamloops, BC, Canada
| | - Matthew B Lanktree
- Division of Nephrology, Department of Medicine, McMaster University, Hamilton, ON, Canada
| | - Teresa M Chan
- Division of Emergency Medicine, Department of Medicine, McMaster University, Hamilton, ON, Canada; Department of Emergency Medicine, Hamilton General Hospital, Hamilton Health Sciences Centre, Hamilton, ON, Canada; Program for Faculty Development, Faculty of Health Sciences, McMaster University, Hamilton, ON, Canada; McMaster Education Research, Innovation, and Theory Program (MERIT), McMaster University, Hamilton, ON, Canada.
| |
Collapse
|
21
|
Shaw LK, Kiegaldie D, Jones C, Morris ME. Improving hospital falls screening and mitigation using a health professional education framework. NURSE EDUCATION TODAY 2021; 98:104695. [PMID: 33517181 DOI: 10.1016/j.nedt.2020.104695] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/11/2020] [Revised: 10/13/2020] [Accepted: 11/30/2020] [Indexed: 06/12/2023]
Abstract
OBJECTIVE Although health professional education has the potential to mitigate hospital falls risk, the best methods to develop, deliver and evaluate health professional education remain unclear. This study applied evidence-based approaches to education design to improve falls risk mitigation. DESIGN Mixed methods using questionnaires to evaluate health professionals knowledge of evidence-based falls risk assessment and mitigation, followed by semi-structured interviews with individual health professionals. SETTING Five large Australian hospitals. PARTICIPANTS For each hospital, 10 clinical leaders from nursing and allied health professions were invited to participate in falls workshops. METHODS 46 participants received a three-hour education program on the latest evidence in hospital falls risk assessment and how to implement evidence-based falls screening and management. This was based on the "4P" education model (Presage, Planning, Process and Product). They were taught practical skills to enable them to educate other health professionals. RESULTS The education workshop significantly changed participants' views about best practice guidelines for falls screening and prevention. Participants felt more confident in assessing falls risk and judging and implementing the best mitigation strategies. They were prepared and motivated to educate others about falls prevention and satisfied with the skills gained. CONCLUSIONS A high-quality education program grounded in a rigorous quality framework improved health professionals knowledge regarding evidence-based falls prevention. Use of evidence-based rationales for behaviour change promotes effective learning.
Collapse
Affiliation(s)
- Louise K Shaw
- Faculty of Health Science, Youth and Community Studies, Holmesglen Institute, 488 South Road, Moorabbin, Vic 3189, Australia.
| | - Debra Kiegaldie
- Faculty of Health Science, Youth and Community Studies, Holmesglen Institute, 488 South Road, Moorabbin, Vic 3189; Monash University, Australia; Healthscope ARCH, Victorian Rehabilitation Centre, Glen Waverley 3150, Australia.
| | - Cathy Jones
- Healthscope, Level 1, 312 St Kilda Rd, Melbourne, 3004, Australia.
| | - Meg E Morris
- School of Allied Health, La Trobe Centre for Sport and Exercises Medicine Research, La Trobe University, Victoria 3086, Australia; Healthscope ARCH, Victorian Rehabilitation Centre, Glen Waverley 3150, Australia.
| |
Collapse
|
22
|
Janke KK, Hager KD, Sharma A. Unpacking student learning from an early experience with the Pharmacists' Patient Care Process. CURRENTS IN PHARMACY TEACHING & LEARNING 2020; 12:1447-1460. [PMID: 33092775 DOI: 10.1016/j.cptl.2020.07.003] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/09/2019] [Revised: 07/08/2020] [Accepted: 07/14/2020] [Indexed: 06/11/2023]
Abstract
BACKGROUND To illuminate learning, a case study approach was used to examine early, authentic experiences within Pharmacists' Patient Care Process (PPCP)-focused practices. EDUCATIONAL ACTIVITY Six students were matched with five practitioners and spent five half-days in a primary care clinic in a PPCP-committed health system. Students practiced interviewing, determining the patient's medication experience, and formulating the beginnings of the assessment, as well as observing and debriefing on the completion of the process by a practitioner mentor. The Five R Model was used to prompt student learning reflection. In addition, instructors examined students' work for evidence of transformative learning and observations were captured using forms of reflective practice and collaborative debriefing. CRITICAL ANALYSIS OF THE EDUCATIONAL ACTIVITY Reflection performance ratings varied; however, there was strong evidence of transformative learning for all students. Specifically, most student reflections demonstrated a focus on elaborating on existing frames of reference. The most prevalent indicator of transformative learning was exploration of options for new roles, relationships, and actions. The codes from instructors' observations revealed five categories of learning evidence, with the strongest in the patient centeredness category. The process of reviewing student work products, documenting instructor observations, and collaborative debriefing resulted in insights for curricular improvement and explanations for learning difficulties. Further work is needed in understanding student experiential learning intentions and their influence on learning and reflection. Additionally, further research should explore the value of longitudinal assessment of reflection and the value of assessing student work products using criteria beyond traditional reflection criteria.
Collapse
Affiliation(s)
- Kristin K Janke
- University of Minnesota College of Pharmacy-Twin Cities, 7-159 Weaver Densford Hall, 308 Harvard St SE, Minneapolis, MN 55455, United States.
| | - Keri D Hager
- University of Minnesota College of Pharmacy, Duluth, 211 Life Science, 1110 Kirby Dr Duluth, MN 55812, United States.
| | - Anita Sharma
- Blue Cross Blue Shield, 3535 Blue Cross Road, Eagan, MN 55122, United States.
| |
Collapse
|
23
|
Shaw L, Kiegaldie D, Farlie MK. Education interventions for health professionals on falls prevention in health care settings: a 10-year scoping review. BMC Geriatr 2020; 20:460. [PMID: 33167884 PMCID: PMC7653707 DOI: 10.1186/s12877-020-01819-x] [Citation(s) in RCA: 23] [Impact Index Per Article: 5.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2020] [Accepted: 10/05/2020] [Indexed: 02/06/2023] Open
Abstract
Background Falls in hospitals are a major risk to patient safety. Health professional education has the potential to be an important aspect of falls prevention interventions. This scoping review was designed to investigate the extent of falls prevention education interventions available for health professionals, and to determine the quality of reporting. Method A five stage scoping review process was followed based on Arksey and O’Malley’s framework and refined by the Joanna Briggs Institute Methodology for JBI Scoping Reviews. Five online databases identified papers published from January 2008 until May 2019. Papers were independently screened by two reviewers, and data extracted and analysed using a quality reporting framework. Results Thirty-nine publications were included. Interventions included formal methods of educational delivery (for example, didactic lectures, video presentations), interactive learning activities, experiential learning, supported learning such as coaching, and written learning material. Few studies employed comprehensive education design principles. None used a reporting framework to plan, evaluate, and document the outcomes of educational interventions. Conclusions Although health professional education is recognised as important for falls prevention, no uniform education design principles have been utilised in research published to date, despite commonly reported program objectives. Standardised reporting of education programs has the potential to improve the quality of clinical practice and allow studies to be compared and evaluated for effectiveness across healthcare settings.
Collapse
Affiliation(s)
- L Shaw
- Faculty of Health Science, Youth and Community Studies, Holmesglen Institute, 488 South Road, Moorabbin, VIC, 3189, Australia. .,School of Allied Health, Human Services and Sport, La Trobe University, Bundoora, Victoria, 3086, Australia.
| | - D Kiegaldie
- Faculty of Health Science, Youth and Community Studies and Healthscope Hospitals, Holmesglen Institute, 488 South Road, Moorabbin, VIC, 3189, Australia.,Eastern Clinical School, Faculty of Medicine, Nursing & Health Sciences, Monash University, Melbourne, Australia
| | - M K Farlie
- Department of Physiotherapy, School of Primary and Allied Health Care, Faculty of Medicine, Nursing and Health Sciences, Monash University, Moorooduc Highway, Frankston, VIC, 3199, Australia
| |
Collapse
|
24
|
Simulation Research Rubric: Further Analysis of Published Simulation Studies and Future Implications. Clin Simul Nurs 2020. [DOI: 10.1016/j.ecns.2020.08.013] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
|
25
|
Sattelmayer KM, Jagadamma KC, Sattelmayer F, Hilfiker R, Baer G. The assessment of procedural skills in physiotherapy education: a measurement study using the Rasch model. Arch Physiother 2020; 10:9. [PMID: 32509329 PMCID: PMC7249622 DOI: 10.1186/s40945-020-00080-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2019] [Accepted: 05/07/2020] [Indexed: 11/16/2022] Open
Abstract
Background Procedural skills are a key element in the training of future physiotherapists. Procedural skills relate to the acquisition of appropriate motor skills, which allow the safe application of clinical procedures to patients. In order to evaluate procedural skills in physiotherapy education validated assessment instruments are required. Recently the assessment of procedural skills in physiotherapy education (APSPT) tool was developed. The overall aim of this study was to establish the structural validity of the APSPT. In order to do this the following objectives were examined: i) the fit of the items of APSPT to the Rasch-model, ii) the fit of the overall score to the Rasch model, iii) the difficulty of each test item and iv) whether the difficulty levels of the individual test items cover the whole capacity spectrum of students in pre-registration physiotherapy education. Methods For this observational cross-sectional measurement properties study a convenience sample of 69 undergraduate pre-registration physiotherapy students of the HES-SO Valais-Wallis was recruited. Participants were instructed to perform a task procedure on a simulated patient. The performance was evaluated with the APSPT. A conditional maximum likelihood approach was used to estimate the parameters of a partial credit model for polytomous item responses. Item fit, ordering of thresholds, targeting and goodness of fit to the Rasch model was assessed. Results Item fit statistics showed that 25 items of the APSPT showed adequate fit to the Rasch model. Disordering of item thresholds did not occur and the targeting of the APSPT was adequate to measure the abilities of the included participants. Undimensionality and subgroup homogeneity were confirmed. Conclusion This study presented evidence for the structural validity of the APSPT. Undimensionality of the APSPT was confirmed and therefore presents evidence that the latent dimension of procedural skills in physiotherapy education consists of several subcategories. However, the results should be interpreted with caution given the small sample size.
Collapse
Affiliation(s)
- Karl Martin Sattelmayer
- School of Health Sciences, Physiotherapy, Queen Margaret University, Edinburgh, Scotland.,School of Health Sciences, University of Applied Sciences and Arts Western Switzerland Valais (HES-SO Valais-Wallis), Leukerbad, Switzerland
| | - Kavi C Jagadamma
- School of Health Sciences, Physiotherapy, Queen Margaret University, Edinburgh, Scotland
| | | | - Roger Hilfiker
- School of Health Sciences, University of Applied Sciences and Arts Western Switzerland Valais (HES-SO Valais-Wallis), Leukerbad, Switzerland
| | - Gillian Baer
- School of Health Sciences, Physiotherapy, Queen Margaret University, Edinburgh, Scotland
| |
Collapse
|
26
|
Kumar B, Swee ML, Suneja M. Leadership training programs in graduate medical education: a systematic review. BMC MEDICAL EDUCATION 2020; 20:175. [PMID: 32487056 PMCID: PMC7268469 DOI: 10.1186/s12909-020-02089-2] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/30/2019] [Accepted: 05/24/2020] [Indexed: 06/09/2023]
Abstract
BACKGROUND With the increasing recognition that leadership skills can be acquired, there is a heightened focus on incorporating leadership training as a part of graduate medical education. However, there is considerable lack of agreement regarding how to facilitate acquisition of these skills to resident, chief resident, and fellow physicians. METHODS Articles were identified through a search of Ovid MEDLINE, EMBASE, CINAHL, ERIC, PsycNet, Cochrane Systemic Reviews, and Cochrane Central Register of Controlled Trials from 1948 to 2019. Additional sources were identified through contacting authors and scanning references. We included articles that described and evaluated leadership training programs in the United States and Canada. Methodological quality was assessed via the MERSQI (Medical Education Research Study Quality Instrument). RESULTS Fifteen studies, which collectively included 639 residents, chief residents, and fellows, met the eligibility criteria. The format, content, and duration of these programs varied considerably. The majority focused on conflict management, interpersonal skills, and stress management. Twelve were prospective case series and three were retrospective. Seven used pre- and post-test surveys, while seven used course evaluations. Only three had follow-up evaluations after 6 months to 1 year. MERSQI scores ranged from 6 to 9. CONCLUSIONS Despite interest in incorporating structured leadership training into graduate medical education curricula, there is a lack of methodologically rigorous studies evaluating its effectiveness. High-quality well-designed studies, focusing particularly on the validity of content, internal structure, and relationship to other variables, are required in order to determine if these programs have a lasting effect on the acquisition of leadership skills.
Collapse
Affiliation(s)
- Bharat Kumar
- Internal Medicine in the Division of Immunology at the University of Iowa Hospitals and Clinics, 200 Hawkins Drive, Iowa City, IA, 52245, USA.
| | - Melissa L Swee
- Internal Medicine in the Division of Nephrology at the University of Iowa Hospitals and Clinics, Iowa City, IA, USA
| | - Manish Suneja
- Medicine Residency Program Director in the Department of Internal Medicine, University of Iowa Hospitals and Clinics, Iowa City, IA, USA
| |
Collapse
|
27
|
Fragkos KC, Crampton PES. The Effectiveness of Teaching Clinical Empathy to Medical Students: A Systematic Review and Meta-Analysis of Randomized Controlled Trials. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2020; 95:947-957. [PMID: 31688037 DOI: 10.1097/acm.0000000000003058] [Citation(s) in RCA: 21] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/24/2023]
Abstract
PURPOSE Clinical empathy is a necessary trait to provide effective patient care, despite differences in how it is defined and constructed. The aim of this study was to examine whether empathy interventions in medical students are effective and how confounding factors potentially moderate this effect. METHOD The authors performed a systematic review and meta-analysis. They searched the literature published between 1948 and 2018 for randomized controlled trials that examined empathy interventions in medical students. The search (database searching, citation tracking, hand-searching relevant journals) yielded 380 studies, which they culled to 16 that met the inclusion criteria. For the meta-analysis, they used a random effects model to produce a pooled estimate of the standardized mean difference (SMD), then completed subgroup analyses. RESULTS The authors found evidence of the possibility of response and reporting bias. The pooled SMD was 0.68 (95% confidence interval 0.43, 0.93), indicating a moderately positive effect of students developing empathy after an intervention compared with those in the control groups. There was no evidence of publication bias, but heterogeneity was significantly high (I = 88.5%, P < .01). Subgroup analyses indicated that significant moderating factors for developing empathy were age, country, scope of empathy measurement, type of empathy intervention, and presence of rehearsal. Moderating factors with limited evidence were sex, study quality, journal impact factor, and intervention characteristics. CONCLUSIONS Despite heterogeneity and biases, empathy interventions in medical students are effective. These findings reinforce arguments in the literature and add considerable rigor from the meta-analysis. The authors propose a conceptual model for educators to follow when designing empathy interventions in medical students.
Collapse
Affiliation(s)
- Konstantinos C Fragkos
- K.C. Fragkos is clinical fellow in gastroenterology, University College London Hospitals, National Health Service Foundation Trust, London, United Kingdom; ORCID: https://orcid.org/0000-0002-7677-7989. P.E.S. Crampton is lecturer, Health Professions Education Unit, Hull York Medical School, York, United Kingdom, adjunct research fellow, University College London Medical School, London, United Kingdom, and adjunct research fellow, Monash Centre for Scholarship in Health Education, Monash University, Victoria, Australia; ORCID: https://orcid.org/0000-0001-8744-930X
| | | |
Collapse
|
28
|
Danielson AR, Venugopal S, Mefford JM, Clarke SO. How do novices learn physical examination skills? A systematic review of the literature. MEDICAL EDUCATION ONLINE 2019; 24:1608142. [PMID: 31032719 PMCID: PMC6495115 DOI: 10.1080/10872981.2019.1608142] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 03/01/2019] [Revised: 04/10/2019] [Accepted: 04/11/2019] [Indexed: 05/23/2023]
Abstract
BACKGROUND Physical Examination (PE) skills are vital for patient care, and many medical students receive their first introduction to them in their pre-clinical years. A substantial amount of curriculum time is devoted to teaching these skills in most schools. Little is known about the best way to introduce PE skills to novice learners. OBJECTIVE Our objective was to conduct a systematic review of how medical students are first taught PE skills and the evidence supporting these strategies. DESIGN We searched ERIC, SCOPUS, MEDLINE, PubMed and EMBASE for descriptions of complete PE curricula for novice learners. Inclusion criteria were: (1) English language; (2) subjects were enrolled in medical school and were in the preclinical portion of their training; (3) description of a method to teach physical examination skills for the first time; (4) description of the study population; (5) Description of a complete PE curriculum. We used the Medical Education Research Study Quality Instrument (MERSQI) score to evaluate the quality of evidence provided. RESULTS Our search returned 5,418 articles; 32 articles met our inclusion criteria. Two main types of curricula were reported: comprehensive 'head-to-toe' PE curricula (18%) and organ system-based curricula (41%). No studies compared these directly, and only two evaluated trainees' clinical performance. The rest of the articles described interventions used across curricula (41%). Median MERSQI score was 10.1 Interquartile range 8.1-12.4. We found evidence for the use of non-faculty teaching associates, technology-enhanced PE education, and the addition of clinical exposure to formal PE teaching. CONCLUSIONS The current literature on teaching PE is focused on describing innovations to head-to-toe and organ system-based curricula rather than their relative effectiveness, and is further limited by its reliance on short-term outcomes. The optimal strategy for novice PE instruction remains unknown.
Collapse
Affiliation(s)
- Aaron R. Danielson
- Department of Emergency Medicine, University of California at Davis, Sacramento, CA, USA
| | - Sandhya Venugopal
- Division of Cardiovascular Medicine, University of California at Davis, Sacramento, CA, USA
| | - Jason M. Mefford
- Department of Emergency Medicine, Kaiser Permanente, Santa Clara, CA, USA
| | - Samuel O. Clarke
- Department of Emergency Medicine, University of California at Davis, Sacramento, CA, USA
| |
Collapse
|
29
|
Foo J, Cook DA, Walsh K, Golub R, Abdalla ME, Ilic D, Maloney S. Cost evaluations in health professions education: a systematic review of methods and reporting quality. MEDICAL EDUCATION 2019; 53:1196-1208. [PMID: 31402515 DOI: 10.1111/medu.13936] [Citation(s) in RCA: 26] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/06/2019] [Revised: 03/27/2019] [Accepted: 06/20/2019] [Indexed: 06/10/2023]
Abstract
CONTEXT High-quality research into education costs can inform better decision making. Improvements to cost research can be guided by information about the research questions, methods and reporting of studies evaluating costs in health professions education (HPE). Our objective was to appraise the overall state of the field and evaluate temporal trends in the methods and reporting quality of cost evaluations in HPE research. METHODS We searched the MEDLINE, CINAHL (Cumulative Index to Nursing and Allied Health Literature), EMBASE, Business Source Complete and ERIC (Education Resources Information Centre) databases on 31 July 2017. To evaluate trends over time, we sampled research reports at 5-year intervals (2001, 2006, 2011 and 2016). All original research studies in HPE that reported a cost outcome were included. The Medical Education Research Study Quality Instrument (MERSQI) and the BMJ economic checklist were used to appraise methodological and reporting quality, respectively. Trends in quality over time were analysed. RESULTS A total of 78 studies were included, of which 16 were published in 2001, 15 in 2006, 20 in 2011 and 27 in 2016. The region most commonly represented was the USA (n = 43). The profession most commonly referred to was that of the physician (n = 46). The mean ± standard deviation (SD) MERSQI score was 10.9 ± 2.6 out of 18, with no significant change over time (p = 0.55). The mean ± SD BMJ score was 13.5 ± 7.1 out of 35, with no significant change over time (p = 0.39). A total of 49 (63%) studies stated a cost-related research question, 23 (29%) stated the type of cost evaluation used, and 31 (40%) described the method of estimating resource quantities and unit costs. A total of 16 studies compared two or more interventions and reported both cost and learning outcomes. CONCLUSIONS The absolute number of cost evaluations in HPE is increasing. However, there are shortcomings in the quality of methodology and reporting, and these are not improving over time.
Collapse
Affiliation(s)
- Jonathan Foo
- Department of Physiotherapy, Faculty of Medicine, Nursing and Health Sciences, Monash University, Frankston, Victoria, Australia
| | - David A Cook
- Division of General Internal Medicine, Mayo Clinic College of Medicine, Rochester, Minnesota, USA
| | | | - Robert Golub
- Northwestern University Feinberg School of Medicine Chicago, Chicago, Illinois, USA
- JAMA Editorial Office, Chicago, Illinois, USA
| | | | - Dragan Ilic
- School of Public Health and Preventive Medicine, Monash University, Frankston, Victoria, Australia
- Monash Centre for Scholarship in Health Education, Monash University, Frankston, Victoria, Australia
| | - Stephen Maloney
- Department of Physiotherapy, Faculty of Medicine, Nursing and Health Sciences, Monash University, Frankston, Victoria, Australia
- Monash Centre for Scholarship in Health Education, Monash University, Frankston, Victoria, Australia
| |
Collapse
|
30
|
Jordan J, Shah K, Phillips AW, Hartman N, Love J, Gottlieb M. Use of the "Step-back" Method for Education Research Consultation at the National Level: A Pilot Study. AEM EDUCATION AND TRAINING 2019; 3:347-352. [PMID: 31637352 PMCID: PMC6795354 DOI: 10.1002/aet2.10349] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/07/2018] [Revised: 03/21/2019] [Accepted: 03/23/2019] [Indexed: 06/10/2023]
Abstract
BACKGROUND There are a limited number of emergency medicine (EM) physicians with expertise in education research. The Harvard Macy "step-back" method is an emerging model utilized to gather group feedback. Despite its use in multiple educational settings, there are little published data demonstrating effectiveness. OBJECTIVES Our objective was to create and evaluate a national faculty development session providing consultation in education research utilizing the step-back method. METHODS This was a pilot study. EM experts in education research from across the country served as facilitators for a faculty development session held at the 2018 Council of Emergency Medicine Residency Directors Academic Assembly. Small groups consisting of two or three facilitators and one or two participants were formed and each participant underwent a step-back consultation for their education research study. Participants wrote their study question before and after the session. After the session, facilitators and participants completed an evaluative survey consisting of multiple-choice, Likert-type, and free-response items. Descriptive statistics were reported. Qualitative analysis using a thematic approach was performed on free-response data. Participant study questions were assessed by the PICO (population, intervention, comparison, outcome) and FINER (feasible, interesting, novel, ethical, relevant) criteria. Both scales were evaluated using a two-way random-consistency intraclass correlation. Before and after scores were evaluated with a paired t-test. RESULTS Twenty-four facilitators and 13 participants completed the step-back session. Evaluations from 20 facilitators and nine participants were submitted and analyzed. Sixteen of 20 facilitators felt that the step-back method "greatly facilitated" their ability to share their education research expertise. All facilitators and participants recommended that the session be provided at a future academic assembly. Regarding suggestions for improvement, qualitative analysis revealed three major themes: praise for the session, desire for additional time, and a room set up more conducive to small group work. Seven of nine responding participants felt that the session was "very valuable" for improving the strength of their study methods. Qualitative analysis regarding change in study as a result of the step-back session yielded four major themes: refinement of study question, more specific outcomes and measurements, improvement in study design, and greater understanding of study limitations. Both FINER and PICO scale comparisons showed improvement pre- and postintervention (PICO 60% relative increase; FINER 16% relative increase). Neither achieved statistical significance (PICO t(5) = -1.835, p = 0.126; and FINER t(5) = -1.305, p = 0.249). CONCLUSION A national-level education research consultation utilizing the step-back method was feasible to implement and highly valued by facilitators and participants. Potential positive outcomes include refinement of study question, more specific outcomes and measurements, improvement in study design, and greater understanding of limitations. These results may inform others who want to utilize this method.
Collapse
Affiliation(s)
- Jaime Jordan
- UCLA Department of Emergency MedicineRonald Reagan UCLA Medical CenterLos AngelesCA
- David Geffen School of Medicine at University of California Los AngelesLos AngelesCA
| | - Kaushal Shah
- Department of Emergency MedicineIcahn School of Medicine at Mount Sinai Medical CenterNew YorkNY
| | - Andrew W Phillips
- Department of Emergency MedicineUniversity of North CarolinaChapel HillNC
| | - Nicholas Hartman
- Department of Emergency MedicineWake Forest School of MedicineWinston‐SalemNC
| | - Jeffrey Love
- Department of Emergency MedicineGeorge Washington UniversityWashingtonDC
| | - Michael Gottlieb
- Department of Emergency MedicineRush University Medical CenterChicagoIL
| |
Collapse
|
31
|
Abstract
Abstract
There has been a dramatic growth of scholarly articles in medical education in recent years. Evaluating medical education research requires specific orientation to issues related to format and content. Our goal is to review the quantitative aspects of research in medical education so that clinicians may understand these articles with respect to framing the study, recognizing methodologic issues, and utilizing instruments for evaluating the quality of medical education research. This review can be used both as a tool when appraising medical education research articles and as a primer for clinicians interested in pursuing scholarship in medical education.
Collapse
|
32
|
Dubosh NM, Jordan J, Yarris LM, Ullman E, Kornegay J, Runde D, Juve AM, Fisher J. Critical Appraisal of Emergency Medicine Educational Research: The Best Publications of 2016. AEM EDUCATION AND TRAINING 2019; 3:58-73. [PMID: 30680348 PMCID: PMC6339548 DOI: 10.1002/aet2.10203] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/15/2018] [Revised: 09/27/2018] [Accepted: 10/02/2018] [Indexed: 05/05/2023]
Abstract
OBJECTIVES The objectives were to critically appraise the emergency medicine (EM) medical education literature published in 2016 and review the highest-quality quantitative and qualitative studies. METHODS A search of the English language literature in 2016 querying MEDLINE, Scopus, Education Resources Information Center (ERIC), and PsychInfo identified 510 papers related to medical education in EM. Two reviewers independently screened all of the publications using previously established exclusion criteria. The 25 top-scoring quantitative studies based on methodology and all six qualitative studies were scored by all reviewers using selected scoring criteria that have been adapted from previous installments. The top-scoring articles were highlighted and trends in medical education research were described. RESULTS Seventy-five manuscripts met inclusion criteria and were scored. Eleven quantitative and one qualitative papers were the highest scoring and are summarized in this article. CONCLUSION This annual critical appraisal series highlights the best EM education research articles published in 2016.
Collapse
Affiliation(s)
- Nicole M. Dubosh
- Beth Israel Deaconess Medical Center and Harvard Medical SchoolBostonMA
| | - Jaime Jordan
- University of California Los Angeles School of MedicineTorranceCA
| | | | - Edward Ullman
- Beth Israel Deaconess Medical Center and Harvard Medical SchoolBostonMA
| | | | | | | | - Jonathan Fisher
- University of Arizona College of Medicine PhoenixMaricopa Medical CenterPhoenixAZ
| |
Collapse
|
33
|
Research Pioneers in Emergency Medicine-Reflections on Their Paths to Success and Advice to Aspiring Researchers: A Qualitative Study. Ann Emerg Med 2018; 73:555-564. [PMID: 30529113 DOI: 10.1016/j.annemergmed.2018.10.033] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/29/2018] [Revised: 10/24/2018] [Accepted: 10/29/2018] [Indexed: 11/24/2022]
Abstract
STUDY OBJECTIVE Research in basic, translational, and clinical emergency medicine has made great strides since the formalization of emergency medicine as a specialty. Our objective is to identify and analyze strategies used by emergency medicine research pioneers to inform further advancement of research in emergency medicine, particularly for aspiring researchers and those in emerging areas, using emergency medicine medical education as one example. METHODS This was a prospective, grounded-theory, qualitative study, using a constructivist/interpretivist paradigm. Leading basic science, translational, and clinical emergency medicine researchers who completed residency before 1995 were eligible for structured interviews. Thematic coding followed an iterative process until saturation was reached. A theoretic model was developed and analyzed. RESULTS Research pioneers valued advanced methodological training and mentorship. Barriers to funding were lack of recognition of emergency medicine as a specialty, absence of a research history, and lack of training and funding resources. Deliberate interventions to improve emergency medicine research included educational sessions at national meetings, external (to emergency medicine) mentor pairings, targeted funding by emergency medicine organizations, and involvement with funding agencies. Pioneers facilitate research excellence by serving as mentors and allocating funds or protected time to develop researchers. To advance emerging subfields of research in emergency medicine, pioneers recommend advanced methodological training that is specific to the area, deliberate mentorship, and the formation of research consortia to conduct generalizable outcomes-based studies. CONCLUSION Research pioneers in emergency medicine cite mentorship, advanced skills obtained through fellowship or graduate degrees, deliberate collaboration with experienced researchers, support from emergency medicine organizations, and forming networks as the cornerstones of success.
Collapse
|
34
|
Jin Y, Sanger N, Shams I, Luo C, Shahid H, Li G, Bhatt M, Zielinski L, Bantoto B, Wang M, Abbade LP, Nwosu I, Leenus A, Mbuagbaw L, Maaz M, Chang Y, Sun G, Levine MA, Adachi JD, Thabane L, Samaan Z. Does the medical literature remain inadequately described despite having reporting guidelines for 21 years? - A systematic review of reviews: an update. J Multidiscip Healthc 2018; 11:495-510. [PMID: 30310289 PMCID: PMC6166749 DOI: 10.2147/jmdh.s155103] [Citation(s) in RCA: 66] [Impact Index Per Article: 11.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/20/2022] Open
Abstract
PURPOSE Reporting guidelines (eg, Consolidated Standards of Reporting Trials [CONSORT] statement) are intended to improve reporting standards and enhance the transparency and reproducibility of research findings. Despite accessibility of such guidelines, researchers are not required to adhere to them. Our goal was to determine the current status of reporting quality in the medical literature and examine whether adherence of reporting guidelines has improved since the inception of reporting guidelines. MATERIALS AND METHODS Eight reporting guidelines, such as CONSORT, Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA), STrengthening the Reporting of OBservational studies in Epidemiology (STROBE), Quality of Reporting of Meta-analysis (QUOROM), STAndards for Reporting of Diagnostic accuracy (STARD), Animal Research: Reporting In Vivo Experiments (ARRIVE), Consolidated Health Economic Evaluation Reporting Standards (CHEERS), and Meta-analysis of Observational Studies in Epidemiology (MOOSE) were examined. Our inclusion criteria included reviews published between January 1996 to September 2016 which investigated the adherence to reporting guidelines in the literature that addressed clinical trials, systematic reviews, observational studies, meta-analysis, diagnostic accuracy, economic evaluations, and preclinical animal studies that were in English. All reviews were found on Web of Science, Excerpta Medical Database (EMBASE), MEDLINE, and Cumulative Index to Nursing and Allied Health Literature (CINAHL). RESULTS Among the general searching of 26,819 studies by using the designed searching method, 124 studies were included post screening. We found that 87.9% of the included studies reported suboptimal adherence to reporting guidelines. Factors associated with poor adherence included non-pharmacological interventions, year of publication, and trials concluding with significant results. Improved adherence was associated with better study designs such as allocation concealment, random sequence, large sample sizes, adequately powered studies, multiple authorships, and being published in journals endorsing guidelines. CONCLUSION We conclude that the level of adherence to reporting guidelines remains suboptimal. Endorsement of reporting guidelines by journals is important and recommended.
Collapse
Affiliation(s)
- Yanling Jin
- Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, ON, Canada,
| | - Nitika Sanger
- Department of Medical Science, Medical Sciences Graduate Program, McMaster University, Hamilton, ON, Canada
| | - Ieta Shams
- Department of Psychology, Neuroscience and Behaviour, McMaster University, Hamilton, ON, Canada
| | - Candice Luo
- Faculty of Health Sciences, Bachelors of Health Sciences, McMaster University, Hamilton, ON, Canada
| | - Hamnah Shahid
- Department of Arts and Science, McMaster University, Hamilton, ON, Canada
| | - Guowei Li
- Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, ON, Canada,
| | - Meha Bhatt
- Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, ON, Canada,
| | - Laura Zielinski
- Department of Neuroscience, McMaster Integrative Neuroscience Discovery and Study, McMaster University, Hamilton, ON, Canada
| | - Bianca Bantoto
- Department of Science, Honours Integrated Sciences Program, McMaster University, Hamilton, ON, Canada
| | - Mei Wang
- Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, ON, Canada,
| | - Luciana Pf Abbade
- Department of Dermatology and Radiotherapy, Botucatu Medical School, Universidade Estadual Paulista, UNESP, São Paulo, Brazil
| | - Ikunna Nwosu
- Faculty of Health Sciences, Bachelors of Health Sciences, McMaster University, Hamilton, ON, Canada
| | - Alvin Leenus
- Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, ON, Canada,
| | - Lawrence Mbuagbaw
- Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, ON, Canada,
| | - Muhammad Maaz
- Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, ON, Canada,
| | - Yaping Chang
- Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, ON, Canada,
| | - Guangwen Sun
- Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, ON, Canada,
| | - Mitchell Ah Levine
- Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, ON, Canada,
- St. Joseph's Healthcare Hamilton, Hamilton, ON, Canada
| | - Jonathan D Adachi
- Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, ON, Canada,
- St. Joseph's Healthcare Hamilton, Hamilton, ON, Canada
| | - Lehana Thabane
- Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, ON, Canada,
- St. Joseph's Healthcare Hamilton, Hamilton, ON, Canada
| | - Zainab Samaan
- Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, ON, Canada,
- Department of Psychiatry and Behavioural Neurosciences, McMaster University, Hamilton, ON, Canada,
| |
Collapse
|
35
|
Mariani B, Fey MK, Gloe D. The Simulation Research Rubric: A Pilot Study Evaluating Published Simulation Studies. Clin Simul Nurs 2018. [DOI: 10.1016/j.ecns.2018.06.003] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|
36
|
CAEP 2016 Academic Symposium: A Writer's Guide to Key Steps in Producing Quality Medical Education Scholarship. CAN J EMERG MED 2018; 19:S9-S15. [PMID: 28508740 DOI: 10.1017/cem.2017.30] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
A key skill for successful clinician educators is the effective dissemination of scholarly innovations and research. Although there are many ways to disseminate scholarship, the most accepted and rewarded form of educational scholarship is publication in peer-reviewed journals. This paper provides direction for emergency medicine (EM) educators interested in publishing their scholarship via traditional peer-reviewed avenues. It builds upon four literature reviews that aggregated recommendations for writing and publishing high-quality quantitative and qualitative research, innovations, and reviews. Based on the findings from these literature reviews, the recommendations were prioritized for importance and relevance to novice clinician educators by a broad community of medical educators. The top items from the expert vetting process were presented to the 2016 Canadian Association of Emergency Physicians (CAEP) Academic Symposium Consensus Conference on Education Scholarship. This community of EM educators identified the highest yield recommendations for junior medical education scholars. This manuscript elaborates upon the top recommendations identified through this consensus-building process.
Collapse
|
37
|
Chauvin A, Truchot J, Bafeta A, Pateron D, Plaisance P, Yordanov Y. Randomized controlled trials of simulation-based interventions in Emergency Medicine: a methodological review. Intern Emerg Med 2018; 13:433-444. [PMID: 29147942 DOI: 10.1007/s11739-017-1770-1] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/31/2017] [Accepted: 11/10/2017] [Indexed: 11/27/2022]
Abstract
The number of trials assessing Simulation-Based Medical Education (SBME) interventions has rapidly expanded. Many studies show that potential flaws in design, conduct and reporting of randomized controlled trials (RCTs) can bias their results. We conducted a methodological review of RCTs assessing a SBME in Emergency Medicine (EM) and examined their methodological characteristics. We searched MEDLINE via PubMed for RCT that assessed a simulation intervention in EM, published in 6 general and internal medicine and in the top 10 EM journals. The Cochrane Collaboration risk of Bias tool was used to assess risk of bias, intervention reporting was evaluated based on the "template for intervention description and replication" checklist, and methodological quality was evaluated by the Medical Education Research Study Quality Instrument. Reports selection and data extraction was done by 2 independents researchers. From 1394 RCTs screened, 68 trials assessed a SBME intervention. They represent one quarter of our sample. Cardiopulmonary resuscitation (CPR) is the most frequent topic (81%). Random sequence generation and allocation concealment were performed correctly in 66 and 49% of trials. Blinding of participants and assessors was performed correctly in 19 and 68%. Risk of attrition bias was low in three-quarters of the studies (n = 51). Risk of selective reporting bias was unclear in nearly all studies. The mean MERQSI score was of 13.4/18.4% of the reports provided a description allowing the intervention replication. Trials assessing simulation represent one quarter of RCTs in EM. Their quality remains unclear, and reproducing the interventions appears challenging due to reporting issues.
Collapse
Affiliation(s)
- Anthony Chauvin
- Service d'Accueil des Urgences, Emergency Département, Hôpital Lariboisière, Assistance Publique-Hôpitaux de Paris, 2 Rue Ambroise Paré, 75010, Paris, France.
- Faculté de Médecine, Université Diderot, Paris, France.
- INSERM U1153, Statistic and Epidemiologic Research Center Sorbonne Paris Cité (CRESS), METHODS Team, Hotel-Dieu Hospital, Paris, France.
| | - Jennifer Truchot
- Service d'Accueil des Urgences, Emergency Département, Hôpital Lariboisière, Assistance Publique-Hôpitaux de Paris, 2 Rue Ambroise Paré, 75010, Paris, France
- Faculté de Médecine, Université Diderot, Paris, France
- Ilumens Simulation Department, Paris Descartes University, 45 rue des Saint Pères, 75006, Paris, France
| | - Aida Bafeta
- INSERM U1153, Statistic and Epidemiologic Research Center Sorbonne Paris Cité (CRESS), METHODS Team, Hotel-Dieu Hospital, Paris, France
| | - Dominique Pateron
- Sorbonne Universités, UPMC Paris Univ-06, Paris, France
- Service des Urgences-Hôpital Saint Antoine, Assistance Publique-Hôpitaux de Paris (APHP), Paris, France
| | - Patrick Plaisance
- Service d'Accueil des Urgences, Emergency Département, Hôpital Lariboisière, Assistance Publique-Hôpitaux de Paris, 2 Rue Ambroise Paré, 75010, Paris, France
- Faculté de Médecine, Université Diderot, Paris, France
| | - Youri Yordanov
- INSERM U1153, Statistic and Epidemiologic Research Center Sorbonne Paris Cité (CRESS), METHODS Team, Hotel-Dieu Hospital, Paris, France
- Sorbonne Universités, UPMC Paris Univ-06, Paris, France
- Service des Urgences-Hôpital Saint Antoine, Assistance Publique-Hôpitaux de Paris (APHP), Paris, France
| |
Collapse
|
38
|
Jordan J, Coates WC, Clarke S, Runde D, Fowlkes E, Kurth J, Yarris L. The Uphill Battle of Performing Education Scholarship: Barriers Educators and Education Researchers Face. West J Emerg Med 2018; 19:619-629. [PMID: 29760865 PMCID: PMC5942034 DOI: 10.5811/westjem.2018.1.36752] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/13/2017] [Revised: 01/12/2018] [Accepted: 01/11/2018] [Indexed: 11/30/2022] Open
Abstract
Introduction Educators and education researchers report that their scholarship is limited by lack of time, funding, mentorship, expertise, and reward. This study aims to evaluate these groups’ perceptions regarding barriers to scholarship and potential strategies for success. Methods Core emergency medicine (EM) educators and education researchers completed an online survey consisting of multiple-choice, 10-point Likert scale, and free-response items in 2015. Descriptive statistics were reported. We used qualitative analysis applying a thematic approach to free-response items. Results A total of 204 educators and 42 education researchers participated. Education researchers were highly productive: 19/42 reported more than 20 peer-reviewed education scholarship publications on their curricula vitae. In contrast, 68/197 educators reported no education publications within five years. Only a minority, 61/197 had formal research training compared to 25/42 education researchers. Barriers to performing research for both groups were lack of time, competing demands, lack of support, lack of funding, and challenges achieving scientifically rigorous methods and publication. The most common motivators identified were dissemination of knowledge, support of evidence-based practices, and promotion. Respondents advised those who seek greater education research involvement to pursue mentorship, formal research training, collaboration, and rigorous methodological standards. Conclusion The most commonly cited barriers were lack of time and competing demands. Stakeholders were motivated by the desire to disseminate knowledge, support evidence-based practices, and achieve promotion. Suggested strategies for success included formal training, mentorship, and collaboration. This information may inform interventions to support educators in their scholarly pursuits and improve the overall quality of education research in EM.
Collapse
Affiliation(s)
- Jaime Jordan
- Harbor-UCLA Medical Center, Department of Emergency Medicine, Torrance, California
| | - Wendy C Coates
- Harbor-UCLA Medical Center, Department of Emergency Medicine, Torrance, California
| | - Samuel Clarke
- UC Davis Medical Center, Department of Emergency Medicine, Sacramento, California
| | - Daniel Runde
- University of Iowa, Department of Emergency Medicine, Iowa City, Iowa
| | - Emilie Fowlkes
- University of Iowa, Department of Emergency Medicine, Iowa City, Iowa
| | - Jaqueline Kurth
- UCLA Ronald Reagan/Olive View, Department of Emergency Medicine, Los Angeles, California
| | - Lalena Yarris
- Oregon Health and Science University, Department of Emergency Medicine, Portland, Oregon
| |
Collapse
|
39
|
Hunter CL, Silvestri S, Ralls G, Stone A, Walker A, Mangalat N, Papa L. Comparing Quick Sequential Organ Failure Assessment Scores to End-tidal Carbon Dioxide as Mortality Predictors in Prehospital Patients with Suspected Sepsis. West J Emerg Med 2018; 19:446-451. [PMID: 29760838 PMCID: PMC5942006 DOI: 10.5811/westjem.2018.1.35607] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/06/2017] [Revised: 12/13/2017] [Accepted: 01/22/2018] [Indexed: 01/20/2023] Open
Abstract
Introduction Early identification of sepsis significantly improves outcomes, suggesting a role for prehospital screening. An end-tidal carbon dioxide (ETCO2) value ≤ 25 mmHg predicts mortality and severe sepsis when used as part of a prehospital screening tool. Recently, the Quick Sequential Organ Failure Assessment (qSOFA) score was also derived as a tool for predicting poor outcomes in potentially septic patients. Methods We conducted a retrospective cohort study among patients transported by emergency medical services to compare the use of ETCO2 ≤ 25 mmHg with qSOFA score of ≥ 2 as a predictor of mortality or diagnosis of severe sepsis in prehospital patients with suspected sepsis. Results By comparison of receiver operator characteristic curves, ETCO2 had a higher discriminatory power to predict mortality, sepsis, and severe sepsis than qSOFA. Conclusion Both non-invasive measures were easily obtainable by prehospital personnel, with ETCO2 performing slightly better as an outcome predictor.
Collapse
Affiliation(s)
- Christopher L Hunter
- Orlando Regional Medical Center, Department of Emergency Medicine, Orlando, Florida.,University of Central Florida College of Medicine, Department of Emergency Medicine, Orlando, Florida
| | - Salvatore Silvestri
- Orlando Regional Medical Center, Department of Emergency Medicine, Orlando, Florida.,University of Central Florida College of Medicine, Department of Emergency Medicine, Orlando, Florida
| | - George Ralls
- Orlando Regional Medical Center, Department of Emergency Medicine, Orlando, Florida
| | - Amanda Stone
- Orlando Regional Medical Center, Department of Emergency Medicine, Orlando, Florida
| | - Ayanna Walker
- Orlando Regional Medical Center, Department of Emergency Medicine, Orlando, Florida.,University of Central Florida College of Medicine, Department of Emergency Medicine, Orlando, Florida
| | - Neal Mangalat
- St Mary's Hospital, Department of Emergency Medicine, St. Louis, Missouri
| | - Linda Papa
- Orlando Regional Medical Center, Department of Emergency Medicine, Orlando, Florida.,University of Central Florida College of Medicine, Department of Emergency Medicine, Orlando, Florida
| |
Collapse
|
40
|
Spurlock DR. The Single-Group, Pre- and Posttest Design in Nursing Education Research: It's Time to Move on. J Nurs Educ 2018; 57:69-71. [DOI: 10.3928/01484834-20180123-02] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
|
41
|
Albarqouni L, Glasziou P, Hoffmann T. Completeness of the reporting of evidence-based practice educational interventions: a review. MEDICAL EDUCATION 2018; 52:161-170. [PMID: 29098706 DOI: 10.1111/medu.13410] [Citation(s) in RCA: 26] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/30/2017] [Revised: 04/07/2017] [Accepted: 07/04/2017] [Indexed: 05/25/2023]
Abstract
CONTEXT Complete reporting of intervention details in trials of evidence-based practice (EBP) educational interventions is essential to enable clinical educators to translate research evidence about interventions that have been shown to be effective into practice. In turn, this will improve the quality of EBP education. OBJECTIVES This study was designed to examine the completeness of reporting of EBP educational interventions in published studies and to assess whether missing details of educational interventions could be retrieved by searching additional sources and contacting study authors. METHODS A systematic review of controlled trials that had evaluated EBP educational interventions was conducted using a citation analysis technique. Forward and backward citations of the index articles were tracked until March 2016. The TIDieR (template for intervention description and replication) checklist was used to assess the completeness of intervention reporting. Missing details were sought from: (i) the original publication; (ii) additional publicly available sources, and (iii) the study authors. RESULTS Eighty-three articles were included; 45 (54%) were randomised controlled trials (RCTs) and 38 (46%) were non-RCTs. The majority of trials (n = 62, 75%) involved medical professionals. None of the studies completely reported all of the main items of the educational intervention within the original publication or in additional sources. However, details became complete for 17 (20%) interventions after contact with the respective authors. The item most frequently missing was 'intervention materials', which was missing in 80 (96%) of the original publications, in additional sources for 77 (93%) interventions, and in 59 (71%) studies after contact with the authors. Authors of 69 studies were contacted; 33 provided the details requested. CONCLUSIONS The reporting of EBP educational interventions is incomplete and remained so for the majority of studies, even after study authors had been contacted for missing information. Collaborative efforts involving authors and editors are required to improve the completeness of reporting of EBP educational interventions.
Collapse
Affiliation(s)
- Loai Albarqouni
- Centre for Research in Evidence-Based Practice (CREBP), Faculty of Health Sciences and Medicine, Bond University, Gold Coast, Queensland, Australia
| | - Paul Glasziou
- Centre for Research in Evidence-Based Practice (CREBP), Faculty of Health Sciences and Medicine, Bond University, Gold Coast, Queensland, Australia
| | - Tammy Hoffmann
- Centre for Research in Evidence-Based Practice (CREBP), Faculty of Health Sciences and Medicine, Bond University, Gold Coast, Queensland, Australia
| |
Collapse
|
42
|
Clarke SO, Jordan J, Yarris LM, Fowlkes E, Kurth J, Runde D, Coates WC. The View From the Top: Academic Emergency Department Chairs' Perspectives on Education Scholarship. AEM EDUCATION AND TRAINING 2018; 2:26-32. [PMID: 30051062 PMCID: PMC6001505 DOI: 10.1002/aet2.10070] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/10/2017] [Revised: 09/07/2017] [Accepted: 09/23/2017] [Indexed: 05/05/2023]
Abstract
UNLABELLED Education scholarship continues to grow within emergency medicine (EM) and in academic medicine in general. Despite a growing interest, would-be education scholars often struggle to find adequate mentorship, research training, funding, and protected time to produce rigorous scholarship. The ways in which individual academic EM departments can support this mission remains an area in need of description. OBJECTIVES We sought to describe academic EM department chairs' perceptions of education scholarship and facilitators and barriers to producing high-quality education scholarship. METHODS We conducted a qualitative study using a grounded theory-derived approach. Participants were solicited directly, and semistructured interviews were conducted via telephone. Interviews were transcribed verbatim and were analyzed by three study investigators using a coding matrix. Discrepancies in coding were resolved via in depth discussion. RESULTS We interviewed seven EM chairs from academic departments throughout North America (six in geographically diverse regions of the United States and one in western Canada). Chairs described education scholarship as lacking clearly defined and measurable outcomes, as well as methodologic rigor. They identified that education faculty within their departments need training and incentives to pursue scholarly work in a system that primarily expects teaching from educators. Chairs acknowledged a lack of access to education research expertise and mentorship within their own departments, but identified potential resources within their local medical schools and universities. They also voiced willingness to support career development opportunities and scholarly work among faculty seeking to perform education research. CONCLUSIONS Academic EM chairs endorse a need for methodologic training, mentorship, and access to expertise specific to education scholarship. While such resources are often rare within academic EM departments, they may exist within local universities and schools of medicine. Academic EM chairs described themselves as willing and able to support faculty who wish to pursue this type of work.
Collapse
Affiliation(s)
| | - Jaime Jordan
- Department of Emergency MedicineHarbor‐UCLATorranceCA
- University of California, Los AngelesDavid Geffen School of MedicineLos AngelesCA
| | - Lalena M. Yarris
- Department of Emergency MedicineOregon Health & Science UniversityPortlandOR
| | - Emilie Fowlkes
- Department of Emergency MedicineUniversity of Iowa Hospitals and ClinicsIowa CityIA
| | - Jaqueline Kurth
- Department of Emergency MedicineUCLA Ronald Reagan/Olive ViewLos AngelesCA
| | - Daniel Runde
- Department of Emergency MedicineUniversity of Iowa Hospitals and ClinicsIowa CityIA
| | - Wendy C. Coates
- Department of Emergency MedicineHarbor‐UCLATorranceCA
- University of California, Los AngelesDavid Geffen School of MedicineLos AngelesCA
| |
Collapse
|
43
|
Richmond H, Copsey B, Hall AM, Davies D, Lamb SE. A systematic review and meta-analysis of online versus alternative methods for training licensed health care professionals to deliver clinical interventions. BMC MEDICAL EDUCATION 2017; 17:227. [PMID: 29169393 PMCID: PMC5701457 DOI: 10.1186/s12909-017-1047-4] [Citation(s) in RCA: 60] [Impact Index Per Article: 8.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/02/2016] [Accepted: 11/02/2017] [Indexed: 05/07/2023]
Abstract
BACKGROUND Online training is growing in popularity and yet its effectiveness for training licensed health professionals (HCPs) in clinical interventions is not clear. We aimed to systematically review the literature on the effectiveness of online versus alternative training methods in clinical interventions for licensed Health Care Professionals (HCPs) on outcomes of knowledge acquisition, practical skills, clinical behaviour, self-efficacy and satisfaction. METHODS Seven databases were searched for randomised controlled trials (RCTs) from January 2000 to June 2015. Two independent reviewers rated trial quality and extracted trial data. Comparative effects were summarised as standardised mean differences (SMD) and 95% confidence intervals. Pooled effect sizes were calculated using a random-effects model for three contrasts of online versus (i) interactive workshops (ii) taught lectures and (iii) written/electronic manuals. RESULTS We included 14 studies with a total of 1089 participants. Most trials studied medical professionals, used a workshop or lecture comparison, were of high risk of bias and had small sample sizes (range 21-183). Using the GRADE approach, we found low quality evidence that there was no difference between online training and an interactive workshop for clinical behaviour SMD 0.12 (95% CI -0.13 to 0.37). We found very low quality evidence of no difference between online methods and both a workshop and lecture for knowledge (workshop: SMD 0.04 (95% CI -0.28 to 0.36); lecture: SMD 0.22 (95% CI: -0.08, 0.51)). Lastly, compared to a manual (n = 3/14), we found very low quality evidence that online methods were superior for knowledge SMD 0.99 (95% CI 0.02 to 1.96). There were too few studies to draw any conclusions on the effects of online training for practical skills, self-efficacy, and satisfaction across all contrasts. CONCLUSIONS It is likely that online methods may be as effective as alternative methods for training HCPs in clinical interventions for the outcomes of knowledge and clinical behaviour. However, the low quality of the evidence precludes drawing firm conclusions on the relative effectiveness of these training methods. Moreover, the confidence intervals around our effect sizes were large and could encompass important differences in effectiveness. More robust, adequately powered RCTs are needed.
Collapse
Affiliation(s)
- Helen Richmond
- Warwick Clinical Trials Unit, Division of Health Sciences, Warwick Medical School, University of Warwick, Coventry, UK
| | - Bethan Copsey
- Centre for Rehabilitation Research, Nuffield Department of Orthopaedics, Rheumatology, and Musculoskeletal Sciences, University of Oxford, Oxford, UK
| | - Amanda M. Hall
- The George Institute for Global Health, University of Oxford, Oxford, UK
| | - David Davies
- Warwick Medical School, University of Warwick, Coventry, UK
| | - Sarah E. Lamb
- Warwick Clinical Trials Unit, Division of Health Sciences, Warwick Medical School, University of Warwick, Coventry, UK
- Centre for Rehabilitation Research, Nuffield Department of Orthopaedics, Rheumatology, and Musculoskeletal Sciences, University of Oxford, Oxford, UK
| |
Collapse
|
44
|
Stephenson CR, Vaa BE, Wang AT, Schroeder DR, Beckman TJ, Reed DA, Sawatsky AP. Conference presentation to publication: a retrospective study evaluating quality of abstracts and journal articles in medical education research. BMC MEDICAL EDUCATION 2017; 17:193. [PMID: 29121891 PMCID: PMC5680828 DOI: 10.1186/s12909-017-1048-3] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/20/2017] [Accepted: 11/02/2017] [Indexed: 05/25/2023]
Abstract
BACKGROUND There is little evidence regarding the comparative quality of abstracts and articles in medical education research. The Medical Education Research Study Quality Instrument (MERSQI), which was developed to evaluate the quality of reporting in medical education, has strong validity evidence for content, internal structure, and relationships to other variables. We used the MERSQI to compare the quality of reporting for conference abstracts, journal abstracts, and published articles. METHODS This is a retrospective study of all 46 medical education research abstracts submitted to the Society of General Internal Medicine 2009 Annual Meeting that were subsequently published in a peer-reviewed journal. We compared MERSQI scores of the abstracts with scores for their corresponding published journal abstracts and articles. Comparisons were performed using the signed rank test. RESULTS Overall MERSQI scores increased significantly for published articles compared with conference abstracts (11.33 vs 9.67; P < .001) and journal abstracts (11.33 vs 9.96; P < .001). Regarding MERSQI subscales, published articles had higher MERSQI scores than conference abstracts in the domains of sampling (1.59 vs 1.34; P = .006), data analysis (3.00 vs 2.43; P < .001), and validity of evaluation instrument (1.04 vs 0.28; P < .001). Published articles also had higher MERSQI scores than journal abstracts in the domains of data analysis (3.00 vs 2.70; P = .004) and validity of evaluation instrument (1.04 vs 0.26; P < .001). CONCLUSIONS To our knowledge, this is the first study to compare the quality of medical education abstracts and journal articles using the MERSQI. Overall, the quality of articles was greater than that of abstracts. However, there were no significant differences between abstracts and articles for the domains of study design and outcomes, which indicates that these MERSQI elements may be applicable to abstracts. Findings also suggest that abstract quality is generally preserved from original presentation to publication.
Collapse
Affiliation(s)
| | - Brianna E. Vaa
- Division of General Internal Medicine, Mayo Clinic, 200 First Street SW, Rochester, MN 55905 USA
| | - Amy T. Wang
- Division of General Internal Medicine, Mayo Clinic, 200 First Street SW, Rochester, MN 55905 USA
- Harborview, University of California Los Angeles, Los Angeles, CA USA
| | - Darrell R. Schroeder
- Division of Biomedical Statistics and Informatics, Mayo Clinic, Rochester, MN USA
| | - Thomas J. Beckman
- Division of General Internal Medicine, Mayo Clinic, 200 First Street SW, Rochester, MN 55905 USA
| | - Darcy A. Reed
- Division of Primary Care Internal Medicine, Mayo Clinic, Rochester, MN USA
| | - Adam P. Sawatsky
- Division of General Internal Medicine, Mayo Clinic, 200 First Street SW, Rochester, MN 55905 USA
| |
Collapse
|
45
|
Jones N, Milanes L, Banales V, Price I, Gomez I, Hughes S. Difficult Interpersonal Encounters with Medical Students and Residents: Two Objective Standardized Teaching Encounters. MEDEDPORTAL : THE JOURNAL OF TEACHING AND LEARNING RESOURCES 2017; 13:10640. [PMID: 30800841 PMCID: PMC6338149 DOI: 10.15766/mep_2374-8265.10640] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/09/2017] [Accepted: 09/21/2017] [Indexed: 06/09/2023]
Abstract
INTRODUCTION Objective standardized teaching exercises (OSTEs) are widely used to develop professional competencies, especially in the health care professions. An OSTE involves exposing different providers to the same, time-limited scenario that is concurrently observed and/or recorded for either formative or summative evaluation. As there are limited resources available for creating a resident-specific OSTE, especially those applicable to family and community medicine residents, we created and evaluated a resident OSTE (R-OSTE) for second- and third-year family and community medicine residents. METHODS This R-OSTE involved two cases. The first featured Taylor, a third-year medical student resistant to feedback. The second featured Kris, a first-year resident nervous about approaching the attending on duty. Our R-OSTE had residents teaching interpersonal skills to trained actors in a standardized learner role. RESULTS Residents in the teaching role were formatively evaluated by peer observers (fellow residents) and standardized learners on interpersonal domains such as communication and professionalism. Learners gave residents an average performance rating of 4.9 on a 1 to 6 scale with 1 = Very Poor and 6 = Excellent. Residents also evaluated the OSTE itself, rating their experience on multiple teaching-related statements. Eighty-six percent of residents agreed this exercise was an appropriate development activity for family medicine residents. Overall, our R-OSTE was rated highly for relevance to teaching by the residents. DISCUSSION The residents were rated highly by both peer observers and standardized learners. However, there was little variability in peer observer scores, indicating the need for an alternative method of measurement.
Collapse
Affiliation(s)
- Nicole Jones
- Research Coordinator, Family and Community Medicine Residency Program, University of California, San Francisco-Fresno
| | - Liana Milanes
- Assistant Clinical Professor, Family and Community Medicine Department, University of California, San Francisco-Fresno
| | - Vanessa Banales
- Research Assistant, Joint Internship Program, California State University, Fresno, and University of California, San Francisco-Fresno
| | - Iris Price
- Grants Manager and Research Associate, Family and Community Medicine Residency Program, University of California, San Francisco-Fresno
| | - Ivan Gomez
- Clinical Professor, Family and Community Medicine Department, University of California, San Francisco-Fresno
| | - Susan Hughes
- Research Director, Family and Community Medicine Residency Program, University of California, San Francisco-Fresno
| |
Collapse
|
46
|
|
47
|
Jordan J, Yarris LM, Santen SA, Guth TA, Rougas S, Runde DP, Coates WC. Creating a Cadre of Fellowship-Trained Medical Educators, Part II: A Formal Needs Assessment to Structure Postgraduate Fellowships in Medical Education Scholarship and Leadership. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2017; 92:1181-1188. [PMID: 27805949 DOI: 10.1097/acm.0000000000001460] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/28/2023]
Abstract
PURPOSE Education leaders at the 2012 Academic Emergency Medicine Consensus Conference on education research proposed that dedicated postgraduate education scholarship fellowships (ESFs) might provide an effective model for developing future faculty as scholars. A formal needs assessment was performed to understand the training gap and inform the development of ESFs. METHOD A mixed-methods needs assessment was conducted of four emergency medicine national stakeholder groups in 2013: department chairs; faculty education/research leaders; existing education fellowship directors; and current education fellows/graduates. Descriptive statistics were reported for quantitative data. Qualitative data from semistructured interviews and free-text responses were analyzed using a thematic approach. RESULTS Participants were 11/15 (73%) education fellowship directors, 13/20 (65%) fellows/graduates, 106/239 (44%) faculty education/research leaders, and a convenience sample of 26 department chairs. Department chairs expected new education faculty to design didactics (85%) and teach clinically (96%). Faculty education/research leaders thought new faculty were inadequately prepared for job tasks (83.7%) and that ESFs would improve the overall quality of education research (91.1%). Fellowship directors noted that ESFs provide skills, mentorship, and protected time for graduates to become productive academicians. Current fellows/graduates reported pursing an ESF to develop skills in teaching and research methodology. CONCLUSIONS Stakeholder groups uniformly perceived a need for training in education theory, clinical teaching, and education research. These findings support dedicated, deliberate training in these areas. Establishment of a structure for scholarly pursuits prior to assuming a full-time position will effectively prepare new faculty. These findings may inform the development, implementation, and curricula of ESFs.
Collapse
Affiliation(s)
- Jaime Jordan
- J. Jordan is assistant director, Residency Training Program, Department of Emergency Medicine, Harbor-UCLA Medical Center, and assistant professor of medicine and vice chair, Acute Care College, University of California, Los Angeles, David Geffen School of Medicine, Los Angeles, California. L.M. Yarris is associate professor, Department of Emergency Medicine, Oregon Health & Science University, Portland, Oregon. S.A. Santen is assistant dean for educational research and quality improvement, University of Michigan Medical School, and professor, Department of Emergency Medicine and Department of Learning Health Sciences, University of Michigan, Ann Arbor, Michigan. T.A. Guth is emergency medicine clerkship codirector and associate director for clinical skills in the Foundations of Doctoring course, Department of Emergency Medicine, University of Colorado School of Medicine, Aurora, Colorado. S. Rougas is assistant professor of emergency medicine, Alpert Medical School, Brown University, Providence, Rhode Island. D.P. Runde is assistant program director and assistant professor of emergency medicine, Department of Emergency Medicine, University of Iowa Hospitals and Clinics, Iowa City, Iowa. W.C. Coates is senior education specialist, Department of Emergency Medicine, Harbor-UCLA Medical Center, and professor of medicine, University of California, Los Angeles David Geffen School of Medicine, Los Angeles, California
| | | | | | | | | | | | | |
Collapse
|
48
|
Nichols DG. Maintenance of Certification and the Challenge of Professionalism. Pediatrics 2017; 139:peds.2016-4371. [PMID: 28557762 DOI: 10.1542/peds.2016-4371] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Accepted: 02/03/2017] [Indexed: 11/24/2022] Open
Abstract
Board certification has been part of the social contract in which physicians commit to maintaining up-to-date scientific knowledge and improving the quality of patient care. However, the maintenance of certification program has been controversial. This review summarizes the philosophical underpinnings, published literature, recent improvements, and future directions of the American Board of Pediatrics maintenance of certification program.
Collapse
Affiliation(s)
- David G Nichols
- The American Board of Pediatrics, Chapel Hill, North Carolina
| |
Collapse
|
49
|
Reporting Guidelines for Health Care Simulation Research: Extensions to the CONSORT and STROBE Statements. Simul Healthc 2017; 11:238-48. [PMID: 27465839 DOI: 10.1097/sih.0000000000000150] [Citation(s) in RCA: 209] [Impact Index Per Article: 29.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/12/2022]
Abstract
INTRODUCTION Simulation-based research (SBR) is rapidly expanding but the quality of reporting needs improvement. For a reader to critically assess a study, the elements of the study need to be clearly reported. Our objective was to develop reporting guidelines for SBR by creating extensions to the Consolidated Standards of Reporting Trials (CONSORT) and Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Statements. METHODS An iterative multistep consensus-building process was used on the basis of the recommended steps for developing reporting guidelines. The consensus process involved the following: (1) developing a steering committee, (2) defining the scope of the reporting guidelines, (3) identifying a consensus panel, (4) generating a list of items for discussion via online premeeting survey, (5) conducting a consensus meeting, and (6) drafting reporting guidelines with an explanation and elaboration document. RESULTS The following 11 extensions were recommended for CONSORT item 1 (title/abstract), item 2 (background), item 5 (interventions), item 6 (outcomes), item 11 (blinding), item 12 (statistical methods), item 15 (baseline data), item 17 (outcomes/estimation), item 20 (limitations), item 21 (generalizability), and item 25 (funding). The following 10 extensions were recommended for STROBE: item 1 (title/abstract), item 2 (background/rationale), item 7 (variables), item 8 (data sources/measurement), item 12 (statistical methods), item 14 (descriptive data), item 16 (main results), item 19 (limitations), item 21 (generalizability), and item 22 (funding). An elaboration document was created to provide examples and explanation for each extension. CONCLUSIONS We have developed extensions for the CONSORT and STROBE Statements that can help improve the quality of reporting for SBR.
Collapse
|
50
|
Krishnan K, Thoma B, Trueger NS, Lin M, Chan TM. Gestalt assessment of online educational resources may not be sufficiently reliable and consistent. PERSPECTIVES ON MEDICAL EDUCATION 2017; 6:91-98. [PMID: 28243948 PMCID: PMC5383576 DOI: 10.1007/s40037-017-0343-3] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/14/2023]
Abstract
PURPOSE Online open educational resources are increasingly used in medical education, particularly blogs and podcasts. However, it is unclear whether these resources can be adequately appraised by end-users. Our goal was to determine whether gestalt-based recommendations are sufficient for emergency medicine trainees and attending physicians to reliably recommend online educational resources to others. METHODS Raters (33 trainees and 21 attendings in emergency medicine from North America) were asked to rate 40 blog posts according to whether, based on their gestalt, they would recommend the resource to (1) a trainee or (2) an attending physician. The ratings' reliability was assessed using intraclass correlation coefficients (ICC). Associations between groups' mean scores were assessed using Pearson's r. A repeated measures analysis of variance (RM-ANOVA) was completed to determine the effect of the level of training on gestalt recommendation scale (i. e. trainee vs. attending). RESULTS Trainees demonstrated poor reliability when recommending resources for other trainees (ICC = 0.21, 95% CI 0.13-0.39) and attendings (ICC = 0.16, 95% CI = 0.09-0.30). Similarly, attendings had poor reliability when recommending resources for trainees (ICC = 0.27, 95% CI 0.18-0.41) and other attendings (ICC = 0.22, 95% CI 0.14-0.35). There were moderate correlations between the mean scores for each blog post when either trainees or attendings considered the same target audience. The RM-ANOVA also corroborated that there is a main effect of the proposed target audience on the ratings by both trainees and attendings. CONCLUSIONS A gestalt-based rating system is not sufficiently reliable when recommending online educational resources to trainees and attendings. Trainees' gestalt ratings for recommending resources for both groups were especially unreliable. Our findings suggest the need for structured rating systems to rate online educational resources.
Collapse
Affiliation(s)
| | - Brent Thoma
- Department of Emergency Medicine, University of Saskatchewan, Saskatoon, Saskatchewan, Canada
| | - N Seth Trueger
- Department of Emergency Medicine, Northwestern University, Chicago, IL, USA
| | - Michelle Lin
- Department of Emergency Medicine, University of California San Francisco, San Francisco, CA, USA
| | - Teresa M Chan
- Division of Emergency Medicine, Department of Medicine, McMaster University, Hamilton, Ontario, Canada.
| |
Collapse
|