1
|
Lee S, Kim HJ, Choi Y, Kim JY, Sun Shin J. Effectiveness of electrocardiogram interpretation education program using mixed learning methods and webpage. BMC MEDICAL EDUCATION 2024; 24:1039. [PMID: 39334173 PMCID: PMC11428852 DOI: 10.1186/s12909-024-05960-8] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/12/2024] [Accepted: 08/27/2024] [Indexed: 09/30/2024]
Abstract
AIM This study was conducted to develop an electrocardiogram education program that incorporates an HTML webpage and blended learning methods to enhance electrocardiogram interpretation skills. Through continual and efficient education, the program aims to assist nurses in providing appropriate care and treatment to patients. DESIGN Pre-post design study. METHODS We developed an electrocardiogram interpretation HTML webpage based on an electrocardiogram interpretation algorithm and implemented an 18-week (2023.5.15 ~ 2023.9.22) electrocardiogram education program, which included daily 5-minute training sessions. Twenty-seven ward nurses were provided with the URL ( https://ecgweb.github.io/ECGwebEN ) to the electrocardiogram interpretation HTML webpage and shared one electrocardiogram case daily for self-interpretation. Electrocardiogram interpretation performance and confidence were evaluated through questionnaires at three phases: before the program, after 6 weeks of basic electrocardiogram and arrhythmia education, and after 12 weeks of application of the electrocardiogram interpretation HTML webpage and case-based lecture education. The statistical tests used were repeated-measures ANOVA or the Wilcoxon signed-rank test. RESULTS The average score for electrocardiogram interpretation performance before the electrocardiogram education program was 11.89(SD = 3.50), after 6 weeks of basic electrocardiogram and arrhythmia education it was 14.15(SD = 3.68), and after 12 weeks of application of the electrocardiogram interpretation HTML webpage and case-based lecture education, it was 15.56(SD = 3.04). This shows that electrocardiogram interpretation performance significantly improved over time (p < .001). Additionally, post-hoc analysis revealed significant differences in electrocardiogram interpretation performance at each stage, i.e., before, during, and after the application of an electrocardiogram education program. Furthermore, the electrocardiogram interpretation confidence questionnaire score (pre-Median 18, IQR = 5; post-Median 23, IQR = 3) was improved significantly after the completion of the 18-week education program (p < .001). CONCLUSIONS Based on the results of this study, we believe that an electrocardiogram education program using HTML webpage, and a blended teaching method would be very beneficial for maintaining and improving electrocardiogram interpretation skills of clinical nurses. Such a program can help nurses interpret electrocardiograms more effectively and assist them in making important decisions in patient care.
Collapse
Affiliation(s)
- Sunhee Lee
- Department of Nursing, Seoul St. Mary's Hospital, The Catholic University of Korea, Seoul, Republic of Korea.
| | - Hyo Jeong Kim
- Department of Nursing, Seoul St. Mary's Hospital, The Catholic University of Korea, Seoul, Republic of Korea
| | - Young Choi
- Division of Cardiology, Department of Internal Medicine, Seoul St. Mary's Hospital, College of Medicine, The Catholic University of Korea, Seoul, Republic of Korea
| | - Ji Yeung Kim
- Department of Nursing, Seoul St. Mary's Hospital, The Catholic University of Korea, Seoul, Republic of Korea
| | - Ji Sun Shin
- Department of Nursing, Seoul St. Mary's Hospital, The Catholic University of Korea, Seoul, Republic of Korea
| |
Collapse
|
2
|
Olvet DM, Sadigh K. Comparing the effectiveness of asynchronous e-modules and didactic lectures to teach electrocardiogram interpretation to first year US medical students. BMC MEDICAL EDUCATION 2023; 23:360. [PMID: 37217893 DOI: 10.1186/s12909-023-04338-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/09/2022] [Accepted: 05/09/2023] [Indexed: 05/24/2023]
Abstract
BACKGROUND Medical students are expected to be competent in interpreting electrocardiograms (ECGs) by the time they graduate, but many are unable to master this skill. Studies suggest that e-modules are an effective way to teach ECG interpretation, however they are typically evaluated for use during clinical clerkships. We sought to determine if an e-module could replace a didactic lecture to teach ECG interpretation during a preclinical cardiology course. METHODS We developed an asynchronous, interactive e-module that consisted of narrated videos, pop-up questions and quizzes with feedback. Participants were first year medical students who were either taught ECG interpretation during a 2-hour didactic lecture (control group) or were given unlimited access to the e-module (e-module group). First-year internal medicine residents (PGY1 group) were included to benchmark where ECG interpretation skills should be at graduation. At three time-points (pre-course, post-course, and 1-year follow-up), participants were evaluated for ECG knowledge and confidence. A mixed-ANOVA was used to compare groups over time. Students were also asked to describe what additional resources they used to learn ECG interpretation throughout the study. RESULTS Data was available for 73 (54%) students in the control group, 112 (81%) in the e-module group and 47 (71%) in the PGY1 group. Pre-course scores did not differ between the control and e-module groups (39% vs. 38%, respectively). However, the e-module group performed significantly better than the control group on the post-course test (78% vs. 66%). In a subsample with 1-year follow-up data, the e-module group's performance decreased, and the control group remained the same. The PGY1 groups' knowledge scores were stable over time. Confidence in both medical student groups increased by the end of the course, however only pre-course knowledge and confidence were significantly correlated. Most students relied on textbooks and course materials for learning ECG, however online resources were also utilized. CONCLUSIONS An asynchronous, interactive e-module was more effective than a didactic lecture for teaching ECG interpretation, however continued practice is needed regardless of how students learn to interpret ECGs. Various ECG resources are available to students to support their self-regulated learning.
Collapse
Affiliation(s)
- Doreen M Olvet
- Department of Science Education, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, NY, 11549, USA.
| | - Kaveh Sadigh
- Department of Medicine, Renaissance School of Medicine, Stony Brook University, Stony Brook, New York, 11794, USA
| |
Collapse
|
3
|
Oh SY, Cook DA, Van Gerven PWM, Nicholson J, Fairbrother H, Smeenk FWJM, Pusic MV. Physician Training for Electrocardiogram Interpretation: A Systematic Review and Meta-Analysis. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2022; 97:593-602. [PMID: 35086115 DOI: 10.1097/acm.0000000000004607] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
PURPOSE Using electrocardiogram (ECG) interpretation as an example of a widely taught diagnostic skill, the authors conducted a systematic review and meta-analysis to demonstrate how research evidence on instruction in diagnosis can be synthesized to facilitate improvement of educational activities (instructional modalities, instructional methods, and interpretation approaches), guide the content and specificity of such activities, and provide direction for research. METHOD The authors searched PubMed/MEDLINE, Embase, Cochrane CENTRAL, PsycInfo, CINAHL, ERIC, and Web of Science databases through February 21, 2020, for empirical investigations of ECG interpretation training enrolling medical students, residents, or practicing physicians. They appraised study quality with the Medical Education Research Study Quality Instrument and pooled standardized mean differences (SMDs) using random effects meta-analysis. RESULTS Of 1,002 articles identified, 59 were included (enrolling 17,251 participants). Among 10 studies comparing instructional modalities, 8 compared computer-assisted and face-to-face instruction, with pooled SMD 0.23 (95% CI, 0.09, 0.36) indicating a small, statistically significant difference favoring computer-assisted instruction. Among 19 studies comparing instructional methods, 5 evaluated individual versus group training (pooled SMD -0.35 favoring group study [95% CI, -0.06, -0.63]), 4 evaluated peer-led versus faculty-led instruction (pooled SMD 0.38 favoring peer instruction [95% CI, 0.01, 0.74]), and 4 evaluated contrasting ECG features (e.g., QRS width) from 2 or more diagnostic categories versus routine examination of features within a single ECG or diagnosis (pooled SMD 0.23 not significantly favoring contrasting features [95% CI, -0.30, 0.76]). Eight studies compared ECG interpretation approaches, with pooled SMD 0.92 (95% CI, 0.48, 1.37) indicating a large, statistically significant effect favoring more systematic interpretation approaches. CONCLUSIONS Some instructional interventions appear to improve learning in ECG interpretation; however, many evidence-based instructional strategies are insufficiently investigated. The findings may have implications for future research and design of training to improve skills in ECG interpretation and other types of visual diagnosis.
Collapse
Affiliation(s)
- So-Young Oh
- S.-Y. Oh is assistant director, Program for Digital Learning, Institute for Innovations in Medical Education, NYU Grossman School of Medicine, NYU Langone Health, New York, New York; ORCID: https://orcid.org/0000-0002-4640-3695
| | - David A Cook
- D.A. Cook is professor of medicine and medical education, director of education science, Office of Applied Scholarship and Education Science, research chair, Mayo Clinic Rochester Multidisciplinary Simulation Center, and consultant, Division of General Internal Medicine, Mayo Clinic College of Medicine and Science, Rochester, Minnesota; ORCID: https://orcid.org/0000-0003-2383-4633
| | - Pascal W M Van Gerven
- P.W.M. Van Gerven is associate professor, Department of Educational Development and Research, Faculty of Health, Medicine and Life Sciences, Maastricht University, Maastricht, The Netherlands; ORCID: https://orcid.org/0000-0002-8363-2534
| | - Joseph Nicholson
- J. Nicholson is director, NYU Health Sciences Library, NYU Grossman School of Medicine, NYU Langone Health, New York, New York
| | - Hilary Fairbrother
- H. Fairbrother is associate professor, Department of Emergency Medicine, Memorial Hermann-Texas Medical Center, Houston, Texas
| | - Frank W J M Smeenk
- F.W.J.M. Smeenk is professor, Department of Educational Development and Research, Maastricht University, Maastricht, and respiratory specialist, Catharina Hospital, Eindhoven, The Netherlands
| | - Martin V Pusic
- M.V. Pusic is associate professor of pediatrics and associate professor of emergency medicine, Harvard Medical School, Boston, Massachusetts; ORCID: https://orcid.org/0000-0001-5236-6598
| |
Collapse
|
4
|
Cook DA, Oh SY, Pusic MV. Assessments of Physicians' Electrocardiogram Interpretation Skill: A Systematic Review. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2022; 97:603-615. [PMID: 33913438 DOI: 10.1097/acm.0000000000004140] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
PURPOSE To identify features of instruments, test procedures, study design, and validity evidence in published studies of electrocardiogram (ECG) skill assessments. METHOD The authors conducted a systematic review, searching MEDLINE, Embase, Cochrane CENTRAL, PsycINFO, CINAHL, ERIC, and Web of Science databases in February 2020 for studies that assessed the ECG interpretation skill of physicians or medical students. Two authors independently screened articles for inclusion and extracted information on test features, study design, risk of bias, and validity evidence. RESULTS The authors found 85 eligible studies. Participants included medical students (42 studies), postgraduate physicians (48 studies), and practicing physicians (13 studies). ECG selection criteria were infrequently reported: 25 studies (29%) selected single-diagnosis or straightforward ECGs; 5 (6%) selected complex cases. ECGs were selected by generalists (15 studies [18%]), cardiologists (10 studies [12%]), or unspecified experts (4 studies [5%]). The median number of ECGs per test was 10. The scoring rubric was defined by 2 or more experts in 32 studies (38%), by 1 expert in 5 (6%), and using clinical data in 5 (6%). Scoring was performed by a human rater in 34 studies (40%) and by computer in 7 (8%). Study methods were appraised as low risk of selection bias in 16 studies (19%), participant flow bias in 59 (69%), instrument conduct and scoring bias in 20 (24%), and applicability problems in 56 (66%). Evidence of test score validity was reported infrequently, namely evidence of content (39 studies [46%]), internal structure (11 [13%]), relations with other variables (10 [12%]), response process (2 [2%]), and consequences (3 [4%]). CONCLUSIONS ECG interpretation skill assessments consist of idiosyncratic instruments that are too short, composed of items of obscure provenance, with incompletely specified answers, graded by individuals with underreported credentials, yielding scores with limited interpretability. The authors suggest several best practices.
Collapse
Affiliation(s)
- David A Cook
- D.A. Cook is professor of medicine and medical education, director of education science, Office of Applied Scholarship and Education Science, research chair, Mayo Clinic Rochester Multidisciplinary Simulation Center, and consultant, Division of General Internal Medicine, Mayo Clinic College of Medicine and Science, Rochester, Minnesota; ORCID: https://orcid.org/0000-0003-2383-4633
| | - So-Young Oh
- S.-Y. Oh is assistant director, Program for Digital Learning, Institute for Innovations in Medical Education, NYU Grossman School of Medicine, NYU Langone Health, New York, New York; ORCID: https://orcid.org/0000-0002-4640-3695
| | - Martin V Pusic
- M.V. Pusic is associate professor of emergency medicine and pediatrics, Department of Emergency Medicine, NYU Grossman School of Medicine, New York, New York; ORCID: https://orcid.org/0000-0001-5236-6598
| |
Collapse
|
5
|
Ko Y, Issenberg SB, Roh YS. Effects of peer learning on nursing students' learning outcomes in electrocardiogram education. NURSE EDUCATION TODAY 2022; 108:105182. [PMID: 34741917 DOI: 10.1016/j.nedt.2021.105182] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/14/2021] [Revised: 10/04/2021] [Accepted: 10/18/2021] [Indexed: 06/13/2023]
Abstract
BACKGROUND Nurses should have the ability to interpret electrocardiograms (ECGs) quickly and accurately, but their ECG interpretation skills may be suboptimal. The best evidence for effective teaching methods is lacking. OBJECTIVES This study aimed to compare the effects of peer and self-directed individual learning methods on nursing students' learning flow, interpretation skills, and self-confidence in web-based ECG education. DESIGN This study employed a nonequivalent control group with a pretest-posttest design. SETTINGS This study was conducted at two colleges of nursing in the Republic of Korea. PARTICIPANTS Nursing students were conveniently assigned to either a peer learning group (n = 45) or a self-directed individual learning group (n = 51). METHODS A self-administered questionnaire was used to measure the nursing students' learning flow and self-confidence in ECG rhythm interpretation. ECG interpretation skills were measured using a web-based interpretation skills test. Data were analyzed using a paired t-test and a two-sample t-test. RESULTS Nursing students in both groups showed improved learning flow, interpretation skills, and self-confidence after ECG education compared with before learning. However, there were no significant pretest-posttest differences in learning flow, interpretation skills, or self-confidence between the two groups. CONCLUSIONS Peer learning was as effective as self-directed individual learning in improving nursing students' learning flow, interpretations skills, and self-confidence in web-based education. Nurse educators should educate nursing students to have optimal ECG interpretation abilities, and web-based peer or individual learning are effective education methods.
Collapse
Affiliation(s)
- Youngmin Ko
- Graduate School of Nursing and Health Professions, Chung-Ang University, Seoul, Republic of Korea
| | | | - Young Sook Roh
- Red Cross College of Nursing, Chung-Ang University, 84 Heukseok-ro Dongjak-gu, Seoul 06974, Republic of Korea.
| |
Collapse
|
6
|
Utilizing Supplemental Online Modules for Physician Assistant Student Electrocardiogram Interpretation Training. J Physician Assist Educ 2021; 32:242-247. [PMID: 34817428 DOI: 10.1097/jpa.0000000000000391] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
PURPOSE The literature suggests that graduating medical and physician assistant (PA) students lack competency in electrocardiogram (ECG) interpretation. This project aimed to determine whether use of perceptual adaptive learning modules (PALMs) would improve PA students' ECG interpretation, alter self-perceptions of their ECG education, or both. METHODS PALMs were incorporated into the PA curriculum after lecture-based ECG learning. Students' pretest, posttest, and delayed-posttest scores were then compared. Students' ability to correctly interpret ECGs (accuracy) and the percentage of ECGs accurately interpreted within 15 seconds or less (fluency) also were evaluated. Finally, students' perceptions of PALMs and overall ECG training were assessed. RESULTS PALM training improved ECG interpretation accuracy and fluency (p < .0001), as well as delayed-posttest accuracy and fluency (p < .0001). The majority of student respondents felt supplemental training enhanced their learning. CONCLUSION These perception results combined with data on ECG interpretation improvement supports continued use of supplemental PALMs in ECG interpretation training.
Collapse
|
7
|
Viljoen CA, Millar RS, Manning K, Burch VC. Effectiveness of blended learning versus lectures alone on ECG analysis and interpretation by medical students. BMC MEDICAL EDUCATION 2020; 20:488. [PMID: 33272253 PMCID: PMC7713171 DOI: 10.1186/s12909-020-02403-y] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/07/2020] [Accepted: 11/24/2020] [Indexed: 05/29/2023]
Abstract
BACKGROUND Most medical students lack confidence and are unable to accurately interpret ECGs. Thus, better methods of ECG instruction are being sought. Current literature indicates that the use of e-learning for ECG analysis and interpretation skills (ECG competence) is not superior to lecture-based teaching. We aimed to assess whether blended learning (lectures supplemented with the use of a web application) resulted in better acquisition and retention of ECG competence in medical students, compared to conventional teaching (lectures alone). METHODS Two cohorts of fourth-year medical students were studied prospectively. The conventional teaching cohort (n = 67) attended 4 hours of interactive lectures, covering the basic principles of Electrocardiography, waveform abnormalities and arrhythmias. In addition to attending the same lectures, the blended learning cohort (n = 64) used a web application that facilitated deliberate practice of systematic ECG analysis and interpretation, with immediate feedback. All participants completed three tests: pre-intervention (assessing baseline ECG competence at start of clinical clerkship), immediate post-intervention (assessing acquisition of ECG competence at end of six-week clinical clerkship) and delayed post-intervention (assessing retention of ECG competence 6 months after clinical clerkship, without any further ECG training). Diagnostic accuracy and uncertainty were assessed in each test. RESULTS The pre-intervention test scores were similar for blended learning and conventional teaching cohorts (mean 31.02 ± 13.19% versus 31.23 ± 11.52% respectively, p = 0.917). While all students demonstrated meaningful improvement in ECG competence after teaching, blended learning was associated with significantly better scores, compared to conventional teaching, in immediate (75.27 ± 16.22% vs 50.27 ± 17.10%, p < 0.001; Cohen's d = 1.58), and delayed post-intervention tests (57.70 ± 18.54% vs 37.63 ± 16.35%, p < 0.001; Cohen's d = 1.25). Although diagnostic uncertainty decreased after ECG training in both cohorts, blended learning was associated with better confidence in ECG analysis and interpretation. CONCLUSION Blended learning achieved significantly better levels of ECG competence and confidence amongst medical students than conventional ECG teaching did. Although medical students underwent significant attrition of ECG competence without ongoing training, blended learning also resulted in better retention of ECG competence than conventional teaching. Web applications encouraging a stepwise approach to ECG analysis and enabling deliberate practice with feedback may, therefore, be a useful adjunct to lectures for teaching Electrocardiography.
Collapse
Affiliation(s)
- Charle André Viljoen
- Division of Cardiology, Groote Schuur Hospital, Faculty of Health Sciences, University of Cape Town, Observatory, Cape Town, 7925, South Africa.
- Department of Medicine, Groote Schuur Hospital, Faculty of Health Sciences, University of Cape Town, Observatory, Cape Town, 7925, South Africa.
- Hatter Institute for Cardiovascular Research in Africa, Faculty of Health Sciences, University of Cape Town, Observatory, Cape Town, 7925, South Africa.
| | - Rob Scott Millar
- Division of Cardiology, Groote Schuur Hospital, Faculty of Health Sciences, University of Cape Town, Observatory, Cape Town, 7925, South Africa
- Department of Medicine, Groote Schuur Hospital, Faculty of Health Sciences, University of Cape Town, Observatory, Cape Town, 7925, South Africa
| | - Kathryn Manning
- Department of Medicine, Groote Schuur Hospital, Faculty of Health Sciences, University of Cape Town, Observatory, Cape Town, 7925, South Africa
| | - Vanessa Celeste Burch
- Department of Medicine, Groote Schuur Hospital, Faculty of Health Sciences, University of Cape Town, Observatory, Cape Town, 7925, South Africa
| |
Collapse
|
8
|
Cook DA, Oh SY, Pusic MV. Accuracy of Physicians' Electrocardiogram Interpretations: A Systematic Review and Meta-analysis. JAMA Intern Med 2020; 180:1461-1471. [PMID: 32986084 PMCID: PMC7522782 DOI: 10.1001/jamainternmed.2020.3989] [Citation(s) in RCA: 54] [Impact Index Per Article: 13.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/26/2022]
Abstract
IMPORTANCE The electrocardiogram (ECG) is the most common cardiovascular diagnostic test. Physicians' skill in ECG interpretation is incompletely understood. OBJECTIVES To identify and summarize published research on the accuracy of physicians' ECG interpretations. DATA SOURCES A search of PubMed/MEDLINE, Embase, Cochrane CENTRAL (Central Register of Controlled Trials), PsycINFO, CINAHL (Cumulative Index to Nursing and Allied Health), ERIC (Education Resources Information Center), and Web of Science was conducted for articles published from database inception to February 21, 2020. STUDY SELECTION Of 1138 articles initially identified, 78 studies that assessed the accuracy of physicians' or medical students' ECG interpretations in a test setting were selected. DATA EXTRACTION AND SYNTHESIS Data on study purpose, participants, assessment features, and outcomes were abstracted, and methodological quality was appraised with the Medical Education Research Study Quality Instrument. Results were pooled using random-effects meta-analysis. MAIN OUTCOMES AND MEASURES Accuracy of ECG interpretation. RESULTS Of 1138 studies initially identified, 78 assessed the accuracy of ECG interpretation. Across all training levels, the median accuracy was 54% (interquartile range [IQR], 40%-66%; n = 62 studies) on pretraining assessments and 67% (IQR, 55%-77%; n = 47 studies) on posttraining assessments. Accuracy varied widely across studies. The pooled accuracy for pretraining assessments was 42.0% (95% CI, 34.3%-49.6%; n = 24 studies; I2 = 99%) for medical students, 55.8% (95% CI, 48.1%-63.6%; n = 37 studies; I2 = 96%) for residents, 68.5% (95% CI, 57.6%-79.5%; n = 10 studies; I2 = 86%) for practicing physicians, and 74.9% (95% CI, 63.2%-86.7%; n = 8 studies; I2 = 22%) for cardiologists. CONCLUSIONS AND RELEVANCE Physicians at all training levels had deficiencies in ECG interpretation, even after educational interventions. Improved education across the practice continuum appears warranted. Wide variation in outcomes could reflect real differences in training or skill or differences in assessment design.
Collapse
Affiliation(s)
- David A Cook
- Office of Applied Scholarship and Education Science and Division of General Internal Medicine, Mayo Clinic College of Medicine and Science, Rochester, Minnesota
| | - So-Young Oh
- Institute for Innovations in Medical Education, NYU Grossman School of Medicine, NYU Langone Health, New York, New York
| | - Martin V Pusic
- Department of Emergency Medicine, NYU Grossman School of Medicine, NYU Langone Health, New York, New York
| |
Collapse
|
9
|
Mabuza LH, Mntla PS. Generalist practitioners' self-rating and competence in electrocardiogram interpretation in South Africa. Afr J Prim Health Care Fam Med 2020; 12:e1-e7. [PMID: 32896150 PMCID: PMC7479388 DOI: 10.4102/phcfm.v12i1.2421] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2020] [Revised: 06/01/2020] [Accepted: 06/03/2020] [Indexed: 01/08/2023] Open
Abstract
Background Electrocardiogram (ECG) is the only practical, non-invasive method of recording and analysing cardiac abnormalities. It enables a primary healthcare (PHC) clinician to detect cardiac and non-cardiac abnormalities, some potentially life-threatening. Their early detection could save a patient’s life. Aim The aim of this study was to evaluate the competence of generalist practitioners in ECG interpretation. Setting This study was conducted at the Annual Refresher Course, Council for Scientific and Industrial Research (CSIR), Pretoria. Methods A cross-sectional study was conducted amongst 93 generalist practitioners, using a self-administered questionnaire containing 20 ECG tracings, commonly encountered in PHC. The tracings were categorised into primary ECG parameters, ECG emergencies and common ECG abnormalities. Competence was determined by the generalist practitioner’s number of correctly interpreted ECG tracings. Data associations were computed using the Fisher’s exact test. Statistical significance was set at p ≤ 0.05. Results Correct heart rate calculation was achieved by 14/83 (16.9%), ECG rhythm by 7/83 (8.4%), acute antero-septal myocardial infarction (MI) by 29/83 (34.9%), atrial fibrillation by 19/83 (22.9%) and cute inferior MI by 22/83 (26.5%) generalist practitioners. No correlation was found between the practitioners’ number of years in practice and competence in ECG interpretation (p > 0.05). The total number of correct answers achieved by all practitioners was 274/1860 (14.7%). Conclusion The generalist practitioners had poor competency on ECG interpretation regardless of the number of years in practice. Their poor self-rating corresponded with the number of correct answers they provided. There is a need for continuous education in ECG interpretation.
Collapse
Affiliation(s)
- Langalibalele H Mabuza
- Department of Family Medicine and Primary Health Care, Faculty of Health Sciences, Sefako Makgatho Health Sciences University, Pretoria.
| | | |
Collapse
|
10
|
Rabbitt L, Byrne D, O’Connor P, Gorecka M, Jacobsen A, Lydon S. A pragmatic randomised controlled trial of SAFMEDS to produce fluency in interpretation of electrocardiograms. BMC MEDICAL EDUCATION 2020; 20:102. [PMID: 32234041 PMCID: PMC7110657 DOI: 10.1186/s12909-020-02021-8] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/10/2019] [Accepted: 03/24/2020] [Indexed: 06/11/2023]
Abstract
BACKGROUND SAFMEDS (Say-All-Fast-Minute-Every-Day-Shuffled) is a flashcard-type behavioural instructional methodology, involving one-minute learning trials that function both as practice and assessment, used to facilitate the development of fluency in a behaviour. The primary research question was whether SAFMEDS engenders improvement in performance beyond that conferred by usual teaching. A secondary research question was whether SAFMEDS is an effective method of producing fluency in Electrocardiogram (ECG) interpretation. METHODS A pilot study was conducted to determine sample size required to power the pragmatic randomised controlled trial (RCT). For the subsequent RCT, participants were randomly assigned to a "usual teaching" control group (n = 14) or the SAFMEDS intervention group (n = 13), with the recognition of 15 cardiac conditions on ECGs (e.g., atrial fibrillation, complete heart block) targeted. Intervention group participants' performance was tracked over eight weeks as they worked towards achieving the fluency criterion. Percentage accuracy in ECG interpretation was assessed at baseline and post-test for both groups. An ANCOVA was conducted to assess for differences in the performance of the intervention and control group at post-test while controlling for the baseline performance of participants. At post-test, the numbers of participants achieving fluency within the intervention group was examined. RESULTS A large effect size of SAFMEDS (partial η2 = .67) was identified when controlling for the effects of baseline performance. At post-test, the intervention group significantly outperformed (M = 61.5%; SD = 12.1%) the control group (M = 31.6%; SD = 12.5%, p < .001). In total, 7 of 13 intervention group participants achieved fluency. Participants required an average of 51.9 one-minute trials (SD = 18.8) to achieve fluency. CONCLUSIONS SAFMEDS offers a useful adjunct to usual teaching within medical education. Further research could assess whether learning retains, is stable, and transfers to clinical practice.
Collapse
Affiliation(s)
- Louise Rabbitt
- School of Medicine, National University of Ireland Galway, 1 Distillery Road, Galway, Ireland
| | - Dara Byrne
- School of Medicine, National University of Ireland Galway, 1 Distillery Road, Galway, Ireland
- Irish Centre for Applied Patient Safety and Simulation, National University of Ireland Galway, Galway, Ireland
| | - Paul O’Connor
- Irish Centre for Applied Patient Safety and Simulation, National University of Ireland Galway, Galway, Ireland
- School of Medicine, Department of General Practice, National University of Ireland Galway, Galway, Ireland
| | - Miroslawa Gorecka
- School of Medicine, National University of Ireland Galway, 1 Distillery Road, Galway, Ireland
- St James’s Hospital, James’s St North, Ushers Co., Dublin, Ireland
| | - Alan Jacobsen
- Osler Medical Residency, Johns Hopkins Hospital, 1800 Orleans St, Baltimore, MD 21287 USA
| | - Sinéad Lydon
- School of Medicine, National University of Ireland Galway, 1 Distillery Road, Galway, Ireland
- Irish Centre for Applied Patient Safety and Simulation, National University of Ireland Galway, Galway, Ireland
| |
Collapse
|
11
|
Viljoen CA, Scott Millar R, Engel ME, Shelton M, Burch V. Is computer-assisted instruction more effective than other educational methods in achieving ECG competence amongst medical students and residents? A systematic review and meta-analysis. BMJ Open 2019; 9:e028800. [PMID: 31740464 PMCID: PMC6886915 DOI: 10.1136/bmjopen-2018-028800] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/27/2022] Open
Abstract
OBJECTIVES It remains unclear whether computer-assisted instruction (CAI) is more effective than other teaching methods in acquiring and retaining ECG competence among medical students and residents. DESIGN This systematic review and meta-analysis followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. DATA SOURCES Electronic literature searches of PubMed, databases via EBSCOhost, Scopus, Web of Science, Google Scholar and grey literature were conducted on 28 November 2017. We subsequently reviewed the citation indexes for articles identified by the search. ELIGIBILITY CRITERIA Studies were included if a comparative research design was used to evaluate the efficacy of CAI versus other methods of ECG instruction, as determined by the acquisition and/or retention of ECG competence of medical students and/or residents. DATA EXTRACTION AND SYNTHESIS Two reviewers independently extracted data from all eligible studies and assessed the risk of bias. After duplicates were removed, 559 papers were screened. Thirteen studies met the eligibility criteria. Eight studies reported sufficient data to be included in the meta-analysis. RESULTS In all studies, CAI was compared with face-to-face ECG instruction. There was a wide range of computer-assisted and face-to-face teaching methods. Overall, the meta-analysis found no significant difference in acquired ECG competence between those who received computer-assisted or face-to-face instruction. However, subanalyses showed that CAI in a blended learning context was better than face-to-face teaching alone, especially if trainees had unlimited access to teaching materials and/or deliberate practice with feedback. There was no conclusive evidence that CAI was better than face-to-face teaching for longer-term retention of ECG competence. CONCLUSION CAI was not better than face-to-face ECG teaching. However, this meta-analysis was constrained by significant heterogeneity amongst studies. Nevertheless, the finding that blended learning is more effective than face-to-face ECG teaching is important in the era of increased implementation of e-learning. PROSPERO REGISTRATION NUMBER CRD42017067054.
Collapse
Affiliation(s)
| | | | - Mark E Engel
- Medicine, Unversity of Cape Town, Cape Town, South Africa
| | - Mary Shelton
- Health Sciences Library, University of Cape Town, Cape Town, South Africa
| | - Vanessa Burch
- Medicine, Unversity of Cape Town, Cape Town, South Africa
| |
Collapse
|
12
|
Hatala R, Gutman J, Lineberry M, Triola M, Pusic M. How well is each learner learning? Validity investigation of a learning curve-based assessment approach for ECG interpretation. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2019; 24:45-63. [PMID: 30171512 DOI: 10.1007/s10459-018-9846-x] [Citation(s) in RCA: 26] [Impact Index Per Article: 5.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/18/2017] [Accepted: 07/26/2018] [Indexed: 05/16/2023]
Abstract
Learning curves can support a competency-based approach to assessment for learning. When interpreting repeated assessment data displayed as learning curves, a key assessment question is: "How well is each learner learning?" We outline the validity argument and investigation relevant to this question, for a computer-based repeated assessment of competence in electrocardiogram (ECG) interpretation. We developed an on-line ECG learning program based on 292 anonymized ECGs collected from an electronic patient database. After diagnosing each ECG, participants received feedback including the computer interpretation, cardiologist's annotation, and correct diagnosis. In 2015, participants from a single institution, across a range of ECG skill levels, diagnosed at least 60 ECGs. We planned, collected and evaluated validity evidence under each inference of Kane's validity framework. For Scoring, three cardiologists' kappa for agreement on correct diagnosis was 0.92. There was a range of ECG difficulty across and within each diagnostic category. For Generalization, appropriate sampling was reflected in the inclusion of a typical clinical base rate of 39% normal ECGs. Applying generalizability theory presented unique challenges. Under the Extrapolation inference, group learning curves demonstrated expert-novice differences, performance increased with practice and the incremental phase of the learning curve reflected ongoing, effortful learning. A minority of learners had atypical learning curves. We did not collect Implications evidence. Our results support a preliminary validity argument for a learning curve assessment approach for repeated ECG interpretation with deliberate and mixed practice. This approach holds promise for providing educators and researchers, in collaboration with their learners, with deeper insights into how well each learner is learning.
Collapse
Affiliation(s)
- Rose Hatala
- Department of Medicine, St. Paul's Hospital, University of British Columbia, Suite 5907, Burrard Bldg, 1081 Burrard St, Vancouver, BC, V6Z 1Y6, Canada.
| | - Jacqueline Gutman
- Institute for Innovations in Medical Education, New York University School of Medicine, New York, NY, USA
| | - Matthew Lineberry
- Zamierowski Institute for Experiential Learning, University of Kansas Medical Center and Health System, Kansas City, KS, USA
| | - Marc Triola
- Institute for Innovations in Medical Education, New York University School of Medicine, New York, NY, USA
| | - Martin Pusic
- Institute for Innovations in Medical Education, New York University School of Medicine, New York, NY, USA
- Ronald O. Perelman Department of Emergency Medicine, New York University School of Medicine, New York, NY, USA
| |
Collapse
|
13
|
Antiperovitch P, Zareba W, Steinberg JS, Bacharova L, Tereshchenko LG, Farre J, Nikus K, Ikeda T, Baranchuk A. Proposed In-Training Electrocardiogram Interpretation Competencies for Undergraduate and Postgraduate Trainees. J Hosp Med 2018; 13:185-193. [PMID: 29154379 DOI: 10.12788/jhm.2876] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/20/2022]
Abstract
Despite its importance in everyday clinical practice, the ability of physicians to interpret electrocardiograms (ECGs) is highly variable. ECG patterns are often misdiagnosed, and electrocardiographic emergencies are frequently missed, leading to adverse patient outcomes. Currently, many medical education programs lack an organized curriculum and competency assessment to ensure trainees master this essential skill. ECG patterns that were previously mentioned in literature were organized into groups from A to D based on their clinical importance and distributed among levels of training. Incremental versions of this organization were circulated among members of the International Society of Electrocardiology and the International Society of Holter and Noninvasive Electrocardiology until complete consensus was reached. We present reasonably attainable ECG interpretation competencies for undergraduate and postgraduate trainees. Previous literature suggests that methods of teaching ECG interpretation are less important and can be selected based on the available resources of each education program and student preference. The evidence clearly favors summative trainee evaluation methods, which would facilitate learning and ensure that appropriate competencies are acquired. Resources should be allocated to ensure that every trainee reaches their training milestones and should ensure that no electrocardiographic emergency (class A condition) is ever missed. We hope that these guidelines will inform medical education programs and encourage them to allocate sufficient resources and develop organized curricula. Assessments must be in place to ensure trainees acquire the level-appropriate ECG interpretation skills that are required for safe clinical practice.
Collapse
Affiliation(s)
- Pavel Antiperovitch
- Department of Medicine, Kingston General Hospital, Queen's University, Kingston, Ontario, Canada
| | - Wojciech Zareba
- Department of Medicine, University of Rochester Medical Center, University of Rochester, Rochester, New York, USA
| | - Jonathan S Steinberg
- Department of Medicine, University of Rochester Medical Center, University of Rochester, Rochester, New York, USA
- Arrhythmia Center, Summit Medical Group, Short Hills, New Jersey, USA
| | | | - Larisa G Tereshchenko
- Knight Cardiovascular Institute, Oregon Health and Science University, Portland, Oregon, USA
| | - Jeronimo Farre
- Department of Cardiology, Fundación Jiménez Díaz University Hospital, Universidad Autónoma de Madrid, Madrid, Spain
| | - Kjell Nikus
- Heart Center, Tampere University Hospital, and Faculty of Medicine and Life Sciences, University of Tampere, Teiskontie, Finland
| | - Takanori Ikeda
- Department of Medicine, Toho University, Tokyo, Ota, Omorinishi, Japan
| | - Adrian Baranchuk
- Department of Medicine, Kingston General Hospital, Queen's University, Kingston, Ontario, Canada.
| |
Collapse
|
14
|
Kok EM, van Geel K, van Merriënboer JJG, Robben SGF. What We Do and Do Not Know about Teaching Medical Image Interpretation. Front Psychol 2017; 8:309. [PMID: 28316582 PMCID: PMC5334326 DOI: 10.3389/fpsyg.2017.00309] [Citation(s) in RCA: 20] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/27/2016] [Accepted: 02/20/2017] [Indexed: 11/13/2022] Open
Abstract
Educators in medical image interpretation have difficulty finding scientific evidence as to how they should design their instruction. We review and comment on 81 papers that investigated instructional design in medical image interpretation. We distinguish between studies that evaluated complete offline courses and curricula, studies that evaluated e-learning modules, and studies that evaluated specific educational interventions. Twenty-three percent of all studies evaluated the implementation of complete courses or curricula, and 44% of the studies evaluated the implementation of e-learning modules. We argue that these studies have encouraging results but provide little information for educators: too many differences exist between conditions to unambiguously attribute the learning effects to specific instructional techniques. Moreover, concepts are not uniformly defined and methodological weaknesses further limit the usefulness of evidence provided by these studies. Thirty-two percent of the studies evaluated a specific interventional technique. We discuss three theoretical frameworks that informed these studies: diagnostic reasoning, cognitive schemas and study strategies. Research on diagnostic reasoning suggests teaching students to start with non-analytic reasoning and subsequently applying analytic reasoning, but little is known on how to train non-analytic reasoning. Research on cognitive schemas investigated activities that help the development of appropriate cognitive schemas. Finally, research on study strategies supports the effectiveness of practice testing, but more study strategies could be applicable to learning medical image interpretation. Our commentary highlights the value of evaluating specific instructional techniques, but further evidence is required to optimally inform educators in medical image interpretation.
Collapse
Affiliation(s)
- Ellen M Kok
- Department of Educational Development and Research, School of Health Professions Education, Maastricht University Maastricht, Netherlands
| | - Koos van Geel
- Department of Educational Development and Research, School of Health Professions Education, Maastricht University Maastricht, Netherlands
| | - Jeroen J G van Merriënboer
- Department of Educational Development and Research, School of Health Professions Education, Maastricht University Maastricht, Netherlands
| | - Simon G F Robben
- Department of Radiology, Maastricht University Medical Centre Maastricht, Netherlands
| |
Collapse
|