1
|
Cruz A, Minda JP. The spacing effect in remote information-integration category learning. Mem Cognit 2024:10.3758/s13421-024-01569-w. [PMID: 38684557 DOI: 10.3758/s13421-024-01569-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/04/2024] [Indexed: 05/02/2024]
Abstract
The present experiments examined whether the temporal distribution of procedural category learning experiences would impact learning outcomes. Participants completed the remote category learning experiments on a smartphone in one of two learning conditions: massed or distributed. Consistent with expectations, distributed learners in both experiments reached higher accuracy levels than massed learners. In Experiment 1 the effect disappeared after accounting for reaction time differences, suggesting that it was driven by attentional mechanisms. In Experiment 2, the spacing advantage was only present for previously studied items during a post-learning test, suggesting a role of consolidation. In both experiments, it seems likely that temporal spacing helped participants discover the optimal information-integration categorization strategy. These results suggest that adult category learning is facilitated by temporal spacing. Future work may further explore the effects of temporal and contextual distinctiveness of learning experiences on category learning outcomes.
Collapse
Affiliation(s)
- Anthony Cruz
- Department of Psychology, Western University, Perth Drive, London, ON, N6G 1E1, Canada.
| | - John Paul Minda
- Department of Psychology, Western University, Perth Drive, London, ON, N6G 1E1, Canada
| |
Collapse
|
2
|
Kaye MG, Kwiatkowski AV, Khan HA, Yastynovich Y, Graham SP, Meka J. Designing an ECG curriculum for residents: Evidence-based approaches to improving resident ECG interpretation skills. J Electrocardiol 2024; 82:64-68. [PMID: 38039698 DOI: 10.1016/j.jelectrocard.2023.10.012] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/12/2023] [Revised: 10/26/2023] [Accepted: 10/30/2023] [Indexed: 12/03/2023]
Abstract
Residents enter their training with variable comfort and competency in electrocardiogram (ECG) interpretation. Accurately interpreting an ECG is a fundamental skill in medicine and resident physicians would benefit from a longitudinal, dedicated ECG curriculum as part of their training to enhance interpretation skills and improve patient outcomes. Educators currently employ a wide array of methodologies to teach their trainees proper ECG interpretation skills, with no single modality established as the gold-standard for teaching this crucial skill. We present evidence-based guidance on how educators may develop and implement an effective ECG interpretation curriculum as part of residency training.
Collapse
Affiliation(s)
- Matthew G Kaye
- Division of General Internal Medicine, Department of Medicine, State University of New York (SUNY) at Buffalo, Buffalo, NY, USA.
| | - Alysia V Kwiatkowski
- Jacobs School of Medicine and Biomedical Sciences, State University of New York (SUNY) at Buffalo, Buffalo, NY, USA
| | - Hassan A Khan
- Division of Cardiovascular Medicine, Department of Medicine, State University of New York (SUNY) at Buffalo, Buffalo, NY, USA
| | | | - Susan P Graham
- Division of Cardiovascular Medicine, Department of Medicine, State University of New York (SUNY) at Buffalo, Buffalo, NY, USA
| | - Jennifer Meka
- Jacobs School of Medicine and Biomedical Sciences, State University of New York (SUNY) at Buffalo, Buffalo, NY, USA
| |
Collapse
|
3
|
Ardekani A, Hider AM, Rastegar Kazerooni AA, Hosseini SA, Roshanshad A, Amini M, Kojuri J. Surfing the clinical trials of ECG teaching to medical students: A systematic review. JOURNAL OF EDUCATION AND HEALTH PROMOTION 2023; 12:107. [PMID: 37288415 PMCID: PMC10243439 DOI: 10.4103/jehp.jehp_780_22] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/06/2022] [Accepted: 07/12/2022] [Indexed: 06/09/2023]
Abstract
Interpreting an electrocardiogram (ECG) is crucial for every physician. The physician's competency in ECG interpretation needs to be improved at any stage of medical education. The aim of the present study was to review the currently published clinical trials of ECG teaching to medical students and provide suggestions for future works. On May 1, 2022, PubMed, Scopus, Web of Science, Google Scholar, and ERIC were searched to retrieve relevant articles on clinical trials of ECG teaching to medical students. The quality of the included studies was assessed utilizing the Buckley et al. criteria. The screening, data extraction, and quality appraisal processes were duplicated independently. In case of disagreements, consultation with a third author was put forth. In total, 861 citations were found in the databases. After screening abstracts and full texts, 23 studies were deemed eligible. The majority of the studies were of good quality. Peer teaching (7 studies), self-directed learning (6 studies), web-based learning (10 studies), and various assessment modalities (3 studies) comprised the key themes of the studies. Various methods of ECG teaching were encountered in the reviewed studies. Future studies in ECG training should focus on novel and creative teaching methods, the extent to which self-directed learning can be effective, the utility of peer teaching, and the implications of computer-assisted ECG interpretation (e.g., artificial intelligence) for medical students. Long-term knowledge retention assessment studies based on different approaches integrated with clinical outcomes could be beneficial in determining the most efficient modalities.
Collapse
Affiliation(s)
- Ali Ardekani
- School of Medicine, Shiraz University of Medical Sciences, Shiraz, Iran
- Student Research Committee, Shiraz University of Medical Sciences, Shiraz, Iran
| | - Ahmad M. Hider
- University of Michigan Medical School, Ann Arbor, MI, USA
| | | | | | | | - Mitra Amini
- Clinical Education Research Center, Shiraz University of Medical Sciences, Shiraz, Iran
| | - Javad Kojuri
- Clinical Education Research Center, Shiraz University of Medical Sciences, Shiraz, Iran
| |
Collapse
|
4
|
Krimmel-Morrison JD, Dhaliwal G. How to Keep Training-After Residency Training. J Gen Intern Med 2022; 37:1524-1528. [PMID: 35226236 PMCID: PMC9086009 DOI: 10.1007/s11606-021-07240-3] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/24/2021] [Accepted: 10/20/2021] [Indexed: 11/30/2022]
Abstract
Lifelong learning in medicine is an important skill and ethical obligation, but many residents do not feel prepared to be effective self-directed learners when training ends. The learning sciences offer evidence to guide self-directed learning, but these insights have not been integrated into a practical and actionable plan for residents to improve their clinical knowledge and reasoning. We encourage residents to establish a self-directed learning plan, just as an athlete employs a training plan in the pursuit of excellence. We highlight four evidence-based learning principles (spaced practice, mixed practice, retrieval practice, and feedback) and four training strategies comprising a weekly training plan: case tracking, simulated cases, quizzing, and new evidence integration. We provide tips for residents to implement and refine their approach and discuss how residency programs can foster these routines and habits. By optimizing their scarce self-directed learning time with a training plan, residents may enhance patient care and their career satisfaction through their pursuit of clinical mastery.
Collapse
Affiliation(s)
- Jeffrey D Krimmel-Morrison
- Division of General Internal Medicine, Department of Medicine, University of Washington, Seattle, WA, 98195-6420, USA.
| | - Gurpreet Dhaliwal
- Department of Medicine, University of California, San Francisco and Medical Service, San Francisco VA Medical Center, San Francisco, CA, USA
| |
Collapse
|
5
|
Oh SY, Cook DA, Van Gerven PWM, Nicholson J, Fairbrother H, Smeenk FWJM, Pusic MV. Physician Training for Electrocardiogram Interpretation: A Systematic Review and Meta-Analysis. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2022; 97:593-602. [PMID: 35086115 DOI: 10.1097/acm.0000000000004607] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/14/2023]
Abstract
PURPOSE Using electrocardiogram (ECG) interpretation as an example of a widely taught diagnostic skill, the authors conducted a systematic review and meta-analysis to demonstrate how research evidence on instruction in diagnosis can be synthesized to facilitate improvement of educational activities (instructional modalities, instructional methods, and interpretation approaches), guide the content and specificity of such activities, and provide direction for research. METHOD The authors searched PubMed/MEDLINE, Embase, Cochrane CENTRAL, PsycInfo, CINAHL, ERIC, and Web of Science databases through February 21, 2020, for empirical investigations of ECG interpretation training enrolling medical students, residents, or practicing physicians. They appraised study quality with the Medical Education Research Study Quality Instrument and pooled standardized mean differences (SMDs) using random effects meta-analysis. RESULTS Of 1,002 articles identified, 59 were included (enrolling 17,251 participants). Among 10 studies comparing instructional modalities, 8 compared computer-assisted and face-to-face instruction, with pooled SMD 0.23 (95% CI, 0.09, 0.36) indicating a small, statistically significant difference favoring computer-assisted instruction. Among 19 studies comparing instructional methods, 5 evaluated individual versus group training (pooled SMD -0.35 favoring group study [95% CI, -0.06, -0.63]), 4 evaluated peer-led versus faculty-led instruction (pooled SMD 0.38 favoring peer instruction [95% CI, 0.01, 0.74]), and 4 evaluated contrasting ECG features (e.g., QRS width) from 2 or more diagnostic categories versus routine examination of features within a single ECG or diagnosis (pooled SMD 0.23 not significantly favoring contrasting features [95% CI, -0.30, 0.76]). Eight studies compared ECG interpretation approaches, with pooled SMD 0.92 (95% CI, 0.48, 1.37) indicating a large, statistically significant effect favoring more systematic interpretation approaches. CONCLUSIONS Some instructional interventions appear to improve learning in ECG interpretation; however, many evidence-based instructional strategies are insufficiently investigated. The findings may have implications for future research and design of training to improve skills in ECG interpretation and other types of visual diagnosis.
Collapse
Affiliation(s)
- So-Young Oh
- S.-Y. Oh is assistant director, Program for Digital Learning, Institute for Innovations in Medical Education, NYU Grossman School of Medicine, NYU Langone Health, New York, New York; ORCID: https://orcid.org/0000-0002-4640-3695
| | - David A Cook
- D.A. Cook is professor of medicine and medical education, director of education science, Office of Applied Scholarship and Education Science, research chair, Mayo Clinic Rochester Multidisciplinary Simulation Center, and consultant, Division of General Internal Medicine, Mayo Clinic College of Medicine and Science, Rochester, Minnesota; ORCID: https://orcid.org/0000-0003-2383-4633
| | - Pascal W M Van Gerven
- P.W.M. Van Gerven is associate professor, Department of Educational Development and Research, Faculty of Health, Medicine and Life Sciences, Maastricht University, Maastricht, The Netherlands; ORCID: https://orcid.org/0000-0002-8363-2534
| | - Joseph Nicholson
- J. Nicholson is director, NYU Health Sciences Library, NYU Grossman School of Medicine, NYU Langone Health, New York, New York
| | - Hilary Fairbrother
- H. Fairbrother is associate professor, Department of Emergency Medicine, Memorial Hermann-Texas Medical Center, Houston, Texas
| | - Frank W J M Smeenk
- F.W.J.M. Smeenk is professor, Department of Educational Development and Research, Maastricht University, Maastricht, and respiratory specialist, Catharina Hospital, Eindhoven, The Netherlands
| | - Martin V Pusic
- M.V. Pusic is associate professor of pediatrics and associate professor of emergency medicine, Harvard Medical School, Boston, Massachusetts; ORCID: https://orcid.org/0000-0001-5236-6598
| |
Collapse
|
6
|
Cook DA, Oh SY, Pusic MV. Assessments of Physicians' Electrocardiogram Interpretation Skill: A Systematic Review. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2022; 97:603-615. [PMID: 33913438 DOI: 10.1097/acm.0000000000004140] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
PURPOSE To identify features of instruments, test procedures, study design, and validity evidence in published studies of electrocardiogram (ECG) skill assessments. METHOD The authors conducted a systematic review, searching MEDLINE, Embase, Cochrane CENTRAL, PsycINFO, CINAHL, ERIC, and Web of Science databases in February 2020 for studies that assessed the ECG interpretation skill of physicians or medical students. Two authors independently screened articles for inclusion and extracted information on test features, study design, risk of bias, and validity evidence. RESULTS The authors found 85 eligible studies. Participants included medical students (42 studies), postgraduate physicians (48 studies), and practicing physicians (13 studies). ECG selection criteria were infrequently reported: 25 studies (29%) selected single-diagnosis or straightforward ECGs; 5 (6%) selected complex cases. ECGs were selected by generalists (15 studies [18%]), cardiologists (10 studies [12%]), or unspecified experts (4 studies [5%]). The median number of ECGs per test was 10. The scoring rubric was defined by 2 or more experts in 32 studies (38%), by 1 expert in 5 (6%), and using clinical data in 5 (6%). Scoring was performed by a human rater in 34 studies (40%) and by computer in 7 (8%). Study methods were appraised as low risk of selection bias in 16 studies (19%), participant flow bias in 59 (69%), instrument conduct and scoring bias in 20 (24%), and applicability problems in 56 (66%). Evidence of test score validity was reported infrequently, namely evidence of content (39 studies [46%]), internal structure (11 [13%]), relations with other variables (10 [12%]), response process (2 [2%]), and consequences (3 [4%]). CONCLUSIONS ECG interpretation skill assessments consist of idiosyncratic instruments that are too short, composed of items of obscure provenance, with incompletely specified answers, graded by individuals with underreported credentials, yielding scores with limited interpretability. The authors suggest several best practices.
Collapse
Affiliation(s)
- David A Cook
- D.A. Cook is professor of medicine and medical education, director of education science, Office of Applied Scholarship and Education Science, research chair, Mayo Clinic Rochester Multidisciplinary Simulation Center, and consultant, Division of General Internal Medicine, Mayo Clinic College of Medicine and Science, Rochester, Minnesota; ORCID: https://orcid.org/0000-0003-2383-4633
| | - So-Young Oh
- S.-Y. Oh is assistant director, Program for Digital Learning, Institute for Innovations in Medical Education, NYU Grossman School of Medicine, NYU Langone Health, New York, New York; ORCID: https://orcid.org/0000-0002-4640-3695
| | - Martin V Pusic
- M.V. Pusic is associate professor of emergency medicine and pediatrics, Department of Emergency Medicine, NYU Grossman School of Medicine, New York, New York; ORCID: https://orcid.org/0000-0001-5236-6598
| |
Collapse
|
7
|
Viljoen CA, Millar RS, Hoevelmann J, Muller E, Hähnle L, Manning K, Naude J, Sliwa K, Burch VC. Utility of mobile learning in Electrocardiography. EUROPEAN HEART JOURNAL. DIGITAL HEALTH 2021; 2:202-214. [PMID: 36712390 PMCID: PMC9707875 DOI: 10.1093/ehjdh/ztab027] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/19/2020] [Revised: 01/18/2021] [Accepted: 02/18/2021] [Indexed: 02/01/2023]
Abstract
Aims Mobile learning is attributed to the acquisition of knowledge derived from accessing information on a mobile device. Although increasingly implemented in medical education, research on its utility in Electrocardiography remains sparse. In this study, we explored the effect of mobile learning on the accuracy of electrocardiogram (ECG) analysis and interpretation. Methods and results The study comprised 181 participants (77 fourth- and 69 sixth-year medical students, and 35 residents). Participants were randomized to analyse ECGs with a mobile learning strategy [either searching the Internet or using an ECG reference application (app)] or not. For each ECG, they provided their initial diagnosis, key supporting features, and final diagnosis consecutively. Two weeks later, they analysed the same ECGs, without access to any mobile device. ECG interpretation was more accurate when participants used the ECG app (56%), as compared to searching the Internet (50.3%) or neither (43.5%, P = 0.001). Importantly, mobile learning supported participants in revising their initial incorrect ECG diagnosis (ECG app 18.7%, Internet search 13.6%, no mobile device 8.4%, P < 0.001). However, whilst this was true for students, there was no significant difference amongst residents. Internet searches were only useful if participants identified the correct ECG features. The app was beneficial when participants searched by ECG features, but not by diagnosis. Using the ECG reference app required less time than searching the Internet (7:44 ± 4:13 vs. 9:14 ± 4:34, P < 0.001). Mobile learning gains were not sustained after 2 weeks. Conclusion Whilst mobile learning contributes to increased ECG diagnostic accuracy, the benefits were not sustained over time.
Collapse
Affiliation(s)
- Charle André Viljoen
- Division of Cardiology, New Main Building, Groote Schuur Hospital, University of Cape Town, Anzio Road, Observatory 7925, Cape Town, South Africa,Department of Medicine, Old Main Building, Groote Schuur Hospital, University of Cape Town, Anzio Road, Observatory 7925, Cape Town, South Africa,Hatter Institute for Cardiovascular Research in Africa and Cape Heart Institute, Chris Barnard Building, Faculty of Health Sciences, University of Cape Town, Observatory 7925, Cape Town, South Africa,Corresponding author. Tel: +27214046088,
| | - Rob Scott Millar
- Division of Cardiology, New Main Building, Groote Schuur Hospital, University of Cape Town, Anzio Road, Observatory 7925, Cape Town, South Africa,Department of Medicine, Old Main Building, Groote Schuur Hospital, University of Cape Town, Anzio Road, Observatory 7925, Cape Town, South Africa
| | - Julian Hoevelmann
- Hatter Institute for Cardiovascular Research in Africa and Cape Heart Institute, Chris Barnard Building, Faculty of Health Sciences, University of Cape Town, Observatory 7925, Cape Town, South Africa,Klinik für Innere Medizin III, Kardiologie, Angiologie und Internistische Intensivmedizin, Universitätsklinikum des Saarlandes, Saarland University Hospital, Homburg/Saar, Deutschland, Germany
| | - Elani Muller
- Hatter Institute for Cardiovascular Research in Africa and Cape Heart Institute, Chris Barnard Building, Faculty of Health Sciences, University of Cape Town, Observatory 7925, Cape Town, South Africa
| | - Lina Hähnle
- Hatter Institute for Cardiovascular Research in Africa and Cape Heart Institute, Chris Barnard Building, Faculty of Health Sciences, University of Cape Town, Observatory 7925, Cape Town, South Africa
| | - Kathryn Manning
- Department of Medicine, Old Main Building, Groote Schuur Hospital, University of Cape Town, Anzio Road, Observatory 7925, Cape Town, South Africa
| | - Jonathan Naude
- Department of Medicine, Old Main Building, Groote Schuur Hospital, University of Cape Town, Anzio Road, Observatory 7925, Cape Town, South Africa
| | - Karen Sliwa
- Hatter Institute for Cardiovascular Research in Africa and Cape Heart Institute, Chris Barnard Building, Faculty of Health Sciences, University of Cape Town, Observatory 7925, Cape Town, South Africa
| | - Vanessa Celeste Burch
- Department of Medicine, Old Main Building, Groote Schuur Hospital, University of Cape Town, Anzio Road, Observatory 7925, Cape Town, South Africa
| |
Collapse
|
8
|
Visual category learning: Navigating the intersection of rules and similarity. Psychon Bull Rev 2021; 28:711-731. [PMID: 33464550 DOI: 10.3758/s13423-020-01838-0] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 10/22/2020] [Indexed: 11/08/2022]
Abstract
Visual categorization is fundamental to expertise in a wide variety of disparate domains, such as radiology, art history, and quality control. The pervasive need to master visual categories has served as the impetus for a vast body of research dedicated to exploring how to enhance the learning process. The literature is clear on one point: no category learning technique is always superior to another. In the present review, we discuss how two factors moderate the efficacy of learning techniques. The first, category similarity, refers to the degree of featural overlap of exemplars. The second moderator, category type, concerns whether the features that define category membership can be mastered through learning processes that are implicit/non-verbal (information-integration categories) or explicit/verbal (rule-based categories). The literature on each moderator has been conducted almost entirely in isolation, such that their potential interaction remains underexplored. We address this gap in the literature by reviewing empirical and theoretical evidence that these two moderators jointly influence the efficacy of learning techniques.
Collapse
|
9
|
Cook DA, Oh SY, Pusic MV. Accuracy of Physicians' Electrocardiogram Interpretations: A Systematic Review and Meta-analysis. JAMA Intern Med 2020; 180:1461-1471. [PMID: 32986084 PMCID: PMC7522782 DOI: 10.1001/jamainternmed.2020.3989] [Citation(s) in RCA: 54] [Impact Index Per Article: 13.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/26/2022]
Abstract
IMPORTANCE The electrocardiogram (ECG) is the most common cardiovascular diagnostic test. Physicians' skill in ECG interpretation is incompletely understood. OBJECTIVES To identify and summarize published research on the accuracy of physicians' ECG interpretations. DATA SOURCES A search of PubMed/MEDLINE, Embase, Cochrane CENTRAL (Central Register of Controlled Trials), PsycINFO, CINAHL (Cumulative Index to Nursing and Allied Health), ERIC (Education Resources Information Center), and Web of Science was conducted for articles published from database inception to February 21, 2020. STUDY SELECTION Of 1138 articles initially identified, 78 studies that assessed the accuracy of physicians' or medical students' ECG interpretations in a test setting were selected. DATA EXTRACTION AND SYNTHESIS Data on study purpose, participants, assessment features, and outcomes were abstracted, and methodological quality was appraised with the Medical Education Research Study Quality Instrument. Results were pooled using random-effects meta-analysis. MAIN OUTCOMES AND MEASURES Accuracy of ECG interpretation. RESULTS Of 1138 studies initially identified, 78 assessed the accuracy of ECG interpretation. Across all training levels, the median accuracy was 54% (interquartile range [IQR], 40%-66%; n = 62 studies) on pretraining assessments and 67% (IQR, 55%-77%; n = 47 studies) on posttraining assessments. Accuracy varied widely across studies. The pooled accuracy for pretraining assessments was 42.0% (95% CI, 34.3%-49.6%; n = 24 studies; I2 = 99%) for medical students, 55.8% (95% CI, 48.1%-63.6%; n = 37 studies; I2 = 96%) for residents, 68.5% (95% CI, 57.6%-79.5%; n = 10 studies; I2 = 86%) for practicing physicians, and 74.9% (95% CI, 63.2%-86.7%; n = 8 studies; I2 = 22%) for cardiologists. CONCLUSIONS AND RELEVANCE Physicians at all training levels had deficiencies in ECG interpretation, even after educational interventions. Improved education across the practice continuum appears warranted. Wide variation in outcomes could reflect real differences in training or skill or differences in assessment design.
Collapse
Affiliation(s)
- David A Cook
- Office of Applied Scholarship and Education Science and Division of General Internal Medicine, Mayo Clinic College of Medicine and Science, Rochester, Minnesota
| | - So-Young Oh
- Institute for Innovations in Medical Education, NYU Grossman School of Medicine, NYU Langone Health, New York, New York
| | - Martin V Pusic
- Department of Emergency Medicine, NYU Grossman School of Medicine, NYU Langone Health, New York, New York
| |
Collapse
|
10
|
Viljoen CA, Scott Millar R, Engel ME, Shelton M, Burch V. Is computer-assisted instruction more effective than other educational methods in achieving ECG competence amongst medical students and residents? A systematic review and meta-analysis. BMJ Open 2019; 9:e028800. [PMID: 31740464 PMCID: PMC6886915 DOI: 10.1136/bmjopen-2018-028800] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/27/2022] Open
Abstract
OBJECTIVES It remains unclear whether computer-assisted instruction (CAI) is more effective than other teaching methods in acquiring and retaining ECG competence among medical students and residents. DESIGN This systematic review and meta-analysis followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. DATA SOURCES Electronic literature searches of PubMed, databases via EBSCOhost, Scopus, Web of Science, Google Scholar and grey literature were conducted on 28 November 2017. We subsequently reviewed the citation indexes for articles identified by the search. ELIGIBILITY CRITERIA Studies were included if a comparative research design was used to evaluate the efficacy of CAI versus other methods of ECG instruction, as determined by the acquisition and/or retention of ECG competence of medical students and/or residents. DATA EXTRACTION AND SYNTHESIS Two reviewers independently extracted data from all eligible studies and assessed the risk of bias. After duplicates were removed, 559 papers were screened. Thirteen studies met the eligibility criteria. Eight studies reported sufficient data to be included in the meta-analysis. RESULTS In all studies, CAI was compared with face-to-face ECG instruction. There was a wide range of computer-assisted and face-to-face teaching methods. Overall, the meta-analysis found no significant difference in acquired ECG competence between those who received computer-assisted or face-to-face instruction. However, subanalyses showed that CAI in a blended learning context was better than face-to-face teaching alone, especially if trainees had unlimited access to teaching materials and/or deliberate practice with feedback. There was no conclusive evidence that CAI was better than face-to-face teaching for longer-term retention of ECG competence. CONCLUSION CAI was not better than face-to-face ECG teaching. However, this meta-analysis was constrained by significant heterogeneity amongst studies. Nevertheless, the finding that blended learning is more effective than face-to-face ECG teaching is important in the era of increased implementation of e-learning. PROSPERO REGISTRATION NUMBER CRD42017067054.
Collapse
Affiliation(s)
| | | | - Mark E Engel
- Medicine, Unversity of Cape Town, Cape Town, South Africa
| | - Mary Shelton
- Health Sciences Library, University of Cape Town, Cape Town, South Africa
| | - Vanessa Burch
- Medicine, Unversity of Cape Town, Cape Town, South Africa
| |
Collapse
|