301
|
Morgulis Y, Kumar RK, Lindeman R, Velan GM. Impact on learning of an e-learning module on leukaemia: a randomised controlled trial. BMC MEDICAL EDUCATION 2012; 12:36. [PMID: 22640463 PMCID: PMC3419126 DOI: 10.1186/1472-6920-12-36] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/05/2012] [Accepted: 05/28/2012] [Indexed: 05/12/2023]
Abstract
BACKGROUND e-learning resources may be beneficial for complex or conceptually difficult topics. Leukaemia is one such topic, yet there are no reports on the efficacy of e-learning for leukaemia. This study compared the learning impact on senior medical students of a purpose-built e-learning module on leukaemia, compared with existing online resources. METHODS A randomised controlled trial was performed utilising volunteer senior medical students. Participants were randomly allocated to Study and Control groups. Following a pre-test on leukaemia administered to both groups, the Study group was provided with access to the new e-learning module, while the Control group was directed to existing online resources. A post-test and an evaluation questionnaire were administered to both groups at the end of the trial period. RESULTS Study and Control groups were equivalent in gender distribution, mean academic ability, pre-test performance and time studying leukaemia during the trial. The Study group performed significantly better than the Control group in the post-test, in which the group to which the students had been allocated was the only significant predictor of performance. The Study group's evaluation of the module was overwhelmingly positive. CONCLUSIONS A targeted e-learning module on leukaemia had a significant effect on learning in this cohort, compared with existing online resources. We believe that the interactivity, dialogic feedback and integration with the curriculum offered by the e-learning module contributed to its impact. This has implications for e-learning design in medicine and other disciplines.
Collapse
Affiliation(s)
- Yuri Morgulis
- Department of Pathology, School of Medical Sciences, Faculty of Medicine, The University of New South Wales, Sydney, NSW, 2052, Australia
| | - Rakesh K Kumar
- Department of Haematology, Prince of Wales Hospital, Sydney, NSW, 2031, Australia
| | - Robert Lindeman
- Department of Pathology, School of Medical Sciences, Faculty of Medicine, The University of New South Wales, Sydney, NSW, 2052, Australia
| | - Gary M Velan
- Department of Pathology, School of Medical Sciences, Faculty of Medicine, The University of New South Wales, Sydney, NSW, 2052, Australia
| |
Collapse
|
302
|
Does an offer for a free on-line continuing medical education (CME) activity increase physician survey response rate? A randomized trial. BMC Res Notes 2012; 5:129. [PMID: 22397624 PMCID: PMC3327628 DOI: 10.1186/1756-0500-5-129] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2012] [Accepted: 03/07/2012] [Indexed: 11/10/2022] Open
Abstract
BACKGROUND Achieving a high response rate in a physician survey is challenging. Monetary incentives increase response rates but obviously add cost to a survey project. We wondered whether an offer of a free continuing medical education (CME) activity would be effective in improving survey response rate. RESULTS As part of a survey of a national sample of physicians, we randomized half to an offer for a free on-line CME activity upon completion of a web-based survey and the other half to no such offer. We compared response rates between the groups. A total of 1214 out of 8477 potentially eligible physicians responded to our survey, for an overall response rate of 14.3%. The response rate among the control group (no offer of CME credit) was 16.6%, while among those offered the CME opportunity, the response rate was 12.0% (p < 0.0001). CONCLUSIONS An offer for a free on-line CME activity did not improve physician survey response rate. On the contrary, the offer for a free CME activity actually appeared to worsen the response rate.
Collapse
|
303
|
Sangvai S, Mahan JD, Lewis KO, Pudlo N, Suresh S, McKenzie LB. The impact of an interactive Web-based module on residents' knowledge and clinical practice of injury prevention. Clin Pediatr (Phila) 2012; 51:165-74. [PMID: 21985892 DOI: 10.1177/0009922811419027] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
OBJECTIVE To determine the effectiveness of an interactive Web-based module on knowledge acquisition, retention, and clinical practice by residents. METHODS Residents were randomized to complete an interactive Web-based module on injury prevention or a noninteractive Web-based module of identical content. Acquisition and retention of medical knowledge were measured by pretest, posttest, and long-term test scores, and change in clinical practice was measured by videotaped clinical encounters. RESULTS Fifty-seven residents completed the modules. The control group had higher posttest scores than the intervention group (P = .036). Thirty-seven residents completed the long-term test with scores that were significantly higher than pretest scores (P = .00). Thirty-six residents had videotaped encounter scores (232 visits), with no difference in these scores after the intervention (P = .432). CONCLUSION The noninteractive module was more effective in promoting knowledge acquisition. Residents successfully demonstrated knowledge retention with completion of either module. The modules were insufficient to change clinical practice.
Collapse
Affiliation(s)
- Shilpa Sangvai
- Nationwide Children's Hospital, Ambulatory Pediatrics, Columbus, OH 43205, USA.
| | | | | | | | | | | |
Collapse
|
304
|
Ochoa J, Naritoku DK. Using a virtual training program to train community neurologist on EEG reading skills. TEACHING AND LEARNING IN MEDICINE 2012; 24:26-28. [PMID: 22250932 DOI: 10.1080/10401334.2012.641483] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/31/2023]
Abstract
BACKGROUND EEG training requires iterative exposure of different patterns with continuous feedback from the instructor. This training is traditionally acquired through a traditional fellowship program, but only 28% of neurologists in training plan to do a fellowship in EEG. PURPOSE The purpose of this study was to determine the value of online EEG training to improve EEG knowledge among general neurologists. METHODS The participants were general neurologists invited through bulk e-mail and paid a fee to enroll in the virtual EEG program. A 40-question pretest exam was performed before training. The training included 4 online learning units about basic EEG principles and 40 online clinical EEG tutorials. In addition there were weekly live teleconferences for Q&A sessions. At the end of the program, the participants were asked to complete a posttest exam. RESULTS Fifteen of 20 participants successfully completed the program and took both the pre- and posttest exams. All the subjects scored significantly higher in the posttest compared to their baseline score. The average score in the pretest evaluation was 61.7% and the posttest average was 87.8% (p = .0002, two-tailed). CONCLUSIONS Virtual EEG training can improve EEG knowledge among community neurologists.
Collapse
Affiliation(s)
- Juan Ochoa
- Department of Neurology, University of South Alabama, Mobile, Alabama 366693, USA.
| | | |
Collapse
|
305
|
Kalet AL, Song HS, Sarpel U, Schwartz R, Brenner J, Ark TK, Plass J. Just enough, but not too much interactivity leads to better clinical skills performance after a computer assisted learning module. MEDICAL TEACHER 2012; 34:833-9. [PMID: 22917265 PMCID: PMC3826788 DOI: 10.3109/0142159x.2012.706727] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/11/2023]
Abstract
BACKGROUND Well-designed computer-assisted instruction (CAI) can potentially transform medical education. Yet little is known about whether specific design features such as direct manipulation of the content yield meaningful gains in clinical learning. We designed three versions of a multimedia module on the abdominal exam incorporating different types of interactivity. METHODS As part of their physical diagnosis course, 162 second-year medical students were randomly assigned (1:1:1) to Watch, Click or Drag versions of the abdominal exam module. First, students' prior knowledge, spatial ability, and prior experience with abdominal exams were assessed. After using the module, students took a posttest; demonstrated the abdominal exam on a standardized patient; and wrote structured notes of their findings. RESULTS Data from 143 students were analyzed. Baseline measures showed no differences among groups regarding prior knowledge, experience, or spatial ability. Overall there was no difference in knowledge across groups. However, physical exam scores were significantly higher for students in the Click group. CONCLUSIONS A mid-range level of behavioral interactivity was associated with small to moderate improvements in performance of clinical skills. These improvements were likely mediated by enhanced engagement with the material, within the bounds of learners' cognitive capacity. These findings have implications for the design of CAI materials to teach procedural skills.
Collapse
Affiliation(s)
- A L Kalet
- Division of Educational Informatics, New York University School of Medicine, NY, USA.
| | | | | | | | | | | | | |
Collapse
|
306
|
Cook DA. Randomized controlled trials and meta-analysis in medical education: what role do they play? MEDICAL TEACHER 2012; 34:468-73. [PMID: 22489980 DOI: 10.3109/0142159x.2012.671978] [Citation(s) in RCA: 33] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/14/2023]
Abstract
Education researchers seek to understand what works, for whom, in what circumstances. Unfortunately, educational environments are complex and research itself is highly context dependent. Faced with these challenges, some have argued that qualitative methods should supplant quantitative methods such as randomized controlled trials (RCTs) and meta-analysis. I disagree. Good qualitative and mixed-methods research are complementary to, rather than exclusive of, quantitative methods. The complexity and challenges we face should not beguile us into ignoring methods that provide strong evidence. What, then, is the proper role for RCTs and meta-analysis in medical education? First, the choice of study design depends on the research question. RCTs and meta-analysis are appropriate for many, but not all, study goals. They have compelling strengths but also numerous limitations. Second, strong methods will not compensate for a pointless question. RCTs do not advance the science when they make confounded comparisons, or make comparison with no intervention. Third, clinical medicine now faces many of the same challenges we encounter in education. We can learn much from other fields about how to handle complexity in RCTs. Finally, no single study will definitively answer any research question. We need carefully planned, theory-building, programmatic research, reflecting a variety of paradigms and approaches, as we accumulate evidence to change the art and science of education.
Collapse
Affiliation(s)
- David A Cook
- Office of Education Research, Division of General Internal Medicine, Mayo Clinic College of Medicine, Mayo 17, 200 First Street SW, Rochester, MN 55905, USA.
| |
Collapse
|
307
|
Triola MM, Huwendiek S, Levinson AJ, Cook DA. New directions in e-learning research in health professions education: Report of two symposia. MEDICAL TEACHER 2012; 34:e15-20. [PMID: 22250691 DOI: 10.3109/0142159x.2012.638010] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/25/2023]
Abstract
BACKGROUND The use of Computer Assisted Instruction (CAI) is rising across health professions education. Research to date is of limited use in guiding the implementation and selection of CAI innovations. AIMS In the context of two symposia, systemic reviews were discussed that evaluate literature in Internet-based learning, Virtual Patients, and animations. Each session included a debate with the goal of reaching consensus on best current practices and future research. METHODS Thematic analysis of the discussions was performed to arrange the questions by theme, eliminate redundancy, and craft them into a cohesive narrative. RESULTS The question analysis revealed that there are clear advantages to the use of CAI, and that established educational theories should certainly inform the future development and selection of CAI tools. Schools adopting CAI need to carefully consider the benefits, cost, available resources, and capacity for teachers and learners to accept change in their practice of education. Potential areas for future research should focus on the effectiveness of CAI instructional features, integration of e-learning into existing curricula and with other modalities like simulation, and the use of CAI in assessment of higher-level outcomes. CONCLUSIONS There are numerous opportunities for future research and it will be important to achieve consensus on important themes.
Collapse
Affiliation(s)
- Marc M Triola
- New York University School of Medicine, New York, NY, USA.
| | | | | | | |
Collapse
|
308
|
Wolbrink TA, Burns JP. Internet-based learning and applications for critical care medicine. J Intensive Care Med 2011; 27:322-32. [PMID: 22173562 DOI: 10.1177/0885066611429539] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
PURPOSE Recent changes in duty hour allowances and economic constraints are forcing a paradigm shift in graduate medical education in the United States. Internet-based learning is a rapidly growing component of postgraduate medical education, including the field of critical care medicine. Here, we define the key concepts of Internet-based learning, summarize the current literature, and describe how Internet-based learning may be uniquely suited for the critical care provider. METHODS A MEDLINE/PubMed search from January 2000 to July 2011 using the search terms: "e-learning," "Web-based learning," "computer-aided instruction," "adult learning," "knowledge retention," "intensive care," and "critical care." RESULTS The growth of the Internet is marked by the development of new technologies, including more user-derived tools. Nonmedical fields have embraced Internet-based learning as a valuable teaching tool. A recent meta-analysis described Internet-based learning in the medical field as being more effective than no intervention and likely as efficacious as traditional teaching methods. Web sites containing interactive features are aptly suited for the adult learner, complementing the paradigm shift to more learner-centered education. Interactive cases, simulators, and games may allow for improvement in clinical care. The total time spent utilizing Internet-based resources, as well as the frequency of returning to those sites, may influence educational gains. CONCLUSION Internet-based learning may provide an opportunity for assistance in the transformation of medical education. Many features of Web-based learning, including interactivity, make it advantageous for the adult medical learner, especially in the field of critical care medicine, and further work is necessary to develop a robust learning platform incorporating a variety of learning modalities for critical care providers.
Collapse
Affiliation(s)
- Traci A Wolbrink
- Division of Critical Care Medicine, Department of Anesthesia, Perioperative and Pain Management, Children's Hospital Boston, Boston, MA 02115, USA.
| | | |
Collapse
|
309
|
Waimey KE, Krausfeldt AD, Taylor RL, Wallach HD, Woodruff TK. Understanding Technology and Human Interaction to Catalyze Oncofertility and Adolescent and Young Adult Oncology Research. J Adolesc Young Adult Oncol 2011; 1:160-163. [PMID: 23610736 DOI: 10.1089/jayao.2012.0001] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022] Open
Affiliation(s)
- Kate E Waimey
- Oncofertility Consortium, Feinberg School of Medicine, Northwestern University , Chicago, Illinois. ; Department of Obstetrics and Gynecology, Feinberg School of Medicine, Northwestern University , Chicago, Illinois
| | | | | | | | | |
Collapse
|
310
|
Kontio R, Lahti M, Pitkänen A, Joffe G, Putkonen H, Hätönen H, Katajisto J, Välimäki M. Impact of eLearning course on nurses' professional competence in seclusion and restraint practices: a randomized controlled study (ISRCTN32869544). J Psychiatr Ment Health Nurs 2011; 18:813-21. [PMID: 21985684 DOI: 10.1111/j.1365-2850.2011.01729.x] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
Abstract
Education on the care of aggressive and disturbed patients is fragmentary. eLearning could ensure the quality of such education, but data on its impact on professional competence in psychiatry are lacking. The aim of this study was to explore the impact of ePsychNurse.Net, an eLearning course, on psychiatric nurses' professional competence in seclusion and restraint and on their job satisfaction and general self-efficacy. In a randomized controlled study, 12 wards were randomly assigned to ePsychNurse.Net (intervention) or education as usual (control). Baseline and 3-month follow-up data on nurses' knowledge of coercion-related legislation, physical restraint and seclusion, their attitudes towards physical restraint and seclusion, job satisfaction and general self-efficacy were analysed for 158 completers. Knowledge (primary outcome) of coercion-related legislation improved in the intervention group, while knowledge of physical restraint improved and knowledge of seclusion remained unchanged in both groups. General self-efficacy improved in the intervention group also attitude to seclusion in the control group. In between-group comparison, attitudes to seclusion (one of secondary outcomes) favoured the control group. Although the ePsychNurse.Net demonstrated only slight advantages over conventional learning, it may be worth further development with, e.g. flexible time schedule and individualized content.
Collapse
Affiliation(s)
- R Kontio
- Department of Psychiatry, University of Turku, Turku, Finland.
| | | | | | | | | | | | | | | |
Collapse
|
311
|
Standardized patient-narrated web-based learning modules improve students' communication skills on a high-stakes clinical skills examination. J Gen Intern Med 2011; 26:1374-7. [PMID: 21769506 PMCID: PMC3208474 DOI: 10.1007/s11606-011-1809-3] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/08/2010] [Revised: 03/04/2011] [Accepted: 07/06/2011] [Indexed: 10/18/2022]
Abstract
BACKGROUND Use of web-based standardized patient (SP) modules is associated with improved medical student history-taking and physical examination skills on clinical performance examinations (CPX), but a benefit for communication skills has not been shown. AIM We describe an innovative web-based SP module using detailed SP and faculty commentary to teach communication skills. SETTING A public medical school in 2008-2009. PARTICIPANTS Fourth-year medical students. PROGRAM DESCRIPTION A 90-minute web-based module with three simulated clinical encounters was narrated by an expert clinician and SP to explain expected history-taking, physical examination, and communication skills behaviors. All 147 students were encouraged to review the module one month before the CPX. PROGRAM EVALUATION One hundred and six students (72%) viewed the web-based module. Students who watched the module performed significantly higher on the CPX communication score (+2.67%, p < 0.01) and overall score (+2.12%, p = 0.03), even after controlling for USMLE Step 1 and clerkship summary ratings. Use of the module did not significantly affect history/physical examination scores (+1.89%, p = 0.12). DISCUSSION Students who watched an optional web-based SP module prior to the CPX performed higher than those who did not on communication skills. The web-based module appears to be an effective CPX preparatory activity to enhance communication performance.
Collapse
|
312
|
Marsh-Tootle WL, McGwin G, Kohler CL, Kristofco RE, Datla RV, Wall TC. Efficacy of a web-based intervention to improve and sustain knowledge and screening for amblyopia in primary care settings. Invest Ophthalmol Vis Sci 2011; 52:7160-7. [PMID: 21730344 DOI: 10.1167/iovs.10-6566] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Abstract
PURPOSE To evaluate the efficacy of a physician-targeted website to improve knowledge and self-reported behavior relevant to strabismus and amblyopia ("vision") in primary care settings. METHODS Eligible providers (filing Medicaid claims for at least eight well-child checks at ages 3 or 4 years, 1 year before study enrollment), randomly assigned to control (chlamydia and blood pressure) or vision groups, accessed four web-based educational modules, programmed to present interactive case vignettes with embedded questions and feedback. Each correct response, assigned a value of +1 to a maximum of +7, was used to calculate a summary score per provider. Responses from intervention providers (IPs) at baseline and two follow-up points were compared to responses to vision questions, taken at the end of the study, from control providers (CPs). RESULTS Most IPs (57/65) responded at baseline and after the short delay (within 1 hour after baseline for 38 IPs). A subgroup (27 IPs and 42 CPs) completed all vision questions after a long delay averaging 1.8 years. Scores from IPs improved after the short delay (median score, 3 vs. 6; P = 0.0065). Compared to CPs, scores from IPs were similar at baseline (P = 0.6473) and higher after the short-term (P < 0.0001) and long-term (P < 0.05) delay. CONCLUSIONS Significant improvements after the short delay demonstrate the efficacy of the website and the potential for accessible, standardized vision education. Although improvements subsided over time, the IPs' scores did not return to baseline levels and were significantly better compared to CPs tested 1 to 3 years later.
Collapse
Affiliation(s)
- Wendy L Marsh-Tootle
- School of Optometry, University of Alabama at Birmingham, 1716 University Boulevard, Birmingham, AL 35294, USA.
| | | | | | | | | | | |
Collapse
|
313
|
Brouwers MC, Makarski J, Durocher LD, Levinson AJ. E-learning interventions are comparable to user's manual in a randomized trial of training strategies for the AGREE II. Implement Sci 2011; 6:81. [PMID: 21791080 PMCID: PMC3162563 DOI: 10.1186/1748-5908-6-81] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/14/2011] [Accepted: 07/26/2011] [Indexed: 11/23/2022] Open
Abstract
Background Practice guidelines (PGs) are systematically developed statements intended to assist in patient and practitioner decisions. The AGREE II is the revised tool for PG development, reporting, and evaluation, comprised of 23 items, two global rating scores, and a new User's Manual. In this study, we sought to develop, execute, and evaluate the impact of two internet interventions designed to accelerate the capacity of stakeholders to use the AGREE II. Methods Participants were randomized to one of three training conditions. 'Tutorial'--participants proceeded through the online tutorial with a virtual coach and reviewed a PDF copy of the AGREE II. 'Tutorial + Practice Exercise'--in addition to the Tutorial, participants also appraised a 'practice' PG. For the practice PG appraisal, participants received feedback on how their scores compared to expert norms and formative feedback if scores fell outside the predefined range. 'AGREE II User's Manual PDF (control condition)'--participants reviewed a PDF copy of the AGREE II only. All participants evaluated a test PG using the AGREE II. Outcomes of interest were learners' performance, satisfaction, self-efficacy, mental effort, time-on-task, and perceptions of AGREE II. Results No differences emerged between training conditions on any of the outcome measures. Conclusions We believe these results can be explained by better than anticipated performance of the AGREE II PDF materials (control condition) or the participants' level of health methodology and PG experience rather than the failure of the online training interventions. Some data suggest the online tools may be useful for trainees new to this field; however, this requires further study.
Collapse
|
314
|
Cook DA, Levinson AJ, Garside S. Method and reporting quality in health professions education research: a systematic review. MEDICAL EDUCATION 2011; 45:227-38. [PMID: 21299598 DOI: 10.1111/j.1365-2923.2010.03890.x] [Citation(s) in RCA: 93] [Impact Index Per Article: 7.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/23/2023]
Abstract
CONTEXT Studies evaluating reporting quality in health professions education (HPE) research have demonstrated deficiencies, but none have used comprehensive reporting standards. Additionally, the relationship between study methods and effect size (ES) in HPE research is unknown. OBJECTIVES This review aimed to evaluate, in a sample of experimental studies of Internet-based instruction, the quality of reporting, the relationship between reporting and methodological quality, and associations between ES and study methods. METHODS We conducted a systematic search of databases including MEDLINE, Scopus, CINAHL, EMBASE and ERIC, for articles published during 1990-2008. Studies (in any language) quantifying the effect of Internet-based instruction in HPE compared with no intervention or other instruction were included. Working independently and in duplicate, we coded reporting quality using the Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement, and coded study methods using a modified Newcastle-Ottawa Scale (m-NOS), the Medical Education Research Study Quality Instrument (MERSQI), and the Best Evidence in Medical Education (BEME) global scale. RESULTS For reporting quality, articles scored a mean±standard deviation (SD) of 51±25% of STROBE elements for the Introduction, 58±20% for the Methods, 50±18% for the Results and 41±26% for the Discussion sections. We found positive associations (all p<0.0001) between reporting quality and MERSQI (ρ=0.64), m-NOS (ρ=0.57) and BEME (ρ=0.58) scores. We explored associations between study methods and knowledge ES by subtracting each study's ES from the pooled ES for studies using that method and comparing these differences between subgroups. Effect sizes in single-group pretest/post-test studies differed from the pooled estimate more than ESs in two-group studies (p=0.013). No difference was found between other study methods (yes/no: representative sample, comparison group from same community, randomised, allocation concealed, participants blinded, assessor blinded, objective assessment, high follow-up). CONCLUSIONS Information is missing from all sections of reports of HPE experiments. Single-group pre-/post-test studies may overestimate ES compared with two-group designs. Other methodological variations did not bias study results in this sample.
Collapse
Affiliation(s)
- David A Cook
- Division of General Internal Medicine, College of Medicine, Mayo Clinic, Rochester, Minnesota 55905, USA
| | | | | |
Collapse
|
315
|
Legris MÈ, Séguin NC, Desforges K, Sauvé P, Lord A, Bell R, Berbiche D, Desrochers JF, Lemieux JP, Morin-Bélanger C, Ste-Marie Paradis F, Lalonde L. Pharmacist Web-based training program on medication use in chronic kidney disease patients: impact on knowledge, skills, and satisfaction. THE JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS 2011; 31:140-150. [PMID: 21953653 DOI: 10.1002/chp.20119] [Citation(s) in RCA: 12] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/31/2023]
Abstract
INTRODUCTION Chronic kidney disease (CKD) patients are multimorbid elderly at high risk of drug-related problems. A Web-based training program was developed based on a list of significant drug-related problems in CKD patients requiring a pharmaceutical intervention. The objectives were to evaluate the impact of the program on community pharmacists' knowledge and skills and their satisfaction with the training. METHODS Pharmacists were randomized to the training program or the control group. Training comprised a 60-minute Web-based interactive session supported by a clinical guide. Pharmacists completed a questionnaire on knowledge (10 multiple-choice questions) and skills (2 clinical vignettes) at baseline and a second time within 1 month. Trained pharmacists completed a written satisfaction questionnaire. Semidirected telephone interviews were conducted with 8 trained pharmacists. Changes in knowledge and skills scores were compared between the groups. RESULTS Seventy pharmacists (training: 52; control: 18) were recruited; the majority were women with <15 years' experience. Compared with the control group, an adjusted incremental increase in the knowledge score (22%; 95% confidence interval [CI]: 16%-27%) and skills score (24%; 95% CI: 16%-33%) was observed in the training group. Most pharmacists (87%-100%) rated each aspect of the program "excellent'' or "very good." Additional training and adding a discussion forum were suggested to complement the program. DISCUSSION Pharmacists like the Web-based continuing education program. Over a short time span, the program improved their knowledge and skills. Its impact on their clinical practices and quality of medication use in CKD patients remains to be assessed.
Collapse
Affiliation(s)
- Marie-Ève Legris
- Université de Montréal, Hôpital Maisonneuve-Rosemont, Montréal, Canada
| | | | | | | | | | | | | | | | | | | | | | | |
Collapse
|
316
|
Hauge LS, Frischknecht AC, Gauger PG, Hirshfield LE, Harkins D, Butz DA, Taheri PA. Web-Based Curriculum Improves Residents' Knowledge of Health Care Business. J Am Coll Surg 2010; 211:777-83. [DOI: 10.1016/j.jamcollsurg.2010.07.011] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/25/2010] [Revised: 07/12/2010] [Accepted: 07/14/2010] [Indexed: 11/26/2022]
|
317
|
Cook DA, Levinson AJ, Garside S. Time and learning efficiency in Internet-based learning: a systematic review and meta-analysis. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2010; 15:755-70. [PMID: 20467807 DOI: 10.1007/s10459-010-9231-x] [Citation(s) in RCA: 92] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/04/2010] [Accepted: 04/26/2010] [Indexed: 05/21/2023]
Abstract
UNLABELLED Authors have claimed that Internet-based instruction promotes greater learning efficiency than non-computer methods. OBJECTIVES determine, through a systematic synthesis of evidence in health professions education, how Internet-based instruction compares with non-computer instruction in time spent learning, and what features of Internet-based instruction are associated with improved learning efficiency. DATA SOURCES we searched databases including MEDLINE, CINAHL, EMBASE, and ERIC from 1990 through November 2008. STUDY SELECTION AND DATA ABSTRACTION we included all studies quantifying learning time for Internet-based instruction for health professionals, compared with other instruction. Reviewers worked independently, in duplicate, to abstract information on interventions, outcomes, and study design. RESULTS we identified 20 eligible studies. Random effects meta-analysis of 8 studies comparing Internet-based with non-Internet instruction (positive numbers indicating Internet longer) revealed pooled effect size (ES) for time -0.10 (p = 0.63). Among comparisons of two Internet-based interventions, providing feedback adds time (ES 0.67, p =0.003, two studies), and greater interactivity generally takes longer (ES 0.25, p = 0.089, five studies). One study demonstrated that adapting to learner prior knowledge saves time without significantly affecting knowledge scores. Other studies revealed that audio narration, video clips, interactive models, and animations increase learning time but also facilitate higher knowledge and/or satisfaction. Across all studies, time correlated positively with knowledge outcomes (r = 0.53, p = 0.021). CONCLUSIONS on average, Internet-based instruction and non-computer instruction require similar time. Instructional strategies to enhance feedback and interactivity typically prolong learning time, but in many cases also enhance learning outcomes. Isolated examples suggest potential for improving efficiency in Internet-based instruction.
Collapse
Affiliation(s)
- David A Cook
- Division of General Internal Medicine and Office of Education Research, Mayo Clinic College of Medicine, Baldwin 4-A, 200 First Street SW, Rochester, MN 55905, USA.
| | | | | |
Collapse
|
318
|
Cook DA, Garside S, Levinson AJ, Dupras DM, Montori VM. What do we mean by web-based learning? A systematic review of the variability of interventions. MEDICAL EDUCATION 2010; 44:765-74. [PMID: 20633216 DOI: 10.1111/j.1365-2923.2010.03723.x] [Citation(s) in RCA: 125] [Impact Index Per Article: 8.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/05/2023]
Abstract
OBJECTIVES Educators often speak of web-based learning (WBL) as a single entity or a cluster of similar activities with homogeneous effects. Yet a recent systematic review demonstrated large heterogeneity among results from individual studies. Our purpose is to describe the variation in configurations, instructional methods and presentation formats in WBL. METHODS We systematically searched MEDLINE, EMBASE, ERIC, CINAHL and other databases (last search November 2008) for studies comparing a WBL intervention with no intervention or another educational activity. From eligible studies we abstracted information on course participants, topic, configuration and instructional methods. We summarised this information and then purposively selected and described several WBL interventions that illustrate specific technologies and design features. RESULTS We identified 266 eligible studies. Nearly all courses (89%) used written text and most (55%) used multimedia. A total of 32% used online communication via e-mail, threaded discussion, chat or videoconferencing, and 9% implemented synchronous components. Overall, 24% blended web-based and non-computer-based instruction. Most web-based courses (77%) employed specific instructional methods, other than text alone, to enhance the learning process. The most common instructional methods (each used in nearly 50% of courses) were patient cases, self-assessment questions and feedback. We describe several studies to illustrate the range of instructional designs. CONCLUSIONS Educators and researchers cannot treat WBL as a single entity. Many different configurations and instructional methods are available for WBL instructors. Researchers should study when to use specific WBL designs and how to use them effectively.
Collapse
Affiliation(s)
- David A Cook
- Department of Medicine, College of Medicine, Mayo Clinic, Rochester, Minnesota 55905, USA.
| | | | | | | | | |
Collapse
|
319
|
Shortt SED, Guillemette JM, Duncan AM, Kirby F. Defining quality criteria for online continuing medical education modules using modified nominal group technique. THE JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS 2010; 30:246-50. [PMID: 21171030 DOI: 10.1002/chp.20089] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/25/2023]
Abstract
INTRODUCTION The rapid increase in the use of the Internet for continuing education by physicians suggests the need to define quality criteria for accredited online modules. METHODS Continuing medical education (CME) directors from Canadian medical schools and academic researchers participated in a consensus process, Modified Nominal Group Technique, to develop agreement on the most important quality criteria to guide module development. Rankings were compared to responses to a survey of a subset of Canadian Medical Association (CMA) members. RESULTS A list of 17 items was developed, of which 10 were deemed by experts to be important and 7 were considered secondary. A quality module would: be needs-based; presented in a clinical format; utilize evidence-based information; permit interaction with content and experts; facilitate and attempt to document practice change; be accessible for later review; and include a robust course evaluation. There was less agreement among CMA members on criteria ranking, with consensus on ranking reached on only 12 of 17 items. In contrast to experts, members agreed that the need to assess performance change as a result of an educational experience was not important. DISCUSSION This project identified 10 quality criteria for accredited online CME modules that representatives of Canadian organizations involved in continuing education believe should be taken into account when developing learning products. The lack of practitioner support for documentation of change in clinical behavior may suggest that they favor traditional attendance- or completion-based CME; this finding requires further research.
Collapse
Affiliation(s)
- S E D Shortt
- Knowledge Transfer & Practice Policy, Canadian Medical Association, Ottawa, ON, Canada.
| | | | | | | |
Collapse
|
320
|
|