1
|
Liu Z, Xu Y, Lin Y, Yu P, Ji M, Luo Z. A partially flipped physiology classroom improves the deep learning approach of medical students. ADVANCES IN PHYSIOLOGY EDUCATION 2024; 48:446-454. [PMID: 38602011 DOI: 10.1152/advan.00196.2023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/19/2023] [Revised: 04/08/2024] [Accepted: 04/08/2024] [Indexed: 04/12/2024]
Abstract
This study aimed to compare the impact of the partially flipped physiology classroom (PFC) and the traditional lecture-based classroom (TLC) on students' learning approaches. The study was conducted over 5 mo at Xiangya School of Medicine from February to July 2022 and comprised 71 students majoring in clinical medicine. The experimental group (n = 32) received PFC teaching, whereas the control group (n = 39) received TLC. The Revised Two-Factor Study Process Questionnaire (R-SPQ-2F) was used to assess the impact of different teaching methods on students' learning approaches. After the PFC, students got significantly higher scores on deep learning approach (Z = -3.133, P < 0.05). Conversely, after the TLC students showed significantly higher scores on surface learning approach (Z = -2.259, P < 0.05). After the course, students in the PFC group scored significantly higher in deep learning strategy than those in the TLC group (Z = -2.196, P < 0.05). The PFC model had a positive impact on deep learning motive and strategy, leading to an improvement in the deep approach, which is beneficial for the long-term development of students. In contrast, the TLC model only improved the surface learning approach. The study implies that educators should consider implementing PFC to enhance students' learning approaches.NEW & NOTEWORTHY In this article, we compare the impact of the partially flipped classroom (PFC) and the traditional lecture classroom (TLC) in a physiology course on medical students' learning approaches. We found that the PFC benefited students by significantly enhancing their deep learning motive, strategy, and approach, which was good for them. However, the TLC model only improved the surface learning motive and approach.
Collapse
Affiliation(s)
- Ziqi Liu
- Xiangya School of Medicine, Central South University, Changsha, China
| | - Yangting Xu
- Xiangya School of Medicine, Central South University, Changsha, China
- Shanghai Key Laboratory of Psychotic Disorders, Brain Health Institute, National Center for Mental Disorders, Shanghai Mental Health Center, Shanghai Jiaotong University School of Medicine, Shanghai, China
| | - Yicheng Lin
- Xiangya School of Medicine, Central South University, Changsha, China
| | - Pei Yu
- Xiangya School of Medicine, Central South University, Changsha, China
| | - Ming Ji
- Department of Physiology, School of Basic Medicine, Central South University, Changsha, China
| | - Ziqiang Luo
- Department of Physiology, School of Basic Medicine, Central South University, Changsha, China
| |
Collapse
|
2
|
Kusurkar RA, Orsini C, Somra S, Artino AR, Daelmans HE, Schoonmade LJ, van der Vleuten C. The Effect of Assessments on Student Motivation for Learning and Its Outcomes in Health Professions Education: A Review and Realist Synthesis. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2023; 98:1083-1092. [PMID: 37146237 PMCID: PMC10453393 DOI: 10.1097/acm.0000000000005263] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/07/2023]
Abstract
PURPOSE In health professions education (HPE), the effect of assessments on student motivation for learning and its consequences have been largely neglected. This is problematic because assessments can hamper motivation and psychological well-being. The research questions guiding this review were: How do assessments affect student motivation for learning in HPE? What outcomes does this lead to in which contexts? METHOD In October 2020, the authors searched PubMed, Embase, APA PsycInfo, ERIC, CINAHL, and Web of Science Core Collection for "assessments" AND "motivation" AND "health professions education/students." Empirical papers or literature reviews investigating the effect of assessments on student motivation for learning in HPE using quantitative, qualitative, or mixed methods from January 1, 2010, to October 29, 2020, were included. The authors chose the realist synthesis method for data analysis to study the intended and unintended consequences of this complex topic. Assessments were identified as stimulating autonomous or controlled motivation using sensitizing concepts from self-determination theory and data on context-mechanism-outcome were extracted. RESULTS Twenty-four of 15,291 articles were ultimately included. Assessments stimulating controlled motivation seemed to have negative outcomes. An example of an assessment that stimulates controlled motivation is one that focuses on factual knowledge (context), which encourages studying only for the assessment (mechanism) and results in surface learning (outcome). Assessments stimulating autonomous motivation seemed to have positive outcomes. An example of an assessment that stimulates autonomous motivation is one that is fun (context), which through active learning (mechanism) leads to higher effort and better connection with the material (outcome). CONCLUSIONS These findings indicate that students strategically learned what was expected to appear in assessments at the expense of what was needed in practice. Therefore, health professions educators should rethink their assessment philosophy and practices and introduce assessments that are relevant to professional practice and stimulate genuine interest in the content.
Collapse
Affiliation(s)
- Rashmi A. Kusurkar
- R.A. Kusurkar is professor and research programme leader, Research in Education, Amsterdam University Medical Centers location Vrije Universiteit Amsterdam, professor and research programme leader, LEARN! Research Institute for Learning and Education, Faculty of Psychology and Education, VU University Amsterdam, and professor and research programme leader, Amsterdam Public Health, Quality of Care, Amsterdam, the Netherlands; ORCID: http://orcid.org/0000-0002-9382-0379
| | - Cesar Orsini
- C. Orsini is associate professor in medical education, Norwich Medical School, University of East Anglia, Norwich, United Kingdom, and Researcher in Health Professions Education, Faculty of Dentistry, Universidad de Los Andes, Santiago, Chile; ORCID: http://orcid.org/0000-0002-5226-3625
| | - Sunia Somra
- S. Somra was research assistant, Research in Education, Amsterdam University Medical Centers location Vrije Universiteit Amsterdam, Amsterdam, the Netherlands, at the time of this study
| | - Anthony R. Artino
- A.R. Artino Jr is professor and associate dean for evaluation and educational research, School of Medicine & Health Sciences, George Washington University, Washington, DC; ORCID: http://orcid.org/0000-0003-2661-7853
| | - Hester E.M. Daelmans
- H.E.M. Daelmans is director of the master of medicine programme, Faculty of Medicine Vrije Universiteit Amsterdam, Amsterdam, the Netherlands
| | - Linda J. Schoonmade
- L.J. Schoonmade is information specialist at the medical library, Vrije Universiteit Amsterdam, Amsterdam, the Netherlands; ORCID: https://orcid.org/0000-0002-2407-5977
| | - Cees van der Vleuten
- C. van der Vleuten is professor, School of Health Professions Education, University of Maastricht, Maastricht, the Netherlands; ORCID: http://orcid.org/0000-0001-6802-3119
| |
Collapse
|
3
|
Gardner NP, Gormley GJ, Kearney GP. Is there ever a single best answer (SBA): assessment driving certainty in the uncertain world of GP? EDUCATION FOR PRIMARY CARE 2023; 34:180-183. [PMID: 37642400 DOI: 10.1080/14739879.2023.2243447] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/08/2023] [Accepted: 07/29/2023] [Indexed: 08/31/2023]
Abstract
Uncertainty is inherent in all areas of medical practice, not least in primary care, which is defined by its acceptance of uncertainty and complexity. Single best answer (SBA) questions are a ubiquitous assessment tool in undergraduate medical assessments; however clinical practice, particularly in primary care, challenges the supposition that a single best answer exists for all clinical encounters and dilemmas. In this article, we seek to highlight several aspects of the relationship between this assessment format and clinical uncertainty by considering its influence on medical students' views of uncertainty in the contexts of their medical education, personal epistemology, and clinical expectations.
Collapse
Affiliation(s)
- Nick P Gardner
- Centre for Medical Education, Queen's University Belfast, Belfast, UK
| | - Gerard J Gormley
- Centre for Medical Education, Queen's University Belfast, Belfast, UK
| | - Grainne P Kearney
- Centre for Medical Education, Queen's University Belfast, Belfast, UK
| |
Collapse
|
4
|
Planas De Lathawer V. An exploration of Third-Year student midwives’ experiences of High-Risk Module Assessment in preparation for practice and real-world emergencies. Midwifery 2022; 114:103450. [DOI: 10.1016/j.midw.2022.103450] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/19/2022] [Revised: 07/14/2022] [Accepted: 08/05/2022] [Indexed: 11/29/2022]
|
5
|
Pham H, Court-Kowalski S, Chan H, Devitt P. Writing Multiple Choice Questions-Has the Student Become the Master? TEACHING AND LEARNING IN MEDICINE 2022:1-12. [PMID: 35491868 DOI: 10.1080/10401334.2022.2050240] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/06/2021] [Accepted: 02/21/2022] [Indexed: 06/14/2023]
Abstract
CONSTRUCT We compared the quality of clinician-authored and student-authored multiple choice questions (MCQs) using a formative, mock examination of clinical knowledge for medical students. BACKGROUND Multiple choice questions are a popular format used in medical programs of assessment. A challenge for educators is creating high-quality items efficiently. For expediency's sake, a standard practice is for faculties to repeat items in examinations from year to year. This study aims to compare the quality of student-authored with clinician-authored items as a potential source of new items to include in faculty item banks. APPROACH We invited Year IV and V medical students at the University of Adelaide to participate in a mock examination. The participants first completed an online instructional module on strategies for answering and writing MCQs, then submitted one original MCQ each for potential inclusion in the mock examination. Two 180-item mock examinations, one for each year level, were constructed. Each consisted of 90 student-authored items and 90 clinician-authored items. Participants were blinded to the author of each item. Each item was analyzed for item difficulty and discrimination, number of item-writing flaws (IWFs) and non-functioning distractors (NFDs), and cognitive skill level (using a modified version of Bloom's taxonomy). FINDINGS Eighty-nine and 91 students completed the Year IV and V examinations, respectively. Student-authored items, compared with clinician-authored items, tended to be written at both a lower cognitive skill and difficulty level. They contained a significantly higher rate of IWFs (2-3.5 times) and NFDs (1.18 times). However, they were equally or better discriminating items than clinician-authored items. CONCLUSIONS Students can author MCQ items with comparable discrimination to clinician-authored items, despite being inferior in other parameters. Student-authored items may be considered a potential source of material for faculty item banks; however, several barriers exist to their use in a summative setting. The overall quality of items remains suboptimal, regardless of author. This highlights the need for ongoing faculty training in item writing.
Collapse
Affiliation(s)
- Hannah Pham
- Adelaide Medical School, University of Adelaide, Adelaide, South Australia
| | - Stefan Court-Kowalski
- Adelaide Medical School, University of Adelaide, Adelaide, South Australia
- Royal Adelaide Hospital, Adelaide, South Australia
| | - Hong Chan
- SA Ambulance Service, Eastwood, South Australia
| | - Peter Devitt
- Adelaide Medical School, University of Adelaide, Adelaide, South Australia
| |
Collapse
|
6
|
Lagoo JY, Joshi SB. Introduction of direct observation of procedural skills (DOPS) as a formative assessment tool during postgraduate training in anaesthesiology: Exploration of perceptions. Indian J Anaesth 2021; 65:202-209. [PMID: 33776110 PMCID: PMC7989495 DOI: 10.4103/ija.ija_124_20] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/11/2020] [Revised: 04/17/2020] [Accepted: 09/27/2020] [Indexed: 11/05/2022] Open
Abstract
Background and Aims: Formative assessment of procedural skills of Anaesthesiology postgraduate (PG) students is not conducted conventionally. Direct observation of procedural skills (DOPS) helps to identify gaps in performance and provides structured feedback. The present study was taken to explore perceptions of PG students and faculty about DOPS. Methods: This mixed design interventional study was conducted on 12 PGs and 10 faculty members in Department of Anaesthesiology. After conducting DOPS, a pre-validated questionnaire was given to them about perceptions using 3-point Likert's scale along with open ended questions. Statistical analysis was done using descriptive statistics of perception to calculate percentages and themes were identified for qualitative data. Results: Responses of students were positive about skill improvement (83%), time provided (75%), feedback (100%), interaction (83%), motivation (83%), satisfaction (83%), effectiveness (83%) and opportunity creation (92%). Faculty responded positively regarding change in attitude (100%), effectiveness (100%), scope (90%), feasible application (90%), ease (90%), opportunity creation (80%), gap identification (100%), satisfaction (80%). However, 60% felt training was required, 50% thought more time and commitment was required. Themes identified were DOPS is comprehensive, interactive, student-friendly, good teaching-learning tool, identifies gaps, provides focus for learning, provides systematic constructive feedback, improves skills, prepares for future practice, requires planning, may not reflect competence, has assessor variability and can be included in PG curriculum. Conclusion: DOPS was perceived as an effective assessment and teaching-learning tool by PG students as well as faculty.
Collapse
Affiliation(s)
- Jui Y Lagoo
- Department of Anaesthesia, Symbiosis Medical College for Women, Pune, Maharashtra, India
| | - Shilpa B Joshi
- Department of Anaesthesia, St John's Medical College Hospital, Bangalore, Karnataka, India
| |
Collapse
|
7
|
Schüttpelz-Brauns K, Karay Y, Arias J, Gehlhar K, Zupanic M. Comparison of the evaluation of formative assessment at two medical faculties with different conditions of undergraduate training, assessment and feedback. GMS JOURNAL FOR MEDICAL EDUCATION 2020; 37:Doc41. [PMID: 32685669 PMCID: PMC7346285 DOI: 10.3205/zma001334] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Figures] [Subscribe] [Scholar Register] [Received: 08/29/2019] [Revised: 03/10/2020] [Accepted: 04/27/2020] [Indexed: 06/11/2023]
Abstract
Introduction: Both formative and summative assessments have their place in medical curricula: formative assessment to accompany the learning process and summative assessment to ensure that minimum standards are achieved. Depending on the conditions of undergraduate training, assessment and feedback, students place more or less importance on formative assessment, and thus the fulfilment of its function may be questionable. This study describes how the low-stakes formative Berlin Progress Test (BPT) is embedded at two medical faculties with partially different framework conditions and what effects these have on the students' testing efforts and the evaluation of the test, especially the perception of its benefits and (intangible) costs, such as non-participation in contemporaneous activities and emotional impairments. Methods: In this study, the proportion of non-serious BPT participants at two medical faculties (total sample: NF1=1,410, NF2=1,176) in winter term 2015/16 was determined both by the number of unanswered questions on the test itself and in a survey using a standardized instrument (NF1=415, NF2=234). Furthermore, open questions were asked in this survey about perceived benefits and perceived costs, which were analyzed with qualitative and quantitative methods. Results: The BPT is generally better accepted at Faculty 2. This can be seen in the higher proportion of serious test takers, the lower perceived costs and the higher reported benefit, as well as the higher proportion of constructive comments. Faculty 2 students better understood the principle of formative testing and used the results of the BPT as feedback on their own knowledge progress, motivation to learn and reduction of exam fear. Discussion: When medical faculties integrate formative assessments into the curriculum, they have to provide a framework in which these assessments are perceived as an important part of the curriculum. Otherwise, it is questionable whether they can fulfil their function of accompanying the learning process.
Collapse
Affiliation(s)
| | - Yassin Karay
- University of Cologne, Medical Faculty, Cologne, Germany
| | - Johann Arias
- RWTH Aachen University, Medical Faculty, Aachen, Germany
| | - Kirsten Gehlhar
- Carl von Ossietzky University, School of Medicine and Health Sciences, Oldenburg, Germany
| | | |
Collapse
|
8
|
Jasemi M, Ahangarzadeh Rezaie S, Hemmati Maslakpak M, Parizad N. Are workplace-based assessment methods (DOPS and Mini-CEX) effective in nursing students' clinical skills? A single-blind randomized, parallel group, controlled trial. Contemp Nurse 2020; 55:565-575. [PMID: 32107975 DOI: 10.1080/10376178.2020.1735941] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/24/2022]
Abstract
Background: Evaluation of clinical skills is critically important for nursing students. However, the quality of evaluation tools is poor.Objectives: To evaluate the effectiveness of Direct Observation of Procedural Skills (DOPS) and Mini-Clinical Evaluation Exercise (Mini-CEX) on clinical skills of nursing students.Methods: This study was conducted among 108 senior nursing students. Mini-CEX and DOPS were utilized to evaluate clinical skills in the intervention group.Results: The mean of students' scores in all of the five procedures was significantly higher in the intervention group compared to control group.. Students' scores for the procedures significantly raised through the first stage of DOPS and Mini-CEX to the third stage.Conclusions: Utilization of DOPS and Mini-CEX for evaluation of clinical skills in nursing students effectively enhance their learning ability. Implementing of such assessment methods lead to promoting clinical skills of students which eventually help them to provide high quality care for their patients.
Collapse
Affiliation(s)
- Madineh Jasemi
- Faculty of Nursing and Midwifery, Urmia University of Medical Sciences, Urmia, Iran
| | | | - Masumeh Hemmati Maslakpak
- Faculty of Nursing and Midwifery, Urmia University of Medical Sciences, Urmia, Iran.,Maternal and Childhood Obesity Research Center, Urmia University of Medical Sciences, Urmia, Iran
| | - Naser Parizad
- Faculty of Nursing and Midwifery, Urmia University of Medical Sciences, Urmia, Iran.,Patient Safety Research Center, Urmia University of Medical Sciences, Urmia, Iran
| |
Collapse
|
9
|
Kordestani Moghaddam A, Khankeh HR, Shariati M, Norcini J, Jalili M. Educational impact of assessment on medical students' learning at Tehran University of Medical Sciences: a qualitative study. BMJ Open 2019; 9:e031014. [PMID: 31362972 PMCID: PMC6677973 DOI: 10.1136/bmjopen-2019-031014] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/04/2022] Open
Abstract
OBJECTIVES It has been shown that assessment strongly affects students' performance. A deeper insight needs to be gained into the interplay of assessment and learning. The aim of the current study was to develop a model to explain the educational impact of assessments on students' learning, before, during and after the test. DESIGN This study used semistructured interviews, focus group discussions and observation and collection of field notes. A qualitative methodology using the grounded theory data analysis approach was then used to generate an explanation of the process of how assessment impacts students' learning. SETTING School of Medicine, Tehran University of Medical Sciences. PARTICIPANTS Participants were medical students and teachers with first-hand experience or expertise in assessment as well as their willingness to participate in the study. Fifteen people (eight medical students, seven faculty members) were interviewed. One focus group discussion (with five students) was held. RESULTS The extracted concepts from our study were classified into four main categories. These categories include elements of the assessment programme which affect learning, the mechanism through which they exert their effects, contextual factors and the impact they have on learning. These elements and their interplay occur within an environment with its antecedent characteristics. CONCLUSIONS This study suggested a model for understanding the elements of the assessment which, within the context, affect learning, the mechanisms through which they impart their effects and the final outcomes obtained.
Collapse
Affiliation(s)
| | - Hamid Reza Khankeh
- Health in Emergency and Disaster Research Center, University of Social Welfare and Rehabilitation Sciences, Tehran, Iran
- Department of Clinical Science and Education, Karolinska Institute, Stockholm, Sweden
| | - Mohammad Shariati
- Department of Medical Education, Department of Community Medicine, School of Medicine, Tehran University of Medical Sciences, Tehran, Iran
| | - John Norcini
- Foundation for Advancement of International Medical Education and Research, Philadelphia, Pennsylvania, USA
| | - Mohammad Jalili
- Department of Medical Education, Tehran University of Medical Sciences, Tehran, Iran
- Department of Emergency Medicine, School of Medicine, Health Professions Education Research Center, Tehran University of Medical Sciences, Tehran, Iran
| |
Collapse
|
10
|
Bakoush O, Al Dhanhani A, Alshamsi S, Grant J, Norcini J. Does performance on United States national board of medical examiners reflect student clinical experiences in United Arab Emirates? MEDEDPUBLISH 2019; 8:4. [PMID: 38089293 PMCID: PMC10712583 DOI: 10.15694/mep.2019.000004.2] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/06/2024] Open
Abstract
This article was migrated. The article was marked as recommended. Background: A number of medical schools around the world use the United States National Board of Medical Examiners Subject Examinations as a clerkship assessment of student performance, yet these exams were blueprinted against the United States national core clerkship curriculum which might not be the same as the local curricula to which they are applied in other parts of the world. In this study, we investigated the correlations between the internal medicine clinical experiences at United Arab Emirates University with student performance on the National Board of Medical Examiners subject of internal medicine (NBME). Methods: One hundred and seven junior clerkship students out of 145 (74%) who finished their Internal Medicine clerkship during academic years 2014-2015 and 2015-2016 participated in this study. The students' clinical experiences were measured by the clinical learning evaluation questionnaire (CLEQ) and by the logged number of meaningful patient contacts during their internal medicine clerkship. Results: Linear regression analysis showed no significant association between performance on the subject test and student clinical experiences measured by the CLEQ or the number of logged patients. NBME scores were weakly correlated with OSCEs scores (ɸ 0.20). Conclusions: The study findings raised uncertainties about the suitability of using NBME in the clerkship assessment program in the United Arab Emirates.
Collapse
Affiliation(s)
| | | | | | - Janet Grant
- Centre for Medical Education in Context (CenMEDIC) and Department of Education in Medicine
- Centre for Medical Education in Context (CenMEDIC) and Department of Education in Medicine
| | - John Norcini
- Foundation for Advancement and International Medical Education and Research
| |
Collapse
|
11
|
Direct observation of procedural skills (DOPS) evaluation method: Systematic review of evidence. Med J Islam Repub Iran 2018; 32:45. [PMID: 30159296 PMCID: PMC6108252 DOI: 10.14196/mjiri.32.45] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/23/2017] [Indexed: 11/18/2022] Open
Abstract
Background: Evaluation is one of the most important aspects of medical education. Thus, new methods of effective evaluation are required in this area, and direct observation of procedural skills (DOPS) is one of these methods. This study was conducted to systematically review the evidence involved in this type of assessment to allow the effective use of this method.
Methods: Data were collected searching such keywords as evaluation, assessment, medical education, and direct observation of procedural skills (DOPS) on Google Scholar, PubMed, Science Direct, SID, Medlib and Google and by searching unpublished sources (Gray literature) and selected references (reference of reference).
Results: Of 236 papers, 28 were studied. Satisfaction with DOPS method was found to be moderate. The major strengths of this evaluation method are as follow: providing feedback to the participants and promoting independence and practical skills during assessment. However, stressful evaluation, time limitation for participants, and bias between assessors are the main drawbacks of this method. Positive impact of DOPS method on improving student performance has been noted in most studies. The results showed that the validity and reliability of DOPS are relatively acceptable. Performance of participants using DOPS was relatively satisfactory. However, not providing necessary trainings on how to take DOPS test, not providing essential feedback to participants, and insufficient time for the test are the major drawbacks of the DOPS tests.
Conclusion: According to the results of this study, DOPS tests can be applied as a valuable and effective evaluation method in medical education. However, more attention should be paid to the quality of these tests.
Collapse
|
12
|
Yusuf L, Ahmed A, Yasmin R. Educational impact of Mini-Clinical Evaluation Exercise: A game changer. Pak J Med Sci 2018; 34:405-411. [PMID: 29805417 PMCID: PMC5954388 DOI: 10.12669/pjms.342.14667] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/02/2022] Open
Abstract
Background and objective: Workplace based assessment has a strong educational impact in terms of student’s clinical performance by steering their learning towards the desired learning outcomes. Educational impact is hardly measured in the sphere of medical education and this study is an attempt to measure educational impact of post graduate residents. The aim of this study was “To explore educational impact of Minicex (Mini-clinical evaluation exercise) on residents with respect to their learning”. Methods: A mixed convergent parallel method was selected, participants were identified through non-probability convenience sampling, total 10 participants were chosen for data collection, all of them experienced four minicex encounters which generated their scores (the quantitative data), after first two Minicex encounters each participant was interviewed using a structured interview technique, similarly, after 3rd and 4th Minicex encounters. Data was entered in the SPSS version -21 to calculate descriptive statistics. Inferential statistics were determined using ANOVA to calculate improvement in score over time and P-value to report statistical significance. Qualitative analysis was done using thematic analysis approach with the help of themes based on interview questions: priori theme. NVIVO was used for triangulation of themes accordingly. Results: The results indicate statistically significant improvement in scores and p values were considered significant at 0.05. Also, qualitative analysis provided reasons for improvement in scores and residents’ satisfaction such as feedback, motivation, self-directed learning, peer assisted learning. Conclusion: The study concluded that residents learning behavior and, their satisfaction from assessment method can be enhanced through work place based assessment particularly in context of minicex (mini-clinical evaluation exercise) so encouraging its use in similar situations. However, the scope for generalization of results remains limited owing to a small sample size.
Collapse
Affiliation(s)
- Lamia Yusuf
- Dr. Lamia Yusuf MBBS, FCPS, MHPE. Assistant Professor (Gynaecology / Obstetrics) Rashid Latif Medical College, Lahore, Pakistan
| | - Amina Ahmed
- Dr. Amina Ahmed, MBBS, M. PHIL, MCHP-HPE. Supervisor, Associate Professor, Medical Education, Fatima Memorial College, Lahore, Pakistan
| | - Raheela Yasmin
- Prof. Raheela Yasmin, BDS, DCPS-HPE, MHPE, PHD-HPE. Co-supervisor, Medical Education, Riphah international University, Islamabad, Pakistan
| |
Collapse
|
13
|
Svirko E, Mellanby J. Teaching neuroanatomy using computer-aided learning: What makes for successful outcomes? ANATOMICAL SCIENCES EDUCATION 2017; 10:560-569. [PMID: 28431201 DOI: 10.1002/ase.1694] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/08/2016] [Revised: 03/06/2017] [Accepted: 03/12/2017] [Indexed: 06/07/2023]
Abstract
Computer-aided learning (CAL) is an integral part of many medical courses. The neuroscience course at Oxford University for medical students includes CAL course of neuroanatomy. CAL is particularly suited to this since neuroanatomy requires much detailed three-dimensional visualization, which can be presented on screen. The CAL course was evaluated using the concept of approach to learning. The aims of university teaching are congruent with the deep approach-seeking meaning and relating new information to previous knowledge-rather than to the surface approach of concentrating on rote learning of detail. Seven cohorts of medical students (N = 869) filled in approach to learning scale and a questionnaire investigating their engagement with the CAL course. The students' scores on CAL-course-based neuroanatomy assessment and later university examinations were obtained. Although the students reported less use of the deep approach for the neuroanatomy CAL course than for the rest of their neuroanatomy course (mean = 24.99 vs. 31.49, P < 0.001), deep approach for CAL was positively correlated with neuroanatomy assessment performance (r = 0.12, P < 0.001). Time spent on the CAL course, enjoyment of it, the amount of CAL videos watched and quizzes completed were each significantly positively related to deep approach. The relationship between deep approach and enjoyment was particularly notable (25.5% shared variance). Reported relationships between deep approach and academic performance support the desirability of deep approach in university students. It is proposed that enjoyment of the course and the deep approach could be increased by incorporation of more clinical material which is what the students liked most. Anat Sci Educ 10: 560-569. © 2017 American Association of Anatomists.
Collapse
Affiliation(s)
- Elena Svirko
- Oxford Group for Children's Potential, Department of Experimental Psychology, University of Oxford, Oxford, United Kingdom
| | - Jane Mellanby
- Oxford Group for Children's Potential, Department of Experimental Psychology, University of Oxford, Oxford, United Kingdom
| |
Collapse
|
14
|
Watling C, LaDonna KA, Lingard L, Voyer S, Hatala R. 'Sometimes the work just needs to be done': socio-cultural influences on direct observation in medical training. MEDICAL EDUCATION 2016; 50:1054-64. [PMID: 27628722 DOI: 10.1111/medu.13062] [Citation(s) in RCA: 72] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/07/2015] [Revised: 12/08/2015] [Accepted: 02/26/2016] [Indexed: 05/14/2023]
Abstract
CONTEXT Direct observation promises to strengthen both coaching and assessment, and calls for its increased use in medical training abound. Despite its apparent potential, the uptake of direct observation in medical training remains surprisingly limited outside the formal assessment setting. The limited uptake of observation raises questions about cultural barriers to its use. In this study, we explore the influence of professional culture on the use of direct observation within medical training. METHODS Using a constructivist grounded theory approach, we interviewed 22 residents or fellows (10 male, 12 female) about their experiences of being observed during training. Participants represented a range of specialties and training levels. Data collection and analysis were conducted iteratively. Themes were identified using constant comparative analysis. RESULTS Observation was used selectively; specialties tended to observe the clinical acts that they valued most. Despite these differences, we found two cultural values that consistently challenged the ready implementation of direct observation across specialties: (i) autonomy in learning and (ii) efficiency in health care provision. Furthermore, we found that direct observation was a primarily learner-driven activity, which left learners caught in the middle, wanting observation but also wanting to appear independent and efficient. CONCLUSIONS The cultural values of autonomy in learning and practice and efficiency in health care provision challenge the integration of direct observation into clinical training. Medical learners are often expected to ask for observation, but such requests are socially and culturally fraught, and likely to constrain the wider uptake of direct observation.
Collapse
Affiliation(s)
- Christopher Watling
- Department of Clinical Neurological Sciences, Schulich School of Medicine and Dentistry, Western University, London, Ontario, Canada.
| | - Kori A LaDonna
- Centre for Education Research & Innovation, Schulich School of Medicine and Dentistry, Western University, London, Ontario, Canada
| | - Lorelei Lingard
- Department of Medicine and Centre for Education Research & Innovation, Schulich School of Medicine and Dentistry, Western University, London, Ontario, Canada
| | - Stephane Voyer
- Department of Medicine, University of British Columbia, Vancouver, British Columbia, Canada
| | - Rose Hatala
- Department of Medicine, University of British Columbia, Vancouver, British Columbia, Canada
| |
Collapse
|
15
|
Gerhard-Szep S, Güntsch A, Pospiech P, Söhnel A, Scheutzel P, Wassmann T, Zahn T. Assessment formats in dental medicine: An overview. GMS JOURNAL FOR MEDICAL EDUCATION 2016; 33:Doc65. [PMID: 27579365 PMCID: PMC5003142 DOI: 10.3205/zma001064] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Received: 10/23/2015] [Revised: 03/24/2016] [Accepted: 05/09/2016] [Indexed: 05/25/2023]
Abstract
AIM At the annual meeting of German dentists in Frankfurt am Main in 2013, the Working Group for the Advancement of Dental Education (AKWLZ) initiated an interdisciplinary working group to address assessments in dental education. This paper presents an overview of the current work being done by this working group, some of whose members are also actively involved in the German Association for Medical Education's (GMA) working group for dental education. The aim is to present a summary of the current state of research on this topic for all those who participate in the design, administration and evaluation of university-specific assessments in dentistry. METHOD Based on systematic literature research, the testing scenarios listed in the National Competency-based Catalogue of Learning Objectives (NKLZ) have been compiled and presented in tables according to assessment value. RESULTS Different assessment scenarios are described briefly in table form addressing validity (V), reliability (R), acceptance (A), cost (C), feasibility (F), and the influence on teaching and learning (EI) as presented in the current literature. Infoboxes were deliberately chosen to allow readers quick access to the information and to facilitate comparisons between the various assessment formats. Following each description is a list summarizing the uses in dental and medical education. CONCLUSION This overview provides a summary of competency-based testing formats. It is meant to have a formative effect on dental and medical schools and provide support for developing workplace-based strategies in dental education for learning, teaching and testing in the future.
Collapse
Affiliation(s)
- Susanne Gerhard-Szep
- Goethe-Universität, Carolinum Zahnärztliches Universitäts-Institut gGmbH, Poliklinik Zahnerhaltungskunde, Frankfurt am Main, Deutschland
| | - Arndt Güntsch
- Marquette University School of Dentistry, Department of Surgical Sciences, Milwaukee, USA und Universitätsklinikum Jena, Zentrum für Zahn-, Mund- und Kieferheilkunde, Jena, Deutschland
| | - Peter Pospiech
- Universität Würzburg, Poliklinik für Zahnärztliche Prothetik, Würzburg, Deutschland
| | - Andreas Söhnel
- Universitätsmedizin Greifswald, Poliklinik für Zahnärztliche Prothetik, Alterszahnheilkunde und medizinischer Werkstoffkunde, Greifswald, Deutschland
| | - Petra Scheutzel
- Universitätsklinikum Münster, Poliklinik für Prothetische Zahnmedizin & Biomaterialien, Münster, Deutschland
| | - Torsten Wassmann
- Universitätsmedizin Göttingen, Poliklinik für Zahnärztliche Prothetik, Göttingen, Deutschland
| | - Tugba Zahn
- Goethe-Universität, Carolinum Zahnärztliches Universitäts-Institut gGmbH, Poliklinik für Zahnärztliche Prothetik, Frankfurt am Main, Deutschland
| |
Collapse
|
16
|
Cobb KA, Brown G, Hammond R, Mossop LH. Students' perceptions of the Script Concordance Test and its impact on their learning behavior: a mixed methods study. JOURNAL OF VETERINARY MEDICAL EDUCATION 2015; 42:45-52. [PMID: 25526762 DOI: 10.3138/jvme.0514-057r1] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/04/2023]
Abstract
The Script Concordance Test (SCT) is increasingly used in postgraduate and undergraduate education as a method of summative clinical assessment. It has been shown to have high validity and reliability but there is little evidence of its use in veterinary education as assessment for learning. This study investigates some students' perceptions of the SCT and its effects on their approaches to learning. Final-year undergraduates of the School of Veterinary Medicine and Science (SVMS) at the University of Nottingham participated in a mixed-methods study after completing three formative SCT assessments. A qualitative, thematic analysis was produced from transcripts of three focus group discussions. The quantitative study was a survey based on the analyses of the qualitative study. Out of 50 students who registered for the study, 18 participated in the focus groups and 28 completed the survey. Clinical experience was regarded as the most useful source of information for answering the SCT. The students also indicated that recall of facts was perceived as useful for multiple-choice questions but least useful for the SCT. Themes identified in the qualitative study related to reliability, acceptability, educational impact, and validity of the SCT. The evidence from this study shows that the SCT has high face validity among veterinary students. They reported that it encouraged them to reflect upon their clinical experience, to participate in discussions of case material, and to adopt a deeper approach to clinical learning. These findings strongly suggest that the SCT is potentially a valuable method for assessing clinical reasoning and enhancing student learning.
Collapse
|