1
|
Parekh P, Bahadoor V. The Utility of Multiple-Choice Assessment in Current Medical Education: A Critical Review. Cureus 2024; 16:e59778. [PMID: 38846235 PMCID: PMC11154086 DOI: 10.7759/cureus.59778] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 05/06/2024] [Indexed: 06/09/2024] Open
Abstract
In recent years, healthcare education providers have boasted about a conscious shift towards increasing clinical competence via assessment tests that promote more active learning. Despite this, multiple-choice questions remain amongst the most prevalent forms of assessment. Various literature justifies the use of multiple-choice testing by its high levels of validity and reliability. Education providers also benefit from requiring fewer resources and costs in the development of questions and easier adaptivity of questions to compensate for neurodiversity. However, when testing these (and other) variables via a structured approach in terms of their utility, it is elucidated that these advantages are largely dependent on the quality of the questions that are written, the level of clinical competence that is to be attained by learners and the impact of negating confounding variables such as differential attainment. Attempts at improving the utility of multiple-choice question testing in modern healthcare curricula are discussed in this review, as well as the impact of these modifications on performance.
Collapse
Affiliation(s)
- Priya Parekh
- Trauma and Orthopaedics, Wirral University Teaching Hospital, Wirral, GBR
| | - Vikesh Bahadoor
- Trauma and Orthopaedics, Wirral University Teaching Hospital, Wirral, GBR
| |
Collapse
|
2
|
Kim HS, Johnson TM, Stancoven BW, Inouye KA, Millan CP, Lincicum AR. Predictors of standardized in-service examination performance and residency outcomes in a graduate periodontics program. J Dent Educ 2024; 88:403-410. [PMID: 38269493 DOI: 10.1002/jdd.13445] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/24/2023] [Revised: 11/11/2023] [Accepted: 11/22/2023] [Indexed: 01/26/2024]
Abstract
PURPOSE/OBJECTIVES The objectives of this study were to assess the influence of learner- and education-related factors on standardized in-service examination performance and determine whether in-service examination scores predict residency outcomes. METHODS American Academy of Periodontology (AAP) In-service Examination (AIE) scores from 10 periodontics residency classes at a single center were recorded and compared against a panel of learner- and education-related variables using multiple linear regression models. Defined residency outcome measures were analyzed against AIE scores using binomial logistic regression. RESULTS No evaluated learner- or education-related variable was a statistically significant predictor of AIE score in this study sample. Likewise, AIE score was not a statistically significant predictor of any assessed residency outcome. CONCLUSIONS The AAP has performed a tremendous service to periodontics residents and programs by marshaling the leadership and expertise necessary to offer a professionally constructed assessment instrument. However, in the current study, no relationship could be identified between AIE score and any outcome, including first-attempt board certification. The AAP In-service Committee appears well situated to provide additional leadership focusing on exam implementation, which may enhance AIE value in competency decision making.
Collapse
Affiliation(s)
- Han S Kim
- Department of Periodontics, Army Postgraduate Dental School, Postgraduate Dental College, Uniformed Services University, Fort Eisenhower, Georgia, USA
| | - Thomas M Johnson
- Department of Periodontics, Army Postgraduate Dental School, Postgraduate Dental College, Uniformed Services University, Fort Eisenhower, Georgia, USA
| | - Brian W Stancoven
- Department of Periodontics, Army Postgraduate Dental School, Postgraduate Dental College, Uniformed Services University, Fort Eisenhower, Georgia, USA
| | - Kimberly Ann Inouye
- Department of Periodontics, Army Postgraduate Dental School, Postgraduate Dental College, Uniformed Services University, Fort Eisenhower, Georgia, USA
| | - Claudia P Millan
- Department of Periodontics, Army Postgraduate Dental School, Postgraduate Dental College, Uniformed Services University, Fort Eisenhower, Georgia, USA
| | - Adam R Lincicum
- Department of Periodontics, Army Postgraduate Dental School, Postgraduate Dental College, Uniformed Services University, Fort Eisenhower, Georgia, USA
| |
Collapse
|
3
|
Rath A. Back to basics: reflective take on role of MCQs in undergraduate Malaysian dental professional qualifying exams. Front Med (Lausanne) 2023; 10:1287924. [PMID: 38098841 PMCID: PMC10719850 DOI: 10.3389/fmed.2023.1287924] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/03/2023] [Accepted: 11/06/2023] [Indexed: 12/17/2023] Open
Affiliation(s)
- Avita Rath
- Faculty of Dentistry, SEGi University, Petaling Jaya, Selangor, Malaysia
- Edinburgh Medical School- Clinical Education, University of Edinburgh, Edinburgh, United Kingdom
| |
Collapse
|
4
|
Sims DA, Cilliers FJ. Clinician educators' conceptions of assessment in medical education. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2023; 28:1053-1077. [PMID: 36662334 PMCID: PMC10624725 DOI: 10.1007/s10459-022-10197-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/26/2022] [Accepted: 12/11/2022] [Indexed: 06/17/2023]
Abstract
In pursuing assessment excellence, clinician-educators who design and implement assessment are pivotal. The influence of their assessment practice in university-run licensure exams on student learning has direct implications for future patient care. While teaching practice has been shown to parallel conceptions of teaching, we know too little about conceptions of assessment in medical education to know if this is the case for assessment practice and conceptions of assessment. To explore clinician-educators' conceptions of assessment, a phenomenographic study was undertaken. Phenomenography explores conceptions, the qualitatively different ways of understanding a phenomenon. Data analysis identifies a range of hierarchically inclusive categories of understanding, from simple to more complex, and the dimensions that distinguish each category or conception. Thirty-one clerkship convenors in three diverse Southern settings were interviewed in three cycles of iterative data collection and analysis. Four conceptions of assessment were identified: passive operator, awakening enquirer, active owner and scholarly assessor. Six dimensions were elucidated to describe and distinguish each conception: purpose of assessment; temporal perspective; role and responsibility; accountability; reflexivity and emotional valence. Additionally, three characteristics that appeared to track the progressive nature of the conceptions were identified: professional identity, assessment literacy and self-efficacy. These conceptions encompass and extend previously described conceptions across different educational levels, disciplines and contexts, suggesting applicability to other settings. There is some evidence of a relationship between conceptions and practice, suggesting, together with the hierarchical nature of these conceptions, that targeting conceptions during faculty development may be an effective approach to enhance assessment practice.
Collapse
Affiliation(s)
- D A Sims
- University of the Western Cape, 14 Blanckenberg Street, Bellville, South Africa.
| | - F J Cilliers
- Faculty of Health Sciences, University of Cape Town, Cape Town, South Africa
| |
Collapse
|
5
|
Chan SCC, Choa G, Kelly J, Maru D, Rashid MA. Implementation of virtual OSCE in health professions education: A systematic review. MEDICAL EDUCATION 2023; 57:833-843. [PMID: 37080907 DOI: 10.1111/medu.15089] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/07/2022] [Revised: 03/13/2023] [Accepted: 03/28/2023] [Indexed: 05/03/2023]
Abstract
INTRODUCTION The Objective Structured Clinical Examination (OSCE) has been widely used in health professions education since the 1970s. The global disruption caused by the COVID-19 pandemic restricted in-person assessments and medical educators globally sought alternative means to assess and certify students and trainees to meet the acute demand for health-care workers. One such solution was through virtual OSCE (vOSCE), which modified traditional in-person OSCE using videoconference platforms. This meta-ethnography sought to synthesise qualitative literature on candidates' and assessors' experiences of vOSCE to evaluate whether it may have a role in future assessment practices. METHODS In June 2022, we systematically searched PsycINFO, Medline and ERIC for peer-reviewed qualitative and mixed-methods articles that described candidates' and assessors' experiences of virtual OSCE in health professions education. Of 1069 articles identified, 17 were synthesised using meta-ethnography. RESULTS The final synthesis represented 1190 candidates and assessors from faculties of medicine, dentistry, nursing, pharmacy and osteopathy. We developed our findings into four key concepts. 'Strengthening confidence in a virtual environment' highlighted attempts to overcome and mitigate concerns associated with transitioning from in-person to virtual assessment. 'Understanding the scope of use as an assessment' reflected on the suitability of vOSCE in assessing various skills. 'Refining operational processes' emphasised the technical challenges of implementing vOSCE and impacts on accessibility and resources. 'Envisioning its future role' considered the applicability of vOSCE in the climate of rapid development in telehealth. CONCLUSION This meta-ethnography highlighted that although vOSCE was primarily considered a temporary and crisis response, candidates and assessors recognised positive, as well as negative, consequences of the transition towards them. Moving forward, medical education policymakers should carefully consider the extent to which elements of vOSCE could be incorporated into assessment systems, particularly in light of the rise of telehealth in clinical practice.
Collapse
Affiliation(s)
- See Chai Carol Chan
- Faculty of Medical Sciences, UCL Medical School, University College London, London, UK
| | - George Choa
- Department of Radiology, Cambridge University Hospitals NHS Foundation Trust, Cambridge, UK
| | - James Kelly
- Faculty of Medical Sciences, UCL Medical School, University College London, London, UK
| | - Devina Maru
- Faculty of Medical Sciences, UCL Medical School, University College London, London, UK
| | - Mohammed Ahmed Rashid
- Faculty of Medical Sciences, UCL Medical School, University College London, London, UK
| |
Collapse
|
6
|
Hermasari BK, Nugroho D, Maftuhah A, Pamungkasari EP, Budiastuti VI, Laras AA. Promoting medical student's clinical reasoning during COVID-19 pandemic. KOREAN JOURNAL OF MEDICAL EDUCATION 2023; 35:187-198. [PMID: 37291847 DOI: 10.3946/kjme.2023.259] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/04/2022] [Accepted: 03/17/2023] [Indexed: 06/10/2023]
Abstract
PURPOSE The development of students' clinical reasoning skills should be a consideration in the design of instruction and evaluation in medical education. In response to the coronavirus disease 2019 (COVID-19) pandemic, several changes in the medical curriculum have been implemented in promoting clinical reasoning. This study aims to explore medical students' perceptions and experiences with the clinical reasoning curriculum during the COVID-19 pandemic and determine their skills development. METHODS The study used a mixed-method design with a concurrent approach. A cross-sectional study was conducted to compare and examine the relationship between the outcomes of the structured oral examination (SOE) and the Diagnostic Thinking Inventory (DTI). Then, the qualitative method was used. A focus group discussion using a semi-structured interview guide with open-ended questions was conducted, then the verbatim transcript was subjected to thematic analysis. RESULTS There is an increase in SOE and DTI scores between second-year to fourth-year students. The diagnostic thinking domains and SOE are significantly correlated (r=0.302, 0.313, and 0.241 with p<0.05). The three primary themes from the qualitative analysis are perceptions regarding clinical reasoning, clinical reasoning activities, and the learning component. CONCLUSION Even if students are still studying throughout the COVID-19 pandemic, their clinical reasoning skills can improve. The clinical reasoning and diagnostic thinking skills of medical students increase as the length of the school year increases. Online case-based learning and assessment support the development of clinical reasoning skills. The skills are supported in their development by positive attitudes toward faculty, peers, case type, and prior knowledge.
Collapse
Affiliation(s)
| | - Dian Nugroho
- Faculty of Medicine, Sebelas Maret University, Surakarta, Indonesia
| | - Atik Maftuhah
- Faculty of Medicine, Sebelas Maret University, Surakarta, Indonesia
| | | | | | | |
Collapse
|
7
|
Castanelli D. Sociocultural learning theory and assessment for learning. MEDICAL EDUCATION 2023; 57:382-384. [PMID: 36760219 DOI: 10.1111/medu.15028] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/13/2023] [Accepted: 01/30/2023] [Indexed: 06/18/2023]
Affiliation(s)
- Damian Castanelli
- School of Clinical Sciences at Monash Health, Monash University, Clayton, Victoria, Australia
| |
Collapse
|
8
|
Zubiaurre Bitzer LA, Dathatri S, Fine JB, Swan Sein A. Building a student learning-focused assessment and grading system in dental school: One school's experience. J Dent Educ 2023; 87:614-624. [PMID: 36607618 DOI: 10.1002/jdd.13158] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/28/2022] [Revised: 11/21/2022] [Accepted: 12/06/2022] [Indexed: 01/07/2023]
Abstract
PURPOSE/OBJECTIVES As health professions education moves toward competency-based education, there has been increased focus on the structure of assessment systems that support student competency development and learning. This has been buoyed by a growing body of research supporting assessment for learning processes to promote student growth and learning rather than relying on assessment systems primarily to measure performance. This paper presents the rationale and evidence for moving to an assessment for learning system and the results of a quasi-experimental interrupted time series study using data from 2015 to 2022 to evaluate the impacts of these changes. METHODS Columbia University College of Dental Medicine faculty voted to implement assessment for learning system changes beginning in 2017 with the graduating class of 2021. These changes included moving from using a grading system for didactic courses with Honors, Pass, and Fail as available grades to a grading system with only Pass and Fail as available grades, as well as creating synthesis and assessment weeks, weekly problem sets, post-exam review sessions, exam remediation opportunities, and formative progress exams throughout the curriculum. The revised assessment and grading system changes were communicated to residency program directors, and programmatic competency data about student performance across the curriculum were shared with programs in Dean's Letters. RESULTS Once assessment system changes were implemented, it was found that student exam failure rates were lower, course exam scores were the same or higher, and performance on board exams improved compared to the national average. Students reported positive perceptions with regard to well-being and learning climate that they associated with the adoption of Pass/Fail grading. Match outcomes, including student satisfaction and program director ratings, have remained consistently positive. CONCLUSION As dental educators, our goal is to nurture students to become life-long learners. Adopting a grading structure that is Pass/Fail and an assessment system that fosters learning allows students to shape learning practices that favor long-term retention and application of information, also enhancing the learning environment and student well-being. These system changes may also facilitate the inclusion and support of students whose backgrounds are underrepresented in dentistry.
Collapse
Affiliation(s)
| | - Shubha Dathatri
- Vagelos College of Physicians and Surgeons, Columbia University, New York, New York, USA
| | - James B Fine
- College of Dental Medicine, Columbia University, New York, New York, USA
| | - Aubrie Swan Sein
- Vagelos College of Physicians and Surgeons, Columbia University, New York, New York, USA
| |
Collapse
|
9
|
Taking the Big Leap: A Case Study on Implementing Programmatic Assessment in an Undergraduate Medical Program. EDUCATION SCIENCES 2022. [DOI: 10.3390/educsci12070425] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/04/2023]
Abstract
The concept of programmatic assessment (PA) is well described in the literature; however, studies on implementing and operationalizing this systemic assessment approach are lacking. The present case study developed a local instantiation of PA, referred to as Assessment System Fribourg (ASF), which was inspired by an existing program. ASF was utilized for a new competency-based undergraduate Master of Medicine program at the State University of Fribourg. ASF relies on the interplay of four key principles and nine main program elements based on concepts of PA, formative assessment, and evaluative judgment. We started our journey in 2019 with the first cohort of 40 students who graduated in 2022. This paper describes our journey implementing ASF, including the enabling factors and hindrances that we encountered, and reflects on our experience and the path that is still in front of us. This case illustrates one possibility for implementing PA.
Collapse
|
10
|
Castanelli DJ, Weller JM, Molloy E, Bearman M. Trust, power and learning in workplace-based assessment: The trainee perspective. MEDICAL EDUCATION 2022; 56:280-291. [PMID: 34433230 PMCID: PMC9292503 DOI: 10.1111/medu.14631] [Citation(s) in RCA: 20] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 02/01/2021] [Revised: 08/09/2021] [Accepted: 08/17/2021] [Indexed: 05/22/2023]
Abstract
For trainees to participate meaningfully in workplace-based assessment (WBA), they must have trust in their assessor. However, the trainee's dependent position complicates such trust. Understanding how power and trust influence WBAs may help us make them more effective learning opportunities. We conducted semi-structured interviews with 17 postgraduate anaesthesia trainees across Australia and New Zealand. Sensitised by notions of power, we used constructivist grounded theory methodology to examine trainees' experiences with trusting their supervisors in WBAs. In our trainee accounts, we found that supervisors held significant power to mediate access to learning opportunities and influence trainee progress in training. All episodes where supervisors could observe trainees, from simply working together to formal WBAs, were seen to generate assessment information with potential consequences. In response, trainees actively acquiesced to a deferential role, which helped them access desirable expertise and minimise the risk of reputational harm. Trainees granted trust based on how they anticipated a supervisor would use the power inherent in their role. Trainees learned to ration exposure of their authentic practice to supervisors in proportion to their trust in them. Trainees were more trusting and open to learning when supervisors used their power for the trainee's benefit and avoided WBAs with supervisors they perceived as less trustworthy. If assessment for learning is to flourish, then the trainee-supervisor power dynamic must evolve. Enhancing supervisor behaviour through reflection and professional development to better reward trainee trust would invite more trainee participation in assessment for learning. Modifying the assessment system design to nudge the power balance towards the trainee may also help. Modifications could include designated formative and summative assessments or empowering trainees to select which assessments count towards progress decisions. Attending to power and trust in WBA may stimulate progress towards the previously aspirational goal of assessment for learning in the workplace.
Collapse
Affiliation(s)
- Damian J. Castanelli
- School of Clinical Sciences at Monash HealthMonash UniversityClaytonVictoriaAustralia
- Department of Anaesthesia and Perioperative MedicineMonash HealthClaytonVictoriaAustralia
- Centre for Research and Assessment in Digital Learning (CRADLE)Deakin UniversityGeelongVictoriaAustralia
| | - Jennifer M. Weller
- Centre for Medical and Health Sciences Education, School of MedicineUniversity of AucklandAucklandNew Zealand
- Department of AnaesthesiaAucklandAucklandNew Zealand
| | - Elizabeth Molloy
- Department of Medical Education, Melbourne Medical SchoolUniversity of MelbourneMelbourneVictoriaAustralia
| | - Margaret Bearman
- Centre for Research and Assessment in Digital Learning (CRADLE)Deakin UniversityGeelongVictoriaAustralia
| |
Collapse
|
11
|
Saiyad S, Bhagat P, Virk A, Mahajan R, Singh T. Changing Assessment Scenarios: Lessons for Changing Practice. Int J Appl Basic Med Res 2021; 11:206-213. [PMID: 34912682 PMCID: PMC8633695 DOI: 10.4103/ijabmr.ijabmr_334_21] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/23/2021] [Revised: 08/03/2021] [Accepted: 09/02/2021] [Indexed: 11/04/2022] Open
Abstract
Assessment is a process that includes ascertainment of improvement in the performance of students over time, motivation of students to study, evaluation of teaching methods, and ranking of student capabilities. It is an important component of the educational process influencing student learning. Although we have embarked on a new curricular model, assessment has remained largely ignored despite being the hallmark of competency-based education. During the earlier stages, the assessment was considered akin to "measurement," believing that competence is "generic, fixed and transferable across content," could be measured quantitatively and can be expressed as a single score. The objective assessment was the norm and subjective tools were considered unreliable and biased. It was soon realized that "competence is specific and nontransferable," mandating the use of multiple assessment tools across multiple content areas using multiple assessors. A paradigm change through "programmatic assessment" only occurred with the understanding that competence is "dynamic, incremental and contextual." Here, information about the students' competence and progress is gathered continually over time, analysed and supplemented with purposefully collected additional information when needed, using carefully selected combination of tools and assessor expertise, leading to an authentic, observation-driven, institutional assessment system. In the conduct of any performance assessment, the assessor remains an important part of the process, therefore making assessor training indispensable. In this paper, we look at the changing paradigms of our understanding of clinical competence, corresponding global changes in assessment and then try to make out a case for adopting the prevailing trends in the assessment of clinical competence.
Collapse
Affiliation(s)
- Shaista Saiyad
- Department of Physiology, Smt N H L Municipal Medical College, Ahmedabad, Gujarat, India
| | - Purvi Bhagat
- M and J Western Regional Institute of Ophthalmology, B. J. Medical College, Ahmedabad, Gujarat, India
| | - Amrit Virk
- Department of Community Medicine, Adesh Medical College and Hospital, Kurukshetra, Haryana, India
| | - Rajiv Mahajan
- Department of Pharmacology, Adesh Institute of Medical Sciences and Research, Bathinda, Punjab, India
| | - Tejinder Singh
- Department of Medical Education, Sri Guru Ram Das Institute of Medical Sciences and Research, Amritsar, Punjab, India
| |
Collapse
|
12
|
Pearce J, Tavares W. A philosophical history of programmatic assessment: tracing shifting configurations. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2021; 26:1291-1310. [PMID: 33893881 DOI: 10.1007/s10459-021-10050-1] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/23/2020] [Accepted: 04/09/2021] [Indexed: 06/12/2023]
Abstract
Programmatic assessment is now well entrenched in medical education, allowing us to reflect on when it first emerged and how it evolved into the form we know today. Drawing upon the intellectual tradition of historical epistemology, we provide a philosophically-oriented historiographical study of programmatic assessment. Our goal is to trace its relatively short historical trajectory by describing shifting configurations in its scene of inquiry-focusing on questions, practices, and philosophical presuppositions. We identify three historical phases: emergence, evolution and entrenchment. For each, we describe the configurations of the scene; examine underlying philosophical presuppositions driving changes; and detail upshots in assessment practice. We find that programmatic assessment emerged in response to positivist 'turmoil' prior to 2005, driven by utility considerations and implicit pragmatist undertones. Once introduced, it evolved with notions of diversity and learning being underscored, and a constructivist ontology developing at its core. More recently, programmatic assessment has become entrenched as its own sub-discipline. Rich narratives have been emphasised, but philosophical underpinnings have been blurred. We hope to shed new light on current assessment practices in the medical education community by interrogating the history of programmatic assessment from this philosophical vantage point. Making philosophical presuppositions explicit highlights the perspectival nature of aspects of programmatic assessment, and suggest reasons for perceived benefits as well as potential tensions, contradictions and vulnerabilities in the approach today. We conclude by offering some reflections on important points to emerge from our historical study, and suggest 'what next' for programmatic assessment in light of this endeavour.
Collapse
Affiliation(s)
- J Pearce
- Tertiary Education (Assessment), Australian Council for Educational Research, 19 Prospect Hill Road, Camberwell, VIC, 3124, Australia.
| | - W Tavares
- The Wilson Centre and Post-MD Education. University Health Network and University of Toronto, Toronto, ON, Canada
| |
Collapse
|
13
|
Ross S, Hauer KE, Wycliffe-Jones K, Hall AK, Molgaard L, Richardson D, Oswald A, Bhanji F. Key considerations in planning and designing programmatic assessment in competency-based medical education. MEDICAL TEACHER 2021; 43:758-764. [PMID: 34061700 DOI: 10.1080/0142159x.2021.1925099] [Citation(s) in RCA: 19] [Impact Index Per Article: 6.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Programmatic assessment as a concept is still novel for many in clinical education, and there may be a disconnect between the academics who publish about programmatic assessment and the front-line clinical educators who must put theory into practice. In this paper, we clearly define programmatic assessment and present high-level guidelines about its implementation in competency-based medical education (CBME) programs. The guidelines are informed by literature and by lessons learned from established programmatic assessment approaches. We articulate five steps to consider when implementing programmatic assessment in CBME contexts: articulate the purpose of the program of assessment, determine what must be assessed, choose tools fit for purpose, consider the stakes of assessments, and define processes for interpreting assessment data. In the process, we seek to offer a helpful guide or template for front-line clinical educators. We dispel some myths about programmatic assessment to help training programs as they look to design-or redesign-programs of assessment. In particular, we highlight the notion that programmatic assessment is not 'one size fits all'; rather, it is a system of assessment that results when shared common principles are considered and applied by individual programs as they plan and design their own bespoke model of programmatic assessment for CBME in their unique context.
Collapse
Affiliation(s)
- Shelley Ross
- Department of Family Medicine, University of Alberta, Edmonton, Canada
- Canadian Association for Medical Education, Edmonton, Canada
| | | | - Keith Wycliffe-Jones
- Department of Family Medicine, Cumming School of Medicine, University of Calgary, Calgary, Canada
| | - Andrew K Hall
- Department of Emergency Medicine, Queen's University, Kingston, Canada
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
| | - Laura Molgaard
- University of Minnesota College of Veterinary Medicine, St. Paul, MIN, USA
| | - Denyse Richardson
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Division of Physiatry, Department of Medicine, University of Toronto, Toronto, Canada
| | - Anna Oswald
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Department of Medicine and CBME lead for the Faculty of Medicine & Dentistry, University of Alberta, Edmonton, Canada
| | - Farhan Bhanji
- Royal College of Physicians and Surgeons of Canada, Ottawa, Canada
- Pediatrics at McGill University, Montreal, Canada
| |
Collapse
|
14
|
Pan TLT, Chen FG. Opportunity within a crisis-A push towards programmatic assessment in the COVID-19 pandemic situation. MEDICAL EDUCATION 2021; 55:637. [PMID: 33686734 PMCID: PMC8250595 DOI: 10.1111/medu.14486] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/06/2020] [Accepted: 02/09/2021] [Indexed: 06/12/2023]
|
15
|
Swan Sein A, Rashid H, Meka J, Amiel J, Pluta W. Twelve tips for embedding assessment for and as learning practices in a programmatic assessment system. MEDICAL TEACHER 2021; 43:300-306. [PMID: 32658603 DOI: 10.1080/0142159x.2020.1789081] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/24/2023]
Abstract
Programmatic assessment supports the evolution from assessment of learning to fostering assessment for learning and as learning practices. A well-designed programmatic assessment system aligns educational objectives, learning opportunities, and assessments with the goals of supporting student learning, making decisions about student competence and promotion decisions, and supporting curriculum evaluation. We present evidence-based guidance for implementing assessment for and as learning practices in the pre-clinical knowledge assessment system to help students learn, synthesize, master and retain content for the long-term so that they can apply knowledge to patient care. Practical tips are in the domains of culture and motivation of assessment, including how an honour code and competency-based grading system can support an assessment system to develop student self-regulated learning and professional identity, curricular assessment structure, such as how and when to utilize low-stakes and cumulative assessment to drive learning, exam and question structure, including what authentic question and exam types can best facilitate learning, and assessment follow-up and review considerations, such exam retake processes to support learning, and academic success structures. A culture change is likely necessary for administrators, faculty members, and students to embrace assessment as most importantly a learning tool for students and programs.
Collapse
Affiliation(s)
- Aubrie Swan Sein
- Columbia Vagelos College of Physicians and Surgeons, Columbia University, New York, NY, USA
| | - Hanin Rashid
- Rutgers Robert Wood Johnson Medical School, Piscataway, NJ, USA
| | - Jennifer Meka
- Jacobs School of Medicine and Biomedical Sciences, State University of New York (SUNY) at Buffalo, Buffalo, NY, USA
| | - Jonathan Amiel
- Columbia Vagelos College of Physicians and Surgeons, Columbia University, New York, NY, USA
| | - William Pluta
- Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, USA
| |
Collapse
|
16
|
Dart J, Twohig C, Anderson A, Bryce A, Collins J, Gibson S, Kleve S, Porter J, Volders E, Palermo C. The Value of Programmatic Assessment in Supporting Educators and Students to Succeed: A Qualitative Evaluation. J Acad Nutr Diet 2021; 121:1732-1740. [PMID: 33612437 DOI: 10.1016/j.jand.2021.01.013] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/29/2020] [Revised: 01/13/2021] [Accepted: 01/17/2021] [Indexed: 11/28/2022]
Abstract
BACKGROUND Programmatic assessment has been proposed as the way forward for competency-based assessment, yet there is a dearth of literature describing the implementation and evaluation of programmatic assessment approaches. OBJECTIVE To evaluate the implementation of a programmatic assessment and explore its ability to support students and assessors. DESIGN A qualitative evaluation of programmatic assessment was employed. PARTICIPANTS/SETTING Interviews with graduates (n = 8) and preceptors (n = 12) together with focus groups with faculty assessors (n = 9) from the one Australian university explored experiences of the programmatic approach, role of assessment in learning, and defensibility of assessment decisions in determining competence. ANALYSIS PERFORMED Data were analyzed into key themes using framework analysis. RESULTS The programmatic assessment increased confidence in defensibility of assessment decisions, reduced emotional burden of assessment, increased value of assessment, and identified and remediated at-risk students earlier when philosophical and practice shifts in approaches to assessment were embraced. CONCLUSIONS Programmatic assessment supports a holistic approach to competency development and assessment and has multiple benefits for learners and assessors.
Collapse
|
17
|
Implementing a competency-based midwifery programme in Lesotho: A gap analysis. Nurse Educ Pract 2018; 34:72-78. [PMID: 30466039 DOI: 10.1016/j.nepr.2018.11.005] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/24/2018] [Revised: 10/05/2018] [Accepted: 11/06/2018] [Indexed: 11/21/2022]
Abstract
Global reforms in health professions education, including midwifery, support the transformation of education programmes to adopt competency-based models. Lesotho, a small sub-Saharan African country, with perennially high maternal and neonatal mortality, adopted a competency-based education model in the design and subsequent implementation of a one-year post-basic midwifery programme. Through a gap analysis involving administrators, educators and students in all the nursing education institutions in Lesotho, we explored their experiences related to the implementation of a competency-based midwifery programme after three years of continuous implementation. The findings revealed a vast gap between the described curriculum, and what was enacted in the nursing education institutions. The essential components of the midwifery programme had not been transformed to accommodate competency-based education. We argue that structural and operational elements of a programme should be adjusted before and during the implementation of such a curriculum innovation to enhance a positive teaching and learning experience, further sustaining the programme. Therefore, contextually relevant frameworks aimed at supporting the implementation and sustainability of the entire programme should be developed.
Collapse
|