1
|
Torre D, Schuwirth L. Programmatic assessment for learning: A programmatically designed assessment for the purpose of learning: AMEE Guide No. 174. MEDICAL TEACHER 2024:1-16. [PMID: 39368061 DOI: 10.1080/0142159x.2024.2409936] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/29/2024] [Accepted: 09/24/2024] [Indexed: 10/07/2024]
Abstract
Programmatic assessment for learning (PAL) involves programmatically structured collection of assessment data for the purpose of learning. In this guide, we examine and provide recommendations on several aspects: First, we review the evolution that has led to the development of programmatic assessment, providing clarification of some of its terminology. Second, we outline the learning processes that guide the design of PAL, including distributed learning, interleaving, overlearning, and test-enhanced learning. Third, we review the evolving nature of validity and provide insights into validity from a program perspective. Finally, we examine opportunities, challenges, and future directions of assessment in the context of artificial intelligence.
Collapse
Affiliation(s)
- Dario Torre
- University of Central Florida College of Medicine, Orlando, FL, USA
| | - Lambert Schuwirth
- College of Medicine and Public Health, Flinders University, Adelaide, Australia
| |
Collapse
|
2
|
Parsons AS, Wijesekera TP, Olson APJ, Torre D, Durning SJ, Daniel M. Beyond thinking fast and slow: Implications of a transtheoretical model of clinical reasoning and error on teaching, assessment, and research. MEDICAL TEACHER 2024:1-12. [PMID: 38835283 DOI: 10.1080/0142159x.2024.2359963] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/23/2023] [Accepted: 05/22/2024] [Indexed: 06/06/2024]
Abstract
From dual process to a family of theories known collectively as situativity, both micro and macro theories of cognition inform our current understanding of clinical reasoning (CR) and error. CR is a complex process that occurs in a complex environment, and a nuanced, expansive, integrated model of these theories is necessary to fully understand how CR is performed in the present day and in the future. In this perspective, we present these individual theories along with figures and descriptive cases for purposes of comparison before exploring the implications of a transtheoretical model of these theories for teaching, assessment, and research in CR and error.
Collapse
Affiliation(s)
- Andrew S Parsons
- Medicine and Public Health, University of Virginia School of Medicine, Charlottesville, VA, USA
| | | | - Andrew P J Olson
- Medicine and Pediatrics, Medical Education Outcomes Center, University of Minnesota Medical School, Minneapolis, MN, USA
| | - Dario Torre
- Medicine, University of Central Florida College of Medicine, Orlando, FL, USA
| | - Steven J Durning
- Medicine and Pathology, Uniformed Services University of the Health Sciences, Bethesda, MD, USA
| | - Michelle Daniel
- Emergency Medicine, University of California San Diego School of Medicine San Diego, CA, USA
| |
Collapse
|
3
|
Torre D, Daniel M, Ratcliffe T, Durning SJ, Holmboe E, Schuwirth L. Programmatic Assessment of Clinical Reasoning: New Opportunities to Meet an Ongoing Challenge. TEACHING AND LEARNING IN MEDICINE 2024:1-9. [PMID: 38794865 DOI: 10.1080/10401334.2024.2333921] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/13/2023] [Accepted: 02/29/2024] [Indexed: 05/26/2024]
Abstract
Issue: Clinical reasoning is essential to physicians' competence, yet assessment of clinical reasoning remains a significant challenge. Clinical reasoning is a complex, evolving, non-linear, context-driven, and content-specific construct which arguably cannot be assessed at one point in time or with a single method. This has posed challenges for educators for many decades, despite significant development of individual assessment methods. Evidence: Programmatic assessment is a systematic assessment approach that is gaining momentum across health professions education. Programmatic assessment, and in particular assessment for learning, is well-suited to address the challenges with clinical reasoning assessment. Several key principles of programmatic assessment are particularly well-aligned with developing a system to assess clinical reasoning: longitudinality, triangulation, use of a mix of assessment methods, proportionality, implementation of intermediate evaluations/reviews with faculty coaches, use of assessment for feedback, and increase in learners' agency. Repeated exposure and measurement are critical to develop a clinical reasoning assessment narrative, thus the assessment approach should optimally be longitudinal, providing multiple opportunities for growth and development. Triangulation provides a lens to assess the multidimensionality and contextuality of clinical reasoning and that of its different, yet related components, using a mix of different assessment methods. Proportionality ensures the richness of information on which to draw conclusions is commensurate with the stakes of the decision. Coaching facilitates the development of a feedback culture and allows to assess growth over time, while enhancing learners' agency. Implications: A programmatic assessment model of clinical reasoning that is developmentally oriented, optimizes learning though feedback and coaching, uses multiple assessment methods, and provides opportunity for meaningful triangulation of data can help address some of the challenges of clinical reasoning assessment.
Collapse
Affiliation(s)
- Dario Torre
- Department of Medical Education, University of Central Florida, Orlando, FL, USA
| | - Michelle Daniel
- Department of Emergency Medicine, University of California, San Diego, CA, USA
| | - Temple Ratcliffe
- Department of Medicine, The Joe R and Teresa Lozano Long School of Medicine at University of Texas Health, Texas, USA
| | - Steven J Durning
- Center for Heath Profession Education, Uniformed Services University Center for Neuroscience and Regenerative Medicine, Bethesda, Maryland, USA
| | - Eric Holmboe
- Milestones Development and Evaluation, Accreditation Council for Graduate Medical Education, Chicago, IL, USA
| | | |
Collapse
|
4
|
Scholte JBJ, Strehler JC, Dill T, van Mook WNKA. Trainee-supervisor collaboration, progress-visualisation, and coaching: a survey on challenges in assessment of ICU trainees. BMC MEDICAL EDUCATION 2024; 24:120. [PMID: 38321516 PMCID: PMC10848472 DOI: 10.1186/s12909-023-04980-0] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/10/2023] [Accepted: 12/14/2023] [Indexed: 02/08/2024]
Abstract
BACKGROUND Assessing trainees is crucial for development of their competence, yet it remains a challenging endeavour. Identifying contributing and influencing factors affecting this process is imperative for improvement. METHODS We surveyed residents, fellows, and intensivists working in an intensive care unit (ICU) at a large non-university hospital in Switzerland to investigate the challenges in assessing ICU trainees. Thematic analysis revealed three major themes. RESULTS Among 45 physicians, 37(82%) responded. The first theme identified is trainee-intensivist collaboration discontinuity. The limited duration of trainees' ICU rotations, large team size operating in a discordant three-shift system, and busy and unpredictable day-planning hinder sustained collaboration. Potential solutions include a concise pre-collaboration briefing, shared bedside care, and post-collaboration debriefing involving formative assessment and reflection on collaboration. The second theme is the lack of trainees' progress visualisation, which is caused by unsatisfactory familiarisation with the trainees' development. The lack of an overview of a trainee's previous achievements, activities, strengths, weaknesses, and goals may result in inappropriate assessments. Participants suggested implementing digital assessment tools, a competence committee, and dashboards to facilitate progress visualisation. The third theme we identified is insufficient coaching and feedback. Factors like personality traits, hierarchy, and competing interests can impede coaching, while high-quality feedback is essential for correct assessment. Skilled coaches can define short-term goals and may optimise trainee assessment by seeking feedback from multiple supervisors and assisting in both formative and summative assessment. Based on these three themes and the suggested solutions, we developed the acronym "ICU-STAR" representing a potentially powerful framework to enhance short-term trainee-supervisor collaboration in the workplace and to co-scaffold the principles of adequate assessment. CONCLUSIONS According to ICU physicians, trainee-supervisor collaboration discontinuity, the lack of visualisation of trainee's development, and insufficient coaching and feedback skills of supervisors are the major factors hampering trainees' assessment in the workplace. Based on suggestions by the survey participants, we propose the acronym "ICU-STAR" as a framework including briefing, shared bedside care, and debriefing of the trainee-supervisor collaboration at the workplace as its core components. With the attending intensivists acting as coaches, progress visualisation can be enhanced by actively collecting more data points. TRIAL REGISTRATION N/A.
Collapse
Affiliation(s)
- Johannes B J Scholte
- Department of Intensive Care Medicine, Cantonal Hospital Lucerne, Lucerne, Switzerland.
- Master of Medical Education Student, University of Bern, Bern, Switzerland.
| | - Johannes C Strehler
- Department of Intensive Care Medicine, Cantonal Hospital Lucerne, Lucerne, Switzerland
| | - Tatjana Dill
- Master of Medical Education Student, University of Bern, Bern, Switzerland
- Department of Anaesthesiology and Pain Medicine, Inselspital, Bern University Hospital, Bern, Switzerland
- Swiss Air-Ambulance Ltd, Rega, Zurich, Switzerland
| | - Walther N K A van Mook
- Department of Intensive Care Medicine and Academy for Postgraduate Medical Training, Maastricht University Medical Centre, Maastricht, The Netherlands
- School of Health Professions Education, Maastricht University, Maastricht, The Netherlands
| |
Collapse
|
5
|
Maxson IN, Su E, Brown KA, Tcharmtchi MH, Ginsburg S, Bhargava V, Wenger J, Centers GI, Alade KH, Leung SK, Gowda SH, Flores S, Riley A, Thammasitboon S. A Program of Assessment Model for Point-of-Care Ultrasound Training for Pediatric Critical Care Providers: A Comprehensive Approach to Enhance Competency-Based Point-of-Care Ultrasound Training. Pediatr Crit Care Med 2023; 24:e511-e519. [PMID: 37260313 DOI: 10.1097/pcc.0000000000003288] [Citation(s) in RCA: 5] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 06/02/2023]
Abstract
Point-of-care ultrasound (POCUS) is increasingly accepted in pediatric critical care medicine as a tool for guiding the evaluation and treatment of patients. POCUS is a complex skill that requires user competency to ensure accuracy, reliability, and patient safety. A robust competency-based medical education (CBME) program ensures user competency and mitigates patient safety concerns. A programmatic assessment model provides a longitudinal, holistic, and multimodal approach to teaching, assessing, and evaluating learners. The authors propose a fit-for-purpose and modifiable CBME model that is adaptable for different institutions' resources and needs for any intended competency level. This educational model drives and supports learning, ensures competency attainment, and creates a clear pathway for POCUS education while enhancing patient care and safety.
Collapse
Affiliation(s)
- Ivanna Natasha Maxson
- Department of Pediatrics, Division of Critical Care Medicine, Baylor College of Medicine, Texas Children's Hospital, Houston, TX
| | - Erik Su
- Department of Pediatrics, Division of Critical Care Medicine, Baylor College of Medicine, Texas Children's Hospital, Houston, TX
| | - Kyle A Brown
- Department of Pediatrics, Texas Christian University School of Medicine, Cook Children's Medical Center, Fort Worth, TX
| | - M Hossein Tcharmtchi
- Department of Pediatrics, Division of Critical Care Medicine, Baylor College of Medicine, Texas Children's Hospital, Houston, TX
| | - Sarah Ginsburg
- Department of Pediatrics, Division of Critical Care Medicine, UT Southwestern Medical Center, Dallas, TX
| | - Vidit Bhargava
- Department of Pediatrics, Division of Critical Care Medicine, University of Alabama Children's Hospital of Alabama, Birmingham, AL
| | - Jesse Wenger
- Department of Pediatrics, Division of Critical Care Medicine, University of Washington Seattle Children's Hospital, Seattle, WA
| | - Gabriela I Centers
- Department of Pediatrics, Division of Critical Care Medicine, Indiana University, Riley Children's Hospital, Indianapolis, IN
| | - Kiyetta H Alade
- Department of Pediatrics, Division of Emergency Medicine, Baylor College of Medicine, Texas Children's Hospital, Houston, TX
| | - Stephanie K Leung
- Department of Pediatrics, Division of Emergency Medicine, Baylor College of Medicine, Texas Children's Hospital, Houston, TX
| | - Sharada H Gowda
- Department of Pediatrics, Division of Neonatology, Baylor College of Medicine, Texas Children's Hospital, Houston, TX
| | - Saul Flores
- Department of Pediatrics, Division of Critical Care Medicine and Cardiology, Baylor College of Medicine, Texas Children's Hospital, Houston, TX
| | - Alan Riley
- Department of Pediatrics, Division of Cardiology, Baylor College of Medicine, Texas Children's Hospital, Houston, TX
| | - Satid Thammasitboon
- Department of Pediatrics, Division of Critical Care Medicine, Baylor College of Medicine, Texas Children's Hospital, Houston, TX
- Department of Pediatrics, Center for Research, Innovation, and Scholarship in Medical Education, Baylor College of Medicine, Texas Children's Hospital, Houston, TX
| |
Collapse
|
6
|
Greenfield J, Qua K, Prayson RA, Bierer SB. "It Changed How I Think"-Impact of Programmatic Assessment Upon Practicing Physicians: A Qualitative Study. MEDICAL SCIENCE EDUCATOR 2023; 33:963-974. [PMID: 37546195 PMCID: PMC10403454 DOI: 10.1007/s40670-023-01829-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 06/23/2023] [Indexed: 08/08/2023]
Abstract
Programmatic assessment is a systematic approach used to document and assess learner performance. It offers learners frequent formative feedback from a variety of contexts and uses both high- and low-stakes assessments to determine student progress. Existing research has explored learner and faculty perceptions of programmatic assessment, reporting favorable impact on faculty understanding of the importance of assessment stakes and feedback to learners while students report the ability to establish and navigate towards goals and reflect on their performance. The Cleveland Clinic Lerner College of Medicine (CCLCM) of Case Western Reserve University adopted programmatic assessment methods at its inception. With more than 18 years' experience with programmatic assessment and a portfolio-based assessment system, CCLCM is well-positioned to explore its graduates' perceptions of their programmatic assessment experiences during and after medical school. In 2020, the investigators interviewed 26 of the 339 physician graduates. Participants were purposefully sampled to represent multiple class cohorts (2009-2019), clinical specialties, and practice locations. The investigators analyzed interview transcripts using thematic analysis informed by the frameworks of self-determination theory and professional identity formation. The authors identified themes and support each with participant quotes from the interviews. Based on findings, the investigators compiled a series of recommendations for other institutions who have already or plan to incorporate elements of programmatic assessment into their curricula. The authors concluded by discussing future directions for research and additional avenues of inquiry.
Collapse
Affiliation(s)
- Jessica Greenfield
- University of Virginia School of Medicine, Room 2008A Pinn Hall, Box 800866, Charlottesville, VA 22908-0366 USA
| | - Kelli Qua
- Case Western Reserve University School of Medicine, Cleveland, OH USA
| | - Richard A. Prayson
- Department of Anatomic Pathology, Cleveland Clinic Lerner College of Medicine of Case Western Reserve University, Cleveland Clinic, Cleveland, OH USA
| | - S. Beth Bierer
- Cleveland Clinic Lerner College of Medicine of Case Western Reserve University, Cleveland, OH USA
| |
Collapse
|
7
|
Ogden K, Kilpatrick S, Elmer S. Examining the nexus between medical education and complexity: a systematic review to inform practice and research. BMC MEDICAL EDUCATION 2023; 23:494. [PMID: 37408005 PMCID: PMC10320888 DOI: 10.1186/s12909-023-04471-2] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/21/2022] [Accepted: 06/23/2023] [Indexed: 07/07/2023]
Abstract
BACKGROUND Medical education is a multifarious endeavour integrating a range of pedagogies and philosophies. Complexity as a science or theory ('complexity') signals a move away from a reductionist paradigm to one which appreciates that interactions in multi-component systems, such as healthcare systems, can result in adaptive and emergent outcomes. This examination of the nexus between medical education and complexity theory aims to discover ways that complexity theory can inform medical education and medical education research. METHODS A structured literature review was conducted to examine the nexus between medical education and complexity; 5 databases were searched using relevant terms. Papers were included if they engaged fully with complexity as a science or theory and were significantly focused on medical education. All types of papers were included, including conceptual papers (e.g. opinion and theoretical discussions), case studies, program evaluations and empirical research. A narrative and thematic synthesis was undertaken to create a deep understanding of the use of complexity in medical education. RESULTS Eighty-three papers were included; the majority were conceptual papers. The context and theoretical underpinnings of complexity as a relevant theory for medical education were identified. Bibliographic and temporal observations were noted regarding the entry of complexity into medical education. Complexity was relied upon as a theoretical framework for empirical studies covering a variety of elements within medical education including: knowledge and learning theories; curricular, program and faculty development; program evaluation and medical education research; assessment and admissions; professionalism and leadership; and learning for systems, about systems and in systems. DISCUSSION There is a call for greater use of theory by medical educators. Complexity within medical education is established, although not widespread. Individualistic cultures of medicine and comfort with reductionist epistemologies challenges its introduction. However, complexity was found to be a useful theory across a range of areas by a limited number of authors and is increasingly used by medical educators and medical education researchers. This review has further conceptualized how complexity is being used to support medical education and medical education research. CONCLUSION This literature review can assist in understanding how complexity can be useful in medical educationalists' practice.
Collapse
Affiliation(s)
- Kathryn Ogden
- Tasmanian School of Medicine, University of Tasmania, Launceston, TAS, Australia.
- Launceston Clinical School, Locked Bag 1377, Launceston, 7250, Australia.
| | - Sue Kilpatrick
- School of Education, University of Tasmania, Launceston, TAS, Australia
| | - Shandell Elmer
- School of Nursing, University of Tasmania, Launceston, TAS, Australia
| |
Collapse
|
8
|
Jamieson J, Gibson S, Hay M, Palermo C. Teacher, Gatekeeper, or Team Member: supervisor positioning in programmatic assessment. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2022:10.1007/s10459-022-10193-9. [PMID: 36469231 DOI: 10.1007/s10459-022-10193-9] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/18/2022] [Accepted: 11/27/2022] [Indexed: 06/17/2023]
Abstract
Competency-based assessment is undergoing an evolution with the popularisation of programmatic assessment. Fundamental to programmatic assessment are the attributes and buy-in of the people participating in the system. Our previous research revealed unspoken, yet influential, cultural and relationship dynamics that interact with programmatic assessment to influence success. Pulling at this thread, we conducted secondary analysis of focus groups and interviews (n = 44 supervisors) using the critical lens of Positioning Theory to explore how workplace supervisors experienced and perceived their positioning within programmatic assessment. We found that supervisors positioned themselves in two of three ways. First, supervisors universally positioned themselves as a Teacher, describing an inherent duty to educate students. Enactment of this position was dichotomous, with some supervisors ascribing a passive and disempowered position onto students while others empowered students by cultivating an egalitarian teaching relationship. Second, two mutually exclusive positions were described-either Gatekeeper or Team Member. Supervisors positioning themselves as Gatekeepers had a duty to protect the community and were vigilant to the detection of inadequate student performance. Programmatic assessment challenged this positioning by reorientating supervisor rights and duties which diminished their perceived authority and led to frustration and resistance. In contrast, Team Members enacted a right to make a valuable contribution to programmatic assessment and felt liberated from the burden of assessment, enabling them to assent power shifts towards students and the university. Identifying supervisor positions revealed how programmatic assessment challenged traditional structures and ideologies, impeding success, and provides insights into supporting supervisors in programmatic assessment.
Collapse
Affiliation(s)
- Janica Jamieson
- Monash University, Melbourne, Australia.
- School of Medical and Health Sciences, Edith Cowan University, 270 Joondalup Drive, Joondalup, WA, 6027, Australia.
| | | | | | | |
Collapse
|
9
|
Nair B, Moonen-van Loon JW. Programmatic assessment – What are we waiting for? ARCHIVES OF MEDICINE AND HEALTH SCIENCES 2022. [DOI: 10.4103/amhs.amhs_259_22] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/24/2022] Open
|