1
|
Vennemeyer S, Kinnear B, Gao A, Zhu S, Nattam A, Knopp MI, Warm E, Wu DT. User-Centered Evaluation and Design Recommendations for an Internal Medicine Resident Competency Assessment Dashboard. Appl Clin Inform 2023; 14:996-1007. [PMID: 38122817 PMCID: PMC10733060 DOI: 10.1055/s-0043-1777103] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2023] [Accepted: 10/25/2023] [Indexed: 12/23/2023] Open
Abstract
OBJECTIVES Clinical Competency Committee (CCC) members employ varied approaches to the review process. This makes the design of a competency assessment dashboard that fits the needs of all members difficult. This work details a user-centered evaluation of a dashboard currently utilized by the Internal Medicine Clinical Competency Committee (IM CCC) at the University of Cincinnati College of Medicine and generated design recommendations. METHODS Eleven members of the IM CCC participated in semistructured interviews with the research team. These interviews were recorded and transcribed for analysis. The three design research methods used in this study included process mapping (workflow diagrams), affinity diagramming, and a ranking experiment. RESULTS Through affinity diagramming, the research team identified and organized opportunities for improvement about the current system expressed by study participants. These areas include a time-consuming preprocessing step, lack of integration of data from multiple sources, and different workflows for each step in the review process. Finally, the research team categorized nine dashboard components based on rankings provided by the participants. CONCLUSION We successfully conducted user-centered evaluation of an IM CCC dashboard and generated four recommendations. Programs should integrate quantitative and qualitative feedback, create multiple views to display these data based on user roles, work with designers to create a usable, interpretable dashboard, and develop a strong informatics pipeline to manage the system. To our knowledge, this type of user-centered evaluation has rarely been attempted in the medical education domain. Therefore, this study provides best practices for other residency programs to evaluate current competency assessment tools and to develop new ones.
Collapse
Affiliation(s)
- Scott Vennemeyer
- Department of Biomedical Informatics, College of Medicine, University of Cincinnati, Ohio, United States
| | - Benjamin Kinnear
- Department of Pediatrics, College of Medicine, University of Cincinnati, Ohio, United States
- Department of Internal Medicine, College of Medicine, University of Cincinnati, Ohio, United States
| | - Andy Gao
- Department of Biomedical Informatics, College of Medicine, University of Cincinnati, Ohio, United States
- Medical Sciences Baccalaureate Program, College of Medicine, University of Cincinnati, Ohio, United States
| | - Siyi Zhu
- Department of Biomedical Informatics, College of Medicine, University of Cincinnati, Ohio, United States
- School of Design, College of Design, Architecture, Art, and Planning (DAAP), University of Cincinnati, Ohio, United States
| | - Anunita Nattam
- Department of Biomedical Informatics, College of Medicine, University of Cincinnati, Ohio, United States
- Medical Sciences Baccalaureate Program, College of Medicine, University of Cincinnati, Ohio, United States
| | - Michelle I. Knopp
- Department of Internal Medicine, College of Medicine, University of Cincinnati, Ohio, United States
- Division of Hospital Medicine, Cincinnati Children's Hospital Medical Center, Department of Pediatrics, College of Medicine, University of Cincinnati, Ohio, United States
| | - Eric Warm
- Department of Internal Medicine, College of Medicine, University of Cincinnati, Ohio, United States
| | - Danny T.Y. Wu
- Department of Biomedical Informatics, College of Medicine, University of Cincinnati, Ohio, United States
- Department of Pediatrics, College of Medicine, University of Cincinnati, Ohio, United States
- Medical Sciences Baccalaureate Program, College of Medicine, University of Cincinnati, Ohio, United States
- School of Design, College of Design, Architecture, Art, and Planning (DAAP), University of Cincinnati, Ohio, United States
| |
Collapse
|
2
|
Eliasz KL, Nick MW, Zabar S, Buckvar-Keltz L, Ng GM, Riles TS, Kalet AL. Viewing Readiness-for-Residency through Binoculars: Mapping Competency-Based Assessments to the AAMC's 13 Core Entrustable Professional Activities (EPAs). TEACHING AND LEARNING IN MEDICINE 2023; 35:436-441. [PMID: 35668557 DOI: 10.1080/10401334.2022.2082432] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/04/2020] [Accepted: 05/04/2022] [Indexed: 06/15/2023]
Abstract
Construct: The construct being assessed is readiness-for-residency of graduating medical students, as measured through two assessment frameworks. Background: Readiness-for-residency of near-graduate medical students should be but is not consistently assessed. To address this, the Association of American Medical Colleges (AAMC), in 2014, identified and described 13 core Entrustable Professional Activities (EPAs), which are tasks that all residents should be able to perform unsupervised upon entering residency. However, the AAMC did not initially provide measurement guidelines or propose standardized assessments. We designed Night-onCall (NOC), an immersive simulation for our near-graduating medical students to assess and address their readiness-for-residency, framed around tasks suggested by the AAMC's core EPAs. In adopting this EPA assessment framework, we began by building upon an established program of competency-based clinical skills assessments, repurposing competency-based checklists to measure components of the EPAs where possible, and designing new checklists, when necessary. This resulted in a blended suite of 14 checklists, which theoretically provide substantive assessment of all 13 core EPAs. In this paper, we describe the consensus-based mapping process conducted to ensure we understood the relationship between competency and EPA-based assessment lenses and could therefore report meaningful feedback on both to transitioning students in the NOC exercise. Approach: Between January-November 2017, five clinician and two non-clinician health professions educators at NYU Grossman School of Medicine conducted a rigorous consensus-based mapping process, which included each rater mapping each of the 310 NOC competency-based checklist items to lists of entrustable behaviors expected of learners according to the AAMC 13 core EPAs. Findings: All EPAs were captured to varying degrees by the 14 NOC checklists (overall Intraclass Correlation Coefficient (ICC) = 0.77). Consensus meetings resolved discrepancies and improved ICC values for three (EPA-9, EPA-10, EPA-12) of the four EPAs that initially showed poor reliability. Conclusions: Findings suggest that with some limitations (e.g., EPA-7 "form clinical questions/retrieve evidence") established competency-based assessments can be repurposed to measure readiness-for-residency through an EPA lens and both can be reported to learners and faculty.
Collapse
Affiliation(s)
- Kinga L Eliasz
- Division of General Internal Medicine and Clinical Innovation, New York University Grossman School of Medicine, New York, New York, USA
- Department of Medicine, New York University Grossman School of Medicine, New York, New York, USA
| | - Michael W Nick
- Program on Medical Education and Technology, New York University Grossman School of Medicine, New York, New York, USA
| | - Sondra Zabar
- Division of General Internal Medicine and Clinical Innovation, New York University Grossman School of Medicine, New York, New York, USA
- Department of Medicine, New York University Grossman School of Medicine, New York, New York, USA
| | - Lynn Buckvar-Keltz
- Department of Medicine, New York University Grossman School of Medicine, New York, New York, USA
| | - Grace M Ng
- New York Simulation Center for the Health Sciences, A Partnership of the City University of New York and New York University Grossman School of Medicine, New York, New York, USA
| | - Thomas S Riles
- Departments of Surgery and Medical Education and Technology, New York University Grossman School of Medicine, New York, New York, USA
| | - Adina L Kalet
- Robert D. and Patricia E. Kern Institute for the Transformation of Medical Education at Medical College of Wisconsin, Wauwatosa, Wisconsin, USA
| |
Collapse
|
3
|
Warm EJ, Carraccio C, Kelleher M, Kinnear B, Schumacher DJ, Santen S. The education passport: connecting programmatic assessment across learning and practice. CANADIAN MEDICAL EDUCATION JOURNAL 2022; 13:82-91. [PMID: 36091737 PMCID: PMC9441115 DOI: 10.36834/cmej.73871] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/21/2023]
Abstract
Competency-based medical education (CBME) shifts us from static assessment of learning to developmental assessment for learning. However, implementation challenges associated with CBME remain a major hurdle, especially after training and into practice. The full benefit of developmental assessment for learning over time requires collaboration, cooperation, and trust among learners, regulators, and the public that transcends each individual phase. The authors introduce the concept of an "Education Passport" that provides evidence of readiness to travel across the boundaries between undergraduate medical education, graduate medical education, and the expanse of practice. The Education Passport uses programmatic assessment, a process of collecting numerous low stakes assessments from multiple sources over time, judging these data using criterion-referencing, and enhancing this with coaching and competency committees to understand, process, and accelerate growth without end. Information in the Passport is housed on a cloud-based server controlled by the student/physician over the course of training and practice. These data are mapped to various educational frameworks such Entrustable Professional Activities or milestones for ease of longitudinal performance tracking. At each stage of education and practice the student/physician grants Passport access to all entities that can provide data on performance. Database managers use learning analytics to connect and display information over time that are then used by the student/physician, their assigned or chosen coaches, and review committees to maintain or improve performance. Global information is also collected and analyzed to improve the entire system of learning and care. Developing a true continuum that embraces performance and growth will be a long-term adaptive challenge across many organizations and jurisdictions and will require coordination from regulatory and national agencies. An Education Passport could also serve as an organizing tool and will require research and high-value communication strategies to maximize public trust in the work.
Collapse
Affiliation(s)
- Eric J Warm
- Department of Internal Medicine, University of Cincinnati College of Medicine, Ohio, USA
- Correspondence to: Eric J. Warm,
| | | | - Matthew Kelleher
- Department of Internal Medicine, University of Cincinnati College of Medicine, Ohio, USA
| | - Benjamin Kinnear
- Department of Pediatrics, University of Cincinnati College of Medicine, Ohio, USA
| | - Daniel J Schumacher
- Division of Emergency Medicine, Cincinnati Children's Hospital Medical Center and the University of Cincinnati College of Medicine, Ohio, USA
| | - Sally Santen
- Division of Emergency Medicine, Cincinnati Children's Hospital Medical Center and the University of Cincinnati College of Medicine, Ohio, USA
- Virginia Commonwealth University, Ohio, USA
| |
Collapse
|
4
|
Warm EJ, Kinnear B, Lance S, Schauer DP, Brenner J. What Behaviors Define a Good Physician? Assessing and Communicating About Noncognitive Skills. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2022; 97:193-199. [PMID: 34166233 DOI: 10.1097/acm.0000000000004215] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/13/2023]
Abstract
Once medical students attain a certain level of medical knowledge, success in residency often depends on noncognitive attributes, such as conscientiousness, empathy, and grit. These traits are significantly more difficult to assess than cognitive performance, creating a potential gap in measurement. Despite its promise, competency-based medical education (CBME) has yet to bridge this gap, partly due to a lack of well-defined noncognitive observable behaviors that assessors and educators can use in formative and summative assessment. As a result, typical undergraduate to graduate medical education handovers stress standardized test scores, and program directors trust little of the remaining information they receive, sometimes turning to third-party companies to better describe potential residency candidates. The authors have created a list of noncognitive attributes, with associated definitions and noncognitive skills-called observable practice activities (OPAs)-written for learners across the continuum to help educators collect assessment data that can be turned into valuable information. OPAs are discrete work-based assessment elements collected over time and mapped to larger structures, such as milestones, entrustable professional activities, or competencies, to create learning trajectories for formative and summative decisions. Medical schools and graduate medical education programs could adapt these OPAs or determine ways to create new ones specific to their own contexts. Once OPAs are created, programs will have to find effective ways to assess them, interpret the data, determine consequence validity, and communicate information to learners and institutions. The authors discuss the need for culture change surrounding assessment-even for the adoption of behavior-based tools such as OPAs-including grounding the work in a growth mindset and the broad underpinnings of CBME. Ultimately, improving assessment of noncognitive capacity should benefit learners, schools, programs, and most importantly, patients.
Collapse
Affiliation(s)
- Eric J Warm
- E.J. Warm is professor of medicine and program director, Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, Ohio; ORCID: https://orcid.org/0000-0002-6088-2434
| | - Benjamin Kinnear
- B. Kinnear is associate professor of medicine and pediatrics and associate program director, Department of Pediatrics, University of Cincinnati College of Medicine, Cincinnati, Ohio; ORCID: https://orcid.org/0000-0003-0052-4130
| | - Samuel Lance
- S. Lance is associate professor of plastic surgery and craniofacial surgery and program director of plastic surgery, Division of Plastic Surgery, University of California San Diego, San Diego, California; ORCID: https://orcid.org/0000-0002-5186-2677
| | - Daniel P Schauer
- D.P. Schauer is associate professor of medicine and associate program director, Department of Internal Medicine, University of Cincinnati College of Medicine, Cincinnati, Ohio; ORCID: https://orcid.org/0000-0003-3264-8154
| | - Judith Brenner
- J. Brenner is associate professor of science education and medicine and associate dean for curricular integration and assessment, Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, Hempstead, New York; ORCID: https://orcid.org/0000-0002-8697-5401
| |
Collapse
|
5
|
Feeney C, Hotez E, Wan L, Bishop L, Timmerman J, Haley M, Kuo A, Fernandes P. A Multi-Institutional Collaborative To Assess the Knowledge and Skills of Medicine-Pediatrics Residents in Health Care Transition. Cureus 2022; 13:e20327. [PMID: 35028223 PMCID: PMC8748002 DOI: 10.7759/cureus.20327] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/12/2021] [Accepted: 12/09/2021] [Indexed: 01/23/2023] Open
Abstract
Background Pediatric to adult health care transition (HCT) is an essential process in the care of youth with special health care needs (YSHCN). Many internal medicine-pediatrics (med-peds) residency programs have developed curricula to teach transition knowledge and skills for the care of YSHCN. Objective Using a national med-peds program director quality improvement collaborative to improve transition curriculum, we aim to identify curricular content areas of improvement by describing baseline trainee knowledge and skills taught through existing transition curricula in med-peds programs. Methods We analyzed data collected during the 2018-2019 national med-peds program director quality improvement collaborative to improve transition curriculum. Program directors assessed their programs, and trainees assessed themselves on five transition goals by completing a Likert-scale questionnaire. In addition, trainees received an objective assessment of their knowledge through a multiple-choice questionnaire (MCQ). Results All 19 programs in the collaborative, and 193 of 316 trainees from these programs, completed the questionnaires. Most programs were based at academic centers (68%) and provided transition training via didactics (63%) and/or subspecialty rotations (58%). More programs had high confidence (95%) than trainees (58%) in goal 1 (knowledge and skills of the issues around transition), whereas more trainees had high confidence (60%) than programs (47%) in goal 2 (understanding the developmental and psychosocial aspects of transition). Programs and trainees self-assessed lower in goals related to health insurance, educational and vocational needs, and application of health care system knowledge to the practice environment (goals 3, 4, and 5, respectively). Conclusions Using the assessments of the program directors and resident trainees, we identified subject areas for improvement of transition curricula, including health insurance and the application of health care system knowledge to the practice environment.
Collapse
Affiliation(s)
- Colby Feeney
- Medicine and Pediatrics, Duke University School of Medicine, Durham, USA
| | - Emily Hotez
- Internal Medicine, David Geffen School of Medicine, University of California Los Angeles, Los Angeles, USA
| | - Lori Wan
- Medicine and Pediatrics, University of California San Diego, San Diego, USA
| | - Laura Bishop
- Medicine and Pediatrics, University of Louisville School of Medicine, Louisville, USA
| | - Jason Timmerman
- Internal Medicine, David Geffen School of Medicine, University of California Los Angeles, Los Angeles, USA
| | - Madeline Haley
- Internal Medicine and Pediatrics, David Geffen School of Medicine, University of California Los Angeles, Los Angeles, USA
| | - Alice Kuo
- Internal Medicine and Pediatrics, David Geffen School of Medicine, University of California Los Angeles, Los Angeles, USA
| | - Priyanka Fernandes
- Internal Medicine and Pediatrics, David Geffen School of Medicine, University of California Los Angeles, Los Angeles, USA
| |
Collapse
|
6
|
Shrivastava S, Shrivastava P. Employing clinical work sampling tool for monitoring the clinical competence among medical students. MEDICAL JOURNAL OF DR. D.Y. PATIL VIDYAPEETH 2022. [DOI: 10.4103/mjdrdypu.mjdrdypu_583_20] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/04/2022] Open
|
7
|
Weber DE, Kinnear B, Kelleher M, Klein M, Sall D, Schumacher DJ, Zhang N, Warm E, Schauer DP. Effect of resident and assessor gender on entrustment-based observational assessment in an internal medicine residency program. MEDEDPUBLISH 2021. [DOI: 10.12688/mep.17410.1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022] Open
Abstract
Background: Implicit gender bias leads to differences in assessment. Studies examining gender differences in resident milestone assessment data demonstrate variable results. The purpose of this study was to determine if observational entrustment scores differ by resident and assessor gender in a program of assessment based on discrete, observable skills. Methods: We analyzed overall entrustment scores and entrustment scores by Accreditation Council for Graduate Medical Education (ACGME) core competency for 238 residents (49% female) from 396 assessors (38% female) in one internal medicine residency program from July 2012 to June 2019. We conducted analyses at 1-12 months, 1-36 months, 1-6 months, 7-12 months, and 31-36 months. We used linear mixed-effect models to assess the role of resident and assessor gender, with resident-specific and assessor-specific random effect to account for repeated measures. Results: Statistically significant interactions existed between resident and assessor gender for overall entrustment at 1-12 months (p < 0.001), 1-36 months (p< 0.001), 1-6 months (p<0.001), 7-12 months (p=0.04), and 31-36 months (p<0.001). However, group differences were not statistically significant. In several instances an interaction was significant between resident and assessor gender by ACGME core competency, but there were no statistically significant group differences for all competencies at any time point. When applicable, subsequent analysis of main effect of resident or assessor gender independently of one another revealed no statistically significant differences. Conclusions: No significant differences in entrustment scores were found based on resident or assessor gender in our large, robust entrustment-based program of assessment. Determining the reasons for our findings may help identify ways to mitigate gender bias in assessment.
Collapse
|
8
|
LaRochelle JS. From observation to judgment: Making sense of milestones 2.0 for cytopathology. Cancer Cytopathol 2021; 129:673-674. [PMID: 34358417 DOI: 10.1002/cncy.22494] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/07/2021] [Accepted: 07/09/2021] [Indexed: 11/08/2022]
Affiliation(s)
- Jeffrey S LaRochelle
- Department of Medical Education, University of Central Florida College of Medicine, Orlando, Florida
| |
Collapse
|
9
|
Schumacher DJ, Martini A, Kinnear B, Kelleher M, Balmer DF, Wurster-Ovalle V, Carraccio C. Facilitators and Inhibitors to Assessing Entrustable Professional Activities in Pediatric Residency. Acad Pediatr 2021; 21:735-741. [PMID: 33221495 DOI: 10.1016/j.acap.2020.11.013] [Citation(s) in RCA: 12] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/03/2020] [Revised: 11/12/2020] [Accepted: 11/14/2020] [Indexed: 01/09/2023]
Abstract
OBJECTIVE Research on entrustable professional activities (EPAs) has focused on EPA development with little attention paid to implementation experiences. This constructivist grounded theory study sought to begin filling this gap by exploring the experiences of pediatric residency programs with implementing EPA-based assessment. METHODS Interviews with 19 program leader and clinical competency committee participants from 13 sites were held between January and July 2019. Participants were asked about their experiences with implementing EPA-based assessment. Data collection and analysis were iterative. RESULTS Participants described a range of facilitators and inhibitors that influenced their efforts to implement EPA-based assessment. These fell into 4 thematic areas: 1) alignment of EPA construct with local views of performance and assessment, 2) assessing EPAs illuminates holes in the residency curriculum, 3) clinical competency committee structure and process impacts EPA-based assessment, and 4) faculty engagement and development drives ability to assess EPAs. Areas described as facilitators by some participants were noted to be inhibitors for others. The sum of a program's facilitators and inhibitors led to more or less ability to assess EPAs on the whole. Finally, the first area functions differently from the others; it can shift the entire balance toward or away from the ability to assess EPAs overall. CONCLUSION This study helps fill a void in implementation evidence for EPA-based assessment through better understanding of facilitators and inhibitors to such efforts.
Collapse
Affiliation(s)
- Daniel J Schumacher
- Department of Pediatrics, Cincinnati Children's Hospital Medical Center/University of Cincinnati College of Medicine (DJ Schumacher, A Martini, and V Wurster-Ovalle), Cincinnati, Ohio.
| | - Abigail Martini
- Department of Pediatrics, Cincinnati Children's Hospital Medical Center/University of Cincinnati College of Medicine (DJ Schumacher, A Martini, and V Wurster-Ovalle), Cincinnati, Ohio
| | - Benjamin Kinnear
- Departments of Pediatrics and Medicine, Cincinnati Children's Hospital Medical Center/University of Cincinnati College of Medicine (B Kinnear and M Kelleher), Cincinnati, Ohio
| | - Matthew Kelleher
- Departments of Pediatrics and Medicine, Cincinnati Children's Hospital Medical Center/University of Cincinnati College of Medicine (B Kinnear and M Kelleher), Cincinnati, Ohio
| | - Dorene F Balmer
- Children's Hospital of Philadelphia and University of Pennsylvania Perelman School of Medicine (DF Balmer), Philadelphia, Pa
| | - Victoria Wurster-Ovalle
- Department of Pediatrics, Cincinnati Children's Hospital Medical Center/University of Cincinnati College of Medicine (DJ Schumacher, A Martini, and V Wurster-Ovalle), Cincinnati, Ohio
| | | |
Collapse
|
10
|
Boursicot K, Kemp S, Wilkinson T, Findyartini A, Canning C, Cilliers F, Fuller R. Performance assessment: Consensus statement and recommendations from the 2020 Ottawa Conference. MEDICAL TEACHER 2021; 43:58-67. [PMID: 33054524 DOI: 10.1080/0142159x.2020.1830052] [Citation(s) in RCA: 27] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
INTRODUCTION In 2011 the Consensus Statement on Performance Assessment was published in Medical Teacher. That paper was commissioned by AMEE (Association for Medical Education in Europe) as part of the series of Consensus Statements following the 2010 Ottawa Conference. In 2019, it was recommended that a working group be reconvened to review and consider developments in performance assessment since the 2011 publication. METHODS Following review of the original recommendations in the 2011 paper and shifts in the field across the past 10 years, the group identified areas of consensus and yet to be resolved issues for performance assessment. RESULTS AND DISCUSSION This paper addresses developments in performance assessment since 2011, reiterates relevant aspects of the 2011 paper, and summarises contemporary best practice recommendations for OSCEs and WBAs, fit-for-purpose methods for performance assessment in the health professions.
Collapse
Affiliation(s)
- Katharine Boursicot
- Department of Assessment and Progression, Duke-National University of Singapore, Singapore, Singapore
| | - Sandra Kemp
- Curtin Medical School, Curtin University, Perth, Australia
| | - Tim Wilkinson
- Dean's Department, University of Otago, Christchurch, New Zealand
| | - Ardi Findyartini
- Department of Medical Education, Universitas Indonesia, Jakarta, Indonesia
| | - Claire Canning
- Department of Assessment and Progression, Duke-National University of Singapore, Singapore, Singapore
| | - Francois Cilliers
- Department of Health Sciences Education, University of Cape Town, Cape Town, South Africa
| | | |
Collapse
|