1
|
Dohms MC, Rocha A, Rasenberg E, Dielissen P, Thoonen B. Peer assessment in medical communication skills training in programmatic assessment: A qualitative study examining faculty and student perceptions. MEDICAL TEACHER 2024; 46:823-831. [PMID: 38157436 DOI: 10.1080/0142159x.2023.2285248] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/06/2023] [Accepted: 11/15/2023] [Indexed: 01/03/2024]
Abstract
INTRODUCTION Current literature recommends assessment of communication skills in medical education combining different settings and multiple observers. There is still a gap in understanding about whether and how peers assessment facilitates learning in communication skills training. METHODS We designed a qualitative study using focus group interviews and thematic analysis, in a medical course in the Netherlands. We aimed to explore medical students' and teachers' experiences, perceptions, and perspectives about challenges and facilitating factors in PACST (Peer assessment in medical communication skills training). RESULTS Most of the participants reported that peer feedback was a valuable experience when learning communication skills. The major challenges for the quality and credibility of PACST reported by the participants are the question whether peer feedback is critical enough for learning and the difficulty of actually engaging students in the assessment process. CONCLUSION Teachers reviewing students' peer assessments may improve the quality and their credibility and the reviewed assessments can best be used for learning purposes. We suggest to pay sufficient attention to teachers' roles in PACST, ensuring a safe and trustworthy environment and additionally helping students to internalize the value of being vulnerable during the evaluation process.
Collapse
Affiliation(s)
- M C Dohms
- Clinique Bouchard, Marseille, France
| | - A Rocha
- DASA (Diagnósticos da América S/A), São Paulo, Brazil
| | | | - P Dielissen
- Medisch Centrum Onder de Linde, Nijmegen, Netherlands
| | - B Thoonen
- Radboud University, Nijmegen, Netherlands
| |
Collapse
|
2
|
Keshmiri F, Javadi A. Feedback-based learning from viewpoints of surgical nursing students: A mixed-method study. J Eval Clin Pract 2024. [PMID: 38818690 DOI: 10.1111/jep.14024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/06/2024] [Revised: 04/30/2024] [Accepted: 05/13/2024] [Indexed: 06/01/2024]
Abstract
BACKGROUND Feedback-based learning (FBL) focuses on guiding the learning process according to educational objectives and the student's needs. This study aimed to investigate surgical nursing students' perceptions and explore their experiences of FBL. METHOD The present study used a mixed-methods sequential explanatory design that was conducted in the quantitative and qualitative phases. Surgical nursing students participated in the quantitative phase (n = 105). In the first phase, students completed two questionnaires about FBL and clinical feedback. Semi-structured face-to-face interviews were used to collect qualitative data in the second phase. Graneheim and Lundman's inductive approaches were used to analyse the qualitative data. RESULTS The mean (SD) score for students' perception of FBL was 3.99 ± 0.70. The qualitative results were explored in two themes, "motivational support for improvement" and "unpleasant learning". CONCLUSION In this study, the positive and negative aspects of FBL were explained. FBL is perceived as a motivational support mechanism to improve students' capabilities during their academic courses and also prepare them for future careers. Conversely, FBL may experience unpleasant learning due to negative feedback and negative emotions.
Collapse
Affiliation(s)
- Fatemeh Keshmiri
- Department of Medical Education, Education Development Center, Shahid Sadoughi University of Medical Sciences, Yazd, Iran
| | - Alireza Javadi
- Department of Surgical Technology, Student Research Committee, Shahid Sadoughi University of Medical Sciences, Yazd, Iran
| |
Collapse
|
3
|
Ohlin S, King S, Takashima M, Ossenberg C, Henderson A. Learning in the workplace: Development of a simple language statement assessment tool that supports second-level nurse practice. Nurse Educ Pract 2024; 77:103983. [PMID: 38701684 DOI: 10.1016/j.nepr.2024.103983] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2023] [Revised: 04/15/2024] [Accepted: 04/18/2024] [Indexed: 05/05/2024]
Abstract
AIM To focus learning through clarity of the enrolled nurse (EN) role (a second tier nurse position) through development of a user-friendly workplace performance assessment tool commensurate with EN standards for practice. BACKGROUND Internationally, the nursing workforce comprises regulated and unregulated staff. In Australia, similar to other western countries, there are two tiers of regulated workforce, namely Registered Nurses (RNs) and Enrolled Nurses (ENs). Differences in RN and EN standards based on the education preparation are not always clearly differentiated in workplace practice. Roles are often seen as interchangeable: Improved clarity of both regulated and unregulated roles, when numbers of healthcare workers are burgeoning, assists performance assessment that guides further learning and safe practice. DESIGN Two phase sequential, non-experimental design. METHODS Phase one used focus groups (n=48), expert reference panel (n=8) and end-users (n=16) to develop simple language statements. Phase two involved field testing of the statements. FINDINGS A 30-item, criterion-based workplace performance tool was developed. Principal component analysis of completed tools indicated work could be organised around three key areas of practice, namely, higher order thinking and problem solving, routine daily activities of care and personal and social attributes. DISCUSSION Participants reported the statement items assisted in determining suitable activities and accompanying cues in discussing learning needs. Analysis assisted with discriminating broader elements of EN workplace performance. CONCLUSIONS Workplace learning is important for nurses to continue to build their capacity to deliver optimum care. Assessment tools that describe professional capability in plain language statements and provide examples of supportive behavioural cues help guide on-going learning through improving the validity and thereby consistency of assessment processes. Furthermore, comprehensible and meaningful statements and cues can readily be adopted by students and educators to target learning and feedback thereby enhancing clarity of the EN role, to distinguish from other nursing roles.
Collapse
Affiliation(s)
- Simone Ohlin
- Central Queensland University, Queensland, Australia
| | - Sue King
- Central Queensland University, Queensland, Australia
| | - Mari Takashima
- School of Nursing, Midwifery and Social Work, University of Queensland, Australia
| | - Christine Ossenberg
- Central Queensland University, Queensland, Australia; Princess Alexandra Hospital, Woolloongabba, Queensland, Australia
| | - Amanda Henderson
- Central Queensland University, Queensland, Australia; Princess Alexandra Hospital, Woolloongabba, Queensland, Australia.
| |
Collapse
|
4
|
Fuentes-Cimma J, Sluijsmans D, Riquelme A, Villagran I, Isbej L, Olivares-Labbe MT, Heeneman S. Designing feedback processes in the workplace-based learning of undergraduate health professions education: a scoping review. BMC MEDICAL EDUCATION 2024; 24:440. [PMID: 38654360 PMCID: PMC11036781 DOI: 10.1186/s12909-024-05439-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/25/2023] [Accepted: 04/17/2024] [Indexed: 04/25/2024]
Abstract
BACKGROUND Feedback processes are crucial for learning, guiding improvement, and enhancing performance. In workplace-based learning settings, diverse teaching and assessment activities are advocated to be designed and implemented, generating feedback that students use, with proper guidance, to close the gap between current and desired performance levels. Since productive feedback processes rely on observed information regarding a student's performance, it is imperative to establish structured feedback activities within undergraduate workplace-based learning settings. However, these settings are characterized by their unpredictable nature, which can either promote learning or present challenges in offering structured learning opportunities for students. This scoping review maps literature on how feedback processes are organised in undergraduate clinical workplace-based learning settings, providing insight into the design and use of feedback. METHODS A scoping review was conducted. Studies were identified from seven databases and ten relevant journals in medical education. The screening process was performed independently in duplicate with the support of the StArt program. Data were organized in a data chart and analyzed using thematic analysis. The feedback loop with a sociocultural perspective was used as a theoretical framework. RESULTS The search yielded 4,877 papers, and 61 were included in the review. Two themes were identified in the qualitative analysis: (1) The organization of the feedback processes in workplace-based learning settings, and (2) Sociocultural factors influencing the organization of feedback processes. The literature describes multiple teaching and assessment activities that generate feedback information. Most papers described experiences and perceptions of diverse teaching and assessment feedback activities. Few studies described how feedback processes improve performance. Sociocultural factors such as establishing a feedback culture, enabling stable and trustworthy relationships, and enhancing student feedback agency are crucial for productive feedback processes. CONCLUSIONS This review identified concrete ideas regarding how feedback could be organized within the clinical workplace to promote feedback processes. The feedback encounter should be organized to allow follow-up of the feedback, i.e., working on required learning and performance goals at the next occasion. The educational programs should design feedback processes by appropriately planning subsequent tasks and activities. More insight is needed in designing a full-loop feedback process, in which specific attention is needed in effective feedforward practices.
Collapse
Affiliation(s)
- Javiera Fuentes-Cimma
- Department of Health Sciences, Faculty of Medicine, Pontificia Universidad Católica de Chile, Avenida Vicuña Mackenna 4860, Macul, Santiago, Chile.
- School of Health Professions Education, Maastricht University, Maastricht, Netherlands.
| | | | - Arnoldo Riquelme
- Centre for Medical and Health Profession Education, Department of Gastroenterology, Faculty of Medicine, Pontificia Universidad Católica de Chile, Santiago, Chile
| | - Ignacio Villagran
- Department of Health Sciences, Faculty of Medicine, Pontificia Universidad Católica de Chile, Avenida Vicuña Mackenna 4860, Macul, Santiago, Chile
| | - Lorena Isbej
- School of Health Professions Education, Maastricht University, Maastricht, Netherlands
- School of Dentistry, Faculty of Medicine, Pontificia Universidad Católica de Chile, Santiago, Chile
| | | | - Sylvia Heeneman
- Department of Pathology, Faculty of Health, Medicine and Health Sciences, Maastricht University, Maastricht, Netherlands
| |
Collapse
|
5
|
Wang PZT, Wilson CA, Nair SM, Bjazevic J, Dave S, Davidson J, Saklofske DH, Chahine S. The Interactive Relationship Between Instructor Perceptions and Learner Personality on Surgical Skills Performance. JOURNAL OF SURGICAL EDUCATION 2022; 79:686-694. [PMID: 35115267 DOI: 10.1016/j.jsurg.2022.01.002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/13/2021] [Revised: 01/04/2022] [Accepted: 01/07/2022] [Indexed: 06/14/2023]
Abstract
OBJECTIVE The objective of this study was to examine the association between learner personality and capacity to be trained (i.e., performance improvement) on a surgical task, and how instructor perceptions of the learners' capacity to be trained interact with learner personality and performance during training and feedback. There is meaningful heterogeneity in the degree of learner surgical skills acquisition despite receiving the same amount of training. While learner personality may independently contribute to skill acquisition, the instructor-learner feedback process is also important to consider. To better understand this interpersonal relationship, it is necessary to also consider instructor factors (i.e., perceptions), and how this may contribute to learner variability in skills training. DESIGN This exploratory study employed a prospective two-phase design. Medical and non-medical undergraduate students (N = 62) completed measures of personality and participated in two 20-minute training sessions with expert feedback 2 weeks apart, performing an end-to-side anastomosis on a low-fidelity model. Learner performance and instructors' perceptions of a learner's capacity to be trained were assessed. PARTICIPANTS Sixty-two medical and non-medical undergraduate students. RESULTS There was a significant interaction between learner Extraversion and instructor's perceptions of learner capacity to be trained. Higher learner Extraversion was only associated with an increase in performance improvement for those who were considered trainable (OR = 4.83, p = 0.017). Post hoc analysis revealed a significant difference in the amount of feedback provided to participants who were considered trainable (M = 9.45) versus not trainable (M = 16.48). CONCLUSIONS This study highlights the importance of both individual learner factors and instructor perceptions on surgical skill acquisition.
Collapse
Affiliation(s)
| | - Claire A Wilson
- Department of Surgery, Western University, London, Ontario, Canada
| | - Shiva M Nair
- Department of Urology, Waikato Hospital, Hamilton, New Zealand
| | | | - Sumit Dave
- Department of Surgery, Western University, London, Ontario, Canada
| | - Jacob Davidson
- Division of Pediatric Surgery, London Health Sciences Centre, London, Ontario, Canada
| | | | - Saad Chahine
- Faculty of Education, Queen's University, Kingston, Ontario, Canada
| |
Collapse
|
6
|
Roberts C, Khanna P, Lane AS, Reimann P, Schuwirth L. Exploring complexities in the reform of assessment practice: a critical realist perspective. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2021; 26:1641-1657. [PMID: 34431028 DOI: 10.1007/s10459-021-10065-8] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/02/2020] [Accepted: 08/08/2021] [Indexed: 06/13/2023]
Abstract
Although the principles behind assessment for and as learning are well-established, there can be a struggle when reforming traditional assessment of learning to a program which encompasses assessment for and as learning. When introducing and reporting reforms, tensions in faculty may arise because of differing beliefs about the relationship between assessment and learning and the rules for the validity of assessments. Traditional systems of assessment of learning privilege objective, structured quantification of learners' performances, and are done to the students. Newer systems of assessment promote assessment for learning, emphasise subjectivity, collate data from multiple sources, emphasise narrative-rich feedback to promote learner agency, and are done with the students. This contrast has implications for implementation and evaluative research. Research of assessment which is done to students typically asks, "what works", whereas assessment that is done with the students focuses on more complex questions such as "what works, for whom, in which context, and why?" We applied such a critical realist perspective drawing on the interplay between structure and agency, and a systems approach to explore what theory says about introducing programmatic assessment in the context of pre-existing traditional approaches. Using a reflective technique, the internal conversation, we developed four factors that can assist educators considering major change to assessment practice in their own contexts. These include enabling positive learner agency and engagement; establishing argument-based validity frameworks; designing purposeful and eclectic evidence-based assessment tasks; and developing a shared narrative that promotes reflexivity in appreciating the complex relationships between assessment and learning.
Collapse
Affiliation(s)
- Chris Roberts
- Faculty of Medicine and Health, Education Office, Sydney Medical School, The University of Sydney, Sydney, NSW, Australia.
| | - Priya Khanna
- Faculty of Medicine and Health, Education Office, Sydney Medical School, The University of Sydney, Sydney, NSW, Australia
| | - Andrew Stuart Lane
- Faculty of Medicine and Health, Education Office, Sydney Medical School, The University of Sydney, Sydney, NSW, Australia
| | - Peter Reimann
- Centre for Research on Learning and Innovation (CRLI), The University of Sydney, Sydney, NSW, Australia
| | - Lambert Schuwirth
- Prideaux Discipline of Clinical Education, College of Medicine and Public Health, Flinders University, Adelaide, South Australia, Australia
| |
Collapse
|
7
|
Haywood KL, Carr S, Tregonning AM. Midwives' experiences of completing written feedback: The emotions, challenges and solutions. Nurse Educ Pract 2021; 54:103097. [PMID: 34058466 DOI: 10.1016/j.nepr.2021.103097] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/21/2020] [Revised: 05/09/2021] [Accepted: 05/22/2021] [Indexed: 11/18/2022]
Abstract
AIM Written feedback is a valued learning tool for midwifery students, providing information on clinical performance with the aim to improve future practice. One aim of this study was to explore the experiences of midwives in completing written feedback in the clinical setting. DESIGN This qualitative study is situated within a hermeneutic phenomenological framework. METHODS Data were collected through focus groups and individual interviews, then transcribed and subjected to thematic content analysis. RESULTS Three interconnected themes of Emotions, Challenges and Solutions were identified. Midwifery participants experienced strong emotional reactions (anxiety, guilt, frustration) around the completion of written feedback in the clinical setting due to four challenges (lack of time, continuity, clarity of feedback content and direct supervision), which resulted in solutions being employed to offset or minimise problematic written feedback. CONCLUSIONS Completing written feedback in the clinical setting was a challenging experience for participants in this study, affecting their ability to do so in some cases. This is concerning as the literature is supportive of the positive impact written feedback has on the growth and potential of students.
Collapse
Affiliation(s)
- Kirsty L Haywood
- King Edward Memorial Hospital, Bagot Road, Subiaco, Western Australia 6008, Australia; University of Western Australia, 35 Stirling Highway, Crawley, Western Australia 6009, Australia.
| | - Sandra Carr
- University of Western Australia, 35 Stirling Highway, Crawley, Western Australia 6009, Australia.
| | - Alexandra M Tregonning
- University of Western Australia, 35 Stirling Highway, Crawley, Western Australia 6009, Australia.
| |
Collapse
|
8
|
Lafleur A, Côté L, Witteman HO. Analysis of Supervisors' Feedback to Residents on Communicator, Collaborator, and Professional Roles During Case Discussions. J Grad Med Educ 2021; 13:246-256. [PMID: 33897959 PMCID: PMC8054588 DOI: 10.4300/jgme-d-20-00842.1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/30/2020] [Revised: 11/06/2020] [Accepted: 01/10/2021] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Literature examining the feedback supervisors give to residents during case discussions in the realms of communication, collaboration, and professional roles (intrinsic roles) focuses on analyses of written feedback and self-reporting. OBJECTIVES We quantified how much of the supervisors' verbal feedback time targeted residents' intrinsic roles and how well feedback time was aligned with the role targeted by each case. We analyzed the educational goals of this feedback. We assessed whether feedback content differed depending on whether the residents implied or explicitly expressed a need for particular feedback. METHODS This was a mixed-methods study conducted from 2017 to 2019. We created scripted cases for radiology and internal medicine residents to present to supervisors, then analyzed the feedback given both qualitatively and quantitatively. The cases were designed to highlight the CanMEDS intrinsic roles of communicator, collaborator, and professional. RESULTS Radiologists (n = 15) spent 22% of case discussions providing feedback on intrinsic roles (48% aligned): 28% when the case targeted the communicator role, 14% for collaborator, and 27% for professional. Internists (n = 15) spent 70% of discussions on intrinsic roles (56% aligned): 66% for communicator, 73% for collaborator, and 72% for professional. Radiologists' goals were to offer advice (66%), reflections (21%), and agreements (7%). Internists offered advice (41%), reflections (40%), and clarifying questions (10%). We saw no consistent effects when residents explicitly requested feedback on an intrinsic role. CONCLUSIONS Case discussions represent frequent opportunities for substantial feedback on intrinsic roles, largely aligned with the clinical case. Supervisors predominantly offered monologues of advice and agreements.
Collapse
Affiliation(s)
- Alexandre Lafleur
- Alexandre Lafleur, MD, MHPE, is Associate Clinical Professor, Department of Medicine, Laval University Faculty of Medicine, Quebec City, Canada, and Co-Chairholder, CMA-MD Educational Leadership Chair in Health Professions Education
| | - Luc Côté
- Luc Côté, MSW, PhD, is Professor and Medical Education Researcher, Department of Family and Emergency Medicine, Office of Education and Continuing Professional Development, Laval University Faculty of Medicine, Quebec City, Canada
| | - Holly O. Witteman
- Holly O. Witteman, PhD, is Associate Professor, Department of Family and Emergency Medicine, Office of Education and Continuing Professional Development, Laval University Faculty of Medicine, Quebec City, Canada
| |
Collapse
|
9
|
Dupre J, Naik VN. The role of simulation in high-stakes assessment. BJA Educ 2021; 21:148-153. [PMID: 33777413 DOI: 10.1016/j.bjae.2020.12.002] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 12/02/2020] [Indexed: 01/22/2023] Open
Affiliation(s)
- J Dupre
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada
| | - V N Naik
- Royal College of Physicians and Surgeons of Canada, Ottawa, ON, Canada.,University of Ottawa, Ottawa, ON, Canada
| |
Collapse
|
10
|
Schut S, Maggio LA, Heeneman S, van Tartwijk J, van der Vleuten C, Driessen E. Where the rubber meets the road - An integrative review of programmatic assessment in health care professions education. PERSPECTIVES ON MEDICAL EDUCATION 2021; 10:6-13. [PMID: 33085060 PMCID: PMC7809087 DOI: 10.1007/s40037-020-00625-w] [Citation(s) in RCA: 42] [Impact Index Per Article: 14.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/26/2020] [Revised: 09/21/2020] [Accepted: 09/29/2020] [Indexed: 05/12/2023]
Abstract
INTRODUCTION Programmatic assessment was introduced as an approach to design assessment programmes with the aim to simultaneously optimize the decision-making and learning function of assessment. An integrative review was conducted to review and synthesize results from studies investigating programmatic assessment in health care professions education in practice. METHODS The authors systematically searched PubMed, Web of Science, and ERIC to identify studies published since 2005 that reported empirical data on programmatic assessment. Characteristics of the included studies were extracted and synthesized, using descriptive statistics and thematic analysis. RESULTS Twenty-seven studies were included, which used quantitative methods (n = 10), qualitative methods (n = 12) or mixed methods (n = 5). Most studies were conducted in clinical settings (77.8%). Programmatic assessment was found to enable meaningful triangulation for robust decision-making and used as a catalyst for learning. However, several problems were identified, including overload in assessment information and the associated workload, counterproductive impact of using strict requirements and summative signals, lack of a shared understanding of the nature and purpose of programmatic assessment, and lack of supportive interpersonal relationships. Thematic analysis revealed that the success and challenges of programmatic assessment were best understood by the interplay between quantity and quality of assessment information, and the influence of social and personal aspects on assessment perceptions. CONCLUSION Although some of the evidence may seem compelling to support the effectiveness of programmatic assessment in practice, tensions will emerge when simultaneously stimulating the development of competencies and assessing its result. The identified factors and inferred strategies provide guidance for navigating these tensions.
Collapse
Affiliation(s)
- Suzanne Schut
- School of Health Professions Education, Department of Educational Development and Research, Maastricht University, Maastricht, The Netherlands.
| | - Lauren A Maggio
- Department of Medicine, Uniformed Services, University of the Health Sciences, Bethesda, MD, USA
| | - Sylvia Heeneman
- School of Health Professions Education, Department of Pathology, Cardiovascular Research Institute Maastricht, Maastricht University, Maastricht, The Netherlands
| | - Jan van Tartwijk
- Department of Education, Utrecht University, Utrecht, The Netherlands
| | - Cees van der Vleuten
- School of Health Professions Education, Department of Educational Development and Research, Maastricht University, Maastricht, The Netherlands
| | - Erik Driessen
- School of Health Professions Education, Department of Educational Development and Research, Maastricht University, Maastricht, The Netherlands
| |
Collapse
|
11
|
McEllistrem B, Barrett A, Hanley K. Performance in practice; exploring trainer and trainee experiences of user-designed formative assessment tools. EDUCATION FOR PRIMARY CARE 2020; 32:27-33. [PMID: 33094687 DOI: 10.1080/14739879.2020.1815085] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Abstract
INTRODUCTION General Practice training in Ireland currently has various methods of formative assessment and feedback delivered to trainees. In 2018 the Irish College of General Practitioners commissioned the generation of two new user-designed formative feedback tools that would allow trainee feedback to drive learning. These tools became known as the Performance in Practice (PiP) tools. AIMS To explore the experiences of General Practice (GP) trainers and trainees having completed a pilot of using the PiP tools for 4 months. METHODS An explorative phenomenological approach was taken to understand the experiences of trainers and trainees. One to one interviews were conducted, and the transcripts analysed for themes and sub-theme via Template analysis. RESULTS User experiences focused on two main areas; educational value and acceptability. In relation to educational value, the PiP tools were seen as an improvement over established forms of formative feedback, as they were centred around the curriculum and therefore reflected the unique multifaceted requirements of an independently practising GP. Acceptability primarily focused around data governance and structures, as well as practical issues such as ease of software use. CONCLUSIONS Overall, the experience of using the PiP tools was positive for both trainers and trainees. Future plans to further explore implementation of the PiP tools have been significantly informed by this research.
Collapse
Affiliation(s)
- B McEllistrem
- General Practice Training Unit, Irish College of General Practitioners, Dublin, Ireland
| | - A Barrett
- General Practice Training Unit, Irish College of General Practitioners, Dublin, Ireland
| | - K Hanley
- General Practice Training Unit, Irish College of General Practitioners, Dublin, Ireland
| |
Collapse
|
12
|
Oudkerk Pool A, Jaarsma ADC, Driessen EW, Govaerts MJB. Student perspectives on competency-based portfolios: Does a portfolio reflect their competence development? PERSPECTIVES ON MEDICAL EDUCATION 2020; 9:166-172. [PMID: 32274650 PMCID: PMC7283408 DOI: 10.1007/s40037-020-00571-7] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
INTRODUCTION Portfolio-based assessments require that learners' competence development is adequately reflected in portfolio documentation. This study explored how students select and document performance data in their portfolios and how they perceive these data to be representative for their competence development. METHODS Students uploaded performance data in a competency-based portfolio. During one clerkship period, twelve students also recorded an audio diary in which they reflected on experiences and feedback that they perceived to be indicants of their competence development. Afterwards, these students were interviewed to explore the extent to which the performance documentation in the portfolio corresponded with what they considered illustrative evidence of their development. The interviews were analyzed using thematic analysis. RESULTS Portfolios provide an accurate but fragmented picture of student development. Portfolio documentation was influenced by tensions between learning and assessment, student beliefs about the goal of portfolios, student performance evaluation strategies, the learning environment and portfolio structure. DISCUSSION This study confirms the importance of taking student perceptions into account when implementing a competency-based portfolio. Students would benefit from coaching on how to select meaningful experiences and performance data for documentation in their portfolios. Flexibility in portfolio structure and requirements is essential to ensure optimal fit between students' experienced competence development and portfolio content.
Collapse
Affiliation(s)
- Andrea Oudkerk Pool
- School of Health Professions Education (SHE), Maastricht University, Maastricht, The Netherlands.
- Department of Educational Development and Research, Faculty of Health Medicine and Life Sciences, Maastricht University, Maastricht, The Netherlands.
| | - A Debbie C Jaarsma
- Center for Education Development and Research in Health Professions (CEDAR), Faculty of Medical Sciences, University Medical Center Groningen, University of Groningen, Groningen, The Netherlands
| | - Erik W Driessen
- Department of Educational Development and Research, Faculty of Health Medicine and Life Sciences, Maastricht University, Maastricht, The Netherlands
| | - Marjan J B Govaerts
- Department of Educational Development and Research, Faculty of Health Medicine and Life Sciences, Maastricht University, Maastricht, The Netherlands
| |
Collapse
|
13
|
Bowen JL, Boscardin CK, Chiovaro J, Ten Cate O, Regehr G, Irby DM, O'Brien BC. A view from the sender side of feedback: anticipated receptivity to clinical feedback when changing prior physicians' clinical decisions-a mixed methods study. ADVANCES IN HEALTH SCIENCES EDUCATION : THEORY AND PRACTICE 2020; 25:263-282. [PMID: 31552531 DOI: 10.1007/s10459-019-09916-2] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/17/2018] [Accepted: 08/30/2019] [Indexed: 05/23/2023]
Abstract
When physicians transition patients, the physician taking over may change the diagnosis. Such a change could serve as an important source of clinical feedback to the prior physician. However, this feedback may not transpire if the current physician doubts the prior physician's receptivity to the information. This study explored facilitators of and barriers to feedback communication in the context of patient care transitions using an exploratory sequential, qualitative to quantitative, mixed methods design. Twenty-two internal medicine residents and hospitalist physicians from two teaching hospitals were interviewed and data were analyzed thematically. A prominent theme was participants' reluctance to communicate diagnostic changes. Participants perceived case complexity and physical proximity to facilitate, and hierarchy, unfamiliarity with the prior physician, and lack of relationship to inhibit communication. In the subsequent quantitative portion of the study, forty-one hospitalists completed surveys resulting in 923 total survey responses. Multivariable analyses and a mixed-effects model were applied to survey data with anticipated receptivity as the outcome variable. In the mixed-effects model, four factors had significant positive associations with receivers' perceived receptivity: (1) feedback senders' time spent on teaching services (β = 0.52, p = 0.02), (2) receivers' trustworthiness and clinical credibility (β = 0.49, p < 0.001), (3) preference of both for shared work rooms (β = 0.15, p = 0.006), and (4) receivers being peers (β = 0.24, p < 0.001) or junior colleagues (β = 0.39, p < 0.001). This study suggests that anticipated receptivity to feedback about changed clinical decisions affects clinical communication loops. Without trusting relationships and opportunities for low risk, casual conversations, hospitalists may avoid such conversations.
Collapse
Affiliation(s)
- Judith L Bowen
- Department of Medical Education and Clinical Sciences, Spokane Academic Center, Elson S Floyd College of Medicine, Washington State University, 412 E. Spokane Falls Blvd, Spokane, WA, 99202, USA.
- Portland Veterans Affairs Health Care System, Portland, OR, USA.
| | - Christy Kim Boscardin
- Department of Medicine and Center for Faculty Educators, University of California, San Francisco, CA, USA
| | - Joseph Chiovaro
- Portland Veterans Affairs Health Care System, Portland, OR, USA
- Division of General Internal Medicine and Geriatrics, Department of Medicine, Oregon Health and Science University, Portland, OR, USA
| | - Olle Ten Cate
- Department of Medicine and Center for Faculty Educators, University of California, San Francisco, CA, USA
- Center for Research and Development of Education, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Glenn Regehr
- Department of Surgery and Centre for Health Education Scholarship, University of British Columbia, British Columbia, Canada
| | - David M Irby
- Department of Medicine and Center for Faculty Educators, University of California, San Francisco, CA, USA
| | - Bridget C O'Brien
- Department of Medicine and Center for Faculty Educators, University of California, San Francisco, CA, USA
| |
Collapse
|
14
|
Lörwald AC, Lahner FM, Mooser B, Perrig M, Widmer MK, Greif R, Huwendiek S. Influences on the implementation of Mini-CEX and DOPS for postgraduate medical trainees' learning: A grounded theory study. MEDICAL TEACHER 2019; 41:448-456. [PMID: 30369283 DOI: 10.1080/0142159x.2018.1497784] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/08/2023]
Abstract
Introduction: In order for Mini-Clinical Evaluation Exercise (Mini-CEX) and Direct Observation of Procedural Skills (DOPS) to actually have a positive effect on trainees' learning, the way in which the tools are implemented is of key importance. However, there are many factors influencing their implementation. In this study, we aim to develop a comprehensive model of such factors. Methods: Using a constructivist grounded theory approach, we performed eight focus groups. Participants were postgraduate trainees and supervisors from three different specialties; all were experienced with Mini-CEX and/or DOPS. Data were analyzed for recurring themes, underlying concepts and their interactions using constant comparison. Results: We developed a model demonstrating how the implementation of Mini-CEX and DOPS for trainees' learning is influenced by 13 factors relating to four categories: organizational culture (e.g. value of teaching and feedback), work structure (e.g. time for Mini-CEX and DOPS, faculty development), instruments (e.g. content of assessment), and users (e.g. relationship between trainees and supervisors), and their interaction. Conclusions: We developed a complex model of influencing factors relating to four categories. Consideration of this model might support successful implementation and trainees' learning with Mini-CEX and DOPS.
Collapse
Affiliation(s)
- Andrea C Lörwald
- a Department of Assessment and Evaluation , Institute of Medical Education , Bern , Switzerland
| | - Felicitas-Maria Lahner
- a Department of Assessment and Evaluation , Institute of Medical Education , Bern , Switzerland
| | - Bettina Mooser
- a Department of Assessment and Evaluation , Institute of Medical Education , Bern , Switzerland
| | - Martin Perrig
- b Department of General Internal Medicine , Bern University Hospital University of Bern , Bern , Switzerland
| | - Matthias K Widmer
- c Department of Cardiovascular Surgery , Bern University Hospital University of Bern , Bern , Switzerland
| | - Robert Greif
- d Department of Anaesthesiology and Pain Therapy , Bern University Hospital University of Bern , Bern , Switzerland
| | - Sören Huwendiek
- a Department of Assessment and Evaluation , Institute of Medical Education , Bern , Switzerland
| |
Collapse
|
15
|
Brunyé TT, Drew T, Weaver DL, Elmore JG. A review of eye tracking for understanding and improving diagnostic interpretation. COGNITIVE RESEARCH-PRINCIPLES AND IMPLICATIONS 2019; 4:7. [PMID: 30796618 PMCID: PMC6515770 DOI: 10.1186/s41235-019-0159-2] [Citation(s) in RCA: 49] [Impact Index Per Article: 9.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/05/2018] [Accepted: 02/01/2019] [Indexed: 12/29/2022]
Abstract
Inspecting digital imaging for primary diagnosis introduces perceptual and cognitive demands for physicians tasked with interpreting visual medical information and arriving at appropriate diagnoses and treatment decisions. The process of medical interpretation and diagnosis involves a complex interplay between visual perception and multiple cognitive processes, including memory retrieval, problem-solving, and decision-making. Eye-tracking technologies are becoming increasingly available in the consumer and research markets and provide novel opportunities to learn more about the interpretive process, including differences between novices and experts, how heuristics and biases shape visual perception and decision-making, and the mechanisms underlying misinterpretation and misdiagnosis. The present review provides an overview of eye-tracking technology, the perceptual and cognitive processes involved in medical interpretation, how eye tracking has been employed to understand medical interpretation and promote medical education and training, and some of the promises and challenges for future applications of this technology.
Collapse
Affiliation(s)
- Tad T Brunyé
- Center for Applied Brain and Cognitive Sciences, Tufts University, 200 Boston Ave., Suite 3000, Medford, MA, 02155, USA.
| | - Trafton Drew
- Department of Psychology, University of Utah, 380 1530 E, Salt Lake City, UT, 84112, USA
| | - Donald L Weaver
- Department of Pathology and University of Vermont Cancer Center, University of Vermont, 111 Colchester Ave., Burlington, VT, 05401, USA
| | - Joann G Elmore
- Department of Medicine, David Geffen School of Medicine at UCLA, University of California at Los Angeles, 10833 Le Conte Ave., Los Angeles, CA, 90095, USA
| |
Collapse
|
16
|
Da J, Ran Y, Pi M, Wu J, Dong R, Li Q, Zhang Q, Zhang X, Zha Y. Application of mini-clinical evaluation exercise for assessing the integrated-based learning during physical diagnostic course. BIOCHEMISTRY AND MOLECULAR BIOLOGY EDUCATION : A BIMONTHLY PUBLICATION OF THE INTERNATIONAL UNION OF BIOCHEMISTRY AND MOLECULAR BIOLOGY 2018; 46:417-423. [PMID: 30242954 DOI: 10.1002/bmb.21137] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/05/2018] [Revised: 05/06/2018] [Accepted: 06/05/2018] [Indexed: 06/08/2023]
Abstract
Medical education paradigm has been questioned for the requirements of improving the quality and quantity of medical students. This study was to explore the efficiency of integrated-based learning (IBL) used mini-clinical evaluation exercise (mini-CEX) during physical diagnostics course. One hundred and eleven volunteered students were randomly divided into three groups: lecture-based learning (LBL), case-based learning (CBL), and IBL. Nephrotic syndrome was the teaching content. In the IBL group, students were provided the guideline and additional interpretation from the instructor about the basic knowledge related to disease as vertical integration curriculum. Their performance was evaluated by mini-CEX and theoretical examination, respectively. All subjects have completed the study. The difference of five factors (medical interview, physical examination, clinical judgment, organizational effectiveness, and competence) in mini-CEX between IBL, CBL, and LBL were statistically significant (p <0.05). Sample sizes of below, meets, and above the expectations of mini-CEX in different instructional groups were statistically significant (X2 =17.842, p =0.001). The final exam scores in IBL group and the CBL group were significantly higher than that of LBL group (F =41.553, p =0.000). And the relationship of final exam score only in the IBL group was positive existed with medical interview (R =0.466, p =0.004), physical examination (R =0.328, p =0.048), professional attitude (R =0.366, p =0.026), and communication skill (R =0.412, p =0.011). Therefore, our study revealed the effect of IBL on the medical students' skills. It highlights IBL could improve the physical examination, organizational effectiveness, and competence and the application of basic knowledge. © 2018 International Union of Biochemistry and Molecular Biology, 46(5):417-423, 2018.
Collapse
Affiliation(s)
- Jingjing Da
- Renal Division, Department of Medicine, People's Hospital of Guizhou Province,Guiyang, Guizhou, 55002, China
- Department of Diagnostics, Medical College of Guizhou University, Guiyang, Guizhou, 550002, China
| | - Yan Ran
- Renal Division, Department of Medicine, People's Hospital of Guizhou Province,Guiyang, Guizhou, 55002, China
- Department of Diagnostics, Medical College of Guizhou University, Guiyang, Guizhou, 550002, China
| | - Mingjing Pi
- Renal Division, Department of Medicine, People's Hospital of Guizhou Province,Guiyang, Guizhou, 55002, China
- Department of Diagnostics, Medical College of Guizhou University, Guiyang, Guizhou, 550002, China
| | - Jing Wu
- Renal Division, Department of Medicine, People's Hospital of Guizhou Province,Guiyang, Guizhou, 55002, China
- Department of Diagnostics, Medical College of Guizhou University, Guiyang, Guizhou, 550002, China
| | - Rong Dong
- Renal Division, Department of Medicine, People's Hospital of Guizhou Province,Guiyang, Guizhou, 55002, China
- Department of Diagnostics, Medical College of Guizhou University, Guiyang, Guizhou, 550002, China
| | - Qian Li
- Renal Division, Department of Medicine, People's Hospital of Guizhou Province,Guiyang, Guizhou, 55002, China
- Department of Diagnostics, Medical College of Guizhou University, Guiyang, Guizhou, 550002, China
| | - Qian Zhang
- Medical College of Guizhou University, Guiyang, Guizhou, 550002, China
| | - Xiangyan Zhang
- Department of Clinical Education, People's Hospital of Guizhou Province, Guiyang, Guizhou, 550002, China
- Medical College of Guizhou University, Guiyang, Guizhou, 550002, China
| | - Yan Zha
- Renal Division, Department of Medicine, People's Hospital of Guizhou Province,Guiyang, Guizhou, 55002, China
- Department of Diagnostics, Medical College of Guizhou University, Guiyang, Guizhou, 550002, China
| |
Collapse
|
17
|
Schüler IM, Heinrich-Weltzien R, Eiselt M. Effect of individual structured and qualified feedback on improving clinical performance of dental students in clinical courses-randomised controlled study. EUROPEAN JOURNAL OF DENTAL EDUCATION : OFFICIAL JOURNAL OF THE ASSOCIATION FOR DENTAL EDUCATION IN EUROPE 2018; 22:e458-e467. [PMID: 29424934 DOI: 10.1111/eje.12325] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Accepted: 01/09/2018] [Indexed: 06/08/2023]
Abstract
AIM Analysis of the effect of individual structured and qualified feedback (FB) on practical skills development of dental students during clinical courses. METHODS Fifty-three final-year dental students at Jena University Hospital participated in this prospective randomised controlled interventional study. Two calibrated assessors evaluated 128 pre- and post-assessments of 4 different dental treatment steps performed by dental students during the integrated clinical course in restorative dentistry and prosthodontics and the clinical course paediatric dentistry. The assessment included direct observation, graded and non-grading evaluation and was documented with a specific FB assessment tool. Dental students in the intervention group (IG) received an elaborated, structured and qualified FB after the pre-assessment that focussed on individual strengths and weaknesses, providing specific suggestions for improvement and establishing a personal learning goal. Participants were randomly allocated to the IG and the control group (CG). RESULTS In both groups, dental students significantly enhanced their performance, but the difference was higher in the IG than in the CG. Large effect sizes (ES) were observed in all observed items, but FB had largest effect size in improving technical skills (ES = 1.6), followed by management (ES = 1.3) and communication skills (ES = 0.8). Factors with the highest influence on FB in enhancing dental students' clinical performance were their insight into their own mistakes or omissions, the observed dental treatment step and the duration of FB. CONCLUSION Individual structured and qualified FB is an effective method to enhance dental students' professional performances and to individually guide the learning process.
Collapse
Affiliation(s)
- I M Schüler
- Department of Preventive and Paediatric Dentistry, Jena University Hospital, Jena, Germany
| | - R Heinrich-Weltzien
- Department of Preventive and Paediatric Dentistry, Jena University Hospital, Jena, Germany
| | - M Eiselt
- Deanery, Medical Faculty, Friedrich-Schiller-University Jena, Jena, Germany
| |
Collapse
|
18
|
Ansari T, Usmani A. Students perception towards feedback in clinical sciences in an outcome-based integrated curriculum. Pak J Med Sci 2018; 34:702-709. [PMID: 30034443 PMCID: PMC6041542 DOI: 10.12669/pjms.343.15021] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/30/2022] Open
Abstract
Background & Objective: Feedback has been identified as one of the key strategies for learning in the outcome-based curriculum. Students are more interested in their grades paying little attention to the feedback, may not understand the importance of feedback and its effect on their performance because of their perception, and beliefs. Non-constructive feedback will not result in the improvement of the students’ performance. This study aims to explore; student’s perception of useful feedback; the purpose of feedback and believes about written feedback. Methods: This analytical cross-sectional study was conducted from November 2017 to January 2018 at Majmaah University. Students studying in clinical phase were recruited. Data were collected from 121 students by self-structured questionnaire using complete enumeration sampling method. Results: Majority of the students (45.5%) disagreed that the feedback should always contain marks; (49.6%) commented that the tutor did not provide enough constructive feedback. While we ask the purpose of feedback (62.8%), agree with two-way nature of feedback, and it is helpful to find there expected performance. Almost two third (67.8%) of the students believe that limited feedback is the reason for frustration and they did not receive comments for improvement. Conclusions: Students are aware of the purpose of feedback. Senior students give more value to feedback and in the opinion that feedback provides useful suggestions for future improvement and limited feedback is the reason for frustration. The results highlight the need for more structured feedback mechanism, and there is a need for faculty engagement in training to fill the existing gapes to create an effective educational alliance.
Collapse
Affiliation(s)
- Tahir Ansari
- Dr. Tahir Ansari, MCPS, FCPS, MRCPI. Department of Clinical Sciences, College of Medicine, Majmaah University, 11952 Almajmaah, Kingdom Saudi Arabia
| | - Ambreen Usmani
- Prof. Ambreen Usmani, MPhil, MCPS-HPE. Department of Anatomy and Medical Education, Baharia University Medical and Dental College, Karachi, Pakistan
| |
Collapse
|
19
|
Buchner HHF, Burger C, Ehlers JP. Does it matter who writes down the feedback? A comparison of teacher- vs. student-completed clinical encounter cards during clinical rotations in veterinary studies. GMS JOURNAL FOR MEDICAL EDUCATION 2018; 35:Doc23. [PMID: 29963613 PMCID: PMC6022583 DOI: 10.3205/zma001170] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Figures] [Subscribe] [Scholar Register] [Received: 10/23/2017] [Revised: 01/10/2018] [Accepted: 03/04/2018] [Indexed: 06/08/2023]
Abstract
Objective: Despite the fact that feedback (FB) provided by teachers to students is a recognised, effective teaching tool, successful use of feedback during clinical training depends on many factors. In addition to appropriate training and attitude of teachers, sustainable feedback requires an appropriate teaching culture and active commitment on the part of the students to receive, accept and use FB. This study examines the use of two different clinical encounter cards (CECs) during clinical rotation and investigates whether students take a more active part in the feedback process when using these cards. The objective of this study is to test whether it has a positive effect if students write down FB themselves and to verify this positive effect. Methodology: 161 students in their 9th semester of veterinary studies each had to use two clinical encounter cards (types 1 and 2) during their rotations on 10 wards. For this, students had to ask teachers for FB before starting a clinical activity. The oral FB given by the teachers was either written down on the CEC by the teachers (CEC type 1) or by the students (CEC type 2). Furthermore, the students were asked to assess their own performance by means of anchor criteria and to evaluate the quality of the FB provided by the teachers. Based on the entries in the CECs submitted, the following indicators for both CEC types could be calculated: (1) FB quantity and quality (length and specificity), (2) differentiation of self-assessment, as well as (3) level of satisfaction with the FB provided by the teachers. Results: With 2,377 CECs submitted, the mean CEC return rate was 74%. 99% of the cards showed positive FB, 69% contained constructive FB with suggestions for improvement, and 87% suggested specific next steps. On average, the FB written down by teachers was longer (12.4 versus 9.7 words) and more specific (1.9 versus 1.7 out of 3) than FB written down by students. Length and specificity decreased in the course of the semester. Neither the differentiation of self-assessment (proportion of differentiated entering of self-assessment) nor the students' level of satisfaction with the FB differed between the two examined CEC variants. Conclusion: The use of CECs across the cohort was successfully possible; however, the fact that students formulated and wrote down the FB themselves did not result in more comprehensive or more specific FB. Self-assessment and level of satisfaction with the teachers' FB remained unchanged.
Collapse
Affiliation(s)
| | - Christoph Burger
- University of Vienna, Faculty of Psychology, Institute for Basic Psychological Research and Research Methods, Vienna, Austria
- University of Applied Sciences Upper Austria, Campus Linz, Department of Social Work, Linz, Austria
| | - Jan P. Ehlers
- Witten/Herdecke University, Faculty of Health, Chair of Didactics and Educational Research in Health Science, Witten, Germany
| |
Collapse
|
20
|
Bing-You R, Varaklis K, Hayes V, Trowbridge R, Kemp H, McKelvy D. The Feedback Tango: An Integrative Review and Analysis of the Content of the Teacher-Learner Feedback Exchange. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2018; 93:657-663. [PMID: 28991848 DOI: 10.1097/acm.0000000000001927] [Citation(s) in RCA: 50] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/07/2023]
Abstract
PURPOSE To conduct an integrative review and analysis of the literature on the content of feedback to learners in medical education. METHOD Following completion of a scoping review in 2016, the authors analyzed a subset of articles published through 2015 describing the analysis of feedback exchange content in various contexts: audiotapes, clinical examination, feedback cards, multisource feedback, videotapes, and written feedback. Two reviewers extracted data from these articles and identified common themes. RESULTS Of the 51 included articles, about half (49%) were published since 2011. Most involved medical students (43%) or residents (43%). A leniency bias was noted in many (37%), as there was frequently reluctance to provide constructive feedback. More than one-quarter (29%) indicated the feedback was low in quality (e.g., too general, limited amount, no action plans). Some (16%) indicated faculty dominated conversations, did not use feedback forms appropriately, or provided inadequate feedback, even after training. Multiple feedback tools were used, with some articles (14%) describing varying degrees of use, completion, or legibility. Some articles (14%) noted the impact of the gender of the feedback provider or learner. CONCLUSIONS The findings reveal that the exchange of feedback is troubled by low-quality feedback, leniency bias, faculty deficient in feedback competencies, challenges with multiple feedback tools, and gender impacts. Using the tango dance form as a metaphor for this dynamic partnership, the authors recommend ways to improve feedback for teachers and learners willing to partner with each other and engage in the complexities of the feedback exchange.
Collapse
Affiliation(s)
- Robert Bing-You
- R. Bing-You is professor, Tufts University School of Medicine, Boston, Massachusetts, and vice president for medical education, Maine Medical Center, Portland, Maine. K. Varaklis is clinical associate professor, Tufts University School of Medicine, Boston, Massachusetts, and designated institutional official, Maine Medical Center, Portland, Maine. V. Hayes is clinical assistant professor, Tufts University School of Medicine, Boston, Massachusetts, and faculty member, Department of Family Medicine, Maine Medical Center, Portland, Maine. R. Trowbridge is associate professor, Tufts University School of Medicine, Boston, Massachusetts, and director of undergraduate medical education, Department of Medicine, Maine Medical Center, Portland, Maine. H. Kemp is medical librarian, Maine Medical Center, Portland, Maine. D. McKelvy is manager of library and knowledge services, Maine Medical Center, Portland, Maine
| | | | | | | | | | | |
Collapse
|
21
|
Beck S, Schirlo C, Breckwoldt J. How the Start into the Clinical Elective Year Could be Improved: Qualitative Results and Recommendations from Student Interviews. GMS JOURNAL FOR MEDICAL EDUCATION 2018; 35:Doc14. [PMID: 29497699 PMCID: PMC5827187 DOI: 10.3205/zma001161] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Subscribe] [Scholar Register] [Received: 03/30/2017] [Revised: 07/25/2017] [Accepted: 08/17/2017] [Indexed: 05/10/2023]
Abstract
Background: Entering the Clinical Elective Year (CEY) is a challenging transition phase for undergraduate medical students. Students become members of a professional team, thereby taking over certain tasks, which are executed more or less independently. Factors which facilitate (or impede) this transition in the perception of students are not well described. We therefore wanted to explore, what students perceived to be helpful during the first phase of the CEY and possibly derive respective recommendations. Methods: We conducted semi-structured interviews with 5th year medical students after they had completed the first two months of their CEY. Students were asked which problems they had faced and how they felt prepared for the CEY. Interviews were audio-recorded, transcribed, and analysed by qualitative content analysis. Results: From 34 interviews, we included 28 into analysis. Overall, 24 students were satisfied or very satisfied with their start into the CEY. Satisfaction was expressed with respect to workplace experiences, learning progress, responsibilities and team integration. Especially, students appreciated if they were integrated as active members of the team, were given responsibility for certain units of work, and received well-structured formal teaching and supervision. Students had divergent opinions about the quality of teaching and supervision, about their own achievements, and the recognition they received. Students recommended improvements in respect to formal teaching and supervision by clinical supervisors, preparation of the CEY by university, and supporting structures in the hosting institution. Conclusion: Students in this study were generally satisfied with the first two months of their CEY. Facilitating factors were active and responsible involvement into routine patient care, and high quality formal teaching and supervision. Findings may inform universities, teaching hospitals, and students how to better shape the first phase of the CEY.
Collapse
Affiliation(s)
- Samuel Beck
- University of Zurich, Faculty of Medicine, Dean's Office, Zurich, Switzerland
| | - Christian Schirlo
- University of Zurich, Faculty of Medicine, Dean's Office, Zurich, Switzerland
| | - Jan Breckwoldt
- University of Zurich, Faculty of Medicine, Dean's Office, Zurich, Switzerland
- *To whom correspondence should be addressed: Jan Breckwoldt, University of Zurich, Faculty of Medicine, Dean`s Office, Pestalozzistr. 3-5, CH-8091 Zurich, Switzerland, Tel.: +41 (0)44/634-1075, Fax: +41 (0)44/634-1088, E-mail:
| |
Collapse
|
22
|
Kogan JR, Hatala R, Hauer KE, Holmboe E. Guidelines: The do's, don'ts and don't knows of direct observation of clinical skills in medical education. PERSPECTIVES ON MEDICAL EDUCATION 2017; 6:286-305. [PMID: 28956293 PMCID: PMC5630537 DOI: 10.1007/s40037-017-0376-7] [Citation(s) in RCA: 86] [Impact Index Per Article: 12.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/12/2023]
Abstract
INTRODUCTION Direct observation of clinical skills is a key assessment strategy in competency-based medical education. The guidelines presented in this paper synthesize the literature on direct observation of clinical skills. The goal is to provide a practical list of Do's, Don'ts and Don't Knows about direct observation for supervisors who teach learners in the clinical setting and for educational leaders who are responsible for clinical training programs. METHODS We built consensus through an iterative approach in which each author, based on their medical education and research knowledge and expertise, independently developed a list of Do's, Don'ts, and Don't Knows about direct observation of clinical skills. Lists were compiled, discussed and revised. We then sought and compiled evidence to support each guideline and determine the strength of each guideline. RESULTS A final set of 33 Do's, Don'ts and Don't Knows is presented along with a summary of evidence for each guideline. Guidelines focus on two groups: individual supervisors and the educational leaders responsible for clinical training programs. Guidelines address recommendations for how to focus direct observation, select an assessment tool, promote high quality assessments, conduct rater training, and create a learning culture conducive to direct observation. CONCLUSIONS High frequency, high quality direct observation of clinical skills can be challenging. These guidelines offer important evidence-based Do's and Don'ts that can help improve the frequency and quality of direct observation. Improving direct observation requires focus not just on individual supervisors and their learners, but also on the organizations and cultures in which they work and train. Additional research to address the Don't Knows can help educators realize the full potential of direct observation in competency-based education.
Collapse
Affiliation(s)
- Jennifer R Kogan
- Perelman School of Medicine at the University of Pennsylvania, Philadelphia, PA, USA.
| | - Rose Hatala
- University of British Columbia, Vancouver, British Columbia, Canada
| | - Karen E Hauer
- University of California San Francisco, San Francisco, CA, USA
| | - Eric Holmboe
- Accreditation Council of Graduate Medical Education, Chicago, IL, USA
| |
Collapse
|
23
|
Bing-You R, Hayes V, Varaklis K, Trowbridge R, Kemp H, McKelvy D. Feedback for Learners in Medical Education: What Is Known? A Scoping Review. ACADEMIC MEDICINE : JOURNAL OF THE ASSOCIATION OF AMERICAN MEDICAL COLLEGES 2017; 92:1346-1354. [PMID: 28177958 DOI: 10.1097/acm.0000000000001578] [Citation(s) in RCA: 118] [Impact Index Per Article: 16.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/19/2023]
Abstract
PURPOSE To conduct a scoping review of the literature on feedback for learners in medical education. METHOD In 2015-2016, the authors searched the Ovid MEDLINE, ERIC, CINAHL, ProQuest Dissertations and Theses Global, Web of Science, and Scopus databases and seven medical education journals (via OvidSP) for articles published January 1980-December 2015. Two reviewers screened articles for eligibility with inclusion criteria. All authors extracted key data and analyzed data descriptively. RESULTS The authors included 650 articles in the review. More than half (n = 341) were published during 2010-2015. Many centered on medical students (n = 274) or residents (n = 192); some included learners from other disciplines (n = 57). Most (n = 633) described methods used for giving feedback; some (n = 95) described opinions and recommendations regarding feedback. Few studies assessed approaches to feedback with randomized, educational trials (n = 49) or described changes in learner behavior after feedback (n = 49). Even fewer assessed the impact of feedback on patient outcomes (n = 28). CONCLUSIONS Feedback is considered an important means of improving learner performance, as evidenced by the number of articles outlining recommendations for feedback approaches. The literature on feedback for learners in medical education is broad, fairly recent, and generally describes new or altered curricular approaches that involve feedback for learners. High-quality, evidence-based recommendations for feedback are lacking. In addition to highlighting calls to reassess the concepts and complex nature of feedback interactions, the authors identify several areas that require further investigation.
Collapse
Affiliation(s)
- Robert Bing-You
- R. Bing-You is professor, Tufts University School of Medicine, and vice president for medical education, Maine Medical Center, Portland, Maine. V. Hayes is clinical assistant professor, Tufts University School of Medicine, and faculty member, Department of Family Medicine, Maine Medical Center, Portland, Maine. K. Varaklis is clinical associate professor, Tufts University School of Medicine, and residency program director in obstetrics and gynecology, Maine Medical Center, Portland, Maine. R. Trowbridge is associate professor, Tufts University School of Medicine, and director of undergraduate medical education, Department of Medicine, Maine Medical Center, Portland, Maine. H. Kemp is medical librarian, Maine Medical Center, Portland, Maine. D. McKelvy is manager of library and knowledge services, Maine Medical Center, Portland, Maine
| | | | | | | | | | | |
Collapse
|
24
|
Ali K, Khan S, Briggs P, Jones E. An evaluation of a two-site pilot model for dental foundation training. Br Dent J 2017; 223:287-292. [PMID: 28840871 DOI: 10.1038/sj.bdj.2017.714] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/12/2017] [Indexed: 11/09/2022]
|
25
|
Duijn CCMA, Welink LS, Mandoki M, Ten Cate OTJ, Kremer WDJ, Bok HGJ. Am I ready for it? Students' perceptions of meaningful feedback on entrustable professional activities. PERSPECTIVES ON MEDICAL EDUCATION 2017; 6:256-264. [PMID: 28577253 PMCID: PMC5542892 DOI: 10.1007/s40037-017-0361-1] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/19/2016] [Accepted: 05/22/2017] [Indexed: 05/12/2023]
Abstract
BACKGROUND Receiving feedback while in the clinical workplace is probably the most frequently voiced desire of students. In clinical learning environments, providing and seeking performance-relevant information is often difficult for both supervisors and students. The use of entrustable professional activities (EPAs) can help to improve student assessment within competency-based education. This study aimed to illustrate what students' perceptions are of meaningful feedback viewed as conducive in preparing for performing EPA unsupervised. METHODS In a qualitative multicentre study we explored students' perceptions on meaningful feedback related to EPAs in the clinical workplace. Focus groups were conducted in three different healthcare institutes. Based on concepts from the literature, the transcripts were coded, iteratively reduced and displayed. RESULTS Participants' preferences regarding meaningful feedback on EPAs were quite similar, irrespective of their institution or type of clerkship. Participants explicitly mentioned that feedback on EPAs could come from a variety of sources. Feedback must come from a credible, trustworthy supervisor who knows the student well, be delivered in a safe environment and stress both strengths and points for improvement. The feedback should be provided immediately after the observed activity and include instructions for follow-up. Students would appreciate feedback that refers to their ability to act unsupervised. CONCLUSION There is abundant literature on how feedback should be provided, and what factors influence how feedback is sought by students. This study showed that students who are training to perform an EPA unsupervised have clear ideas about how, when and from whom feedback should be delivered.
Collapse
Affiliation(s)
- Chantal C M A Duijn
- Chair Quality Improvement in Veterinary Education, Faculty of Veterinary Medicine, Utrecht University, Utrecht, The Netherlands.
| | - Lisanne S Welink
- Center for Research and Development of Education, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Mira Mandoki
- University of Veterinary Medicine, Budapest, Hungary
| | - Olle T J Ten Cate
- Center for Research and Development of Education, University Medical Center Utrecht, Utrecht, The Netherlands
| | - Wim D J Kremer
- Chair Quality Improvement in Veterinary Education, Faculty of Veterinary Medicine, Utrecht University, Utrecht, The Netherlands
| | - Harold G J Bok
- Chair Quality Improvement in Veterinary Education, Faculty of Veterinary Medicine, Utrecht University, Utrecht, The Netherlands
| |
Collapse
|
26
|
Peters H, Holzhausen Y, Boscardin C, Ten Cate O, Chen HC. Twelve tips for the implementation of EPAs for assessment and entrustment decisions. MEDICAL TEACHER 2017; 39:802-807. [PMID: 28549405 DOI: 10.1080/0142159x.2017.1331031] [Citation(s) in RCA: 48] [Impact Index Per Article: 6.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/24/2023]
Abstract
The concept of entrustable professional activities (EPAs) reframes the approach to assessment in competency-based medical education. Key to this concept is the linking of assessment to decision making about entrusting learners with clinical responsibilities. Based on recent literature and the authors' experiences with implementing EPAs, this article provides practical recommendations for how to implement EPAs for assessment and entrustment decisions in the workplace. Tips for supervising clinicians include talking to learners about trust, using EPA descriptions to guide learning and teaching, providing learners with greater ad hoc responsibilities, using EPAs to identify/create opportunities for assessment and feedback, including case-based discussions and acknowledging gut feelings about learner readiness for more autonomy. Tips for curriculum leaders entail enabling the trust development, applying trust decisions at all levels of the supervision scale, employing all available information sources for entrustment, empowering learner ownership of the assessment process and using technology for learner tracking and program evaluation.
Collapse
Affiliation(s)
- Harm Peters
- a Dieter Scheffner Center for Medical Education and Educational Research , Free and Humboldt University of Berlin , Berlin , Germany
| | - Ylva Holzhausen
- a Dieter Scheffner Center for Medical Education and Educational Research , Free and Humboldt University of Berlin , Berlin , Germany
| | - Christy Boscardin
- b Department of Medicine , University of California San Francisco , San Francisco , CA , USA
| | - Olle Ten Cate
- c Center for Research and Development of Education , University Medical Center Utrecht , Utrecht , The Netherlands
| | - H Carrie Chen
- c Center for Research and Development of Education , University Medical Center Utrecht , Utrecht , The Netherlands
- d Department of Pediatrics , University of California San Francisco , San Francisco , CA , USA
- e Department of Pediatrics , Georgetown University School of Medicine , Washington , DC , USA
| |
Collapse
|
27
|
Norman EJ. Supervisor descriptions of veterinary student performance in the clinical workplace: a qualitative interview study. Vet Rec 2017; 180:570. [DOI: 10.1136/vr.104224] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 02/22/2017] [Indexed: 11/03/2022]
Affiliation(s)
- E. J. Norman
- Institute of Veterinary, Animal and Biomedical Sciences, Massey University; Private Bag 11222, Palmerston North 4442 New Zealand
| |
Collapse
|
28
|
Ramani S, Post SE, Könings K, Mann K, Katz JT, van der Vleuten C. "It's Just Not the Culture": A Qualitative Study Exploring Residents' Perceptions of the Impact of Institutional Culture on Feedback. TEACHING AND LEARNING IN MEDICINE 2017; 29:153-161. [PMID: 28001442 DOI: 10.1080/10401334.2016.1244014] [Citation(s) in RCA: 25] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/12/2023]
Abstract
UNLABELLED Phenomenon: Competency-based medical education requires ongoing performance-based feedback for professional growth. In several studies, medical trainees report that the quality of faculty feedback is inadequate. Sociocultural barriers to feedback exchanges are further amplified in graduate and postgraduate medical education settings, where trainees serve as frontline providers of patient care. Factors that affect institutional feedback culture, enhance feedback seeking, acceptance, and bidirectional feedback warrant further exploration in these settings. APPROACH Using a constructivist grounded theory approach, we sought to examine residents' perspectives on institutional factors that affect the quality of feedback, factors that influence receptivity to feedback, and quality and impact of faculty feedback. Four focus group discussions were conducted, with two investigators present at each. One facilitated the discussion, and the other observed the interactions and took field notes. We audiotaped and transcribed the discussions, and performed a thematic analysis. Measures to ensure rigor included thick descriptions, independent coding by two investigators, and attention to reflexivity. FINDINGS We identified five key themes, dominated by resident perceptions regarding the influence of institutional feedback culture. The theme labels are taken from direct participant quotes: (a) the cultural norm lacks clear expectations and messages around feedback, (b) the prevailing culture of niceness does not facilitate honest feedback, (c) bidirectional feedback is not part of the culture, (d) faculty-resident relationships impact credibility and receptivity to feedback, and (e) there is a need to establish a culture of longitudinal professional growth. Insights: Institutional culture could play a key role in influencing the quality, credibility, and acceptability of feedback. A polite culture promotes a positive learning environment but can be a barrier to honest feedback. Feedback initiatives focusing solely on techniques of feedback giving may not enhance meaningful feedback. Further research on factors that promote feedback seeking, receptivity to constructive feedback, and bidirectional feedback would provide valuable insights.
Collapse
Affiliation(s)
- Subha Ramani
- a Department of Medicine , Brigham and Women's Hospital and Harvard Medical School , Boston , Massachusetts , USA
| | - Sarah E Post
- a Department of Medicine , Brigham and Women's Hospital and Harvard Medical School , Boston , Massachusetts , USA
| | - Karen Könings
- b Education Development and Research , Maastricht University , Maastricht , The Netherlands
| | - Karen Mann
- c Department of Medical Education , Dalhousie University , Halifax , Nova Scotia , Canada
| | - Joel T Katz
- a Department of Medicine , Brigham and Women's Hospital and Harvard Medical School , Boston , Massachusetts , USA
| | - Cees van der Vleuten
- b Education Development and Research , Maastricht University , Maastricht , The Netherlands
| |
Collapse
|
29
|
Watling C, LaDonna KA, Lingard L, Voyer S, Hatala R. 'Sometimes the work just needs to be done': socio-cultural influences on direct observation in medical training. MEDICAL EDUCATION 2016; 50:1054-64. [PMID: 27628722 DOI: 10.1111/medu.13062] [Citation(s) in RCA: 72] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/07/2015] [Revised: 12/08/2015] [Accepted: 02/26/2016] [Indexed: 05/14/2023]
Abstract
CONTEXT Direct observation promises to strengthen both coaching and assessment, and calls for its increased use in medical training abound. Despite its apparent potential, the uptake of direct observation in medical training remains surprisingly limited outside the formal assessment setting. The limited uptake of observation raises questions about cultural barriers to its use. In this study, we explore the influence of professional culture on the use of direct observation within medical training. METHODS Using a constructivist grounded theory approach, we interviewed 22 residents or fellows (10 male, 12 female) about their experiences of being observed during training. Participants represented a range of specialties and training levels. Data collection and analysis were conducted iteratively. Themes were identified using constant comparative analysis. RESULTS Observation was used selectively; specialties tended to observe the clinical acts that they valued most. Despite these differences, we found two cultural values that consistently challenged the ready implementation of direct observation across specialties: (i) autonomy in learning and (ii) efficiency in health care provision. Furthermore, we found that direct observation was a primarily learner-driven activity, which left learners caught in the middle, wanting observation but also wanting to appear independent and efficient. CONCLUSIONS The cultural values of autonomy in learning and practice and efficiency in health care provision challenge the integration of direct observation into clinical training. Medical learners are often expected to ask for observation, but such requests are socially and culturally fraught, and likely to constrain the wider uptake of direct observation.
Collapse
Affiliation(s)
- Christopher Watling
- Department of Clinical Neurological Sciences, Schulich School of Medicine and Dentistry, Western University, London, Ontario, Canada.
| | - Kori A LaDonna
- Centre for Education Research & Innovation, Schulich School of Medicine and Dentistry, Western University, London, Ontario, Canada
| | - Lorelei Lingard
- Department of Medicine and Centre for Education Research & Innovation, Schulich School of Medicine and Dentistry, Western University, London, Ontario, Canada
| | - Stephane Voyer
- Department of Medicine, University of British Columbia, Vancouver, British Columbia, Canada
| | - Rose Hatala
- Department of Medicine, University of British Columbia, Vancouver, British Columbia, Canada
| |
Collapse
|
30
|
Quance MA. Nursing Students' Perceptions of Anecdotal Notes as Formative Feedback. Int J Nurs Educ Scholarsh 2016; 13:/j/ijnes.2016.13.issue-1/ijnes-2015-0053/ijnes-2015-0053.xml. [PMID: 27564701 DOI: 10.1515/ijnes-2015-0053] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/03/2016] [Accepted: 06/20/2016] [Indexed: 11/15/2022]
Abstract
Anecdotal notes are a method of providing formative feedback to nursing students following clinical experiences. The extant literature on anecdotal notes is written only from the educator perspective, focusing on rationale for and methods of production, rather than on evaluation of effectiveness. A retrospective descriptive study was carried out with a cohort of 283 third year baccalaureate nursing students to explore their perceptions of anecdotal notes as effective formative feedback. The majority of students valued verbal as well as anecdotal note feedback. They preferred to receive feedback before the next learning experience. Students found the quality of feedback varied by instructor. The anecdotal note process was found to meet identified formative feedback requirements as well as the nursing program's requirement for transparency of evaluation and due process. It is necessary to provide professional development to clinical nurse educators to assist them develop high quality formative feedback using anecdotal notes.
Collapse
|
31
|
Barrett A, Galvin R, Steinert Y, Scherpbier A, O'Shaughnessy A, Walsh G, Horgan M. Profiling postgraduate workplace-based assessment implementation in Ireland: a retrospective cohort study. SPRINGERPLUS 2016; 5:133. [PMID: 26933632 PMCID: PMC4761346 DOI: 10.1186/s40064-016-1748-x] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/06/2015] [Accepted: 02/09/2016] [Indexed: 11/16/2022]
Abstract
In 2010, workplace-based assessment (WBA) was formally integrated as a method of formative trainee assessment into 29 basic and higher specialist medical training (BST/HST) programmes in six postgraduate training bodies in Ireland. The aim of this study is to explore how WBA is being implemented and to examine if WBA is being used formatively as originally intended. A retrospective cohort study was conducted and approved by the institution’s Research Ethics Committee. A profile of WBA requirements was obtained from 29 training programme curricula. A data extraction tool was developed to extract anonymous data, including written feedback and timing of assessments, from Year 1 and 2 trainee ePortfolios in 2012–2013. Data were independently quality assessed and compared to the reference standard number of assessments mandated annually where relevant. All 29 training programmes mandated the inclusion of at least one case-based discussion (max = 5; range 1–5). All except two non-clinical programmes (93 %) required at least two mini-Clinical Evaluation Exercise assessments per year and Direct Observation of Procedural Skills assessments were mandated in 27 training programmes over the course of the programme. WBA data were extracted from 50 % of randomly selected BST ePortfolios in four programmes (n = 142) and 70 % of HST ePortfolios (n = 115) in 21 programmes registered for 2012–2013. Four programmes did not have an eligible trainee for that academic year. In total, 1142 WBAs were analysed. A total of 164 trainees (63.8 %) had completed at least one WBA. The average number of WBAs completed by HST trainees was 7.75 (SD 5.8; 95 % CI 6.5–8.9; range 1–34). BST trainees completed an average of 6.1 assessments (SD 9.3; 95 % CI 4.01–8.19; range 1–76). Feedback—of varied length and quality—was provided on 44.9 % of assessments. The majority of WBAs were completed in the second half of the year. There is significant heterogeneity with respect to the frequency and quality of feedback provided during WBAs. The completion of WBAs later in the year may limit available time for feedback, performance improvement and re-evaluation. This study sets the scene for further work to explore the value of formative assessment in postgraduate medical education.
Collapse
Affiliation(s)
- Aileen Barrett
- Education and Professional Development Unit, Royal College of Physicians of Ireland, Frederick House, 19 South Frederick St, Dublin 2, Ireland ; School of Medicine, College of Medicine and Health Sciences, Brookfield Health Sciences Complex, University College Cork, Cork, Ireland
| | - Rose Galvin
- Discipline of Physiotherapy, Department of Clinical Therapies, Faculty of Education and Health Sciences, University of Limerick, Limerick, Ireland
| | - Yvonne Steinert
- Centre for Medical Education, Faculty of Medicine, McGill University, Lady Meredith House, 1110 Pine Avenue West, Montreal, QC H3A 1A3 Canada
| | - Albert Scherpbier
- Faculty of Health, Medicine and Life Sciences, University of Maastricht, Universiteitssingel 60, 6229 ER Maastricht, The Netherlands
| | - Ann O'Shaughnessy
- Education and Professional Development Unit, Royal College of Physicians of Ireland, Frederick House, 19 South Frederick St, Dublin 2, Ireland
| | - Gillian Walsh
- Education and Professional Development Unit, Royal College of Physicians of Ireland, Frederick House, 19 South Frederick St, Dublin 2, Ireland
| | - Mary Horgan
- School of Medicine, College of Medicine and Health Sciences, Brookfield Health Sciences Complex, University College Cork, Cork, Ireland
| |
Collapse
|
32
|
Lefroy J, Watling C, Teunissen PW, Brand P. Guidelines: the do's, don'ts and don't knows of feedback for clinical education. PERSPECTIVES ON MEDICAL EDUCATION 2015; 4:284-299. [PMID: 26621488 PMCID: PMC4673072 DOI: 10.1007/s40037-015-0231-7] [Citation(s) in RCA: 171] [Impact Index Per Article: 19.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/09/2023]
Abstract
INTRODUCTION The guidelines offered in this paper aim to amalgamate the literature on formative feedback into practical Do's, Don'ts and Don't Knows for individual clinical supervisors and for the institutions that support clinical learning. METHODS The authors built consensus by an iterative process. Do's and Don'ts were proposed based on authors' individual teaching experience and awareness of the literature, and the amalgamated set of guidelines were then refined by all authors and the evidence was summarized for each guideline. Don't Knows were identified as being important questions to this international group of educators which if answered would change practice. The criteria for inclusion of evidence for these guidelines were not those of a systematic review, so indicators of strength of these recommendations were developed which combine the evidence with the authors' consensus. RESULTS A set of 32 Do and Don't guidelines with the important Don't Knows was compiled along with a summary of the evidence for each. These are divided into guidelines for the individual clinical supervisor giving feedback to their trainee (recommendations about both the process and the content of feedback) and guidelines for the learning culture (what elements of learning culture support the exchange of meaningful feedback, and what elements constrain it?) CONCLUSION Feedback is not easy to get right, but it is essential to learning in medicine, and there is a wealth of evidence supporting the Do's and warning against the Don'ts. Further research into the critical Don't Knows of feedback is required. A new definition is offered: Helpful feedback is a supportive conversation that clarifies the trainee's awareness of their developing competencies, enhances their self-efficacy for making progress, challenges them to set objectives for improvement, and facilitates their development of strategies to enable that improvement to occur.
Collapse
Affiliation(s)
- Janet Lefroy
- Keele University School of Medicine, Clinical Education Centre RSUH, ST4 6QG, Staffordshire, UK.
| | - Chris Watling
- Schulich School of Medicine and Dentistry, Western University, Ontario, Canada
| | - Pim W Teunissen
- Maastricht University and VU University Medical Center, Amsterdam, The Netherlands
| | - Paul Brand
- Isala Klinieken, Zwolle, The Netherlands
| |
Collapse
|
33
|
Bok HGJ. Competency-based veterinary education: an integrative approach to learning and assessment in the clinical workplace. PERSPECTIVES ON MEDICAL EDUCATION 2015; 4:86-89. [PMID: 25814329 PMCID: PMC4404455 DOI: 10.1007/s40037-015-0172-1] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/26/2023]
Abstract
When graduating from veterinary school, veterinary professionals must be ready to enter the complex veterinary profession. Therefore, one of the major responsibilities of any veterinary school is to develop training programmes that support students' competency development on the trajectory from novice student to veterinary professional. The integration of learning and assessment in the clinical workplace to foster this competency development in undergraduate veterinary education was the central topic of this thesis.
Collapse
Affiliation(s)
- Harold G J Bok
- Faculty of Veterinary Medicine, Quality improvement in Veterinary Education, Utrecht University, Yalelaan 1, 3584 CL, Utrecht, The Netherlands.
| |
Collapse
|