1
|
Bachmann C, Kropf R, Biller S, Schnabel KP, Junod Perron N, Monti M, Berendonk C, Huwendiek S, Breckwoldt J. Development and national consensus finding on patient-centred high stakes communication skills assessments for the Swiss Federal Licensing Examination in Medicine. PATIENT EDUCATION AND COUNSELING 2021; 104:1765-1772. [PMID: 33358770 DOI: 10.1016/j.pec.2020.12.003] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/20/2019] [Revised: 10/30/2020] [Accepted: 12/05/2020] [Indexed: 06/12/2023]
Abstract
OBJECTIVE To describe and evaluate a consensus finding and expert validation process for the development of patient-centred communication assessments for a national Licensing Exam in Medicine. METHODS A multi-professional team of clinicians and experts in communication, assessment and role-play developed communication assessments for the Swiss Federal Licensing Examination. The six-month process, informed by a preceding national needs-assessment, an expert symposium and a critical literature review covered the application of patient-centred communication frameworks, the development of assessment guides, concrete assessments and pilot-tests. The participants evaluated the process. RESULTS The multiple-step consensus process, based on expert validation of the medical and communication content, led to six high-stakes patient-centred communication OSCE-assessments. The process evaluation revealed areas of challenge such as calibrating rating-scales and case difficulty to the graduates' competencies and integrating differing opinions. Main success factors were attributed to the outcome-oriented process and the multi-professional exchange of expertise. A model for developing high stakes patient-centred communication OSCE-assessments was derived. CONCLUSIONS Consensus finding was facilitated by using well-established communication frameworks, by ensuring outcome-orientated knowledge exchange among multi-professional experts, and collaborative validation of content through experts. PRACTICE IMPLICATIONS We propose developing high-stakes communication assessments in a multi-professional expert consensus and provide a conceptual model.
Collapse
Affiliation(s)
- C Bachmann
- Institute for Medical Education, University of Bern, Switzerland; Office of Educational Affairs, Faculty of Medicine, University of Rostock, Ernst-Heydemann-Str. 8, 18057, Rostock, Germany.
| | - R Kropf
- Office of the Dean, Faculty of Medicine, University of Zurich, Switzerland
| | - S Biller
- Office of Student Affairs, Faculty of Medicine, University of Basel, Switzerland
| | - K P Schnabel
- Institute for Medical Education, University of Bern, Switzerland
| | - N Junod Perron
- Unit of Development and Research in Medical Education, Faculty of Medicine, University of Geneva, Switzerland
| | - M Monti
- Medical Education Unit, Faculty of Biology and Medicine, University of Lausanne, Switzerland
| | - C Berendonk
- Institute for Medical Education, University of Bern, Switzerland
| | - S Huwendiek
- Institute for Medical Education, University of Bern, Switzerland
| | - J Breckwoldt
- Office of the Dean, Faculty of Medicine, University of Zurich, Switzerland; Institute of Anaesthesiology University Hospital Zurich, Zurich, Switzerland
| |
Collapse
|
2
|
Scully RE, Deal SB, Clark MJ, Yang K, Wnuk G, Smink DS, Fryer JP, Bohnen JD, Teitelbaum EN, Meyerson SL, Meier AH, Gauger PG, Reddy RM, Kendrick DE, Stern M, Hughes DT, Chipman JG, Patel JA, Alseidi A, George BC. Concordance Between Expert and Nonexpert Ratings of Condensed Video-Based Trainee Operative Performance Assessment. JOURNAL OF SURGICAL EDUCATION 2020; 77:627-634. [PMID: 32201143 DOI: 10.1016/j.jsurg.2019.12.016] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/09/2019] [Revised: 12/18/2019] [Accepted: 12/29/2019] [Indexed: 06/10/2023]
Abstract
OBJECTIVE We examined the impact of video editing and rater expertise in surgical resident evaluation on operative performance ratings of surgical trainees. DESIGN Randomized independent review of intraoperative video. SETTING Operative video was captured at a single, tertiary hospital in Boston, MA. PARTICIPANTS Six common general surgery procedures were video recorded of 6 attending-trainee dyads. Full-length and condensed versions (n = 12 videos) were then reviewed by 13 independent surgeon raters (5 evaluation experts, 8 nonexperts) using a crossed design. Trainee performance was rated using the Operative Performance Rating Scale, System for Improving and Measuring Procedural Learning (SIMPL) Performance scale, the Zwisch scale, and ten Cate scale. These ratings were then standardized before being compared using Bayesian mixed models with raters and videos treated as random effects. RESULTS Editing had no effect on the Operative Performance Rating Scale Overall Performance (-0.10, p = 0.30), SIMPL Performance (0.13, p = 0.71), Zwisch (-0.12, p = 0.27), and ten Cate scale (-0.13, p = 0.29). Additionally, rater expertise (evaluation expert vs. nonexpert) had no effect on the same scales (-0.16 (p = 0.32), 0.18 (p = 0.74), 0.25 (p = 0.81), and 0.25 (p = 0.17). CONCLUSIONS There is little difference in operative performance assessment scores when raters use condensed videos or when raters who are not experts in surgical resident evaluation are used. Future validation studies of operative performance assessment scales may be facilitated by using nonexpert surgeon raters viewing videos condensed using a standardized protocol.
Collapse
Affiliation(s)
- Rebecca E Scully
- Department of Surgery, Brigham and Women's Hospital, Boston, Massachusetts
| | - Shanley B Deal
- Department of Surgery, Virginia Mason Medical Center, Seattle, Washington
| | - Michael J Clark
- Consulting for Statistics, Computing, and Analytics, University of Michigan, Ann Arbor, Michigan
| | - Katherine Yang
- Consulting for Statistics, Computing, and Analytics, University of Michigan, Ann Arbor, Michigan
| | - Greg Wnuk
- Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Douglas S Smink
- Department of Surgery, Brigham and Women's Hospital, Boston, Massachusetts
| | - Jonathan P Fryer
- Department of Surgery, Northwestern Memorial Hospital, Chicago, Illinois
| | - Jordan D Bohnen
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts
| | - Ezra N Teitelbaum
- Department of Surgery, Northwestern Memorial Hospital, Chicago, Illinois
| | - Shari L Meyerson
- Department of Surgery, University of Kentucky Medical Center, Lexington, Kentucky
| | - Andreas H Meier
- Department of Surgery, SUNY Upstate University Hospital, Syracuse, New York
| | - Paul G Gauger
- Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Rishindra M Reddy
- Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Daniel E Kendrick
- University Hospitals Case Western Reserve, Cleveland Ohio; Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Michael Stern
- Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - David T Hughes
- Department of Surgery, University of Michigan, Ann Arbor, Michigan
| | - Jeffrey G Chipman
- Department of Surgery, University of Minnesota, Minneapolis, Minnesota
| | - Jitesh A Patel
- Department of Surgery, University of Kentucky Medical Center, Lexington, Kentucky
| | - Adnan Alseidi
- Department of Surgery, Virginia Mason Medical Center, Seattle, Washington
| | - Brian C George
- Center for Surgical Training and Research, Department of Surgery, University of Michigan, Ann Arbor, Michigan
| |
Collapse
|
3
|
Halwani Y, Sachdeva AK, Satterthwaite L, de Montbrun S. Development and evaluation of the General Surgery Objective Structured Assessment of Technical Skill (GOSATS). Br J Surg 2019; 106:1617-1622. [DOI: 10.1002/bjs.11359] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2019] [Revised: 07/16/2019] [Accepted: 08/13/2019] [Indexed: 11/05/2022]
Abstract
Abstract
Background
Technical skill acquisition is important in surgery specialty training. Despite an emphasis on competency-based training, few tools are currently available for direct technical skills assessment at the completion of training. The aim of this study was to develop and validate a simulated technical skill examination for graduating (postgraduate year (PGY)5) general surgery trainees.
Methods
A simulated eight-station, procedure-based general surgery technical skills examination was developed. Board-certified general surgeons blinded to the level of training rated performance of PGY3 and PGY5 trainees by means of validated scoring. Cronbach's α was used to calculate reliability indices, and a conjunctive model to set a pass score with borderline regression methodology. Subkoviak methodology was employed to assess the reliability of the pass–fail decision. The relationship between passing the examination and PGY level was evaluated using χ2 analysis.
Results
Ten PGY3 and nine PGY5 trainees were included. Interstation reliability was 0·66, and inter-rater reliability for three stations was 0·92, 0·97 and 0·76. A pass score of 176·8 of 280 (63·1 per cent) was set. The pass rate for PGY5 trainees was 78 per cent (7 of 9), compared with 30 per cent (3 of 10) for PGY3 trainees. Reliability of the pass–fail decision had an agreement coefficient of 0·88. Graduating trainees were significantly more likely to pass the examination than PGY3 trainees (χ2 = 4·34, P = 0·037).
Conclusion
A summative general surgery technical skills examination was developed with reliability indices within the range needed for high-stakes assessments. Further evaluation is required before the examination can be used in decisions regarding certification.
Collapse
Affiliation(s)
- Y Halwani
- Department of Surgery, University of Toronto, Toronto, Ontario, Canada
| | - A K Sachdeva
- American College of Surgeons, Chicago, Illinois, USA
| | - L Satterthwaite
- University of Toronto, Surgical Skills Centre, Mount Sinai Hospital, Toronto, Ontario, Canada
| | - S de Montbrun
- Department of Surgery, University of Toronto, Toronto, Ontario, Canada
- Division of General Surgery, St Michael's Hospital, Toronto, Ontario, Canada
| |
Collapse
|