1
|
Alunno A, Najm A, Sivera F, Haines C, Falzon L, Ramiro S. Assessment of competences in rheumatology training: results of a systematic literature review to inform EULAR points to consider. RMD Open 2021; 6:rmdopen-2020-001330. [PMID: 32883720 PMCID: PMC7508213 DOI: 10.1136/rmdopen-2020-001330] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/20/2020] [Revised: 07/16/2020] [Accepted: 08/21/2020] [Indexed: 11/08/2022] Open
Abstract
Objective To summarise the literature on the assessment of competences in postgraduate medical training. Methods A systematic literature review was performed within a EULAR taskforce on the assessment of competences in rheumatology training and other related specialities (July 2019). Two searches were performed: one search for rheumatology and one for related medical specialities. Two reviewers independently identified eligible studies and extracted data on assessment methods. Risk of bias was assessed using the medical education research study quality instrument. Results Of 7335 articles in rheumatology and 2324 reviews in other specialities, 5 and 31 original studies were included, respectively. Studies in rheumatology were at variable risk of bias and explored only direct observation of practical skills (DOPS) and objective structured clinical examinations (OSCEs). OSCEs, including clinical, laboratory and imaging stations, performed best, with a good to very good internal consistency (Cronbach’s α=0.83–0.92), and intrarater reliability (r=0.80–0.95). OSCEs moderately correlated with other assessment tools: r=0.48 vs rating by programme directors; r=0.2–0.44 vs multiple-choice questionnaires; r=0.48 vs DOPS. In other specialities, OSCEs on clinical skills had a good to very good inter-rater reliability and OSCEs on communication skills demonstrated a good to very good internal consistency. Multisource feedback and the mini-clinical evaluation exercise showed good feasibility and internal consistency (reliability), but other data on validity and reliability were conflicting. Conclusion Despite consistent data on competence assessment in other specialities, evidence in rheumatology is scarce and conflicting. Overall, OSCEs seem an appropriate tool to assess the competence of clinical skills and correlate well with other assessment strategies. DOPS, multisource feedback and the mini-clinical evaluation exercise are feasible alternatives.
Collapse
Affiliation(s)
- Alessia Alunno
- Rheumatology Unit, University of Perugia Department of Medicine, Perugia, Italy
| | - Aurélie Najm
- University Hospital, Inserm Umr 1238, Nantes, France
| | - Francisca Sivera
- Department of Rheumatology, Hospital General Universitario Elda, Elda, Spain.,Department of Medicine, Miguel Hernandez University of Elche, Elche, Spain
| | | | - Louise Falzon
- Center for Personalized Health, Northwell Health Feinstein Institutes for Medical Research, Manhasset, New York, USA
| | - Sofia Ramiro
- Department of Rheumatology, Leiden University Medical Center, Leiden, Netherlands.,Department of Rheumatology, Zuyderland Medical Centre Heerlen, Heerlen, Netherlands
| |
Collapse
|
2
|
Teaching and Assessing Professionalism in Radiology: Resources and Scholarly Opportunities to Contribute to Required Expectations. Acad Radiol 2018; 25:599-609. [PMID: 29478920 DOI: 10.1016/j.acra.2018.01.008] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/28/2017] [Revised: 01/10/2018] [Accepted: 01/14/2018] [Indexed: 11/20/2022]
Abstract
Teaching and assessing trainees' professionalism now represents an explicit expectation for Accreditation Council Graduate Medical Education-accredited radiology programs. Challenges to meeting this expectation include variability in defining the construct of professionalism; limits of traditional teaching and assessment methods, used for competencies historically more prominent in medical education, for professionalism; and emerging expectations for credible and feasible professionalism teaching and assessment practices in the current context of health-care training and practice. This article identifies promising teaching resources and methods that can be used strategically to augment traditional teaching of the cognitive basis for professionalism, including role modeling, case-based scenarios, debriefing, simulations, narrative medicine (storytelling), guided discussions, peer-assisted learning, and reflective practice. This article also summarizes assessment practices intended to promote learning, as well as to inform how and when to assess trainees as their professional identities develop over time, settings, and autonomous practice, particularly in terms of measurable behaviors. This includes assessment tools (including mini observations, critical incident reports, and appreciative inquiry) for authentic assessment in the workplace; engaging multiple sources (self-, peer, other health professionals, and patients) in assessment; and intentional practices for trainees to take responsibility for seeking our actionable feedback and reflection. This article examines the emerging evidence of the feasibility and value added of assessment of medical competency milestones, including professionalism, coordinated by the Accreditation Council Graduate Medical Education in radiology and other medical specialties. Radiology has a strategic opportunity to contribute to scholarship and inform policies in professionalism teaching and assessment practices.
Collapse
|
3
|
Li H, Ding N, Zhang Y, Liu Y, Wen D. Assessing medical professionalism: A systematic review of instruments and their measurement properties. PLoS One 2017; 12:e0177321. [PMID: 28498838 PMCID: PMC5428933 DOI: 10.1371/journal.pone.0177321] [Citation(s) in RCA: 47] [Impact Index Per Article: 6.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/18/2016] [Accepted: 04/25/2017] [Indexed: 11/18/2022] Open
Abstract
BACKGROUND Over the last three decades, various instruments were developed and employed to assess medical professionalism, but their measurement properties have yet to be fully evaluated. This study aimed to systematically evaluate these instruments' measurement properties and the methodological quality of their related studies within a universally acceptable standardized framework and then provide corresponding recommendations. METHODS A systematic search of the electronic databases PubMed, Web of Science, and PsycINFO was conducted to collect studies published from 1990-2015. After screening titles, abstracts, and full texts for eligibility, the articles included in this study were classified according to their respective instrument's usage. A two-phase assessment was conducted: 1) methodological quality was assessed by following the COnsensus-based Standards for the selection of health status Measurement INstruments (COSMIN) checklist; and 2) the quality of measurement properties was assessed according to Terwee's criteria. Results were integrated using best-evidence synthesis to look for recommendable instruments. RESULTS After screening 2,959 records, 74 instruments from 80 existing studies were included. The overall methodological quality of these studies was unsatisfactory, with reasons including but not limited to unknown missing data, inadequate sample sizes, and vague hypotheses. Content validity, cross-cultural validity, and criterion validity were either unreported or negative ratings in most studies. Based on best-evidence synthesis, three instruments were recommended: Hisar's instrument for nursing students, Nurse Practitioners' Roles and Competencies Scale, and Perceived Faculty Competency Inventory. CONCLUSION Although instruments measuring medical professionalism are diverse, only a limited number of studies were methodologically sound. Future studies should give priority to systematically improving the performance of existing instruments and to longitudinal studies.
Collapse
Affiliation(s)
- Honghe Li
- Research Center of Medical Education, China Medical University, Shenyang, Liaoning, China
| | - Ning Ding
- Research Center of Medical Education, China Medical University, Shenyang, Liaoning, China
| | - Yuanyuan Zhang
- School of Public Health, Dalian Medical University, Dalian, Liaoning, China
| | - Yang Liu
- School of Public Health, China Medical University, Shenyang, Liaoning, China
| | - Deliang Wen
- Research Center of Medical Education, China Medical University, Shenyang, Liaoning, China
| |
Collapse
|
4
|
Oktay C, Senol Y, Rinnert S, Cete Y. Utility of 360-degree assessment of residents in a Turkish academic emergency medicine residency program. Turk J Emerg Med 2017; 17:12-15. [PMID: 28345067 PMCID: PMC5357104 DOI: 10.1016/j.tjem.2016.09.007] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/24/2016] [Revised: 09/01/2016] [Accepted: 09/26/2016] [Indexed: 11/20/2022] Open
Abstract
OBJECTIVES This study was designed to test a 360-degree assessment tool for four of the emergency medicine resident competencies as outlined by the Council of Residency Directors in Emergency Medicine on patient care, communication skills, professionalism and system based practice in an academic Emergency Department. MATERIAL AND METHODS Using the competency framework of the American Accreditation Council for Graduate Medical Education, a 57 item-containing assessment tool was created. Based on the different exposure aspects of the involved evaluator groups, the items were integrated into seven different evaluation forms. All sixteen of 16 residents and members from each evaluator group voluntarily participated in the study. Internal consistency scores, multilayer and multilevel Kappa values were measured. Evaluator group scores and resident ranks in competency areas were compared. All evaluators were asked to comment on the applicability and usefulness of the assessment tool in emergency medicine. RESULTS Seven groups completed a total of 1088 forms to evaluate 16 residents. The reliability coefficient for the faculty members was 0.99 while it was 0.60 for the ancillary staff. The interrater Kappa values for faculty members, nurses and peer assessment were relevant with a value of greater than 70%. DISCUSSION AND CONCLUSION Our results showed that the 360-degree assessment did meet expectations by the evaluator group and residents, and that this method was readily accepted in the setting of a Akdeniz University Emergency Medicine residency training program. However, only evaluations by faculty, nurses, self and peers were reliable to have any value. Doing a 360° evaluation is time and effort consuming and thus may not be an ideal tool for larger programs.
Collapse
Affiliation(s)
- Cem Oktay
- Akdeniz University School of Medicine, Department of Emergency Medicine, Antalya, Turkey
| | - Yesim Senol
- Akdeniz University School of Medicine, Department of Medical Education, Antalya, Turkey
| | - Stephan Rinnert
- SUNY Downstate Medical Center, Department of Emergency Medicine, Brooklyn, NY, USA
| | - Yildiray Cete
- Akdeniz University School of Medicine, Department of Emergency Medicine, Antalya, Turkey
| |
Collapse
|
5
|
Riveros R, Kimatian S, Castro P, Dhumak V, Honar H, Mascha EJ, Sessler DI. Multisource feedback in professionalism for anesthesia residents. J Clin Anesth 2016; 34:32-40. [PMID: 27687342 DOI: 10.1016/j.jclinane.2016.03.038] [Citation(s) in RCA: 19] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/24/2015] [Revised: 03/10/2016] [Accepted: 03/10/2016] [Indexed: 11/30/2022]
Abstract
STUDY OBJECTIVE To assess professionalism in anesthesiology residents, it is important to obtain evaluations from people with whom they interact on daily basis. The purpose of this study was to evaluate the effect of a Multisource feedback (MSF) on resident's professional behavior and to assess the effect of faculty feedback on resident performance. DESIGN This study was a two-group randomized clinical trial. SETTING Residents were recruited from Cleveland Clinic Children's Hospital. PATIENTS Participants included twenty eight residents doing a two-month rotation in Pediatric Anesthesia. INTERVENTIONS Multisource feedback questionnaires were developed and then validated using face and content validity. Residents were randomly assigned to a feedback group or a control group. Both groups received the MSF evaluation. Only the group assigned to feedback had a 'coaching meeting' every month creating strategies for improvement. MEASUREMENTS MSF questionnaires were validated using a face validation and expert content validity. The effect of MSF on a professionalism questionnaire was assessed using analysis of covariance and linear mixed effects regression models. MAIN RESULTS Observed test-retest agreement was greater than 0.90 for all items, with more than half of kappa statistics greater than 0.50. Cronbach's alpha was 0.71.The MSF increased the self-assessment score with an estimated effect of 0.21 (95% CI 0.06, 0.37), P=.015. There was no detected effect on patient family evaluation, with mean difference (CI) in change from baseline of 0.03 (-0.15, 0.21), P=.77, faculty evaluation, 0.21 (-0.02, 0.44), P=.08, or coworker evaluation 0.13 (-0.11, 0.37). CONCLUSIONS Our new multi-source feedback questionnaire to assess professionalism had good reliability and internal consistency. Using our validated questionnaire we assessed the effect of a monthly feedback to improve professionalism in anesthesia residents. While we did see improvement in anesthesiology residents' self-assessment, we did not see a similar effect on patient family, faculty or coworker evaluations.
Collapse
Affiliation(s)
- Ricardo Riveros
- Department of Pediatric Anesthesia and Outcomes Research, Cleveland Clinic.
| | | | - Pilar Castro
- Department of Pediatric Anesthesia, Cleveland Clinic
| | - Vipul Dhumak
- Department of Anesthesia, University of Iowa Hospitals and Clinic
| | - Hooman Honar
- Department of Outcomes Research, Cleveland Clinic
| | - Edward J Mascha
- Departments of Quantitative Health Sciences and Outcomes Research, Cleveland Clinic
| | | |
Collapse
|
6
|
Chan TM, Wallner C, Swoboda TK, Leone KA, Kessler C. Assessing interpersonal and communication skills in emergency medicine. Acad Emerg Med 2012; 19:1390-402. [PMID: 23279246 DOI: 10.1111/acem.12030] [Citation(s) in RCA: 18] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/01/2012] [Accepted: 07/03/2012] [Indexed: 01/30/2023]
Abstract
Interpersonal and communication skills (ICS) are a key component of several competency-based schemata and key competency in the set of six Accreditation Council for Graduate Medical Education (ACGME) core competencies. With the shift toward a competency-based educational framework, the importance of robust learner assessment becomes paramount. The journal Academic Emergency Medicine (AEM) hosted a consensus conference to discuss education research in emergency medicine (EM). This article summarizes the initial preparatory research that was conducted to brief consensus conference attendees and reports the results of the consensus conference breakout session as it pertains to ICS assessment of learners. The goals of this consensus conference session were to twofold: 1) to determine the state of assessment of observable learner performance and 2) to determine a research agenda within the ICS field for medical educators. The working group identified six key recommendations for medical educators and researchers.
Collapse
Affiliation(s)
- Teresa M. Chan
- Department of Medicine; Division of Emergency Medicine; McMaster University; Hamilton; Ontario; Canada
| | - Clare Wallner
- Department of Emergency Medicine; Oregon Health Sciences University; Portland; OR
| | - Thomas K. Swoboda
- Department of Emergency Medicine; Louisiana State University Health Sciences Center; Shreveport; LA
| | - Katrina A. Leone
- Department of Emergency Medicine; Oregon Health Sciences University; Portland; OR
| | - Chad Kessler
- Department of Emergency Medicine; Jesse Brown VA Hospital; Chicago; IL
| |
Collapse
|
7
|
Davis LE, King MK, Wayne SJ, Kalishman SG. Evaluating Medical Student Communication/Professionalism Skills from a Patient's Perspective. Front Neurol 2012; 3:98. [PMID: 22723790 PMCID: PMC3379033 DOI: 10.3389/fneur.2012.00098] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/12/2012] [Accepted: 05/30/2012] [Indexed: 11/13/2022] Open
Abstract
OBJECTIVE Evaluate medical students' communication and professionalism skills from the perspective of the ambulatory patient and later compare these skills in their first year of residency. METHODS Students in third year neurology clerkship clinics see patients alone followed by a revisit with an attending neurologist. The patient is then asked to complete a voluntary, anonymous, Likert scale questionnaire rating the student on friendliness, listening to the patient, respecting the patient, using understandable language, and grooming. For students who had completed 1 year of residency these professionalism ratings were compared with those from their residency director. RESULTS Seven hundred forty-two questionnaires for 165 clerkship students from 2007 to 2009 were analyzed. Eighty-three percent of forms were returned with an average of 5 per student. In 64% of questionnaires, patients rated students very good in all five categories; in 35% patients selected either very good or good ratings; and <1% rated any student fair. No students were rated poor or very poor. Sixty-two percent of patients wrote complimentary comments about the students. From the Class of 2008, 52% of students received "better than their peers" professionalism ratings from their PGY1 residency directors and only one student was rated "below their peers." CONCLUSION This questionnaire allowed patient perceptions of their students' communication/professionalism skills to be evaluated in a systematic manner. Residency director ratings of professionalism of the same students at the end of their first year of residency confirms continued professional behavior.
Collapse
Affiliation(s)
- Larry E Davis
- Neurology Service, New Mexico Veterans Affairs Health Care System Albuquerque, NM, USA
| | | | | | | |
Collapse
|
8
|
Buchanan AO, Stallworth J, Christy C, Garfunkel LC, Hanson JL. Professionalism in practice: strategies for assessment, remediation, and promotion. Pediatrics 2012; 129:407-9. [PMID: 22371458 DOI: 10.1542/peds.2011-3716] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/24/2022] Open
Affiliation(s)
- April O Buchanan
- Department of Pediatrics, University of South Carolina School of Medicine, Greenville Hospital System University Medical Center, Greenville, SC 29605, USA.
| | | | | | | | | |
Collapse
|
9
|
Chandratilake M, McAleer S, Gibson J. Cultural similarities and differences in medical professionalism: a multi-region study. MEDICAL EDUCATION 2012; 46:257-66. [PMID: 22324525 DOI: 10.1111/j.1365-2923.2011.04153.x] [Citation(s) in RCA: 85] [Impact Index Per Article: 7.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/21/2023]
Abstract
CONTEXT Over the last two decades, many medical educators have sought to define professionalism. Initial attempts to do so were focused on defining professionalism in a manner that allowed for universal agreement. This quest was later transformed into an effort to 'understand professionalism' as many researchers realised that professionalism is a social construct and is culture-sensitive. The determination of cultural differences in the understanding of professionalism, however, has been subject to very little research, possibly because of the practical difficulties of doing so. In this multi-region study, we illustrate the universal and culture-specific aspects of medical professionalism as it is perceived by medical practitioners. METHODS Forty-six professional attributes were identified by reviewing the literature. A total of 584 medical practitioners, representing the UK, Europe, North America and Asia, participated in a survey in which they indicated the importance of each of these attributes. We determined the 'essentialness' of each attribute in different geographic regions using the content validity index, supplemented with kappa statistics. RESULTS With acceptable levels of consensus, all regional groups identified 29 attributes as 'essential', thereby indicating the universality of these professional attributes, and six attributes as non-essential. The essentialness of the rest varied by regional group. CONCLUSIONS This study has helped to identify regional similarities and dissimilarities in understandings of professionalism, most of which can be explained by cultural differences in line with the theories of cultural dimensions and cultural value. However, certain dissonances among regions may well be attributable to socio-economic factors. Some of the responses appear to be counter-cultural and demonstrate practitioners' keenness to overcome cultural barriers in order to provide better patient care.
Collapse
|
10
|
O'Leary KJ, Sehgal NL, Terrell G, Williams MV. Interdisciplinary teamwork in hospitals: a review and practical recommendations for improvement. J Hosp Med 2012; 7:48-54. [PMID: 22042511 DOI: 10.1002/jhm.970] [Citation(s) in RCA: 116] [Impact Index Per Article: 9.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/05/2011] [Revised: 07/26/2011] [Accepted: 08/08/2011] [Indexed: 11/12/2022]
Abstract
Recognizing the importance of teamwork in hospitals, senior leadership from the American College of Physician Executives (ACPE), the American Hospital Association (AHA), the American Organization of Nurse Executives (AONE), and the Society of Hospital Medicine (SHM) established the High Performance Teams and the Hospital of the Future project. This collaborative learning effort aims to redesign care delivery to provide optimal value to hospitalized patients. With input from members of this initiative, we prepared this report which reviews the literature related to teamwork in hospitals. Teamwork is critically important to provide safe and effective hospital care. Hospitals with high teamwork ratings experience higher patient satisfaction, higher nurse retention, and lower hospital costs. Elements of effective teamwork have been defined and provide a framework for assessment and improvement efforts in hospitals. Measurement of teamwork is essential to understand baseline performance, and to demonstrate the utility of resources invested to enhance it and the subsequent impact on patient care. Interventions designed to improve teamwork in hospitals include localization of physicians, daily goals of care forms and checklists, teamwork training, and interdisciplinary rounds. Though additional research is needed to evaluate the impact on patient outcomes, these interventions consistently result in improved teamwork knowledge, ratings of teamwork climate, and better understanding of patients' plans of care. The optimal approach is implementation of a combination of interventions, with adaptations to fit unique clinical settings and local culture.
Collapse
Affiliation(s)
- Kevin J O'Leary
- Division of Hospital Medicine, Northwestern University Feinberg School of Medicine, Chicago, Illinois.
| | | | | | | | | |
Collapse
|
11
|
Richmond M, Canavan C, Holtman MC, Katsufrakis PJ. Feasibility of implementing a standardized multisource feedback program in the graduate medical education environment. J Grad Med Educ 2011. [PMID: 23205200 PMCID: PMC3244317 DOI: 10.4300/jgme-d-10-00088.1] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND Multisource feedback (MSF) is emerging as a central assessment method for several medical education competencies. Planning and resource requirements for a successful implementation can be significant. Our goal is to examine barriers and challenges to a successful multisite MSF implementation, and identify the benefits of MSF as perceived by participants. METHODS We analyzed the 2007-2008 field trial implementation of the Assessment of Professional Behaviors, an MSF program of the National Board of Medical Examiners, conducted with 8 residency and fellowship programs at 4 institutions. We use a multimethod analysis that draws on quantitative process indicators and qualitative participant experience data. Process indicators include program attrition, completion of implementation milestones, number of participants at each site, number of MSF surveys assigned and completed, and adherence to an experimental rater training protocol. Qualitative data include communications with each program and semistructured interviews conducted with key field trial staff to elicit their experiences with implementation. RESULTS Several implementation challenges are identified, including communication gaps and difficulty scheduling implementation and training workshops. Participant interviews indicate several program changes that should enhance feasibility, including increasing communication and streamlining the training process. CONCLUSIONS Multisource feedback is a complex educational intervention that has the potential to provide users with a better understanding of performance expectations in the graduate medical education environment. Standardization of the implementation processes and tools should reduce the burden on program administrators and participants. Further study is warranted to broaden our understanding of the resource requirements for a successful MSF implementation and to show how outcomes change as MSF gains broader acceptance.
Collapse
|
12
|
O'Leary KJ, Afsar-Manesh N, Budnitz T, Dunn AS, Myers JS. Hospital quality and patient safety competencies: development, description, and recommendations for use. J Hosp Med 2011; 6:530-6. [PMID: 22042766 DOI: 10.1002/jhm.937] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 09/03/2010] [Revised: 01/24/2011] [Accepted: 04/16/2011] [Indexed: 11/09/2022]
Abstract
BACKGROUND Hospitalists are poised to have a tremendous impact on improving the quality of care for hospitalized patients. However, many hospitalists are inadequately prepared to engage in efforts to improve quality, because medical schools and residency programs have not traditionally emphasized healthcare quality and patient safety in their curricula. METHODS Through a multistep process, the Society of Hospital Medicine (SHM) Quality Improvement Education (QIE) subcommittee developed the Hospital Quality and Patient Safety (HQPS) Competencies to provide a framework for developing and assessing curricula and other professional development experiences. This article describes the development, provides definitions, and makes recommendations on the use of the HQPS Competencies. RESULTS The 8 areas of competence include: Quality Measurement and Stakeholder Interests, Data Acquisition and Interpretation, Organizational Knowledge and Leadership Skills, Patient Safety Principles, Teamwork and Communication, Quality and Safety Improvement Methods, Health Information Systems, and Patient Centeredness. Reflecting differing levels of hospitalist involvement in healthcare quality, 3 levels of expertise within each area of competence have been established: basic, intermediate, and advanced. Standards for each competency area use carefully selected action verbs to reflect educational goals for hospitalists at each level. CONCLUSIONS Formal incorporation of the HQPS Competencies into professional development programs, and innovative educational initiatives and curricula, will help provide current hospitalists and the next generations of hospitalists with the needed skills to be successful.
Collapse
Affiliation(s)
- Kevin J O'Leary
- Division of Hospital Medicine, Northwestern University Feinberg School of Medicine, Chicago, Illinois 60611, USA.
| | | | | | | | | |
Collapse
|
13
|
Abstract
In the context of professionalism being viewed increasingly as a social contract, a survey was conducted to investigate the importance placed by the general public on doctors' professional attributes. A quota sample of 953 responded to a 55-item online inventory of professional attributes. The quotas closely represented the national census. The majority of the highly important attributes focused on the relationship with patients. Statistically, the responses emerged as a three-facet model (clinicianship, workmanship and citizenship) of medical professionalism. The general public did not equate professionalism with social standing, wealth production, physique or appearance. They recognised doctors as professionals by their good behaviour, high values and positive attitudes as clinicians, workmen and citizens. Although, their preference of professional attributes varied with the setting, eg patient consultation, working with others and behaving in society, they expected doctors to be confident, reliable, dependable, composed, accountable and dedicated across all settings.
Collapse
|
14
|
Warm EJ, Schauer D, Revis B, Boex JR. Multisource feedback in the ambulatory setting. J Grad Med Educ 2010; 2:269-77. [PMID: 21975632 PMCID: PMC2941386 DOI: 10.4300/jgme-d-09-00102.1] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/12/2009] [Revised: 01/18/2010] [Accepted: 01/25/2010] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND The Accreditation Council for Graduate Medical Education has mandated multisource feedback (MSF) in the ambulatory setting for internal medicine residents. Few published reports demonstrate actual MSF results for a residency class, and fewer still include clinical quality measures and knowledge-based testing performance in the data set. METHODS Residents participating in a year-long group practice experience called the "long-block" received MSF that included self, peer, staff, attending physician, and patient evaluations, as well as concomitant clinical quality data and knowledge-based testing scores. Residents were given a rank for each data point compared with peers in the class, and these data were reviewed with the chief resident and program director over the course of the long-block. RESULTS Multisource feedback identified residents who performed well on most measures compared with their peers (10%), residents who performed poorly on most measures compared with their peers (10%), and residents who performed well on some measures and poorly on others (80%). Each high-, intermediate-, and low-performing resident had a least one aspect of the MSF that was significantly lower than the other, and this served as the basis of formative feedback during the long-block. CONCLUSION Use of multi-source feedback in the ambulatory setting can identify high-, intermediate-, and low-performing residents and suggest specific formative feedback for each. More research needs to be done on the effect of such feedback, as well as the relationships between each of the components in the MSF data set.
Collapse
Affiliation(s)
- Eric J. Warm
- Corresponding author: Eric J. Warm, MD, Department of Internal Medicine, University of Cincinnati Academic Health Center, 231 Albert Sabin Way, Cincinnati, OH 45267-0557, 513.558.2590,
| | | | | | | |
Collapse
|
15
|
Implementation of Peer Review into a Physical Medicine and Rehabilitation Program and its Effect on Professionalism. PM R 2010; 2:117-24. [DOI: 10.1016/j.pmrj.2009.11.013] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/31/2009] [Revised: 11/23/2009] [Accepted: 11/26/2009] [Indexed: 11/19/2022]
|
16
|
Gillespie C, Paik S, Ark T, Zabar S, Kalet A. Residents' perceptions of their own professionalism and the professionalism of their learning environment. J Grad Med Educ 2009; 1:208-15. [PMID: 21975980 PMCID: PMC2931244 DOI: 10.4300/jgme-d-09-00018.1] [Citation(s) in RCA: 22] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
Abstract
BACKGROUND The competency of professionalism encompasses a range of behaviors in multiple domains. Residency programs are struggling to integrate and effectively assess professionalism. We report results from a survey assessing residents' perceptions of their professional competence and the professionalism of their learning environment. METHODS A survey was developed to assess specific behaviors reflecting professionalism based on the conceptualizations of key accrediting bodies. Residents rated their ability to perform the behaviors and reported the frequency with which they observed their fellow residents failing to perform the behaviors. Eighty-five senior residents in emergency medicine, internal medicine, pediatrics, psychiatry, and surgery specialties completed the survey (response rate = 77%). Differences among domains (and among items within domains) were assessed. Correlations between perceived professionalism and the professionalism of the learning environment were described. RESULTS Cronbach alpha for professionalism competence was .93 and for professionalism in the learning environment it was .86. Residents reported feeling most competent in being accountable (mean score = 51.4%; F = 10.3, p<.001) and in demonstrating respect. Some residents reported having trouble being sensitive to patients (n = 5 to 23). Disrespectful behaviors were the most frequently witnessed professionalism lapse in the learning environment (mean = 41.1%; F = 8.1, p<.001). While serious lapses in professionalism were not witnessed with great frequency in the learning environment, instances of over-representing qualifications were reported. Problems in accountability in the learning environment were negatively associated with residents' perceived competence. CONCLUSIONS Residents reported being able to perform professionally most of the time, especially in terms of accountability and respect. However, disrespect was a feature of the learning environment for many residents and several serious lapses were witnessed by a small number of residents. Accountability in the learning environment may be an important indicator of or influence on residents' professionalism.
Collapse
Affiliation(s)
- Colleen Gillespie
- Corresponding author: Colleen Gillespie, PhD, New York University School of Medicine, VA New York Harbor Health System, 423 East 23rd Street, 15 Floor North (15028AN), New York, NY 10010, 212.263.4247,
| | | | | | | | | |
Collapse
|
17
|
Meng L, Metro DG, Patel RM. Evaluating professionalism and interpersonal and communication skills: implementing a 360-degree evaluation instrument in an anesthesiology residency program. J Grad Med Educ 2009; 1:216-20. [PMID: 21975981 PMCID: PMC2931243 DOI: 10.4300/jgme-d-09-00014.1] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/06/2022] Open
Abstract
OBJECTIVES To implement a 360-degree resident evaluation instrument on the postanesthesia care unit (PACU) rotation and to determine the reliability, feasibility, and validity of this tool for assessing residents' professionalism and interpersonal and communication skills. METHODS Thirteen areas of evaluation were selected to assess the professionalism and interpersonal and communication skills of residents during their PACU rotation. Each area was measured on a 9-point Likert scale (1, unsatisfactory performance, to 9, outstanding performance). Rating forms were distributed to raters after the completion of the PACU rotation. Raters included PACU nurses, secretarial staff, nurse aides, and medical technicians. Residents were aware of the 360-degree assessment and participated voluntarily. The multiple raters' evaluations were then compared with those of the traditional faculty. Intraclass correlation coefficients were calculated to measure the reliability of ratings within each category of raters by the Pearson correlation coefficient. RESULTS Four hundred twenty-nine rating forms were returned during the study period. Fifteen residents were evaluated. The response rate was 88%. Residents were ranked highest on availability and lowest on management skill. The average rating across all areas was high (8.23). The average mean rating across all items from PACU nurses was higher (8.34) than from secretarial staff (7.99, P > .08). The highest ranked resident ranked high with all raters and the lowest ranked was low with most raters. The intraclass coefficients of correlations were 0.8719, 0.7860, 0.8268, and 0.8575. CONCLUSIONS This type of resident assessment tool may be useful for PACU rotations. It appears to correlate with traditional faculty ratings, is feasible to use, and provides formative feedback to residents regarding their professionalism and interpersonal and communication skills.
Collapse
Affiliation(s)
- Li Meng
- Corresponding author: Li Meng, MD, MPH, B208 PUH, 200 Lothrop Street, Pittsburgh, PA 15213, 412.647.3260,
| | | | | |
Collapse
|
18
|
Collichio FA, Kayoumi KM, Hande KR, Hawkins RE, Hawley JL, Adelstein DJ, D'Angelo JM, Stewart JA. Developing an in-training examination for fellows: the experience of the American Society of Clinical Oncology. J Clin Oncol 2009; 27:1706-11. [PMID: 19224843 DOI: 10.1200/jco.2008.20.3091] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022] Open
Abstract
The American Society of Clinical Oncology (ASCO) developed its own test -- the Medical Oncology In-Training Examination (MedOnc ITE) -- as a tool to assess trainees' knowledge of the clinical oncology subspecialty, establish consistency in educational standards across training programs, identify areas of strength and weakness in individual programs, and stimulate intraprogrammatic reading and discussion. The Accreditation Council for Graduate Medical Education Outcome Project provided additional incentive for ASCO to develop an ITE. The examination was developed in 4 years. The concept of the examination and the budget were approved by the ASCO governing board. The National Board of Medical Examiners was selected to work with ASCO. Fellowship programs were contacted to determine if they had the information technology support to hold the examination. A blueprint for the examination was developed. The test format, including the number of questions and the selection of case-based single best answers, was determined. Physician volunteers to write the questions were solicited from among program directors, various ASCO committees, and disease experts. A workshop was held to teach volunteers how to write proper case-based questions. From this pool, a smaller group of physicians was selected to develop the test and review all test questions. The final examination was developed and administered in February 2008, with scores provided to fellows and program directors in April 2008. Feedback received after the examination will be helpful for developing future MedOnc ITEs. The process ASCO went through to develop the MedOnc ITE serves as a model for other subspecialties interested in developing their own ITEs.
Collapse
Affiliation(s)
- Frances A Collichio
- Division of Hematology/Oncology, University of North Carolina, Chapel Hill, NC 27599-7305, USA.
| | | | | | | | | | | | | | | |
Collapse
|
19
|
Ogunyemi D, Gonzalez G, Fong A, Alexander C, Finke D, Donnon T, Azziz R. From the eye of the nurses: 360-degree evaluation of residents. THE JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS 2009; 29:105-110. [PMID: 19530193 DOI: 10.1002/chp.20019] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
INTRODUCTION Evaluations from the health care team can provide feedback useful in guiding residents' professional growth. We describe the significance of 360-degree evaluation of residents by the nursing staff. METHODS A retrospective analysis of 1642 nurses' anonymous evaluations on 26 residents from 2004 to 2007 was performed. Nurses' evaluations of residents on communication with patients, interactions with peers, and professionalism were compared to faculty evaluations and standard medical examination scores. Data were analyzed with the use of the chi-square test, the t test, analyses of variance (ANOVAs), and Spearman's correlation. A P value of <.05 was considered significant. RESULTS Strong correlations were noted between nursing evaluation categories (r = 0.74-0.80, P < .001), whereas weak correlations occurred between nursing and faculty evaluations (r = 0.065-0.119, P < .001). There were weak negative correlations between nursing evaluations and standard medical examination scores (r = -0.08 to -0.10, P < .001). Specific graduating resident classes, the obstetrical rotation, and senior or male residents were significantly associated with negative nursing evaluations. DISCUSSION Nursing staff can assess residents on the competencies of interpersonal and communication skills and professionalism. These evaluations provide different perceptions of residents' behavior, which can be useful for formative feedback in residents' development.
Collapse
Affiliation(s)
- Dotun Ogunyemi
- Department of Obstetrics and Gynecology, Cedars Sinai Medical Center, David Geffen School of Medicine at UCLA, Los Angeles, CA 90048, USA.
| | | | | | | | | | | | | |
Collapse
|
20
|
Wiggins MN, Coker K, Hicks EK. Patient perceptions of professionalism: implications for residency education. MEDICAL EDUCATION 2009; 43:28-33. [PMID: 19148978 DOI: 10.1111/j.1365-2923.2008.03176.x] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
OBJECTIVES The purpose of this study was three-fold: to identify which behavioural, communicative and personal presentation characteristics most closely represent patients' views of professionalism; to determine whether patients perceive resident doctors as displaying these characteristics, and to explore whether or not resident doctor professional behaviour creates an impression of clinical competence to the degree where patients perceive a decreased need for Attending Physician involvement. METHODS We carried out a descriptive, cross-sectional study at an academic centre. An anonymous, voluntary four-question survey with multiple items was administered to all adult patients or the parents of paediatric patients attending an ophthalmology clinic who were seen by a resident doctor followed by an Attending Physician. RESULTS A total of 133 of 148 (90%) surveys were returned. All the itemised characteristics of professionalism were reported to be important or very important to the majority of participants.The most important were: 'Pays attention to my concerns' (90%); 'Is compassionate' (83%), and 'Speaks in terms that I can understand' (83%). Although 85% of respondents reported that resident doctors demonstrated all the characteristics of professionalism listed on the survey, 83% of participants stated that it was important or very important that residents have Attending Physician involvement. CONCLUSIONS Patient-centred components of professionalism, such as communication skills and compassion, are more important to patients than social behaviours, such as appearance and acknowledgement of family members. Resident doctors are perceived to display a high level of professionalism during patient care. Patients clearly desire direct resident doctor
Collapse
Affiliation(s)
- Michael N Wiggins
- Department of Ophthalmology, Jones Eye Institute, Universityof Arkansas for Medical Sciences, Little Rock, Arkansas,USA.
| | | | | |
Collapse
|
21
|
Lanza ML, Zeiss RA, Rierdan J. Multiple perspectives on assault: the 360-degree interview. J Am Psychiatr Nurses Assoc 2009; 14:413-20. [PMID: 21665784 DOI: 10.1177/1078390308327039] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Workplace violence is common in health care settings. The authors review various models of this violence that have developed over time. From a linear model, understanding progressed to an interactional and then to a contextual model of assault that examines interactions of the aggressor, victim, and the environment. To date, there has not been a satisfactory research methodology to explore the complexities of the contextual model. This article proposes the 360-degree evaluation as an appropriate methodology for examination of multiple perspectives on assault. The 360-degree model allows comparison of perspectives of the assailant, victim, victim's peers, and victim's supervisor. J Am Psychiatr Nurses Assoc, 2009; 14(6), 413-420.
Collapse
Affiliation(s)
- Marilyn Lewis Lanza
- Nurse Researcher, Edith Nourse Rogers Memorial Veterans Hospital, Bedford, MA
| | | | | |
Collapse
|
22
|
Paige JT, Aaron DL, Yang T, Howell DS, Hilton CW, Cohn I, Chauvin SW. Implementation of a Preoperative Briefing Protocol Improves Accuracy of Teamwork Assessment in the Operating Room. Am Surg 2008. [DOI: 10.1177/000313480807400909] [Citation(s) in RCA: 30] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
This study examined the effect of implementing a new preoperative briefing protocol on self- and peer-assessments of individual operating room (OR) teamwork behaviors. From July 2006 to February 2007, OR teamwork performance at a rural community hospital was evaluated before and after training and implementation of the protocol. After each case, every member on the team completed a 360-degree type teamwork behavior evaluation containing both self- and peer-assessments using a six-point Likert type scale (1 = definitely no to 6 = definitely yes). Individual behavior change was measured using the mean scale score of pre and postprotocol assessments. Statistical analysis included t test for both pre/post and self/peer differences. Data were available for one general surgeon and nine OR staff (pre = 20 cases, post = 16 cases). The preprotocol self-assessment mean score was significantly higher than peer-assessment (5.63 vs 5.29, P < 0.0267). Pre and postprotocol peer assessment mean scores revealed a statistically significant gain in teamwork behaviors. No difference was observed in postassessment mean scores for self- and peer-assessments. Individuals overestimated their teamwork behaviors before protocol implementation. Using a preoperative protocol seems to improve OR staff teamwork behaviors and self-assessment accuracy. The use of a 360-degree assessment method targeting specific, observable behaviors may be useful in evaluating team-based interventions and enhancing teamwork effectiveness.
Collapse
Affiliation(s)
- John T. Paige
- Louisiana State University Health Sciences Center, New Orleans, Louisiana and
| | | | - Tong Yang
- Louisiana State University Health Sciences Center, New Orleans, Louisiana and
| | - D. Shannon Howell
- Louisiana State University Health Sciences Center, New Orleans, Louisiana and
| | - Charles W. Hilton
- Louisiana State University Health Sciences Center, New Orleans, Louisiana and
| | - Isidore Cohn
- Louisiana State University Health Sciences Center, New Orleans, Louisiana and
| | - Sheila W. Chauvin
- Louisiana State University Health Sciences Center, New Orleans, Louisiana and
| |
Collapse
|
23
|
Stark R, Korenstein D, Karani R. Impact of a 360-degree professionalism assessment on faculty comfort and skills in feedback delivery. J Gen Intern Med 2008; 23:969-72. [PMID: 18612726 PMCID: PMC2517935 DOI: 10.1007/s11606-008-0586-0] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Abstract
BACKGROUND Professionalism is identified as a competency of resident education. Best approaches to teaching and evaluating professionalism are unknown, but feedback about professionalism is necessary to change practice and behavior. Faculty discomfort with professionalism may limit their delivery of feedback to residents. OBJECTIVES A pilot program to implement a 360-degree evaluation of observable professionalism behaviors and determine how its use impacts faculty feedback to residents. DESIGN Internal Medicine (IM) residents were evaluated during ambulatory rotations using a 360-degree assessment of professional behaviors developed by the National Board of Medical Examiners(R). Faculty used evaluation results to provide individual feedback to residents. PATIENTS/PARTICIPANTS Fifteen faculty members. MEASUREMENTS AND MAIN RESULTS Faculty completed pre- and post-intervention surveys. Using a 7-point Likert scale, faculty reported increased skill in giving general feedback (4.85 vs 4.36, p < .05) and feedback about professionalism (4.71 vs 3.57, p < .01) after the implementation of the 360-degree evaluation. They reported increased comfort giving feedback about professionalism (5.07 vs 4.35, p < .05) but not about giving feedback in general (5.43 vs 5.50). CONCLUSIONS A 360-degree professionalism evaluation instrument used to guide feedback to residents improves faculty comfort and self-assessed skill in giving feedback about professionalism.
Collapse
Affiliation(s)
- Rachel Stark
- Department of Medicine, Montefiore Medical Center/Albert Einstein College of Medicine, New York, NY, USA.
| | | | | |
Collapse
|
24
|
Massagli TL, Carline JD. Reliability of a 360-Degree Evaluation to Assess Resident Competence. Am J Phys Med Rehabil 2007; 86:845-52. [PMID: 17885319 DOI: 10.1097/phm.0b013e318151ff5a] [Citation(s) in RCA: 39] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
OBJECTIVE To determine the feasibility and psychometric qualities of a 360-degree evaluation of physical medicine and rehabilitation (PM&R) residents' competence. DESIGN Nurses, allied health staff, and medical students completed a 12-item questionnaire after each PM&R resident rotation from January 2002 to December 2004. The items were derived from five of the six competencies defined by the Accreditation Council for Graduate Medical Education (ACGME). RESULTS Nine hundred thirty evaluations of 56 residents were completed. The alpha reliability coefficient for the instrument was 0.89. Ratings did not vary significantly by resident gender. Senior residents had higher ratings than junior residents. A reliability of >0.8 could be achieved by ratings from just five nurses or allied health staff, compared with 23 ratings from medical students. Factor analysis revealed all items clustered on one factor, accounting for 84% of the variance. In a subgroup of residents with low scores, raters were able to differentiate among skills. CONCLUSION Resident assessment tools should be valid, reliable, and feasible. This Web-based 360-degree evaluation tool is a feasible way to obtain reliable ratings from rehabilitation staff about resident behaviors. The assignment of higher ratings for senior residents than junior residents is evidence for the general validity of this 360-degree evaluation tool in the assessment of resident performance. Different rater groups may need distinct instruments based on the exposure of rater groups to various resident activities and behaviors.
Collapse
Affiliation(s)
- Teresa L Massagli
- Department of Rehabilitation Medicine, University of Washington, Seattle, Washington 98105, USA
| | | |
Collapse
|
25
|
Lee AG, Beaver HA, Boldt HC, Olson R, Oetting TA, Abramoff M, Carter K. Teaching and Assessing Professionalism in Ophthalmology Residency Training Programs. Surv Ophthalmol 2007; 52:300-14. [PMID: 17472805 DOI: 10.1016/j.survophthal.2007.02.003] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Abstract
The Accreditation Council for Graduate Medical Education (ACGME) has mandated that all residency training programs teach and assess new competencies including professionalism. This article reviews the literature on medical professionalism, describes good practices gleaned from published works, and proposes an implementation matrix of specific tools for teaching and assessing professionalism in ophthalmology residency. Professionalism requirements have been defined by the ACGME, subspecialty organizations, and other certifying and credentialing organizations. Teaching, role modeling, and assessing the competency of professionalism are important tasks in managing the ACGME mandate. Future work should focus on the field testing of tools for validity, reliability, feasibility, and cost-effectiveness.
Collapse
Affiliation(s)
- Andrew G Lee
- Department of Ophthalmology, University of Iowa Hospital and Clinics, Iowa City, Iowa 52242, USA
| | | | | | | | | | | | | |
Collapse
|
26
|
Adams KE, O'Reilly M, Romm J, James K. Effect of Balint training on resident professionalism. Am J Obstet Gynecol 2006; 195:1431-7. [PMID: 16996457 DOI: 10.1016/j.ajog.2006.07.042] [Citation(s) in RCA: 24] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/23/2006] [Revised: 07/15/2006] [Accepted: 07/31/2006] [Indexed: 11/18/2022]
Abstract
OBJECTIVE The study was designed to assess the impact of 6 months of Balint training on self- and faculty-assessed measures of professionalism in obstetrics and gynecology residents. STUDY DESIGN Pre- and post-Balint training resident self-assessment and pre- and post-training faculty assessment using standard professionalism instruments were used to compare the resident Balint group to the group that did not participate. Participating residents also completed a qualitative assessment of the experience. RESULTS Residents who participated were enthusiastic regarding the value of Balint in promoting self-reflection and gaining insight into self- and patient-care issues, both key components of professionalism. There were no significant differences in self or faculty assessment of professionalism between residents who participated in Balint and those who did not. CONCLUSION Six months of Balint training was successful in providing resident education in professionalism, measured by resident self-report. No differences were detected on 2 measures of professionalism between the training and control groups.
Collapse
Affiliation(s)
- Karen E Adams
- Department of Obstetrics and Gynecology, Oregon Health and Sciences University, Portland, OR 97239, USA.
| | | | | | | |
Collapse
|
27
|
Abstract
Recent developments in assessing professionalism and remediating unprofessional behavior can curtail the inaction that often follows observations of negative as well as positive professionalism of learners and faculty. Developments include: longitudinal assessment models promoting professional behavior, not just penalizing lapses; clarity about the assessment's purpose; methods separating formative from summative assessment; conceptual and behavioral definitions of professionalism; techniques increasing the reliability and validity of quantitative and qualitative approaches to assessment such as 360-degree assessments, performance-based assessments, portfolios, and humanism connoisseurs; and systems-design providing infrastructure support for assessment. Models for remediation have been crafted, including: due process, a warning period and, if necessary, confrontation to initiate remediation of the physician who has acted unprofessionally. Principles for appropriate remediation stress matching the intervention to the cause of the professional lapse. Cognitive behavioral therapy, motivational interviewing, and continuous monitoring linked to behavioral contracts are effective remediation techniques. Mounting and maintaining robust systems for professionalism and remediating professional lapses are not easy tasks. They require a sea change in the fundamental goal of academic health care institutions: medical education must not only be a technical undertaking but also a moral process designed to build and sustain character in all its professional citizens.
Collapse
Affiliation(s)
- Louise Arnold
- University of Missouri-Kansas City School of Medicine, Kansas City, MO 64108, USA.
| |
Collapse
|
28
|
Gay SB, Streeter JL, Ciambotti J, Jackson J. Electronic Evaluation Systems for Radiology Residency and Fellowship. J Am Coll Radiol 2006; 3:358-65. [PMID: 17412081 DOI: 10.1016/j.jacr.2006.01.023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/18/2005] [Indexed: 11/20/2022]
Abstract
Electronic evaluation systems are becoming more commonplace in radiology residencies. This article describes the authors' experience in the development of such a system and a method that a residency program or graduate medical education office could use to evaluate commercially available programs.
Collapse
Affiliation(s)
- Spencer B Gay
- Department of Radiology, University of Virginia Medical Center, Charlottesville, VA 22908, USA.
| | | | | | | |
Collapse
|
29
|
Wood J, Collins J, Burnside ES, Albanese MA, Propeck PA, Kelcz F, Spilde JM, Schmaltz LM. Patient, faculty, and self-assessment of radiology resident performance: a 360-degree method of measuring professionalism and interpersonal/communication skills. Acad Radiol 2004; 11:931-9. [PMID: 15288041 DOI: 10.1016/j.acra.2004.04.016] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2004] [Accepted: 04/14/2004] [Indexed: 11/19/2022]
Abstract
RATIONALE AND OBJECTIVES To develop and test the reliability, validity, and feasibility of a 360-degree evaluation to measure radiology resident competence in professionalism and interpersonal/communication skills. MATERIALS AND METHODS An evaluation form with 10 Likert-type items related to professionalism and interpersonal/communication skills was completed by a resident, supervising radiologist and patient after resident-patient interactions related to breast biopsy procedures. Residents were also evaluated by faculty, using an end-of-rotation global rating form. Residents, faculty, and technologists were queried regarding their reaction to the assessments after a 7-month period. RESULTS Fifty-six complete 360-degree data sets (range, 2-14 per resident) and seven rotational evaluations for seven residents were analyzed and compared. Internal consistency reliability estimates were 0.85, 0.86, and 0.87 for resident, patient, and faculty 360-degree evaluations, respectively. Correlations between resident-versus-patient, resident-versus-faculty, and patient-versus-faculty ratings for the 56 interactions were -0.06 (P =.64), 0.31 (P <.02), and 0.45 (P <.0006), respectively. Pearson correlation coefficients approached significant correlation (0.70) between the faculty global rating and patient 360-degree scores (P =.08) but not with faculty 360-degree scores. Residents and faculty felt that completing the 360-degree forms was easy, but the requirement for faculty presence during the consent process was burdensome. CONCLUSION Results from this pilot study suggest that self, faculty, and patient evaluations of resident performance constitutes a valid and reliable assessment of resident competence. Additional data are needed to determine whether the 360-degree assessment should be incorporated into residency programs and how frequently the assessment should be performed. Requiring only a specified number of assessments per rotation would make the process less burdensome for residents and faculty.
Collapse
Affiliation(s)
- Jonathan Wood
- Department of Radiology, University of Wisconsin Hospital and Clinics, Madison, WI 53792, USA
| | | | | | | | | | | | | | | |
Collapse
|
30
|
Shrank WH, Reed VA, Jernstedt GC. Fostering professionalism in medical education: a call for improved assessment and meaningful incentives. J Gen Intern Med 2004; 19:887-92. [PMID: 15242476 PMCID: PMC1492501 DOI: 10.1111/j.1525-1497.2004.30635.x] [Citation(s) in RCA: 52] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
Abstract
Increasing attention has been focused on developing professionalism in medical school graduates. Unfortunately, the culture of academic medical centers and the behaviors that faculty model are often incongruent with our image of professionalism. The need for improved role modeling, better assessment of student behavior, and focused faculty development is reviewed. We propose that the incentive structure be adjusted to reward professional behavior in both students and faculty. The third-year medicine clerkship provides an ideal opportunity for clinician-educators to play a leading role in evaluating, rewarding, and ultimately fostering professionalism in medical school graduates.
Collapse
|