1
|
Oberoi KPS, Caine AD, Schwartzman J, Rab S, Turner AL, Merchant AM, Kunac A. A Resident-Driven Mobile Evaluation System Can Be Used to Augment Traditional Surgery Rotation Evaluations. Am Surg 2023; 89:137-144. [PMID: 33881951 DOI: 10.1177/00031348211011130] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/15/2022]
Abstract
BACKGROUND The Accreditation Council for Graduate Medical Education requires residents to receive milestone-based evaluations in key areas. Shortcomings of the traditional evaluation system (TES) are a low completion rate and delay in completion. We hypothesized that adoption of a mobile evaluation system (MES) would increase the number of evaluations completed and improve their timeliness. METHODS Traditional evaluations for a general surgery residency program were converted into a web-based form via a widely available, free, and secure application and implemented in August 2017. After 8 months, MES data were analyzed and compared to that of our TES. RESULTS 122 mobile evaluations were completed; 20% were solicited by residents. Introduction of the MES resulted in an increased number of evaluations per resident (P = .0028) and proportion of faculty completing evaluations (P = .0220). Timeliness also improved, with 71% of evaluations being completed during one's clinical rotation. CONCLUSIONS A resident-driven MES is an inexpensive and effective method to augment traditional end-of-rotation evaluations.
Collapse
Affiliation(s)
| | - Akia D Caine
- Department of Surgery, 12286Rutgers - New Jersey Medical School, Newark, NJ, USA
| | - Jacob Schwartzman
- Department of Surgery, 12286Rutgers - New Jersey Medical School, Newark, NJ, USA
| | - Sayeeda Rab
- Department of Surgery, 12286Rutgers - New Jersey Medical School, Newark, NJ, USA
| | - Amber L Turner
- Department of Surgery, 24056Saint Barnabas Medical Center, Livingston, NJ, USA
| | - Aziz M Merchant
- Division of General Surgery, Department of Surgery, 12286Rutgers - New Jersey Medical School, Newark, NJ, USA
| | - Anastasia Kunac
- Division of Trauma and Surgical Critical Care, Department of Surgery, 12286Rutgers - New Jersey Medical School, Newark, NJ, USA
| |
Collapse
|
2
|
Kalynych C, Edwards L, West D, Snodgrass C, Zenni E. Tuesday's Teaching Tips-Evaluation and Feedback: A Spaced Education Strategy for Faculty Development. MEDEDPORTAL : THE JOURNAL OF TEACHING AND LEARNING RESOURCES 2022; 18:11281. [PMID: 36475014 PMCID: PMC9678823 DOI: 10.15766/mep_2374-8265.11281] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 04/22/2022] [Accepted: 08/19/2022] [Indexed: 11/23/2022]
Abstract
Introduction The AGGME requires faculty to participate annually in faculty development sessions. Barriers to this requirement include faculty having a lack of time and not perceiving benefits to participating. Effective evaluation and feedback are integral to resident training. Faculty often feel ill prepared to deliver feedback, and residents find accepting and recognizing feedback challenging. We provided faculty with a spaced education program via email that used cognitive theory of multimedia learning solutions in instructional design. Methods The 14-week program consisted of one microlecture and 13 skills-based teaching tips. One tip reinforcing knowledge and skills from the microlecture was emailed each week for faculty to practice in the clinical environment with trainees. Participants completed a short quiz, course evaluation, and self-reflection. The new world Kirkpatrick model was used for program evaluation. Results Fifty-two physician participants received credit for participating; 34 completed the entire course. Of the 34, 32 (94%) identified at least one effective feedback technique, and 27 (79%) were able to define evaluation and recognize observation as the cornerstone of evaluation. Out of the 15 effective feedback characteristics taught, 13 (87%) were identified. Fifty-one participants (98%) rated the program as good/excellent, 52 (100%) wanted more Tuesday's Teaching Tips programs, and the majority recognized change in knowledge and/or skills. Discussion Participants rated the spaced education program as good/excellent and were able to meet the course objectives. This teaching strategy for faculty development was well received, as it was easily accessible and implemented in the clinical learning environment with trainees.
Collapse
Affiliation(s)
- Colleen Kalynych
- Assistant Dean for Medical Education, Office of Educational Affairs, and Senior Lecturer, Department of Emergency Medicine, University of Florida College of Medicine–Jacksonville
| | - Linda Edwards
- Dean and Associate Professor of Medicine, University of Florida College of Medicine–Jacksonville
| | - Denise West
- Assistant Director, Office of Educational Affairs, University of Florida College of Medicine–Jacksonville
| | - Charity Snodgrass
- Administrative Support, Office of Educational Affairs, University of Florida College of Medicine–Jacksonville
| | - Elisa Zenni
- Senior Associate Dean for Educational Affairs, Office of Educational Affairs, and Professor of Pediatrics, Department of Pediatrics, University of Florida College of Medicine–Jacksonville
| |
Collapse
|
3
|
Waheed S, Maursetter L. Evaluation Evolution: Designing Optimal Evaluations to Enhance Learning in Nephrology Fellowship. Adv Chronic Kidney Dis 2022; 29:526-533. [PMID: 36371117 DOI: 10.1053/j.ackd.2022.06.006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/18/2022] [Revised: 04/26/2022] [Accepted: 06/28/2022] [Indexed: 11/10/2022]
Abstract
Evaluations serve as the backbone of any educational program and can be broadly divided into formative and summative evaluations. Formative evaluations are "just in time" evaluations focused on informing the learning process, whereas summative evaluations compare fellows to a preset standard to determine their readiness for unsupervised practice. In the nephrology fellowship programs, evaluations assess competence in the framework of ACGME Milestones 2.0. A variety of learning venues, evaluators, and tools should be incorporated into the measurement process. It is important to determine which milestones can be best assessed in each education venue to decrease the burden of assessment fatigue. Additionally, programs can diversify the evaluators to include nurses, medical students, peers, and program coordinators in addition to faculty to provide a well-rounded assessment of the fellows and share the assessment burden. Lastly, the evaluation data should be presented to fellows in a format where it can inform goal setting. The evaluation system needs to evolve along with the changes being made in curriculum design. This will help to make fellowship learning effective and efficient.
Collapse
Affiliation(s)
- Sana Waheed
- Piedmont Nephrology and Internal Medicine, Atlanta, GA
| | - Laura Maursetter
- Division of Nephrology, Department of Medicine, University of Wisconsin School of Medicine and Public Health, Madison, WI.
| |
Collapse
|
4
|
Diwersi N, Gass JM, Fischer H, Metzger J, Knobe M, Marty AP. Surgery goes EPA (Entrustable Professional Activity) - how a strikingly easy to use app revolutionizes assessments of clinical skills in surgical training. BMC MEDICAL EDUCATION 2022; 22:559. [PMID: 35854302 PMCID: PMC9295378 DOI: 10.1186/s12909-022-03622-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/17/2021] [Accepted: 07/07/2022] [Indexed: 06/15/2023]
Abstract
OBJECTIVE Entrustable Professional Activities (EPAs) are increasingly being used in competency-based medical education approaches. A general lack of time in clinical settings, however, prevents supervisors from providing their trainees with adequate feedback. With a willingness for more administrative tasks being low in both trainees and educators, the authors developed a radical user-friendly mobile application based on the EPA concept called "Surg-prEPAred". DESIGN Surg-prEPAred is designed to collect micro-assessment data for building competency profiles for surgical residents according to their curriculum. The goal of Surg-prEPAred is to facilitate the performance and documentation of workplace-based assessments. Through aggregated data the app generates a personalized competency profile for every trainee. During a pilot run of 4 months, followed by ongoing usage of the application with a total duration of 9 months (August 2019 to April 2020), 32 residents and 33 consultants made daily use of the application as a rating tool. Every rating included knowledge, skills and professional attitudes of the trainees. Before the initiation of the App and after the 9-month trial period trainees and supervisors where both sent questionnaires to evaluate the user friendliness and effectiveness of the App. RESULTS Five hundred ten App based assessments were generated. Out of 40 pre-defined EPAs, 36 were assessed. 15 trainees and 16 supervisors returned the questionnaires and stated the surg-prEPAred App as very valuable, effective and feasible to evaluate trainees in a clinical setting providing residents with an individual competence portfolio to receive precision medical education. CONCLUSIONS The authors expectation is that the Surg-prEPAred App will contribute to an improvement of quality of medical education and thus to the quality of patient care and safety. In the future the goal is to have the App become an integral part of the official Swiss surgical curriculum accepted by the Swiss professional surgical society.
Collapse
Affiliation(s)
- Nadine Diwersi
- Department of General Surgery, Cantonal Hospital of Lucerne, Spitalstrasse 16, 6000, Lucerne, Switzerland.
| | - Jörn-Markus Gass
- Department of General Surgery, Cantonal Hospital of Lucerne, Spitalstrasse 16, 6000, Lucerne, Switzerland
- Department of Health Sciences and Medicine, University of Lucerne, 6002, Lucerne, Switzerland
| | - Henning Fischer
- Department of General Surgery, Cantonal Hospital of Lucerne, Spitalstrasse 16, 6000, Lucerne, Switzerland
| | - Jürg Metzger
- Department of General Surgery, Cantonal Hospital of Lucerne, Spitalstrasse 16, 6000, Lucerne, Switzerland
| | - Matthias Knobe
- Department of General Surgery, Cantonal Hospital of Lucerne, Spitalstrasse 16, 6000, Lucerne, Switzerland
| | - Adrian Philipp Marty
- Institute of Anesthesiology, University Hospital Zürich, Rämistrasse 100, 8006, Zürich, Switzerland
- precisionED Ltd, Muehlebachstrasse 2, 8832, Wollerau, Switzerland
| |
Collapse
|
5
|
Thomas J, Sandefur B, Colletti J, Mullan A, Homme J. Integrating self-assessment into feedback for emergency medicine residents. AEM EDUCATION AND TRAINING 2022; 6:e10721. [PMID: 35155973 PMCID: PMC8823156 DOI: 10.1002/aet2.10721] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/27/2021] [Revised: 12/16/2021] [Accepted: 01/04/2022] [Indexed: 06/14/2023]
Abstract
BACKGROUND In 2013 the Accreditation Council for Graduate Medical Education (ACGME) introduced "Milestones" designed to nationally standardize the assessment of resident physicians. Previous studies compare resident self-assessment on milestones to faculty assessment, with varying degrees of agreement, but integration of self-assessment into the formative feedback process has not yet been directly studied. This study uses a conceptual framework of self-determination theory, integrated with concepts from adult learning theory, to compare the perception of the feedback quality given in semiannual reviews before and after the incorporation of resident self-assessment into the feedback process. METHODS This was an interventional study conducted in a single emergency medicine residency program at a major academic hospital over 1 calendar year. Residents first engaged in a semiannual review without self-assessment. At subsequent semiannual reviews, residents completed a Milestone-based self-assessment that was provided to the faculty member assigned to conduct their semiannual review. Residents and faculty completed surveys rating perception of feedback quality. Two-sided Wilcoxon signed-rank tests were used in comparison analysis. RESULTS One resident did not self-assess prior to the semiannual review and was excluded leaving 25 paired surveys for analysis. Residents found feedback after the self-assessment more actionable (p = 0.013), insightful (p = 0.010), and better overall (p = 0.025). Similarly, faculty felt the feedback they provided was more actionable (p < 0.001), more insightful (p < 0.001), and better communicated (p < 0.001); led to improved resident understanding of milestones (p < 0.001); and were overall more satisfied (p < 0.001). Free-text comments explore pre- and postintervention perceptions of feedback. CONCLUSIONS Integration of self-assessment into semiannual reviews improves perception of feedback given to residents as perceived by both residents and faculty. Although limited by sample size, the results are promising for a simple, evidence-based intervention to improve feedback during an existing mandated feedback opportunity.
Collapse
Affiliation(s)
- Jenna Thomas
- Department of Emergency MedicineUniversity of MichiganAnn ArborMichiganUSA
- Department of Emergency MedicineMayo ClinicRochesterMinnesotaUSA
| | | | - James Colletti
- Department of Emergency MedicineMayo ClinicRochesterMinnesotaUSA
| | - Aidan Mullan
- Department of Biostatistics and InformaticsMayo ClinicRochesterMinnesotaUSA
| | - James Homme
- Department of Emergency MedicineMayo ClinicRochesterMinnesotaUSA
| |
Collapse
|
6
|
Atthota S, Griffiths A, Kangas-Dick A, Jesneck J, Thanawala R, Savel R, Rhee R. The Attending Meritocracy: Implementation of a Novel Team-Based Approach to Provide Effective Resident Feedback. JOURNAL OF SURGICAL EDUCATION 2021; 78:e78-e85. [PMID: 34452853 DOI: 10.1016/j.jsurg.2021.08.005] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/04/2021] [Revised: 07/13/2021] [Accepted: 08/06/2021] [Indexed: 06/13/2023]
Abstract
OBJECTIVE Providing timely quality feedback is an essential responsibility of teaching faculty and is critical for resident assessment and development throughout training. Numerous evaluation platforms have been created to provide immediate and big picture end-of-rotation feedback. Faculty suffer burnout from electronic documentation demands and workload and as a result, evaluation activity is relegated to a lower priority leading to poor compliance. We implemented a novel team-based Attending Meritocracy (AM)1 program that encompasses monetary, automated reminder, and punitive components, while adding a competition element to further engage faculty. The aim of this study is to determine effectiveness of AM in increasing compliance with resident feedback. DESIGN, SETTING AND PARTICIPANTS Surgical faculty (n = 36) were divided into 5 teams according to service and subspecialty. Points could be earned by completing surgical (Firefly, MiniCEX) or rotation (New Innovations) evaluations, leaving comments, and other educational tasks. A prize for the highest scoring team was identified as a dinner financed by the non-winning teams. Data from evaluation platforms was extracted. Continuous variables were compared using Mann-Whitney-U test, and categorical variables using chi-squared test. RESULTS When comparing July 2019 to February 2020 (control period) with July 2020 to February 2021(initial implementation period), we found a 237% increase in submitted NI evaluations (n = 111-374) and a 42.5% decrease in median time to completion from 60.4 (33.2-106.9) days to 34.7 (24.0-64.5) days, (p = 0.001).2 We observed an increase in operative evaluations completed (Mini CEX n = 4-97, Firefly n = 150-1284). CONCLUSIONS Implementation of a team-based attending meritocracy program is an effective budget neutral method to increase completion of resident evaluations. Further investigation is needed to assess improvement in quality of feedback as well as to explore it's impact on progression of resident autonomy.
Collapse
|
7
|
Dickinson KJ, Bass BL, Pei KY. Public Perceptions of General Surgery Residency Training. JOURNAL OF SURGICAL EDUCATION 2021; 78:717-727. [PMID: 33160942 DOI: 10.1016/j.jsurg.2020.09.026] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/15/2020] [Revised: 08/13/2020] [Accepted: 09/28/2020] [Indexed: 06/11/2023]
Abstract
OBJECTIVE Patients are integral to surgical training. Understanding our patients' perceptions of surgical training, resident involvement and autonomy is crucial to optimizing surgical education and thus patient care. In the modern, connected world many factors extrinsic to a patient's experience of healthcare may influence their opinion of our training systems (i.e., social media, television shows, and internet searches). The purpose of this article is to contextualize the literature investigating public perceptions of general surgery training to allow us to effect patient education initiatives to optimize both surgical training and patient safety. DESIGN This is a perspective including a literature review summarizing the current knowledge of public perceptions of general surgery training. CONCLUSIONS Little is published regarding patient and public perceptions of general surgery residency training and the role of residents within this. Current literature demonstrates that the majority of patients are willing to have residents participate in their care. Patients' attitude toward resident involvement in their operation is improved by utilizing educational materials and by ensuring a supervising attending is present within the operating room. These observations, coupled with future work to delve deeper into factors affecting public perceptions of surgical training and resident involvement within this, can guide strategies to improve surgical education.
Collapse
Affiliation(s)
| | - Barbara L Bass
- George Washington University School of Medicine and Health Services, Washington, District of Columbia
| | - Kevin Y Pei
- Department of Graduate Medical Education, Parkview Health, Fort Wayne, Indiana
| |
Collapse
|
8
|
Novel Method of Evaluating Liver Transplant Surgery Fellows Using Objective Measures of Operative Efficiency and Surgical Outcomes. J Am Coll Surg 2021; 233:111-118. [PMID: 33836288 DOI: 10.1016/j.jamcollsurg.2021.03.029] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/11/2021] [Revised: 02/16/2021] [Accepted: 03/17/2021] [Indexed: 10/21/2022]
Abstract
BACKGROUND The majority of liver transplantations (LTs) in North America are performed by transplant surgery fellows with attending surgeon supervision. Although a strict case volume requirement is mandatory for graduating fellows, no guidelines exist on providing constructive feedback to trainees during fellowship. STUDY DESIGN A retrospective review of all adult LTs performed by abdominal transplant surgery fellows at a single American Society of Transplant Surgeons-accredited academic institution from 2005 to 2019 was conducted. Data from the most recent 5 fellows were averaged to generate reference learning curves for 8 variables representing operative efficiency (ie total operative time, warm ischemia time, and cold ischemia time) and surgical outcomes (ie intraoperative blood loss, unplanned return to the operating room, biliary complication, vascular complication, and patient/graft loss). Data for newer fellows were plotted against the reference curves at 3-month intervals to provide an objective assessment measure. RESULTS Three hundred and fifty-two adult LTs were performed by 5 fellows during the study period. Mean patient age was 56 years; 67% were male; and mean Model for End-Stage Liver Disease score at transplantation was 22. For the 8 primary variables, mean values included the following: total operative time 330 minutes, warm ischemia time 28 minutes, cold ischemia time 288 minutes, intraoperative blood loss 1.59 L, biliary complication 19.6%, unplanned return to operating room 19.3%, and vascular complication 2.3%. A structure for feedback to fellows was developed using a printed report card and through in-person meetings with faculty at 3-month intervals. CONCLUSIONS Comparative feedback using institution-specific reference curves can provide valuable objective data on progression of individual fellows. It can aid in the timely identification of areas in need of improvement, which enhances the quality of training and has the potential to improve patient care and transplantation outcomes.
Collapse
|
9
|
Pugh A, Ford T, Madsen T, Carlson C, Doyle G, Stephen R, Stroud S, Fix M. Impact of a financial incentive on the completion of educational metrics. Int J Emerg Med 2020; 13:60. [PMID: 33261553 PMCID: PMC7709397 DOI: 10.1186/s12245-020-00323-8] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/14/2020] [Accepted: 11/21/2020] [Indexed: 11/24/2022] Open
Abstract
Background The Accreditation Council for Graduate Medical Education (ACGME) requires all emergency medicine (EM) training programs to evaluate resident performance and also requires core faculty to attend didactic conference. Assuring faculty participation in these activities can be challenging. Previously, our institution did not have a formal tracking program nor financial incentive for participation in these activities. In 2017, we initiated an educational dashboard which tracked and published all full-time university faculty conference attendance and participation in resident evaluations and other educational activities. Objectives We sought to determine if the implementation of a financially-incentivized educational dashboard would lead to an increase in faculty conference attendance and the number of completed resident evaluations. Methods We conducted a pre- and post-intervention observational study at our EM residency training program between July 2017 and July 2019. Participants were 17 full-time EM attendings at one training site. We compared the number of completed online resident evaluations (MedHub) and number of conference days attended (call-in verification) before and after the introduction of our financial incentive in June 2018. The incentive required 100% completion of resident evaluations and at least 25% attendance at eligible didactic conference days. We calculated pre- and post-intervention averages, and comparisons were made using a chi-square test. Results Prior to implementation of the intervention, the 90-day resident evaluation completion rate was 71.8%. This increased to 100% after implementation (p < 0.001). Conference attendance prior to implementation was 43.8%, which remained unchanged at 41.3% after implementation of the financial incentive (p = 0.920). Conclusions Attaching a financial incentive to a tracked educational dashboard increased faculty participation in resident evaluations but did not change conference attendance. This difference likely reflects the minimum thresholds required to obtain the financial incentive.
Collapse
Affiliation(s)
- Andrew Pugh
- Division of Emergency Medicine, University of Utah, Salt Lake City, UT, USA.
| | - Tabitha Ford
- Division of Emergency Medicine, University of Utah, Salt Lake City, UT, USA
| | - Troy Madsen
- Division of Emergency Medicine, University of Utah, Salt Lake City, UT, USA
| | - Christine Carlson
- Division of Emergency Medicine, University of Utah, Salt Lake City, UT, USA
| | - Gerard Doyle
- Division of Emergency Medicine, University of Utah, Salt Lake City, UT, USA
| | - Robert Stephen
- Division of Emergency Medicine, University of Utah, Salt Lake City, UT, USA
| | - Susan Stroud
- Division of Emergency Medicine, University of Utah, Salt Lake City, UT, USA
| | - Megan Fix
- Division of Emergency Medicine, University of Utah, Salt Lake City, UT, USA
| |
Collapse
|
10
|
Cooper CJ, Wehner P, Dailey C, O’Connor N, Kleshinski J, Shapiro JI. Information within residency monthly evaluation forms at two institutions. MEDICAL EDUCATION ONLINE 2019; 24:1635844. [PMID: 31246539 PMCID: PMC6610528 DOI: 10.1080/10872981.2019.1635844] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/05/2019] [Revised: 06/18/2019] [Accepted: 06/18/2019] [Indexed: 06/09/2023]
Abstract
Periodic review of resident performance is an important aspect of residency training. Amongst allopathic residency programs, it is expected that the performance of resident physicians which can be grouped based on the ACGME core competencies, be assessed so as to allow for effective feedback and continuous improvement. Review of monthly evaluation forms for residents in the core ACGME programs at Marshall University and the University of Toledo demonstrated a wide spread in the number of Likert questions that faculty were asked to complete. This number ranged from a low of 7 in Surgery to a high of 65 in Psychiatry (both Marshall Programs). Correlation and network analysis were performed on these data. High degrees of correlations were noted between answers to questions (controlled for each resident) on these forms at both institutions. In other words, although evaluation scores varied tremendously amongst the different residents in all the programs studied, scores addressing different competencies tended to be very similar for the same resident, especially in some of the programs which were studied. Network analysis suggested that there were clusters of questions that produced essentially the same answer for a given resident, and these clusters were bigger in some of the different residency program assessment forms. This seemed to be more the rule in the residency programs with large numbers of Likert questions. The authors suggest that reducing the number of monthly questions used to address the core competencies in some programs may be possible without substantial loss of information.
Collapse
Affiliation(s)
- Christopher J. Cooper
- Department of Medicine, University of Toledo School of Medicine and Health Sciences, Toledo, USA
| | - Paulette Wehner
- Joan C. Edwards College of Medicine, Marshall University, Huntington, USA
| | - Cindy Dailey
- Joan C. Edwards College of Medicine, Marshall University, Huntington, USA
| | - Nanette O’Connor
- Department of Medicine, University of Toledo School of Medicine and Health Sciences, Toledo, USA
| | - James Kleshinski
- Department of Medicine, University of Toledo School of Medicine and Health Sciences, Toledo, USA
| | - Joseph I. Shapiro
- Joan C. Edwards College of Medicine, Marshall University, Huntington, USA
| |
Collapse
|