1
|
Cloney MB, Hopkins B, Roumeliotis A, El Tecle N, Dahdaleh NS. Variation in academic neurosurgery departments' #neurosurgery social media influence. World Neurosurg X 2023; 20:100232. [PMID: 37435398 PMCID: PMC10331579 DOI: 10.1016/j.wnsx.2023.100232] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2022] [Revised: 05/10/2023] [Accepted: 06/12/2023] [Indexed: 07/13/2023] Open
Abstract
Background Social media use is increasingly common among academic neurosurgery departments, but its relationship with academic metrics remains underexamined. Methods We examine the relationship between American academic neurosurgery departments' number of followers on Twitter, Instagram, and Facebook and the following academic metrics: Doximity Residency rankings, US News & World Report rankings (USNWR) of their affiliated medical schools, and the amount of NIH funding of those schools. Results Few departments had disproportionate number of followers. A greater proportion of programs had Twitter accounts (88.9%) than had Instagram (72.2%) or Facebook (51.9%) accounts (p=0.0001). Programs identified as "Influencers" had more departmental NIH funding (p=0.044), more institutional NIH funding (p=0.035), better Doximity residency rankings (p=0.044), and better affiliated medical school rankings (p=0.002). Number of Twitter followers had the strongest correlation with academic metrics, yet only modest correlations were identified to departmental NIH funding (R=0.496, p=0.0001), institutional NIH funding (R=0.387, p=0.0072), Doximity residency rank (R=0.411, p=0.0020), and affiliated medical school ranking (R=0.545,p<0.0001). On multivariable regression, only being affiliated with a medical school in the top quartile on the USNWR rankings, rather than neurosurgery departmental metrics, predicted having more Twitter (OR=5.666, p=0.012) and Instagram (OR=8.33, p=0.009) followers. Conclusion American academic neurosurgery departments preferentially use Twitter over Instagram or Facebook. Their Twitter or Instagram presences are associated with better performance on traditional academic metrics. However, these associations are modest, suggesting that other factors contribute to a department's social media influence. A department's affiliated medical school may contribute to the department's social media brand.
Collapse
|
2
|
Guetz B, Bidmon S. The Credibility of Physician Rating Websites: A Systematic Literature Review. Health Policy 2023; 132:104821. [PMID: 37084700 DOI: 10.1016/j.healthpol.2023.104821] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2022] [Revised: 04/05/2023] [Accepted: 04/11/2023] [Indexed: 04/23/2023]
Abstract
OBJECTIVES Increasingly, the credibility of online reviews is drawing critical attention due to the lack of control mechanisms, the constant debate about fake reviews and, last but not least, current developments in the field of artificial intelligence. For this reason, the aim of this study was to examine the extent to which assessments recorded on physician rating websites (PRWs) are credible, based on a comparison to other evaluation criteria. METHODS Referring to the PRISMA guidelines, a comprehensive literature search was conducted across different scientific databases. Data were synthesized by comparing individual statistical outcomes, objectives and conclusions. RESULTS The chosen search strategy led to a database of 36,755 studies of which 28 were ultimately included in the systematic review. The literature review yielded mixed results regarding the credibility of PRWs. While seven publications supported the credibility of PRWs, six publications found no correlation between PRWs and alternative datasets. 15 studies reported mixed results. CONCLUSIONS This study has shown that ratings on PRWs seem to be credible when relying primarily on patients' perception. However, these portals seem inadequate to represent alternative comparative values such as the medical quality of physicians. For health policy makers our results show that decisions based on patients' perceptions may be well supported by data from PRWs. For all other decisions, however, PRWs do not seem to contain sufficiently useful data.
Collapse
Affiliation(s)
- Bernhard Guetz
- Department of Marketing and International Management, Alpen-Adria- Universitaet Klagenfurt, Universitaetsstrasse 65-67, Klagenfurt am Woerthersee, 9020, Austria.
| | - Sonja Bidmon
- Department of Marketing and International Management, Alpen-Adria- Universitaet Klagenfurt, Universitaetsstrasse 65-67, Klagenfurt am Woerthersee, 9020, Austria
| |
Collapse
|
3
|
Butler LR, Tang JE, Hess SM, White CA, Arvind V, Kim JS, Allen AK, Ranade SC. Building better pediatric surgeons: A sentiment analysis of online physician review websites. J Child Orthop 2022; 16:498-504. [PMID: 36483646 PMCID: PMC9723867 DOI: 10.1177/18632521221133812] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 07/18/2022] [Accepted: 10/03/2022] [Indexed: 02/03/2023] Open
Abstract
PURPOSE Physician review websites are a heavily utilized patient tool for finding, rating, and reviewing surgeons. Natural language processing such as sentiment analysis provides a comprehensive approach to better understand the nuances of patient perception. This study utilizes sentiment analysis to examine how specific patient sentiments correspond to positive and negative experiences in online reviews of pediatric orthopedic surgeons. METHODS The online written reviews and star ratings of pediatric surgeons belonging to the Pediatric Orthopaedic Society of North America were obtained from healthgrades.com. A sentiment analysis package obtained compound scores of each surgeon's reviews. Inferential statistics analyzed relationships between demographic variables and star/sentiment scores. Word frequency analyses and multiple logistic regression analyses were performed on key terms. RESULTS A total of 749 pediatric surgeons (3830 total online reviews) were included. 80.8% were males and 33.8% were below 50 years of age. Male surgeons and younger surgeons had higher mean star ratings. Surgeon attributes including "confident" (p < 0.01) and "comfortable" (p < 0.01) improved the odds of positive reviews, while "rude" (p < 0.01) and "unprofessional" (p < 0.01) decreased these odds. Comments regarding "pain" lowered the odds of positive reviews (p < 0.01), whereas "pain-free" increased these odds (p < 0.01). CONCLUSION Pediatric surgeons who were younger, communicated effectively, eased pain, and curated a welcoming office setting were more likely to receive positively written online reviews. This suggests that a spectrum of interpersonal and ancillary factors impact patient experience and perceptions beyond surgical skill. These outcomes can advise pediatric surgeons on behavioral and office qualities that patients and families prioritize when rating/recommending surgeons online. LEVEL OF EVIDENCE IV.
Collapse
Affiliation(s)
| | | | | | | | | | | | | | - Sheena C Ranade
- Sheena C Ranade, Department of Orthopaedic Surgery, Icahn School of Medicine at Mount Sinai, 5 East 98th Street, New York, NY 10029, USA.
| |
Collapse
|
4
|
Quinones A, Tang JE, Vasan V, Li T, Schupper AJ, Ali M, White CA, Hannah TC, Asfaw Z, Li AY, Durbin J, Arvind V, Kim JS, Choudhri TF. Trends in Online Patient Perspectives of Neurosurgeons: A Sentiment Analysis. NEUROSURGERY OPEN 2022. [DOI: 10.1227/neuopn.0000000000000023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/05/2023] Open
|
5
|
Gupta A, Gupta R, White MD, Reddy V, Chang YF, Agarwal P, Alan N, Agarwal N. Patient satisfaction reviews for 967 spine neurosurgeons on Healthgrades. J Neurosurg Spine 2022; 36:869-875. [PMID: 34891133 DOI: 10.3171/2021.8.spine21661] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2021] [Accepted: 08/27/2021] [Indexed: 11/06/2022]
Abstract
OBJECTIVE Patients are increasingly relying on independent physician rating websites (PRWs) to obtain information about healthcare providers. Healthgrades.com is a widely used PRW that allows patients to rate physicians on various metrics of performance and quality of care. This study categorically investigated the correlations between demographics of spine neurosurgeons and online ratings on Healthgrades to better understand the factors driving patient satisfaction in spine surgery in the United States. METHODS In August-December 2019, the authors performed a retrospective data analysis using Healthgrades. The American Association of Neurological Surgeons (AANS) membership database was used to identify spine neurosurgeons in the United States and extract biographical and career data. Individuals with an academic practice were further investigated for academic rank, leadership, and fellowship training. Scores from eight patient satisfaction metrics (PSMs) were collected for each surgeon from Healthgrades. RESULTS A total of 967 spine neurosurgeons were included in the study cohort. Patient satisfaction did not correlate with sex, PhD acquisition, academic status, or academic rank. Among those who were academic surgeons, completion of fellowship training was associated with higher ratings. Geographical location of practice did not influence patient satisfaction. Prolonged wait time was an independent predictor of decreased patient satisfaction and was a key confounding variable underlying trends seen with advanced career duration and age. CONCLUSIONS Overall, patients rated spine neurosurgeons highly favorably on the Healthgrades website. Due to the emerging role of PRWs in locating and assessing providers, it is important for both patients and clinicians to understand the factors that impact patient experience.
Collapse
Affiliation(s)
- Arjun Gupta
- 1Department of Neurological Surgery, Rutgers New Jersey Medical School, Newark, New Jersey
| | - Radhika Gupta
- 2Department of Neurological Surgery, University of Pennsylvania Perelman School of Medicine, Philadelphia, Pennsylvania
| | - Michael D White
- 3Department of Neurological Surgery, Barrow Neurologic Institute, Phoenix, Arizona
| | - Vamsi Reddy
- 4Department of Neurological Surgery, University of Texas Health San Antonio, Texas
| | - Yue-Fang Chang
- 5Department of Neurological Surgery, University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania; and
| | - Prateek Agarwal
- 5Department of Neurological Surgery, University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania; and
| | - Nima Alan
- 5Department of Neurological Surgery, University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania; and
| | - Nitin Agarwal
- 6Department of Neurological Surgery, Washington University School of Medicine, St. Louis, Missouri
| |
Collapse
|
6
|
Lamano JB, Riestenberg RA, Haskell-Mendoza AP, Lee D, Sharp MT, Bloch O. Correlation between social media utilization by academic neurosurgery departments and higher online patient ratings. J Neurosurg 2021:1-13. [PMID: 34678765 DOI: 10.3171/2021.6.jns2122] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2021] [Accepted: 06/11/2021] [Indexed: 11/06/2022]
Abstract
OBJECTIVE Patients increasingly utilize online physician review websites (PRWs) and social media to inform healthcare-related decisions. This provides neurosurgeons with opportunities for increased patient engagement. And despite the growing use of social media among neurosurgeons, the relationship between social media utilization and online reviews remains unknown. The goal of this study was to characterize the relationship between social media utilization and PRW ratings across academic neurosurgery departments. METHODS Social media accounts (Twitter, Facebook, YouTube, Instagram) of academic neurosurgery departments were identified. Online reviews for individual faculty were obtained from Healthgrades, Vitals, WebMD, and Google. Reviews were aggregated to identify the total number of reviews per department, to generate a composite departmental rating, and to calculate a summed departmental score. US News & World Report (USNWR) and Doximity rankings were recorded for each department. Social media utilization by individual neurosurgeons and associated ratings were investigated within the departments with the highest social media utilization. RESULTS Seventy-eight percent of academic neurosurgery departments utilized social media. The most prevalent platform was YouTube (49.1%), followed by Twitter (46.5%), Facebook (38.6%), and Instagram (16.7%). Higher patient ratings on PRWs were associated with the utilization of YouTube (p = 0.048) or Twitter (p = 0.02). The number of social media platforms utilized demonstrated a significant, positive correlation with patient ratings (p = 0.006) and summed patient ratings (p = 0.048). Although USNWR (p = 0.02) and Doximity (p = 0.0008) rankings correlated with patient ratings, only the number of social media platforms utilized remained a significant predictor of patient ratings on multivariate analysis (p = 0.0001). Thirty-one percent of academic neurosurgeons from departments with high social media utilization were active on social media. The most prevalent social media platform among individual neurosurgeons was Twitter (27.4%), followed by Instagram (8.4%), Facebook (4.9%), and YouTube (2.2%). Higher summed patient scores were associated with individual neurosurgeon utilization of YouTube (p = 0.04), Facebook (p < 0.0001), and Instagram (p = 0.01). Increased social media utilization among neurosurgeons was correlated with a greater number of patient reviews (p = 0.006) and higher summed patient scores (p = 0.003). On multivariate analysis, only Facebook use remained a significant predictor of the number of patient reviews received (p = 0.002) and summed patient satisfaction scores (p < 0.001). CONCLUSIONS An increased social media presence is associated with higher ratings on PRWs. As neurosurgeons continue to expand their online presence, they should be aware of the possible impact of social media on online patient reviews.
Collapse
Affiliation(s)
- Jonathan B Lamano
- 1Northwestern University, Feinberg School of Medicine, Chicago, Illinois; and
| | - Robert A Riestenberg
- 2Department of Neurological Surgery, University of California, Davis, Sacramento, California
| | - Aden P Haskell-Mendoza
- 2Department of Neurological Surgery, University of California, Davis, Sacramento, California
| | - Dennis Lee
- 2Department of Neurological Surgery, University of California, Davis, Sacramento, California
| | - Michael T Sharp
- 2Department of Neurological Surgery, University of California, Davis, Sacramento, California
| | - Orin Bloch
- 2Department of Neurological Surgery, University of California, Davis, Sacramento, California
| |
Collapse
|
7
|
Emmert M, McLennan S. One Decade of Online Patient Feedback: Longitudinal Analysis of Data From a German Physician Rating Website. J Med Internet Res 2021; 23:e24229. [PMID: 34309579 PMCID: PMC8367114 DOI: 10.2196/24229] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2020] [Revised: 12/21/2020] [Accepted: 06/30/2021] [Indexed: 01/13/2023] Open
Abstract
Background Feedback from patients is an essential element of a patient-oriented health care system. Physician rating websites (PRWs) are a key way patients can provide feedback online. This study analyzes an entire decade of online ratings for all medical specialties on a German PRW. Objective The aim of this study was to examine how ratings posted on a German PRW have developed over the past decade. In particular, it aimed to explore (1) the distribution of ratings according to time-related aspects (year, month, day of the week, and hour of the day) between 2010 and 2019, (2) the number of physicians with ratings, (3) the average number of ratings per physician, (4) the average rating, (5) whether differences exist between medical specialties, and (6) the characteristics of the patients rating physicians. Methods All scaled-survey online ratings that were posted on the German PRW jameda between 2010 and 2019 were obtained. Results In total, 1,906,146 ratings were posted on jameda between 2010 and 2019 for 127,921 physicians. The number of rated physicians increased constantly from 19,305 in 2010 to 82,511 in 2018. The average number of ratings per rated physicians increased from 1.65 (SD 1.56) in 2010 to 3.19 (SD 4.69) in 2019. Overall, 75.2% (1,432,624/1,906,146) of all ratings were in the best rating category of “very good,” and 5.7% (107,912/1,906,146) of the ratings were in the lowest category of “insufficient.” However, the mean of all ratings was 1.76 (SD 1.53) on the German school grade 6-point rating scale (1 being the best) with a relatively constant distribution over time. General practitioners, internists, and gynecologists received the highest number of ratings (343,242, 266,899, and 232,914, respectively). Male patients, those of higher age, and those covered by private health insurance gave significantly (P<.001) more favorable evaluations compared to their counterparts. Physicians with a lower number of ratings tended to receive ratings across the rating scale, while physicians with a higher number of ratings tended to have better ratings. Physicians with between 21 and 50 online ratings received the lowest ratings (mean 1.95, SD 0.84), while physicians with >100 ratings received the best ratings (mean 1.34, SD 0.47). Conclusions This study is one of the most comprehensive analyses of PRW ratings to date. More than half of all German physicians have been rated on jameda each year since 2016, and the overall average number of ratings per rated physicians nearly doubled over the decade. Nevertheless, we could also observe a decline in the number of ratings over the last 2 years. Future studies should investigate the most recent development in the number of ratings on both other German and international PRWs as well as reasons for the heterogeneity in online ratings by medical specialty.
Collapse
Affiliation(s)
- Martin Emmert
- Institute for Healthcare Management & Health Sciences, University of Bayreuth, Bayreuth, Germany
| | - Stuart McLennan
- Institute of History and Ethics in Medicine, Technical University of Munich, Munich, Germany.,Institute for Biomedical Ethics, University of Basel, Basel, Switzerland
| |
Collapse
|
8
|
Pruvis TE, Holzman S, Hess DK, Levin SC, Maher DP. Online Ratings of Pain Physicians in a Regional Population: What Matters? PAIN MEDICINE 2021; 21:1743-1748. [PMID: 32626891 DOI: 10.1093/pm/pnaa173] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Affiliation(s)
| | - Samuel Holzman
- Division of Infectious Diseases, Department of Internal Medicine, Johns Hopkins School of Medicine, Baltimore, Maryland
| | - Demere Kasper Hess
- Division of Chronic Pain Management, Department of Anesthesiology and Critical Care Medicine, Johns Hopkins School of Medicine, Baltimore, Maryland, USA
| | - Steven C Levin
- Division of Chronic Pain Management, Department of Anesthesiology and Critical Care Medicine, Johns Hopkins School of Medicine, Baltimore, Maryland, USA
| | - Dermot P Maher
- Division of Chronic Pain Management, Department of Anesthesiology and Critical Care Medicine, Johns Hopkins School of Medicine, Baltimore, Maryland, USA
| |
Collapse
|
9
|
Basa K, Jabbour N, Rohlfing M, Schmoker S, Lawlor CM, Levi J, Sobin L, Tracy JC, Tracy LF. Online Reputations: Comparing Hospital- and Patient-Generated Ratings in Academic Otolaryngology. Ann Otol Rhinol Laryngol 2021; 130:1317-1325. [PMID: 33813874 DOI: 10.1177/00034894211005985] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
OBJECTIVES This study compares hospital-generated online ratings to patient-generated online ratings in academic otolaryngology and evaluates physician factors influencing these results. METHODS Websites of academic otolaryngologists were assessed for inclusion of hospital-generated Press Ganey surveys. Corresponding scores on Healthgrades and Vitals.com were identified via internet search. Hospital ratings were compared with patient-generated ratings, including score, demographics, and number of ratings. All data was collected between July 15th 2019 and August 22nd 2019. RESULTS 742 academic otolaryngologists with hospital-generated ratings were identified. Mean hospital-generated rating was significantly higher ((4.70, 95% CI 4.69-4.72) than patient-generated rating (Vitals:4.26, 95% CI 4.18-4.34, and Healthgrades:4.02, 95% CI 3.87-4.18; P < .001). In patient-generated rating, an increased number of rating scores (>20) was associated with male gender, professor ranking, and >30 years in practice (P < .005). Physician demographics did not impact number of ratings in hospital-generated setting. With patient-generated, lower aggregate score was associated with professor ranking (P = .001). In hospital-generated, lower score was associated with >30+ years in practice (P = .023). Across all platforms, comprehensive otolaryngologists and neurotologists/otologists were rated lower in comparison to other specialties (PGS:P < .001,Vitals:P = .027,Healthgrades:P = .016). CONCLUSION Hospital-generated ratings yield higher mean scores than patient-generated platforms. Between sources, Healthgrades.com scores were lower than those of Vitals.com. Professors with >30 years of practice generated more reviews in patient-generated ratings, and these physicians were generally rated lower. Access to patient-generated ratings is universal and physicians should be aware of variability between online rating platforms as scores may affect referrals and practice patterns.
Collapse
Affiliation(s)
- Krystyne Basa
- Department of Otolaryngology-Head and Neck Surgery, Boston Medical Center, Boston, MA, USA
| | | | - Matthew Rohlfing
- Department of Otolaryngology-Head and Neck Surgery, Boston Medical Center, Boston, MA, USA
| | | | - Claire M Lawlor
- Department of Otolaryngology-Head and Neck Surgery, Children's National Medical Center, Washington, DC, USA
| | - Jessica Levi
- Department of Otolaryngology-Head and Neck Surgery, Boston Medical Center, Boston, MA, USA.,Boston University School of Medicine, Boston, MA, USA
| | - Lindsay Sobin
- Department of Otolaryngology-Head and Neck Surgery, University of Massachusetts Medical School, Worcester, MA, USA
| | - Jeremiah C Tracy
- Department of Otolaryngology-Head and Neck Surgery, Tufts Medical Center, Boston, MA, USA
| | - Lauren F Tracy
- Department of Otolaryngology-Head and Neck Surgery, Boston Medical Center, Boston, MA, USA.,Boston University School of Medicine, Boston, MA, USA
| |
Collapse
|
10
|
Correlates of Google Search Rankings for Spine Surgeons: An Analysis of Academic Pedigree, Social Media Presence, and Patient Ratings. Spine (Phila Pa 1976) 2020; 45:1376-1381. [PMID: 32453226 DOI: 10.1097/brs.0000000000003567] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
STUDY DESIGN Prospective observational study. OBJECTIVE The objective of this study is to identify correlates of search ranking among academic pedigree, online ratings, and social media following. SUMMARY OF BACKGROUND DATA Patients increasingly rely on online search in selecting healthcare providers. When choosing a spine surgeon, patients typically value surgical skill and experience as well as demeanor/bedside manner. It is unclear whether current search engine ranking algorithms reflect these preferences. METHODS A Google.com search for the top 25 spine surgeon websites by search ranking was conducted for each of the largest 25 American cities. Resulting websites were then perused for academic pedigree, experience, and practice characteristics. Surgeons' research output and impact were then quantified via number of publications and H-index. Online ratings and followers in various social media outlets were also noted. These variables were assessed as possible correlates of search ranking via linear regression and multivariate analyses of variance. RESULTS A total of 625 surgeons were included. Three categorical variables were identified as significant correlates of higher mean Google search ranking-orthopedics (vs. neurosurgery) as a surgical specialty (P = 0.023), board certification (P = 0.024), and graduation from a top 40 residency program (P = 0.046). Although the majority of the identified surgeons received an allopathic medical education, there was no significant difference in the mean rank of surgeons who had an MD versus DO medical degree (P = 0.530). Additionally, none of the continuous variables collected, including years in practice (P = 0.947), publications (P = 0.527), H-index (P = 0.278), social media following such as on Facebook (P = 0.105), or online ratings such as on Healthgrades (P = 0.080), were significant correlates of Google search ranking. CONCLUSIONS Google search rankings do not always align with patient preferences, currently promoting orthopedic over neurosurgical specialists, graduation from top residency programs, and board certification, while largely ignoring academic pedigree, research, social media presence, and online ratings. LEVEL OF EVIDENCE 3.
Collapse
|
11
|
Goshtasbi K, Lehrich BM, Abouzari M, Bazyani D, Abiri A, Papagiannopoulos P, Tajudeen BA, Kuan EC. Academic Rhinologists' Online Rating and Perception, Scholarly Productivity, and Industry Payments. Am J Rhinol Allergy 2020; 35:341-347. [PMID: 32915651 DOI: 10.1177/1945892420958366] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
INTRODUCTION The emergence of popular online rating websites, social media platforms, and public databases for industry payments and scholarly outputs provide a complete physician online presence which may guide choice and satisfaction. METHODS Websites of all U.S. otolaryngology academic institutions were queried for fellowship-trained rhinologists. Additional well-known and academically active rhinologists were identified by the senior author. Online ratings and comments were collected from Google, Healthgrades, Vitals, and RateMD websites, and weighted rating scores (RS) were calculated on a 1-5 scale. RESULTS A total of 210 rhinologists with 16 ± 9 years of practice were included, where 6901 online ratings (33 ± 47 per rhinologist) provided an average RS of 4.3 ± 0.6. RS was not different according to gender (p = 0.58), geographic quartile (p = 0.48), social media presence (p = 0.41), or attending top-ranked medical school (p = 0.86) or residency programs (p = 0.89). Years of practice negatively correlated with RS (R = -0.22, p<0.01), and academic ranking significantly influenced RS, with professors, associate professors, and assistant professors scoring 4.1 ± 0.6, 4.3 ± 0.4, and 4.4 ± 0.6, respectively (p = 0.03). Of the 3,304 narrative comments analyzed (3.1 ± 11.6 per rhinologist), 76% (positive) and 7% (negative) had elements of clinical knowledge/outcomes, 56% (positive) and 7% (negative) of communication/bedside manner, and 9% (positive) and 7% (negative) of office staff, cost, and wait-time. All negative comment categories had moderate negative correlation with RS, while positive comment categories regarding knowledge/competence and bedside manner weakly correlated with higher RS. Number of publications (48 ± 54) positively correlated with 2018 industry payments ($11,384 ± $19,025) among those receiving industry compensation >$300 (n = 113). Attending a top-ranked medical school was associated with higher industry payments (p<0.01) and H-index (p = 0.02). CONCLUSION Academic rhinologists' online RS was not associated with gender, geographic location, or attending a top-ranked training program, and their scholarly productivity was significantly correlated with total industry payments.
Collapse
Affiliation(s)
- Khodayar Goshtasbi
- Department of Otolaryngology-Head and Neck Surgery, University of California, Irvine, California
| | - Brandon M Lehrich
- Department of Otolaryngology-Head and Neck Surgery, University of California, Irvine, California
| | - Mehdi Abouzari
- Department of Otolaryngology-Head and Neck Surgery, University of California, Irvine, California
| | - Dariush Bazyani
- Department of Otolaryngology-Head and Neck Surgery, University of California, Irvine, California
| | - Arash Abiri
- Department of Otolaryngology-Head and Neck Surgery, University of California, Irvine, California
| | | | - Bobby A Tajudeen
- Department of Otolaryngology-Head and Neck Surgery, Rush University, Chicago, Illinois
| | - Edward C Kuan
- Department of Otolaryngology-Head and Neck Surgery, University of California, Irvine, California
| |
Collapse
|
12
|
Freundlich RE, Li G, Grant B, St Jacques P, Sandberg WS, Ehrenfeld JM, Shotwell MS, Wanderer JP. Patient satisfaction survey scores are not an appropriate metric to differentiate performance among anesthesiologists. J Clin Anesth 2020; 65:109814. [PMID: 32388457 DOI: 10.1016/j.jclinane.2020.109814] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2019] [Revised: 02/26/2020] [Accepted: 04/04/2020] [Indexed: 12/11/2022]
Abstract
STUDY OBJECTIVE With the focus of patient-centered care in healthcare organizations, patient satisfaction plays an increasingly important role in healthcare quality measurement. We sought to determine whether an automated patient satisfaction survey could be effectively used to identify outlying anesthesiologists. DESIGN Retrospective Observational Study. SETTING Vanderbilt University Medical Center (VUMC). MEASUREMENTS Patient satisfaction data were obtained between October 24, 2016 and November 1, 2017. A multivariable ordered probit regression was conducted to evaluate the relationship between the mean scores of responses to Likert-scale questions on SurveyVitals' Anesthesia Patient Satisfaction Questionnaire 2. Fixed effects included demographics, clinical variables, providers and surgeons. Hypothesis tests to compare each individual anesthesiologist with the median-performing anesthesiologist were conducted. MAIN RESULTS We analyzed 10,528 surveys, with a 49.5% overall response rate. Younger patient (odds ratio (OR) 1.011 [per year of age]; 95% confidence interval (CI) 1.008 to 1.014; p < 0.001), regional anesthesia (versus general anesthesia) (OR 1.695; 95% CI 1.186 to 2.422; p = 0.004) and daytime surgery (versus nighttime surgery) (OR 1.795; 95% CI 1.091 to 2.959; p = 0.035) were associated with higher satisfaction scores. Compared with the median-ranked anesthesiologist, we found the adjusted odds ratio for an increase in satisfaction score ranged from 0.346 (95% CI 0.158 to 0.762) to 1.649 (95% CI 0.687 to 3.956) for the lowest and highest scoring providers, respectively. Only 10.10% of anesthesiologists at our institution had an odds ratio for satisfaction with a 95% CI not inclusive of 1. CONCLUSIONS Patient satisfaction is impacted by multiple factors. There was very little information in patient satisfaction scores to discriminate the providers, after adjusting for confounding. While patient satisfaction scores may facilitate identification of extreme outliers among anesthesiologists, there is no evidence that this metric is useful for the routine evaluation of individual provider performance.
Collapse
Affiliation(s)
- Robert E Freundlich
- Department of Anesthesiology, Department of Biomedical Informatics, Vanderbilt University Medical Center, United States of America
| | - Gen Li
- Department of Anesthesiology, Vanderbilt University Medical Center, United States of America.
| | | | - Paul St Jacques
- Department of Anesthesiology, Department of Biomedical Informatics, Vanderbilt University Medical Center, United States of America
| | - Warren S Sandberg
- Department of Anesthesiology, Department of Biomedical Informatics, Department of Surgery, Vanderbilt University Medical Center, United States of America
| | - Jesse M Ehrenfeld
- Department of Anesthesiology, Medical College of Wisconsin, United States of America
| | - Matthew S Shotwell
- Department of Biostatistics, Department of Anesthesiology, Vanderbilt University Medical Center, United States of America
| | - Jonathan P Wanderer
- Department of Anesthesiology, Department of Biomedical Informatics, Vanderbilt University Medical Center, United States of America
| |
Collapse
|
13
|
Dahdaleh NS. In Reply: Online Ratings of Neurosurgeons: An Examination of Web Data and its Implications. Neurosurgery 2019; 85:E166. [PMID: 31049570 DOI: 10.1093/neuros/nyz119] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Affiliation(s)
- Nader S Dahdaleh
- Department of Neurological Surgery Northwestern University Feinberg School of Medicine Chicago, Illinois
| |
Collapse
|
14
|
Vilanilam GC. Letter: Online Ratings of Neurosurgeons: An Examination of Web Data and its Implications. Neurosurgery 2019; 85:E165. [PMID: 31044233 DOI: 10.1093/neuros/nyz118] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/12/2022] Open
Affiliation(s)
- George C Vilanilam
- Department of Neurosurgery Sree Chitra Tirunal Institute for Medical Sciences and Technology Trivandrum, India
| |
Collapse
|
15
|
Liu C, Uffenheimer M, Nasseri Y, Cohen J, Ellenhorn J. "But His Yelp Reviews Are Awful!": Analysis of General Surgeons' Yelp Reviews. J Med Internet Res 2019; 21:e11646. [PMID: 31038463 PMCID: PMC6658237 DOI: 10.2196/11646] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/23/2018] [Revised: 11/11/2018] [Accepted: 01/23/2019] [Indexed: 01/27/2023] Open
Abstract
Background Patients use Web-based platforms to review general surgeons. However, little is known about the free-form text and structured content of the reviews or how they relate to the physicians’ characteristics or their practices. Objective This observational study aimed to analyze the Web-based reviews of general surgeons on the west side of Los Angeles. Methods Demographics, practice characteristics, and Web-based presence were recorded. We evaluated frequency and types of Yelp reviews and assigned negative remarks to 5 categories. Tabulated results were evaluated using independent t test, one-way analysis of variance, and Pearson correlation analysis to determine associations between the number of total and negative reviews with respect to practice structure and physician characteristics. Results Of the 146 general surgeons, 51 (35%) had at least 1 review and 29 (20%) had at least 1 negative review. There were 806 total reviews, 679 (84.2%) positive reviews and 127 (15.8%) negative reviews. The negative reviews contained a total of 376 negative remarks, categorized into physician demeanor (124/376, 32.9%), clinical outcomes (81/376, 22%), office or staff (83/376, 22%), scheduling (44/376, 12%), and billing (44/376, 12%). Surgeons with a professional website had significantly more reviews than those without (P=.003). Surgeons in private practice had significantly more reviews (P=.002) and more negative reviews (P=.03) than surgeons who were institution employed. A strong and direct correlation was found between a surgeon’s number of reviews and number of negative reviews (P<.001). Conclusions As the most common category of complaints was about physician demeanor, surgeons may optimize their Web-based reputation by improving their bedside manner. A surgeon’s Web presence, private practice, and the total number of reviews are significantly associated with both positive and negative reviews.
Collapse
Affiliation(s)
- Cynthia Liu
- The Surgery Group of Los Angeles, Research Foundation, Los Angeles, CA, United States
| | - Meka Uffenheimer
- The Surgery Group of Los Angeles, Research Foundation, Los Angeles, CA, United States
| | - Yosef Nasseri
- The Surgery Group of Los Angeles, Research Foundation, Los Angeles, CA, United States
| | - Jason Cohen
- The Surgery Group of Los Angeles, Research Foundation, Los Angeles, CA, United States
| | - Joshua Ellenhorn
- The Surgery Group of Los Angeles, Research Foundation, Los Angeles, CA, United States
| |
Collapse
|
16
|
Hong YA, Liang C, Radcliff TA, Wigfall LT, Street RL. What Do Patients Say About Doctors Online? A Systematic Review of Studies on Patient Online Reviews. J Med Internet Res 2019; 21:e12521. [PMID: 30958276 PMCID: PMC6475821 DOI: 10.2196/12521] [Citation(s) in RCA: 51] [Impact Index Per Article: 10.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2018] [Revised: 12/16/2018] [Accepted: 01/31/2019] [Indexed: 01/20/2023] Open
Abstract
Background The number of patient online reviews (PORs) has grown significantly, and PORs have played an increasingly important role in patients’ choice of health care providers. Objective The objective of our study was to systematically review studies on PORs, summarize the major findings and study characteristics, identify literature gaps, and make recommendations for future research. Methods A major database search was completed in January 2019. Studies were included if they (1) focused on PORs of physicians and hospitals, (2) reported qualitative or quantitative results from analysis of PORs, and (3) peer-reviewed empirical studies. Study characteristics and major findings were synthesized using predesigned tables. Results A total of 63 studies (69 articles) that met the above criteria were included in the review. Most studies (n=48) were conducted in the United States, including Puerto Rico, and the remaining were from Europe, Australia, and China. Earlier studies (published before 2010) used content analysis with small sample sizes; more recent studies retrieved and analyzed larger datasets using machine learning technologies. The number of PORs ranged from fewer than 200 to over 700,000. About 90% of the studies were focused on clinicians, typically specialists such as surgeons; 27% covered health care organizations, typically hospitals; and some studied both. A majority of PORs were positive and patients’ comments on their providers were favorable. Although most studies were descriptive, some compared PORs with traditional surveys of patient experience and found a high degree of correlation and some compared PORs with clinical outcomes but found a low level of correlation. Conclusions PORs contain valuable information that can generate insights into quality of care and patient-provider relationship, but it has not been systematically used for studies of health care quality. With the advancement of machine learning and data analysis tools, we anticipate more research on PORs based on testable hypotheses and rigorous analytic methods. Trial Registration International Prospective Register of Systematic Reviews (PROSPERO) CRD42018085057; https://www.crd.york.ac.uk/PROSPERO/display_record.php?RecordID=85057 (Archived by WebCite at http://www.webcitation.org/76ddvTZ1C)
Collapse
Affiliation(s)
- Y Alicia Hong
- Department of Health Administration and Policy, George Mason University, Fairfax, VA, United States.,School of Public Health, Texas A&M University, College Station, TX, United States
| | - Chen Liang
- Arnold School of Public Health, University of South Carolina, Columbia, SC, United States
| | - Tiffany A Radcliff
- School of Public Health, Texas A&M University, College Station, TX, United States
| | - Lisa T Wigfall
- Department of Health Kinesiology, Texas A&M University, College Station, TX, United States
| | - Richard L Street
- Department of Communication, Texas A&M University, College Station, TX, United States
| |
Collapse
|
17
|
Evans RW. Negative Online Patient Reviews in Headache Medicine. Headache 2018; 58:1435-1441. [DOI: 10.1111/head.13419] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/28/2018] [Revised: 07/08/2018] [Accepted: 07/08/2018] [Indexed: 12/01/2022]
|