1
|
Ho RA, Shaari AL, Cowan PT, Yan K. ChatGPT Responses to Frequently Asked Questions on Ménière's Disease: A Comparison to Clinical Practice Guideline Answers. OTO Open 2024; 8:e163. [PMID: 38974175 PMCID: PMC11225079 DOI: 10.1002/oto2.163] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/04/2024] [Revised: 06/01/2024] [Accepted: 06/08/2024] [Indexed: 07/09/2024] Open
Abstract
Objective Evaluate the quality of responses from Chat Generative Pre-Trained Transformer (ChatGPT) models compared to the answers for "Frequently Asked Questions" (FAQs) from the American Academy of Otolaryngology-Head and Neck Surgery (AAO-HNS) Clinical Practice Guidelines (CPG) for Ménière's disease (MD). Study Design Comparative analysis. Setting The AAO-HNS CPG for MD includes FAQs that clinicians can give to patients for MD-related questions. The ability of ChatGPT to properly educate patients regarding MD is unknown. Methods ChatGPT-3.5 and 4.0 were each prompted with 16 questions from the MD FAQs. Each response was rated in terms of (1) comprehensiveness, (2) extensiveness, (3) presence of misleading information, and (4) quality of resources. Readability was assessed using Flesch-Kincaid Grade Level (FKGL) and Flesch Reading Ease Score (FRES). Results ChatGPT-3.5 was comprehensive in 5 responses whereas ChatGPT-4.0 was comprehensive in 9 (31.3% vs 56.3%, P = .2852). ChatGPT-3.5 and 4.0 were extensive in all responses (P = 1.0000). ChatGPT-3.5 was misleading in 5 responses whereas ChatGPT-4.0 was misleading in 3 (31.3% vs 18.75%, P = .6851). ChatGPT-3.5 had quality resources in 10 responses whereas ChatGPT-4.0 had quality resources in 16 (62.5% vs 100%, P = .0177). AAO-HNS CPG FRES (62.4 ± 16.6) demonstrated an appropriate readability score of at least 60, while both ChatGPT-3.5 (39.1 ± 7.3) and 4.0 (42.8 ± 8.5) failed to meet this standard. All platforms had FKGL means that exceeded the recommended level of 6 or lower. Conclusion While ChatGPT-4.0 had significantly better resource reporting, both models have room for improvement in being more comprehensive, more readable, and less misleading for patients.
Collapse
Affiliation(s)
- Rebecca A. Ho
- Department of Otolaryngology–Head and Neck SurgeryRutgers New Jersey Medical SchoolNewarkNew JerseyUSA
| | - Ariana L. Shaari
- Department of Otolaryngology–Head and Neck SurgeryRutgers New Jersey Medical SchoolNewarkNew JerseyUSA
| | - Paul T. Cowan
- Department of Otolaryngology–Head and Neck SurgeryRutgers New Jersey Medical SchoolNewarkNew JerseyUSA
| | - Kenneth Yan
- Department of Otolaryngology–Head and Neck SurgeryRutgers New Jersey Medical SchoolNewarkNew JerseyUSA
| |
Collapse
|
2
|
Connors C, Gupta K, Khusid JA, Khargi R, Yaghoubian AJ, Levy M, Gallante B, Atallah W, Gupta M. Evaluation of the Current Status of Artificial Intelligence for Endourology Patient Education: A Blind Comparison of ChatGPT and Google Bard Against Traditional Information Resources. J Endourol 2024. [PMID: 38441078 DOI: 10.1089/end.2023.0696] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/06/2024] Open
Abstract
Introduction: Artificial intelligence (AI) platforms such as ChatGPT and Bard are increasingly utilized to answer patient health care questions. We present the first study to blindly evaluate AI-generated responses to common endourology patient questions against official patient education materials. Methods: Thirty-two questions and answers spanning kidney stones, ureteral stents, benign prostatic hyperplasia (BPH), and upper tract urothelial carcinoma were extracted from official Urology Care Foundation (UCF) patient education documents. The same questions were input into ChatGPT 4.0 and Bard, limiting responses to within ±10% of the word count of the corresponding UCF response to ensure fair comparison. Six endourologists blindly evaluated responses from each platform using Likert scales for accuracy, clarity, comprehensiveness, and patient utility. Reviewers identified which response they believed was not AI generated. Finally, Flesch-Kincaid Reading Grade Level formulas assessed the readability of each platform response. Ratings were compared using analysis of variance (ANOVA) and chi-square tests. Results: ChatGPT responses were rated the highest across all categories, including accuracy, comprehensiveness, clarity, and patient utility, while UCF answers were consistently scored the lowest, all p < 0.01. A subanalysis revealed that this trend was consistent across question categories (i.e., kidney stones, BPH, etc.). However, AI-generated responses were more likely to be classified at an advanced reading level, while UCF responses showed improved readability (college or higher reading level: ChatGPT = 100%, Bard = 66%, and UCF = 19%), p < 0.001. When asked to identify which answer was not AI generated, 54.2% of responses indicated ChatGPT, 26.6% indicated Bard, and only 19.3% correctly identified it as the UCF response. Conclusions: In a blind evaluation, AI-generated responses from ChatGPT and Bard surpassed the quality of official patient education materials in endourology, suggesting that current AI platforms are already a reliable resource for basic urologic care information. AI-generated responses do, however, tend to require a higher reading level, which may limit their applicability to a broader audience.
Collapse
Affiliation(s)
- Christopher Connors
- Department of Urology, Icahn School of Medicine at Mount Sinai, New York, New York, USA
| | - Kavita Gupta
- Department of Urology, Icahn School of Medicine at Mount Sinai, New York, New York, USA
| | - Johnathan A Khusid
- Department of Urology, Icahn School of Medicine at Mount Sinai, New York, New York, USA
| | - Raymond Khargi
- Department of Urology, Icahn School of Medicine at Mount Sinai, New York, New York, USA
| | - Alan J Yaghoubian
- Department of Urology, David Geffen School of Medicine at University of California, Los Angeles, California, USA
| | - Micah Levy
- Department of Urology, Icahn School of Medicine at Mount Sinai, New York, New York, USA
| | - Blair Gallante
- Department of Urology, Icahn School of Medicine at Mount Sinai, New York, New York, USA
| | - William Atallah
- Department of Urology, Icahn School of Medicine at Mount Sinai, New York, New York, USA
| | - Mantu Gupta
- Department of Urology, Icahn School of Medicine at Mount Sinai, New York, New York, USA
| |
Collapse
|
3
|
Gbedemah ZEE, Fuseini MSN, Fordjuor SKEJ, Baisie-Nkrumah EJ, Beecham RMEM, Amissah-Arthur KN. Readability and Quality of Online Information on Sickle Cell Retinopathy for Patients. Am J Ophthalmol 2024; 259:45-52. [PMID: 37918780 DOI: 10.1016/j.ajo.2023.10.023] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2023] [Revised: 06/22/2023] [Accepted: 10/27/2023] [Indexed: 11/04/2023]
Abstract
PURPOSE This study aims to evaluate the readability and quality of Internet-based health information on sickle cell retinopathy. DESIGN Retrospective cross-sectional website analysis. METHODS To simulate a patient's online search, the terms "sickle cell retinopathy" and "sickle cell disease in the eye" were entered into the top 3 search engines (Google, Bing and Yahoo). The first 20 results of each search were retrieved and screened for analysis. The DISCERN questionnaire, the Journal of the American Medical Association (JAMA) standards, and the Health on the Net (HON) criteria were used to evaluate the quality of the information. The Flesch-Kincaid Grade Level (FKGL), the Flesch Reading Ease (FRES), and the Automated Readability Index (ARI) were used to assess the readability of each website. RESULTS Of 16 online sources, 12 (75%) scored moderately on the DISCERN tool. The mean DISCERN score was 40.91 (SD, 10.39; maximum possible, 80). None of the sites met all of the JAMA benchmarks, and only 3 (18.75%) of the websites had HONcode certification. All of the websites had scores above the target American Medical Association grade level of 6 on both the FKGL and ARI. The mean FRES was 57.76 (±4.61), below the recommended FRES of 80 to 90. CONCLUSION There is limited online information available on sickle cell retinopathy. Most included websites were fairly difficult to read and of substandard quality. The quality and readability of Internet-based, patient-focused information on sickle cell retinopathy needs to be improved.
Collapse
Affiliation(s)
- Zulfiya Emefa Edugle Gbedemah
- From the University of Ghana Medical School (Z.E.E.G., M.-S.N.F., S.K.E.J.F., E.J.B.-N., R.-M.E.M.B.), College of Health Sciences, Korle Bu Teaching Hospital, Accra, Ghana
| | - Mohammed-Sherrif Napari Fuseini
- From the University of Ghana Medical School (Z.E.E.G., M.-S.N.F., S.K.E.J.F., E.J.B.-N., R.-M.E.M.B.), College of Health Sciences, Korle Bu Teaching Hospital, Accra, Ghana
| | - Sam Kwaku Esson Jonah Fordjuor
- From the University of Ghana Medical School (Z.E.E.G., M.-S.N.F., S.K.E.J.F., E.J.B.-N., R.-M.E.M.B.), College of Health Sciences, Korle Bu Teaching Hospital, Accra, Ghana
| | - Eugene Jojo Baisie-Nkrumah
- From the University of Ghana Medical School (Z.E.E.G., M.-S.N.F., S.K.E.J.F., E.J.B.-N., R.-M.E.M.B.), College of Health Sciences, Korle Bu Teaching Hospital, Accra, Ghana
| | - Rya-Marie Esi Mensima Beecham
- From the University of Ghana Medical School (Z.E.E.G., M.-S.N.F., S.K.E.J.F., E.J.B.-N., R.-M.E.M.B.), College of Health Sciences, Korle Bu Teaching Hospital, Accra, Ghana
| | - Kwesi Nyan Amissah-Arthur
- Ophthalmology Unit (K.N.A.-A.), Department of Surgery, Korle Bu Teaching Hospital, College of Health Sciences, School of Medicine and Dentistry, University of Ghana, Accra, Ghana.
| |
Collapse
|
4
|
Al-Kharouf KFK, Khan FI, Robertson GAJ. Assessing the readability of online information about jones fracture. World J Methodol 2023; 13:439-445. [PMID: 38229937 PMCID: PMC10789098 DOI: 10.5662/wjm.v13.i5.439] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 03/26/2023] [Revised: 07/06/2023] [Accepted: 09/14/2023] [Indexed: 12/20/2023] Open
Abstract
BACKGROUND Hand in hand with technological advancements, treatment modalities continue to grow. With the turn of the century, the internet has become the number one source of information for almost every topic. Thus, many patients look toward the internet as their primary source of information to learn about their respective medical conditions. The American Medical Association and National Institute of Health strongly recommend that online medical information be written at the 6th to 8th-grade level to aid comprehension by patients of all literacy backgrounds.
AIM To assess the readability of online information regarding Jones fracture. Our hypothesis is that the reading level of medical information published on websites far exceeds the recommended reading level of 6th-8th grade as proposed by the American Medical Associate and National Institute of Health. The result of this study can help us formulate improved recommendations for publishing more comprehensible material and, thus, eventually improve patient compliance and clinical outcomes.
METHODS The exact phrase “Jones fracture” was queried on the three most common search engines, Google, Yahoo!, and Bing, on December 28, 2022. As of December 2022, Google held 84%, Bing held 9%, and Yahoo! held 2% of the worldwide search engine market share. Web pages uniform resource locator from the first three pages of search results were recorded from each search engine. These web pages were classified according to academic, physician-sponsored, governmental and non-government organizations (NGO), commercial, and unspecified as per formally defined categories. Websites associated with an educational institution or medical organization were classified as academic. Websites with products for sale, corporate sponsorship, or advertisements were classified as commercial. Governmental websites or NGOs comprised those that received government subsidies or grants. Webpages that were independently owned by physicians or physician groups were respectively classed as physician sponsored. The remainder of websites that did not fall under the above categories were classified as unspecified.
RESULTS A total of 93 websites were analyzed for reading assessment. A whopping 44% of websites were commercial, followed by 22% of physician-sponsored websites. Third place belonged to non-government organization websites holding a 15% share. The academic website held a meager 9% portion, while unspecified sites were 3%. The table illustrates mean readability scores, along with average cumulative grade level. The average grade level was 10.95 ± 2.28 for all websites, with a range of 6.18 to 18.90. Since P values were more than 0.05, there was not a significant statistical difference between the first page results and the results of all pages. Thus, we can rationalize that readability scores are consistent throughout all pages of a website.
CONCLUSION Hand in hand with technological advancements, treatment modalities continue to grow. With the turn of the century, the internet has become the number one source of information for almost every topic. Thus, many patients look towards the internet as the primary source of information to learn about their respective medical conditions. Our study demonstrates that current online medical information regarding Jones fracture is written at an extraordinarily high-grade level, with an average grade level of all websites at 10.95, nearly an 10th-grade educational level. The American Medical Association and National Institute of Health strongly recommend that online medical information should be written at the 6th to 8th-grade level to aid comprehension by patients of all literacy backgrounds. On the contrary, most of the medical information evaluated was at an 10th-grade level, which far exceeds recommendations by AMA and NIH. This is particularly relevant because readability scores are directly proportional to the level of comprehension attained by readers, thus directly impacting patient outcomes. In conclusion, we suggest and encourage that all online reading materials should be re-written at the 6th to 8th-grade level in a public service effort to increase compliance with treatment goals and raise awareness of preventive measures.
Collapse
Affiliation(s)
| | - Faisal Idrees Khan
- Internal Medicine, Tunbridge Wells Hospital, Tunbridge Wells E10 5NJ, United Kingdom
| | - Greg AJ Robertson
- Orthopaedic Surgery, Queen Alexandra Hospital, Portsmouth PO6 3LY, United Kingdom
| |
Collapse
|
5
|
Thomas ND, Mahler R, Rohde M, Segovia N, Shea KG. Evaluating the Readability and Quality of Online Patient Education Materials for Pediatric ACL Tears. J Pediatr Orthop 2023; 43:549-554. [PMID: 37694607 DOI: 10.1097/bpo.0000000000002490] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 09/12/2023]
Abstract
BACKGROUND As the rate of anterior cruciate ligament (ACL) tears increases in children, the internet has become a major source of information and education. In the United States, the average adult reads at about an eighth grade level. The National Institutes of Health recommends that patient education materials do not exceed a sixth-grade reading level. Therefore, the most accessed resources on the internet should be created with this in mind. The purpose of this study is to assess the readability and quality of online patient resources for pediatric ACL tears. METHODS Google was queried using the term "Pediatric ACL Tear" on May 26, 2022. The most popular sites were identified through page one of a Google search. All content was evaluated to assure information was directed toward patients. To determine reading difficulty, the most widely accepted readability tests: Flesch Reading Ease Index, Flesch-Kincaid Grade Level, and Gunning Fog Index were calculated through plain text in Microsoft Word and URL in online readability checker Readable.io. RESULTS The average grade level for all resources was above the recommended reading level based on both Microsoft Word and Readable.io calculations. Each source exceeded the NIH recommendation by 2.6 grade levels on average (Mean grade level readability was 8.6 ± 1.9). Four of the 6 sites were above the average US reading level, exceeding the eighth grade by an average of 1.5 grade levels. All 6 sites analyzed had a mean DISCERN score of 61.9, meeting the 'good quality' criteria. CONCLUSION The most readily available online materials for Pediatric ACL tears were of 'good quality' but above both the NIH-recommended readability level and the average US adult reading level. With the increasing need for treatment of ACL tears in pediatric and adolescent patients and greater internet accessibility in these populations, it is important to consider the readability of these resources in support of increased health literacy and improved outcomes. CLINICAL RELEVANCE It is important for physicians treating young patients with ACL tears to be aware of all sources of information and support, including content shared online as these platforms are increasingly utilized, especially by patients and families of lower socioeconomic status.
Collapse
|
6
|
Hartnett DA, Philips AP, Daniels AH, Blankenhorn BD. Readability and quality of online information on total ankle arthroplasty. Foot (Edinb) 2023; 54:101985. [PMID: 36827889 DOI: 10.1016/j.foot.2023.101985] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/31/2022] [Revised: 02/16/2023] [Accepted: 02/18/2023] [Indexed: 02/26/2023]
Abstract
The internet is a frequently utilized resource to acquire health information. This study aims to examine the readability and quality of online information pertaining to total ankle arthroplasty (TAA). "Ankle arthroplasty" OR "ankle replacement" was queried in three search engines, with the first 3 pages of results identified. The readability of sites was calculated using six readability algorithms: Flesch-Kincaid grade level, Flesch Reading Ease, Gunning Fog, SMOG, Coleman-Liau index, and Automated Readability Index. Quality was assessed using the JAMA benchmark, Global Quality Score (GQS), and DISCERN instrument. A total of 62 relevant sites were analyzed. Sources were primarily physician-sponsored (50%) or academic (31%) websites. The mean readability indices were above the recommended sixth grade reading level, with an average grade level of across scoring tools of 13.22 ± 2.07. No sites were at or below a sixth grade reading level. Quality ratings were subpar across assessment tools: JAMA = 1.9 ± 1.0 (range, 1-4) out of 4; GQS = 3.4 ± 1.0 (range, 1-5) out of 5. DISCERN = 54.0 ± 11.2 (range, 31-75) out of 80. The readability and quality of online information regarding ankle arthroplasty is not optimal for the average patient, with improvement valuable in cultivating shared decision-making.
Collapse
Affiliation(s)
- Davis A Hartnett
- Warren Alpert Medical School of Brown University, 222 Richmond St, Providence, RI 02903, USA.
| | - Alexander P Philips
- Warren Alpert Medical School of Brown University, 222 Richmond St, Providence, RI 02903, USA.
| | - Alan H Daniels
- Department of Orthopaedic Surgery, Warren Alpert Medical School of Brown University, 1 Kettle Point Ave, East Providence, RI 02914, USA.
| | - Brad D Blankenhorn
- Department of Orthopaedic Surgery, Warren Alpert Medical School of Brown University, 1 Kettle Point Ave, East Providence, RI 02914, USA.
| |
Collapse
|
7
|
Hartnett DA, Philips AP, Daniels AH, Blankenhorn BD. Readability of Online Foot and Ankle Surgery Patient Education Materials. Foot Ankle Spec 2022:19386400221116463. [PMID: 35934974 DOI: 10.1177/19386400221116463] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
Background. Online health education resources are frequently accessed by patients seeking information on orthopaedic conditions and procedures. The objectives of this study were to assess the readability of information provided by the American Orthopaedic Foot and Ankle Society (AOFAS) and compare current levels of readability with previous online material. Methods. This study examined 115 articles classified as "Conditions" or "Treatments" on FootCareMD.org. Readability was assessed using the 6 readability assessment tools: Flesch Reading Ease, Flesch-Kincaid Grade Level (FKGL), Gunning Fog Score, Simple Measure of Gobbledygook (SMOG) Index, Coleman-Liau Index, and the Automated Readability Index. Results. The mean readability score across all metrics ranged from 9.1 to 12.1, corresponding to a 9th- to 12th-grade reading level, with a mean FKGL of 9.2 ± SD 1.1 (range: 6.3-15.0). No articles were written below the recommended US sixth-grade reading level, with only 3 articles at or below an eighth-grade level. Treatment articles had higher mean readability grade levels than condition articles (P = .03). Conclusion. Although the volume and quality of the AOFAS resource Web site has increased, readability of information has worsened since 2008 and remains higher than the recommended reading level for optimal comprehension by the general population.Levels of Evidence: Level IV:Retrospective quantitative analysis.
Collapse
Affiliation(s)
- Davis A Hartnett
- The Warren Alpert Medical School of Brown University, Providence, Rhode Island (DAH, APP)
- Department of Orthopaedic Surgery, Warren Alpert Medical School of Brown University, Providence, Rhode Island (AHD, BDB)
| | - Alexander P Philips
- The Warren Alpert Medical School of Brown University, Providence, Rhode Island (DAH, APP)
- Department of Orthopaedic Surgery, Warren Alpert Medical School of Brown University, Providence, Rhode Island (AHD, BDB)
| | - Alan H Daniels
- The Warren Alpert Medical School of Brown University, Providence, Rhode Island (DAH, APP)
- Department of Orthopaedic Surgery, Warren Alpert Medical School of Brown University, Providence, Rhode Island (AHD, BDB)
| | - Brad D Blankenhorn
- The Warren Alpert Medical School of Brown University, Providence, Rhode Island (DAH, APP)
- Department of Orthopaedic Surgery, Warren Alpert Medical School of Brown University, Providence, Rhode Island (AHD, BDB)
| |
Collapse
|
8
|
Smith S, Jupiter DC, Panchbhavi VK, Chen J. Quality and Readability of Information Regarding Total Ankle Arthroplasty Available to Patients on the Internet. Foot Ankle Spec 2022:19386400221109423. [PMID: 35848229 DOI: 10.1177/19386400221109423] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/17/2022]
Abstract
This study sought to evaluate the reliability, comprehensiveness, and readability of ankle arthroplasty information available on the Internet. We evaluated websites based on category, Journal of the American Medical Association (JAMA) criteria, Health on the Net (HON) code, DISCERN score, an author-created Ankle Replacement Index (ARI), and readability metrics. Based on the ARI, 80 (62.5%) websites provided poor information. The mean reading level was 8.96 ± 2.66, which is above the recommended sixth-grade reading level for patient information. Academic websites had the highest mean DISCERN, ARI, and JAMA scores, and a midrange reading level. The government category had high DISCERN and JAMA scores, a fair ARI score, and the lowest reading level. We found significant correlation between website class and DISCERN score, as well as HON code and DISCERN score. Our results suggest that academic and government websites provide more reliable, complete information than other categories and that websites with an HON code contain more reliable information than those without. We recommend that physicians create handouts to point patients to reliable resources and encourage them to critically evaluate information they read online. We also encourage physicians to take part in evaluating and updating information on their practice websites.Level of Clinical Evidence: N/A.
Collapse
Affiliation(s)
- Sydney Smith
- Department of Preventive Medicine and Population Health (DCJ), and Department of Orthopaedic Surgery and Rehabilitation (SS, DCJ, VKP, JC), The University of Texas Medical Branch, Galveston, Texas
| | - Daniel C Jupiter
- Department of Preventive Medicine and Population Health (DCJ), and Department of Orthopaedic Surgery and Rehabilitation (SS, DCJ, VKP, JC), The University of Texas Medical Branch, Galveston, Texas
| | - Vinod K Panchbhavi
- Department of Preventive Medicine and Population Health (DCJ), and Department of Orthopaedic Surgery and Rehabilitation (SS, DCJ, VKP, JC), The University of Texas Medical Branch, Galveston, Texas
| | - Jie Chen
- Department of Preventive Medicine and Population Health (DCJ), and Department of Orthopaedic Surgery and Rehabilitation (SS, DCJ, VKP, JC), The University of Texas Medical Branch, Galveston, Texas
| |
Collapse
|
9
|
Pruneski JA, Kiapour AM. The readability of online patient education materials for slipped capital femoral epiphysis. J Pediatr Orthop B 2022; 31:e167-e173. [PMID: 34908028 DOI: 10.1097/bpb.0000000000000943] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
Abstract
Given the long-term complications of undiagnosed slipped capital femoral epiphysis (SCFE) and the importance of readable health information materials on positive, equitable health outcomes, the objective of this study was to determine if the online patient education materials regarding SCFE are written at or below accepted recommendations. The secondary objective was to determine whether the readability of these materials varied when stratified by the type of website. 'Slipped capital femoral epiphysis', 'SCFE', and 'slipped femoral head' were used as search queries in three common search engines. The readability of each website was evaluated using five established metrics, and the scores were compared by website type and by the complexity of the search query. In this study of 53 unique websites about SCFE, we demonstrated that only one of the web pages was written at the recommended sixth-grade level, and the mean reading level of the online material was above the 10th-grade level. Post hoc testing showed that only websites associated with pediatric academic institutions were written at a significantly lower grade level than general health websites [P < 0.05 for all, range (0.003, 0.04)]. The materials about SCFE that are available to patients and their families continue to be written at an inappropriate level. To increase accessibility and allow for equitable long-term health outcomes, physicians, universities, hospitals and medical societies must ensure that they produce readable education to increase patients' understanding of SCFE, its symptoms and available treatment options. Future studies evaluating progress regarding these metrics are warranted.
Collapse
Affiliation(s)
- James A Pruneski
- Harvard Medical School
- Department of Orthopedic Surgery, Boston Children's Hospital, Boston, Massachusetts, USA
| | - Ata M Kiapour
- Harvard Medical School
- Department of Orthopedic Surgery, Boston Children's Hospital, Boston, Massachusetts, USA
| |
Collapse
|
10
|
Irwin SC, Lennon DT, Stanley CP, Sheridan GA, Walsh JC. Ankle conFUSION: The quality and readability of information on the internet relating to ankle arthrodesis. Surgeon 2021; 19:e507-e511. [PMID: 33451875 DOI: 10.1016/j.surge.2020.12.001] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2020] [Accepted: 12/04/2020] [Indexed: 10/22/2022]
Abstract
BACKGROUND The internet is an important source of information for patients undergoing surgery. Multiple studies have identified inappropriately high reading levels of patient information online. The average reading level in the United States is 7-8th grade. Multiple organisations have recommended that patient information not exceed 6th grade level. This study aims to evaluate the reading levels and quality of information regarding ankle fusion surgery online. METHODS Google, Bing and Yahoo were searched (MeSH "ankle fusion", "ankle arthrodesis") and the top 30 URLs analysed. Readability was assessed using an online readability calculator to produce 3 scores (Gunning FOG, Flesch Kincaid Grade and Flesch Reading Ease). Quality was assessed using a HONcode detection web-extension and the JAMA benchmark criteria. RESULTS Ninety-eight webpages were identified. The mean Flesch Kincaid Grade level was 9.24 ± 2.33 (95% CI 8.78-9.71). The mean Gunning FOG grade was 10.88 ± 3.1 (95% CI 10.26-11.5). The mean Flesch Reading Ease score was 49.88 ± 14.46 (95% CI 46.98-52.78). 7 webpages were at or below the 6th grade reading level. The mean JAMA score was 1.34 ± 1.32 out of 4 (95% CI 1.07-1.6). 14 websites were HONcode accredited. CONCLUSION The overall readability of medical information online is too high for the average patient. Given the important role that health literacy provides in patient reported outcomes, improving the readability and quality of these materials is imperative. Awareness by the general public is essential for them to critically appraise the information they receive online.
Collapse
Affiliation(s)
- Shane C Irwin
- Department of Orthopaedics, Beaumont Hospital, Dublin, Ireland.
| | - David T Lennon
- Department of Orthopaedics, Beaumont Hospital, Dublin, Ireland
| | | | | | - James C Walsh
- Department of Orthopaedics, Beaumont Hospital, Dublin, Ireland
| |
Collapse
|