1
|
Vought V, Vought R, Sharma R, Razdan G, Zhu L, Sutariya R, Wagner RS. Evaluating Pediatric Ophthalmic Care Using Sentiment Analysis of Physician Review Sites. J Pediatr Ophthalmol Strabismus 2024; 61:211-218. [PMID: 38275203 DOI: 10.3928/01913913-20240108-01] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 01/27/2024]
Abstract
PURPOSE To assess patient satisfaction within pediatric ophthalmology and identify trends in patient sentiment. METHODS Pediatric ophthalmologists in the United States were identified using the American Association for Pediatric Ophthalmology and Strabismus member directory. Demographic data were recorded using publicly available websites. Online written reviews and Stars ratings were obtained from Healthgrades.com. A sentiment analysis package, Valence Aware Dictionary for Sentimental Reasoning (VADER), was used to generate a compound score of reviews, and word frequency analyses were applied. RESULTS A total of 377 pediatric ophthalmologists (2,640 online reviews) were evaluated. Physicians received an average of 4.22/5 Stars and a compound sentiment score of 0.56, indicating positive sentiment. No differences in scores were observed by gender or location, although physicians with fewer years in practice had higher Stars ratings compared to peers (P < .001). The three most common words in the word frequency analysis of all reviews were "surgery," "staff," and "time," with heavy emphasis on bedside manner and addressing patient concerns. CONCLUSIONS This study demonstrates overall high patient satisfaction in pediatric ophthalmology care, with differences in sentiment based on physician demographic features. The study highlights that patient perspective is influenced by non-clinical features of care. These data may be used by pediatric ophthalmologists seeking to improve health care delivery. [J Pediatr Ophthalmol Strabismus. 2024;61(3):211-218.].
Collapse
|
2
|
Elhusseiny AM, Hassan AK, Hassan MA, Eleiwa TK, Ali HT, Abdelnaem S, Chauhan MZ, Shaikh O, Khouri AS, Sallam AB. Quality, Reliability, Technical Quality, and Readability of Google Online Information on Childhood Glaucoma. J Pediatr Ophthalmol Strabismus 2024; 61:198-203. [PMID: 38112390 DOI: 10.3928/01913913-20231114-01] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 12/21/2023]
Abstract
PURPOSE To evaluate the quality, reliability, technical quality, and readability of online information related to childhood glaucoma. METHODS In this cross-sectional study, no human subjects were studied. Analysis was done for online websites on childhood glaucoma. The terms "childhood glaucoma," "pediatric glaucoma," "congenital glaucoma," "buphthalmos," and "big eyes" were entered into the Google search engine and the first 100 search results were assessed for quality, reliability, technical quality, and readability. Peer-reviewed articles, patient forum posts, dictionary definitions, and websites that appeared as targeted ads, were not in English, or were not focused on humans were excluded. Each website was evaluated for (1) quality and reliability using the DISCERN, HONcode, and JAMA criteria; (2) technical quality assessing 11 technical aspects; and (3) readability using six separate criteria (Flesch-Kincaid Reading Ease Score and Grade Level, Gunning Fog Index score, the Simple Measure of Gobbledygook Index, Coleman-Liau Index, and Automated Readability Index). RESULTS The median scores for the DISCERN, HONcode, and JAMA criteria were 2.6 (range = 1 to 4.75; 1 = worst, 5 = best), 10 (range = 0 to 16; 0 = worst, 16 = best), and 2 (range = 0 to 4; 0 = worst, 4 = best), respectively. The median technical quality score was 0.7. Readability was poor among most websites, with a median Flesch-Kincaid grade Grade Level Score of 9.3. The median Gunning Fog Index score was 9.8. There was a statistically significantly higher JAMA score and Gunning Fog Index score among the private websites compared to the institutional websites. However, institutional websites had higher technical quality. CONCLUSIONS Online information on childhood glaucoma had poor to moderate quality and reliability. The technical quality is good; however, most websites' readability was above the recommended 5th to 6th grade reading level. [J Pediatr Ophthalmol Strabismus. 2024;61(3):198-203.].
Collapse
|
3
|
Eid K, Eid A, Wang D, Raiker RS, Chen S, Nguyen J. Optimizing Ophthalmology Patient Education via ChatBot-Generated Materials: Readability Analysis of AI-Generated Patient Education Materials and The American Society of Ophthalmic Plastic and Reconstructive Surgery Patient Brochures. Ophthalmic Plast Reconstr Surg 2024; 40:212-216. [PMID: 37972974 DOI: 10.1097/iop.0000000000002549] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/19/2023]
Abstract
PURPOSE This study aims to compare the readability of patient education materials (PEM) of the American Society of Ophthalmic Plastic and Reconstructive Surgery to that of PEMs generated by the AI-chat bots ChatGPT and Google Bard. METHODS PEMs on 16 common American Society of Ophthalmic Plastic and Reconstructive Surgery topics were generated by 2 AI models, ChatGPT 4.0 and Google Bard, with and without a 6th-grade reading level prompt modifier. The PEMs were analyzed using 7 readability metrics: Flesch Reading Ease Score, Gunning Fog Index, Flesch-Kincaid Grade Level, Coleman-Liau Index, Simple Measure of Gobbledygook Index Score, Automated Readability Index, and Linsear Write Readability Score. Each AI-generated PEM was compared with the equivalent American Society of Ophthalmic Plastic and Reconstructive Surgery PEM. RESULTS Across all readability indices, PEM generated by ChatGPT 4.0 consistently had the highest readability scores, indicating that the material generated by this AI chatbot may be most difficult to read in its unprompted form (Flesch Reading Ease Score: 36.5; Simple Measure of Gobbledygook: 14.7). Google's Bard was able to generate content that was easier to read than both the American Society of Ophthalmic Plastic and Reconstructive Surgery and ChatGPT 4.0 (Flesch Reading Ease Score: 52.3; Simple Measure of Gobbledygook: 12.7). When prompted to produce PEM at a 6th-grade reading level, both ChatGPT 4.0 and Bard were able to significantly improve in their readability scores, with prompted ChatGPT 4.0 being able to consistently generate content that was easier to read (Flesch Reading Ease Score: 67.9, Simple Measure of Gobbledygook: 10.2). CONCLUSION This study suggests that AI tools, when guided by appropriate prompts, can generate accessible and comprehensible PEMs in the field of ophthalmic plastic and reconstructive surgeries, balancing readability with the complexity of the necessary information.
Collapse
Affiliation(s)
- Kevin Eid
- Department of Ophthalmology, Moran Eye Center, University of Utah, Salt Lake City, Utah, U.S.A
| | - Alen Eid
- Department of Ophthalmology and Visual Sciences, West Virginia University, Morgantown, West Virginia, U.S.A
| | - Diane Wang
- Department of Ophthalmology and Visual Sciences, West Virginia University, Morgantown, West Virginia, U.S.A
| | - Rahul S Raiker
- Department of Medical Education, West Virginia University, Morgantown, West Virginia, U.S.A
| | - Stephen Chen
- Department of Medical Education, West Virginia University, Morgantown, West Virginia, U.S.A
| | - John Nguyen
- Department of Ophthalmology and Visual Sciences, West Virginia University, Morgantown, West Virginia, U.S.A
- Department of Otolaryngology and Head and Neck Surgery, West Virginia University, Morgantown, West Virginia, U.S.A
| |
Collapse
|
4
|
Cohen SA, Pershing S. Readability and Accountability of Online Patient Education Materials for Common Retinal Diseases. Ophthalmol Retina 2022; 6:641-643. [PMID: 35338025 PMCID: PMC10728491 DOI: 10.1016/j.oret.2022.03.015] [Citation(s) in RCA: 6] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2022] [Revised: 03/15/2022] [Accepted: 03/17/2022] [Indexed: 11/24/2022]
Abstract
Patients often utilize the internet to vlearn about retinal diseases. Our results demonstrate that online patient education materials related to common retinal diseases are often written at higher than recommended reading levels and lack accountability.
Collapse
Affiliation(s)
- Samuel A Cohen
- Department of Ophthalmology, Stanford University School of Medicine, Stanford, California
| | - Suzann Pershing
- Department of Ophthalmology, Stanford University School of Medicine, Stanford, California; VA Palo Alto Health Care System, Palo Alto, California; Byers Eye Institute at Stanford, Stanford, California.
| |
Collapse
|
5
|
Cheng BT, Kim AB, Tanna AP. Readability of Online Patient Education Materials for Glaucoma. J Glaucoma 2022; 31:438-442. [PMID: 35283441 DOI: 10.1097/ijg.0000000000002012] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/13/2021] [Accepted: 02/22/2022] [Indexed: 11/26/2022]
Abstract
PRCIS We assessed the readability of online glaucoma patient education materials using seven validated instruments. Overall, glaucoma materials were written at a 10th to 11th grade level, above the recommended seventh grade reading level. PURPOSE Online health information is increasingly used by patients, yet previous studies show online patient education materials are often difficult to understand. As such, the American Medical Association recommends that patient education materials are written at or below a seventh grade reading level. This study aimed to assess the readability of online glaucoma patient education materials. METHODS Glaucoma was entered into the Google search engine, and the first 30 search results were assessed for readability using seven validated readability instruments. Scientific articles, forums, and dictionary entries were excluded. Single sample t tests were used to assess whether online glaucoma materials were written above the recommended seventh grade level. RESULTS Overall, glaucoma materials were written at a mean grade level of 10.33 (SD: 2.02). Across 6 grade level readability instruments, these patient education materials were written above the recommended seventh grade reading level (P<0.0001 for all). Glaucoma education materials only on the first page of Google search results were of a similar reading level: mean 10.56 (SD: 2.13). The readability instruments used in this study showed strong consistency. CONCLUSIONS Glaucoma patient education materials are written above the recommended reading level to promote accessibility of education materials. This may contribute to lower patient engagement, worse clinical outcomes, and greater racial and ethnic disparities in glaucoma management. There is a need for reliable, simple glaucoma information to improve patient outcomes.
Collapse
Affiliation(s)
- Brian T Cheng
- Department of Ophthalmology, Northwestern University Feinberg School of Medicine
| | - Anne B Kim
- Rush University Medical College, Chicago, IL
| | - Angelo P Tanna
- Department of Ophthalmology, Northwestern University Feinberg School of Medicine
| |
Collapse
|
6
|
Wang E, Kalloniatis M, Ly A. Assessment of patient education materials for age-related macular degeneration. Ophthalmic Physiol Opt 2022; 42:839-848. [PMID: 35521818 PMCID: PMC9325046 DOI: 10.1111/opo.12991] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/09/2022] [Revised: 04/10/2022] [Accepted: 04/11/2022] [Indexed: 11/30/2022]
Abstract
Purpose Age‐related macular degeneration (AMD) is a leading cause of vision loss. It is helpful for patients living with AMD to understand the prognosis, risk factors and management of their condition. Online education materials are a popular and promising channel for conveying this knowledge to patients with AMD. However, the quality of these materials—particularly with respect to qualities such as ‘understandability’ and ‘actionability’—is not yet known. This study assessed a collection of online materials about AMD based on these qualities of ‘understandability’ and ‘actionability’. Methods Online education materials about AMD were sourced through Google from six English‐speaking nations: Australia, New Zealand, USA, UK, Ireland and Canada. Three Australian/New Zealand trained and registered optometrists participated in the grading of the ‘understandability’ and ‘actionability’ of online education materials using the Patient Education Materials Assessment Tool (PEMAT). Results This study analysed a total of 75 online materials. The mean ‘understandability’ score was 74% (range: 38%–94%). The ‘understandability’ PEMAT criterion U11 (calling for a summary of the key points) scored most poorly across all materials. The mean ‘actionability’ score was 49% (range: 0%–83%). The ‘actionability’ PEMAT criterion A26 (using ‘visual aids’ to make instructions easier to act on) scored most poorly across all materials. Conclusion Most education materials about AMD are easy to understand, but difficult to act on, because of a lack of meaningful visual aids. We propose future enhancements to AMD education materials—including the use of summaries, visual aids and a habit tracker—to help patients with AMD improve their understanding of disease prognosis, risk factors and eye assessment schedule requirements.
Collapse
Affiliation(s)
- Elisa Wang
- Centre for Eye Health, The University of New South Wales Sydney, Kensington, New South Wales, Australia.,School of Optometry and Vision Science, The University of New South Wales Sydney, Kensington, New South Wales, Australia
| | - Michael Kalloniatis
- Centre for Eye Health, The University of New South Wales Sydney, Kensington, New South Wales, Australia.,School of Optometry and Vision Science, The University of New South Wales Sydney, Kensington, New South Wales, Australia
| | - Angelica Ly
- Centre for Eye Health, The University of New South Wales Sydney, Kensington, New South Wales, Australia.,School of Optometry and Vision Science, The University of New South Wales Sydney, Kensington, New South Wales, Australia
| |
Collapse
|
7
|
Gordejeva J, Zowalla R, Pobiruchin M, Wiesner M. Readability of English, German, and Russian Disease-related Wikipedia pages: Automated Computational Analysis (Preprint). J Med Internet Res 2022; 24:e36835. [PMID: 35576562 PMCID: PMC9152717 DOI: 10.2196/36835] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [MESH Headings] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/27/2022] [Revised: 04/12/2022] [Accepted: 04/14/2022] [Indexed: 11/13/2022] Open
Affiliation(s)
| | - Richard Zowalla
- Department of Medical Informatics, Heilbronn University, Heilbronn, Germany
- Consumer Health Informatics SIG, German Association for Medical Informatics, Biometry & Epidemiology (GMDS e. V.), Cologne, Germany
- Center for Machine Learning, Heilbronn University, Heilbronn, Germany
| | - Monika Pobiruchin
- Consumer Health Informatics SIG, German Association for Medical Informatics, Biometry & Epidemiology (GMDS e. V.), Cologne, Germany
- GECKO Institute for Medicine, Informatics & Economics, Heilbronn University, Heilbronn, Germany
| | - Martin Wiesner
- Department of Medical Informatics, Heilbronn University, Heilbronn, Germany
- Consumer Health Informatics SIG, German Association for Medical Informatics, Biometry & Epidemiology (GMDS e. V.), Cologne, Germany
| |
Collapse
|
8
|
Ndukwe T, Cole E, Scanzera AC, Chervinko MA, Chiang MF, Campbell JP, Chan RVP. Health Equity and Disparities in ROP Care: A Need for Systematic Evaluation. Front Pediatr 2022; 10:806691. [PMID: 35433564 PMCID: PMC9010777 DOI: 10.3389/fped.2022.806691] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 11/01/2021] [Accepted: 03/03/2022] [Indexed: 12/04/2022] Open
Abstract
Retinopathy of prematurity (ROP) is a vasoproliferative retinal disorder that can have devastating visual sequelae if not managed appropriately. From an ophthalmology standpoint, ROP care is complex, since it spans multiple care settings and providers, including those in the neonatal intensive care unit (NICU), step down nurseries, and the outpatient clinic setting. This requires coordination and communication between providers, ancillary staff, and most importantly, effective communication with the patient's family members and caregivers. Often, factors related to the social determinants of health play a significant role in effective communication and care coordination with the family, and it is important for ophthalmologists to recognize these risk factors. The aim of this article is to (1) review the literature related to disparities in preterm birth outcomes and infants at risk for ROP; (2) identify barriers to ROP care and appropriate follow up, and (3) describe patient-oriented solutions and future directions for improving ROP care through a health equity lens.
Collapse
Affiliation(s)
- Tochukwu Ndukwe
- Department of Ophthalmology and Visual Sciences, Illinois Eye and Ear Infirmary, University of Illinois at Chicago, Chicago, IL, United States
| | - Emily Cole
- Department of Ophthalmology and Visual Sciences, Illinois Eye and Ear Infirmary, University of Illinois at Chicago, Chicago, IL, United States
| | - Angelica C Scanzera
- Department of Ophthalmology and Visual Sciences, Illinois Eye and Ear Infirmary, University of Illinois at Chicago, Chicago, IL, United States
| | - Margaret A Chervinko
- Department of Ophthalmology and Visual Sciences, Illinois Eye and Ear Infirmary, University of Illinois at Chicago, Chicago, IL, United States
| | - Michael F Chiang
- National Institutes of Health, National Eye Institute, Bethesda, MD, United States
| | - John Peter Campbell
- Department of Ophthalmology, Casey Eye Institute, Oregon Health & Science University, Portland, OR, United States
| | - Robison Vernon Paul Chan
- Department of Ophthalmology and Visual Sciences, Illinois Eye and Ear Infirmary, University of Illinois at Chicago, Chicago, IL, United States
| |
Collapse
|
9
|
Patel PA, Gopali R, Reddy A, Patel KK. The Readability of Ophthalmological Patient Education Materials Provided by Major Academic Hospitals. Semin Ophthalmol 2021; 37:71-76. [PMID: 33852375 DOI: 10.1080/08820538.2021.1915341] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
Abstract
INTRODUCTION The internet is an increasingly important resource for patients seeking health-related information. Because of this trend, the American Medical Association (AMA) and National Institutes of Health (NIH) recommend that online patient education materials (PEMs) be written between a third and seventh grade level. The present study evaluates the readability levels of ophthalmological PEMs provided by five major academic hospitals, quantifies the availability of accompanying videos and graphics, and examines the extent to which readability may be increased. METHODS In March 2021, 397 PEMs from five major academic hospitals were extracted for subsequent analysis by seven validated readability assessments. The presence of an accompanying video or graphic was noted. Statistical significance was assessed using the Kruskal-Wallis test with Dunn's multiple comparisons test and the chi-square test. RESULTS Nearly all articles were written above the recommended reading level of 7th grade. After averaging the scales for each article, the median grade level was 11.7 (interquartile range [IQR], 10.7-12.7). The PEMs with the highest median reading level were provided by the Johns Hopkins University Wilmer Institute (12.6, IQR, 11.3 - 13.6). Only 13.6% and 13.1% of articles had an accompanying video and graphic, respectively. Reduction of sentence length beneath 15 words resulted in an improvement of readability by 2.7 grade levels. CONCLUSIONS The readability of online patient resources provided by major academic hospitals were above the literacy guidelines recommended by the NIH and AMA. Furthermore, most articles did not include a video or graphic, both of which could potentially improve patient understandability of educational materials. By altering these PEMs, as demonstrated here, institutions could increase the value these articles provide for patients and therefore the quality of the patient-physician relationship.
Collapse
Affiliation(s)
- Parth A Patel
- Department of Ophthalmology, Medical College of Georgia, Augusta University, Augusta, GA, USA
| | - Rhea Gopali
- Department of Biological Sciences, North Carolina State University, Raleigh, NC, USA
| | - Anvith Reddy
- Department of Biological Sciences, University of Georgia, Athens, GA, USA
| | - Kajol K Patel
- Department of Ophthalmology, Medical College of Georgia, Augusta University, Augusta, GA, USA
| |
Collapse
|
10
|
Fortuna J, Riddering A, Shuster L, Lopez-Jeng C. Assessment of online patient education materials designed for people with age-related macular degeneration. BMC Ophthalmol 2020; 20:391. [PMID: 33008367 PMCID: PMC7532594 DOI: 10.1186/s12886-020-01664-x] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/30/2020] [Accepted: 09/27/2020] [Indexed: 11/24/2022] Open
Abstract
Background Age-related macular degeneration (AMD) is a chronic eye condition that leads to permanent vision loss in the central visual field. AMD makes reading challenging and inefficient. People with AMD often find it difficult to access, process and understand written patient education materials (PEMs). To promote health literacy, the demands of written PEMs must match the literacy capacities of the target audience. This study aims to evaluate the readability (grade level) and suitability (appropriateness) of online PEMs designed for people with AMD. Methods Online PEMs were sourced from websites of national organizations providing patient education materials designed for people with AMD. The Flesch-Kincaid Grade Level formula and the Suitability Assessment of Materials instrument were used to assess the readability and suitability of PEMs. Descriptive statistics were used to compare online PEMs by organization based on national guidelines for readability level (≤ sixth grade) and the recommended suitability score (≥ 70%) for “superior” material. Results One hundred online PEMs were evaluated from websites of 16 professional organizations. The mean readability level was 9.3 (range 5.0–16.6). The mean suitability score was 53% (range 18–78%). Only six (6%) of PEMs achieved the recommended guidelines for readability level and suitability score. Conclusion The majority of online PEMs designed for people with AMD were written above the recommended readability level, and below the suggested suitability score. To promote health literacy, the demands of written health information must match the reading capacities of the target audience. Heeding to evidence-based guidelines for providing written information to patients with low health literacy and low vision is beneficial for both patients and health care providers. Future research is warranted.
Collapse
Affiliation(s)
- Jennifer Fortuna
- Occupational Science and Therapy Department, Grand Valley State University, 500 Lafayette Ave NE, Grand Rapids, MI, 49503, USA.
| | - Anne Riddering
- Department of Occupational Therapy, Western Michigan University, 1903 W. Michigan Ave, Kalamazoo, MI, 49008, USA
| | - Linda Shuster
- Department of Speech, Language and Hearing Sciences, Western Michigan University, 1903 W. Michigan Ave, Kalamazoo, MI, 49008, USA
| | - Cassie Lopez-Jeng
- School of Interdisciplinary Health Programs, Western Michigan University, 1903 W. Michigan Ave, Kalamazoo, MI, 49008, USA
| |
Collapse
|
11
|
Abstract
BACKGROUND CHD is the most common birth defect type, with one-fourth of patients requiring intervention in the first year of life. Caregiver understanding of CHD may vary. Health literacy may be one factor contributing to this variability. METHODS The study occurred at a large, free-standing children's hospital. Recruitment occurred at a free-of-charge CHD camp and during outpatient cardiology follow-up visits. The study team revised the CHD Guided Questions Tool from an eighth- to a sixth-grade reading level. Caregivers of children with CHD completed the "Newest Vital Sign" health literacy screen and demographic surveys. Health literacy was categorised as "high" (Newest Vital Sign score 4-6) or "low" (score 0-3). Caregivers were randomised to read either the original or revised Guided Questions Tool and completed a validated survey measuring understandability and actionability of the Guided Questions Tool. Understandability and actionability data analysis used two-sample t-testing, and within demographic group differences in these parameters were assessed via one-way analysis of variance. RESULTS Eighty-two caregivers participated who were largely well educated with a high income. The majority (79.3%) of participants scored "high" for health literacy. No differences in understanding (p = 0.43) or actionability (p = 0.11) of the original and revised Guided Questions Tool were noted. There were no socio-economic-based differences in understandability or actionability (p > 0.05). There was a trend towards improved understanding of the revised tool (p = 0.06). CONCLUSIONS This study demonstrated that readability of the Guided Questions Tool could be improved. Future work is needed to expand the study population and further understand health literacy's impact on the CHD community.
Collapse
|
12
|
Moses C, Flegg K, Dimaras H. Patient knowledge, experiences and preferences regarding retinoblastoma and research: A qualitative study. Health Expect 2020; 23:632-643. [PMID: 32113195 PMCID: PMC7321723 DOI: 10.1111/hex.13043] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2019] [Revised: 01/27/2020] [Accepted: 02/07/2020] [Indexed: 12/17/2022] Open
Abstract
BACKGROUND We launched a patient engagement strategy to facilitate research involvement of the retinoblastoma (childhood eye cancer) community in Canada. To inform our strategy, we aimed to uncover the experiences with retinoblastoma, knowledge of retinoblastoma and research engagement among retinoblastoma survivors and parents. METHODS Focus groups were held in Toronto and Calgary, including both in-person and remote participants (via videoconference). Discussions centred on experience with retinoblastoma, knowledge of the disease and engagement with research. Focus group transcripts were evaluated by inductive thematic analysis. RESULTS Four focus groups (3 in Toronto, 1 in Calgary) were held with a collective total of 34 participants. Retinoblastoma had a substantial impact on the life of participants, but overall, patients reported being able to adapt and persevere. Experiential knowledge of retinoblastoma was identified as distinct from the theoretical knowledge held by their clinicians. Participants indicated they often acted as a knowledge broker, communicating information about the cancer to their social networks. Participants were willing to engage in research as partners, but recognized barriers such as time and appropriate training. CONCLUSIONS Patients view their experiential knowledge of retinoblastoma as valuable to improving care and directing research. There is a unique role for research engagement in meeting the educational needs of patients.
Collapse
Affiliation(s)
- Catherine Moses
- Department of Ophthalmology & Vision Sciences, The Hospital for Sick Children, Toronto, ON, Canada.,The Department of Health Studies, University of Toronto, Toronto, ON, Canada
| | - Kaitlyn Flegg
- Department of Ophthalmology & Vision Sciences, The Hospital for Sick Children, Toronto, ON, Canada
| | - Helen Dimaras
- Department of Ophthalmology & Vision Sciences, The Hospital for Sick Children, Toronto, ON, Canada.,Division of Clinical Public Health, Dalla Lana School of Public Health, University of Toronto, Toronto, ON, Canada.,Department of Ophthalmology & Vision Sciences, Faculty of Medicine, University of Toronto, Toronto, ON, Canada.,Child Health Evaluative Sciences Program, SickKids Research Institute, Toronto, ON, Canada.,Center for Global Child Health, SickKids Research Institute, Toronto, ON, Canada
| |
Collapse
|
13
|
Brezar A, Heilman J. Readability of English Wikipedia's health information over time. WIKIJOURNAL OF MEDICINE 2019. [DOI: 10.15347/wjm/2019.007] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022] Open
|
14
|
Nascimento JC, Lima MA, Barros LM, Galindo Neto NM, Pagliuca LMF, Caetano JÁ. Technology for performing ocular self-examination: comparison between printed and virtual booklets. Rev Esc Enferm USP 2018; 52:e03326. [PMID: 29846487 DOI: 10.1590/s1980-220x2017024703326] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2017] [Accepted: 12/19/2017] [Indexed: 11/22/2022] Open
Abstract
OBJECTIVE Comparing the results of the ocular self-examination performed with the aid of printed and virtual versions of an educational booklet. METHOD A quasi-experimental study carried out in a state (public) school of a capital in northeast Brazil, with 100 students equally divided into control and intervention groups according to age, gender, schooling and economic status. Pearson's Chi-square test and Fisher's exact test were applied with a significance level of 5%. RESULTS The results of the self-examination obtained by the virtual and printed booklets were statistically similar, except for the item 'Alterations of the pupillary reflex', in which the virtual booklet was more effective for its identification (p=0.049). CONCLUSION The printed and virtual versions of the ocular educational booklet have similar efficacy for performing ocular self-examination.
Collapse
Affiliation(s)
| | | | - Lívia Moreira Barros
- Programa de Pós-Graduação em Enfermagem, Universidade Federal do Ceará, Fortaleza, Ceará, Brasil
| | | | | | - Joselany Áfio Caetano
- Programa de Pós-Graduação em Enfermagem, Universidade Federal do Ceará, Fortaleza, Ceará, Brasil
| |
Collapse
|
15
|
Hansberry DR, Ayyaswami V, Sood A, Prabhu AV, Agarwal N, Deshmukh SP. Abdominal imaging and patient education resources: enhancing the radiologist-patient relationship through improved communication. Abdom Radiol (NY) 2017; 42:1276-1280. [PMID: 27838772 DOI: 10.1007/s00261-016-0977-3] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
Abstract
INTRODUCTION The relative ease of Internet access and its seemingly endless amount of information creates opportunities for Americans to research medical diseases, diagnoses, and treatment plans. Our objective is quantitative evaluation of the readability level of patient education websites, written for the lay public, pertaining to common radiologic diagnostic test, and radiologic diagnoses specific to abdominal imaging. METHODS In October 2015, 10 search terms were entered in the Google search engine, and the top 10 links for each term were collected and independently examined for their readability level using 10 well-validated quantitative readability scales. Search terms included CT abdomen, MRI abdomen, MRI enterography, ultrasound abdomen, X-ray abdomen, cholecystitis, diverticulitis, hepatitis, inflammatory bowel disease, and pancreatitis. Websites not written exclusively for patients were excluded from the analysis. RESULTS As a group, the 100 articles were assessed at an 11.7 grade level. Only 2% (2/100) were written at the National Institutes of Health (NIH), and American Medical Association (AMA) suggested 3rd to 7th grade level to meet the 8th grade average reading level in the United States. In fact, 49% were written at a level that required a high school education or higher (greater than 12th grade). CONCLUSIONS With websites like radiologyinfo.org, generating over a million visitors a month, it is that clear there is a public interest in learning about radiology. However, given the discordance between the level of readability of the majority of the Internet articles and the NIH and AMA guidelines noted in this study on abdominal imaging readability, it is likely that many readers do not fully benefit from these resources on abdominal imaging.
Collapse
Affiliation(s)
- David R Hansberry
- Department of Radiology, Thomas Jefferson University Hospital, 132 South 10th Street, Philadelphia, PA, 19107, USA.
| | - Varun Ayyaswami
- University of Maryland School of Medicine, Baltimore, MD, USA
| | - Anshum Sood
- University of Maryland School of Medicine, Baltimore, MD, USA
| | - Arpan V Prabhu
- Department of Radiation Oncology, University of Pittsburgh Cancer Institute, Pittsburgh, PA, USA
| | - Nitin Agarwal
- Department of Neurological Surgery, University of Pittsburgh Medical Center, Pittsburgh, PA, USA
| | - Sandeep P Deshmukh
- Department of Radiology, Thomas Jefferson University Hospital, 132 South 10th Street, Philadelphia, PA, 19107, USA
| |
Collapse
|
16
|
Mammography Patient Information at Hospital Websites: Most Neither Comprehensible Nor Guideline Supported. AJR Am J Roentgenol 2016; 207:947-951. [DOI: 10.2214/ajr.16.16436] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
|
17
|
Williams AM, Muir KW, Rosdahl JA. Readability of patient education materials in ophthalmology: a single-institution study and systematic review. BMC Ophthalmol 2016; 16:133. [PMID: 27487960 PMCID: PMC4973096 DOI: 10.1186/s12886-016-0315-0] [Citation(s) in RCA: 78] [Impact Index Per Article: 9.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/22/2015] [Accepted: 07/27/2016] [Indexed: 11/10/2022] Open
Abstract
Background Patient education materials should be written at a level that is understandable for patients with low health literacy. The aims of this study are (1) to review the literature on readability of ophthalmic patient education materials and (2) to evaluate and revise our institution’s patient education materials about glaucoma using evidence-based guidelines on writing for patients with low health literacy. Methods A systematic search was conducted on the PubMed/MEDLINE database for studies that have evaluated readability level of ophthalmic patient education materials, and the reported readability scores were assessed. Additionally, we collected evidence-based guidelines for writing easy-to-read patient education materials, and these recommendations were applied to revise 12 patient education handouts on various glaucoma topics at our institution. Readability measures, including Flesch-Kincaid Grade Level (FKGL), and word count were calculated for the original and revised documents. The original and revised versions of the handouts were then scored in random order by two glaucoma specialists using the Suitability Assessment of Materials (SAM) instrument, a grading scale used to evaluate suitability of health information materials for patients. Paired t test was used to analyze changes in readability measures, word count, and SAM score between original and revised handouts. Finally, five glaucoma patients were interviewed to discuss the revised materials, and patient feedback was analyzed qualitatively. Results Our literature search included 13 studies that evaluated a total of 950 educational materials. Among the mean FKGL readability scores reported in these studies, the median was 11 (representing an eleventh-grade reading level). At our institution, handouts’ readability averaged a tenth-grade reading level (FKGL = 10.0 ± 1.6), but revising the handouts improved their readability to a sixth-grade reading level (FKGL = 6.4 ± 1.2) (p < 0.001). Additionally, the mean SAM score of our institution’s handouts improved from 60 ± 7 % (adequate) for the original versions to 88 ± 4 % (superior) for the revised handouts (p < 0.001). Conclusions Our systematic review of the literature reveals that ophthalmic patient education materials are consistently written at a level that is too high for many patients to understand. Our institution’s experience suggests that applying guidelines on writing easy-to-understand material can improve the readability and suitability of educational materials for patients with low health literacy.
Collapse
Affiliation(s)
- Andrew M Williams
- Michigan State University College of Human Medicine, Grand Rapids, MI, USA.,Department of Ophthalmology, University of Pittsburgh Medical Center, Pittsburgh, PA, USA
| | - Kelly W Muir
- Duke University Department of Ophthalmology, Durham, NC, USA.,Durham VA Medical Center, Health Services Research and Development, Durham, NC, USA
| | | |
Collapse
|
18
|
Prabhu AV, Hansberry DR, Agarwal N, Clump DA, Heron DE. Radiation Oncology and Online Patient Education Materials: Deviating From NIH and AMA Recommendations. Int J Radiat Oncol Biol Phys 2016; 96:521-8. [PMID: 27681748 DOI: 10.1016/j.ijrobp.2016.06.2449] [Citation(s) in RCA: 37] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2016] [Revised: 06/14/2016] [Accepted: 06/15/2016] [Indexed: 11/19/2022]
Abstract
PURPOSE Physicians encourage patients to be informed about their health care options, but much of the online health care-related resources can be beneficial only if patients are capable of comprehending it. This study's aim was to assess the readability level of online patient education resources for radiation oncology to conclude whether they meet the general public's health literacy needs as determined by the guidelines of the United States National Institutes of Health (NIH) and the American Medical Association (AMA). METHODS Radiation oncology-related internet-based patient education materials were downloaded from 5 major professional websites (American Society for Radiation Oncology, American Association of Physicists in Medicine, American Brachytherapy Society, RadiologyInfo.org, and Radiation Therapy Oncology Group). Additional patient education documents were downloaded by searching for key radiation oncology phrases using Google. A total of 135 articles were downloaded and assessed for their readability level using 10 quantitative readability scales that are widely accepted in the medical literature. RESULTS When all 10 assessment tools for readability were taken into account, the 135 online patient education articles were written at an average grade level of 13.7 ± 2.0. One hundred nine of the 135 articles (80.7%) required a high school graduate's comprehension level (12th-grade level or higher). Only 1 of the 135 articles (0.74%) met the AMA and NIH recommendations for patient education resources to be written between the third-grade and seventh-grade levels. CONCLUSION Radiation oncology websites have patient education material written at an educational level above the NIH and AMA recommendations; as a result, average American patients may not be able to fully understand them. Rewriting radiation oncology patient education resources would likely contribute to the patients' understanding of their health and treatment options, making each physician-patient interaction more productive and efficient.
Collapse
Affiliation(s)
- Arpan V Prabhu
- Department of Radiation Oncology, University of Pittsburgh Cancer Institute, Pittsburgh, Pennsylvania
| | - David R Hansberry
- Department of Radiology, Thomas Jefferson University Hospitals, Philadelphia, Pennsylvania
| | - Nitin Agarwal
- Department of Neurological Surgery, University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania
| | - David A Clump
- Department of Radiation Oncology, University of Pittsburgh Cancer Institute, Pittsburgh, Pennsylvania
| | - Dwight E Heron
- Department of Radiation Oncology, University of Pittsburgh Cancer Institute, Pittsburgh, Pennsylvania; Department of Otolaryngology, Head and Neck Surgery, University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania.
| |
Collapse
|