Next Article in Journal
It Is Not Just Stress: A Bayesian Approach to the Shape of the Negative Psychological Features Associated with Sport Injuries
Previous Article in Journal
Technologies to Support Frailty, Disability, and Rare Diseases: Towards a Monitoring Experience during the COVID-19 Pandemic Emergency
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Assessment of the Readability and Quality of Online Patient Education Material for Chronic Medical Conditions

by
Peter Minh Hoang
1,*,† and
Courtney van Ballegooie
2,3,†
1
Department of Internal Medicine, Cumming School of Medicine, Calgary, AB T2N 4N1, Canada
2
Experimental Therapeutics, BC Cancer Research Institute, Vancouver, BC V5Z 1L3, Canada
3
Faculty of Medicine, University of British Columbia, Vancouver, BC V6T 1Z3, Canada
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Healthcare 2022, 10(2), 234; https://doi.org/10.3390/healthcare10020234
Submission received: 31 December 2021 / Revised: 20 January 2022 / Accepted: 21 January 2022 / Published: 26 January 2022
(This article belongs to the Section Healthcare Quality and Patient Safety)

Abstract

:
Patient education materials (PEMs) were assessed from chronic health condition associations to determine their quality and if they were above the 6th grade reading level (GRL) recommended by the Centers for Disease Control and National Institutes of Health. PEMs from 55 associations were assessed for their GRL using ten readability scales and underwent a difficult word analysis. The associations had their quality assessed using two methods: the Journal of the American Medical Association (JAMA) Benchmarks and Health on the Net Foundation Code of Conduct certification (HONCode). Two thousand five hundred and ninety PEMs, collected between June and November 2021, were analyzed. The overall GRL average was 10.8 ± 2.8, with a range of 0 to 19. Difficult word analysis showed that 15.8% ± 4.8 contained complex words of 3 or more syllables and 25.7% ± 6.3 contained words which were unfamiliar. No association displayed all four indicators of quality according to JAMA Benchmarks or held an up-to-date HONCode certification. The PEM readability continues to be written at a level above the recommended GRL. Additionally, the lack of quality indicators from the associations’ websites may make it difficult for older adults to identify the sources as credible. This represents an opportunity to optimize materials that would be understood by a wider audience.

1. Introduction

Older adults are the most likely to experience chronic health conditions, which are the leading causes of morbidity and mortality. This population has also been shown to have lower levels of health literacy relative to the general population [1,2,3]. Health literacy is defined as an individual’s ability to access, understand, and utilize information to create an informed decision regarding their health, and is correlated to reading level [3,4,5,6]. Older adults who have higher levels of health literacy have been shown to have better management of their medical conditions, including diabetic and hypertension targets, medication adherence, and healthy lifestyle behaviors [7,8,9,10]. While the Centers for Disease Control (CDC) already recommend that health information provided to patients be written two levels below the population’s average grade reading level (GRL), corresponding to a 6th GRL, an even lower GRL may be required in those over the age of 65 [11].
Online health information has provided a unique opportunity to improve health literacy. Population studies have shown that 67% of older adults use the Internet, and up to 50% use the Internet for health information [12,13,14]. Accordingly, online patient education materials (PEMs) provide an opportunity to further improve health education. Previous studies have shown that PEMs are written at a grade level above what most American adults would be able to understand [15,16,17,18,19,20,21]. Similarly, two studies assessing online patient health information from geriatric associations identified that their average grade reading level was above the recommended 6th GRL [22,23]. Our study aims to examine PEMs from 55 associations that provide information on specific chronic health conditions common to older adults, based on 10 validated readability scales with disease cluster specific recommendations [24]. In addition, the quality of the associations was assessed in order to determine if the associations’ websites would easily allow one to identify them as a credible source of health-related information.

2. Materials and Methods

2.1. Sample Collection

From June 2021 to November 2021, all internet-based PEMs were extracted from the associations’ websites and pooled into their respective disease cluster (Table 1). The specific associations identified are described in Supplementary Table S1 and included both national organizations and foundations.
The most common conditions experienced by older adults were selected from CDC data, such as cardiovascular disease (including stroke), chronic lung disease (including asthma, chronic obstructive pulmonary disease, and sleep apnea), arthritis, chronic kidney disease, dementia, and osteoporosis [2,25]. Risk factors associated with chronic kidney disease and heart disease, such as hypertension and diabetes, were also included if there were PEMs from the respective associations identified. Visual deficits and urinary incontinence were also identified as common conditions older adults experience, and thus were included in the analysis [26,27]. The downloaded PEMs included materials describing topics related to the specific disease clusters with an intended use by patients and their caregivers. Materials were excluded if they had an intended use by health care providers. Only the top five most common cancers, as identified by the American Cancer Society by gender, were selected for assessment. For men, this included prostate, lung, colorectal and bladder cancer as well as melanoma, while for females this included breast, lung, colorectal, uterine, and thyroid cancers [28]. Given that the majority of cancers are diagnosed over the age of 55, this study aimed to capture what older adults are most likely to experience [28]. Additionally, it should be noted that certain associations contain PEMs that span multiple medical conditions, particularly the American Geriatric Society and National Institute on Aging. If a document was in Portable Document Format (PDF), it was manually converted to a Microsoft Word document (Microsoft Corp) for further analysis. Text sections of nonmedical information such as diagrams, tables, page numbers, disclaimers, and webpage navigation were excluded from assessment [15,18].

2.2. Document Readability Analysis

A readability assessment was then performed on the PEMs using the software package Readability Studio professional edition version 2019.3 (Oleander Software, Ltd., Pune, India). The readability grade level scales included the Coleman–Liau Index (CLI), New Dale–Chall (NDC), Degrees of Reading Power and Grade Equivalent test (DRP-GE), Flesch–Kincaid Grade Level (FK), Gunning Fog Index (GF), New Fog Count (NFC), Simple Measure of Gobbledygook Index (SMOG), and Ford, Caylor, Sticht (FORCAST) scale. Two graphical scales were included, the Raygor Readability Estimate Graph (RREG) and the Fry Readability Graph (FRG). These ten scales provide an estimate of the GRL requirement and are frequently used when assessing medical text, and they offer externally validated measures of readability [15,18]. Given the formulaic variation for the weighting and parameters used in these readability scales, an assessment of PEMs using multiple formulas provides greater assurance in the calculated GRL.
Once the PEMs were extracted from the 55 associations, PEMs were then formatted to account for non-narrative text. This included the alteration of bullet points. PEMs were individually edited to create high- and low-sentence documents wherein bullet points were treated as individual sentences or a single sentence, respectively, which were then subsequently averaged for numerical indices [21,22]. Their readability level using the eight numerical scales can be seen in Figure 1, generated using Prism 9 software, and the two graphical scales can be seen in Supplementary Figures S1 and S2, generated using Readability Studio.

2.3. Difficult Word Analysis

Individual words from each of the PEMs were extracted using Readability Studio. The analysis included the identification of the percentage of complex words (3+ syllable words), the percentage of long words (6+ character words), as well as the percentage of unfamiliar words according to the NDC criteria. Words were compared to the NDC word list as well as the New General Service List (NGSL), and those that appeared in either of the lists were removed and considered as non-jargon words. All words with 3 syllables or more were then extracted. Hyphenated words were only included if one or more of the components contained unfamiliar words. Three-syllable words, prioritized by their frequency for each of the disease clusters and the number of PEMs they appeared in, were then combined with their alternate tenses. Alternative words were then proposed for the most frequent 3-syllable words using the Readability Studio Software, the Merriam-Webster Thesaurus, or a medical doctor (P.M.H.), in order to identify synonyms that could decrease the difficulty of the word by an individual word’s syllables and/or character length. Alternatives also provided short phrases that either avoided medical jargon or were in vernacular language for certain difficult words [29].

2.4. Quality Analysis

A quality analysis was performed using two well-established, validated tools including the Health On the Net (HON) Foundation Code of Conduct (HONCode) and Journal of the American Medical Association (JAMA) benchmarks. HONCode evaluates the credibility and reliability of information for medical and health websites using the following criteria: disclosure of authors’ qualifications, attribution/citation of sources, data protection, justifiability, transparency, and disclosure of sources of funding and advertising [30,31,32]. Associations incur a fee in order to be evaluated by HONCode and receive a certification if they meet HONCode criteria. Each association, regardless of whether it authored PEMs for the disease clusters of interest, had their HONCode certification status verified. The JAMA benchmarks were also used to assess the quality of each website. This instrument evaluates the presence of 4 components: authorship (including affiliations and credentials), references, disclosure (including ownership, advertising policy, sponsorship, and conflicts of interests), and currency (e.g., date of creation/update) [31,33]. Each of the associations that contained PEMs relevant to this study were further evaluated. This included extracting five PEMs from each of the associations and evaluating them using two independent reviewers according to the JAMA benchmark criteria. The associations then had the mode taken for each of the criteria of the five PEMs to achieve their final JAMA benchmark score.

2.5. Statistical Methods

The graphical data in Figure 1 was reported as the arithmetic mean of each numerical scale. The average of the eight numerical readability tests for each of the disease types underwent a one-way Analysis of Variance (ANOVA) with a Tukey’s test comparing the disease types [21,22]. A pooled standard deviation was used to calculate the standard deviation whenever disease types were combined, such as for the difficult word analysis, in order to ensure that the number of PEMs extracted for each disease cluster was taken into consideration. Statistics were analyzed using Graph Pad Prism 9.

3. Results

3.1. Document Readability Analysis

A total of 2590 PEMs were downloaded and assessed from 55 associations for ten chronic medical condition clusters. The average grade reading level of the eight readability scales are as follows: age-related macular degeneration and cataracts (11.2 ± 1.6), cancer (10.0 ± 1.8), dementia (11.3 ± 1.6), cardiovascular disease, heart attack, and stroke (10.1 ± 1.9), kidney-associated diseases (9.5 ± 1.3), osteoarthritis (10.8 ± 2.0), osteoporosis (10.6 ± 1.7), Parkinson’s disease (11.9 ± 1.8), urinary incontinence (9.7 ± 1.8), and lung-associated diseases (10.0 ± 2.2). There was a significant difference identified in the ANOVA (p < 0.0001). Parkinson’s was identified as the most difficult to read relative to all other diseases. The overall mean for all medical condition clusters was 10.8 ± 2.8, with a grade level range of 0 to 19; zero suggests that the PEM can be read by any person. The numerical indicators showed that 95.3% and 82.7% of PEMs were above the 6th and 8th GRL, respectively. Figure 1 illustrates a summary of the results for the various disease clusters, showing each of the eight readability scales used. The RREG of the high sentence estimate (Supplementary Figure S2) ranges from a 3rd grade reading level to a grade level equivalent to that of a professor level (grade 17), with 98.5% and 89.5% exhibiting a grade level above six and eight, respectively. The FRG of the high sentence estimate, as seen in Supplementary Figure S1, ranges from a 3rd grade to a 17th grade (university) reading level, with 99.0% and 89.6% exhibiting a grade level above six and eight, respectively.
Figure 1. Mean grade level of online patient education materials found for each disease cluster (Lung disease, Osteoporosis, Osteoarthritis, Parkinson’s, Kidney disease, Incontinence, Cardiovascular disease [CD], Hypertension, and Stroke, Cancer, Age related macular degeneration [AMD] and Cataracts, and Dementia) using eight numerical scales, including the Coleman–Liau Index, New Dale–Chall, Degrees of Reading Power and Grade Equivalent test, Flesch–Kincaid Grade Level, Gunning Fog Index, New Fog Count, Simple Measure of Gobbledygook Index (SMOG), and Ford, Caylor, Sticht (FORCAST) scale.
Figure 1. Mean grade level of online patient education materials found for each disease cluster (Lung disease, Osteoporosis, Osteoarthritis, Parkinson’s, Kidney disease, Incontinence, Cardiovascular disease [CD], Hypertension, and Stroke, Cancer, Age related macular degeneration [AMD] and Cataracts, and Dementia) using eight numerical scales, including the Coleman–Liau Index, New Dale–Chall, Degrees of Reading Power and Grade Equivalent test, Flesch–Kincaid Grade Level, Gunning Fog Index, New Fog Count, Simple Measure of Gobbledygook Index (SMOG), and Ford, Caylor, Sticht (FORCAST) scale.
Healthcare 10 00234 g001

3.2. Difficult Word Analysis

From the difficult word analysis, it was found that, on average, the PEMs for all disease clusters were comprised of 15.8% ± 4.8% of complex words that contained three or more syllables, 35.2% ± 5.6% of words that contained six or more characters, and 25.7% ± 6.3% of words that were unfamiliar. Supplementary Table S2 describes the top ten most frequent difficult words separated by disease cluster, and Supplementary Table S3 describes the complex, long, and unfamiliar analysis for each of the individual disease clusters. Supplementary Table S4 displays the ANOVA and pairwise comparison of the difficult word analysis, which indicated that Parkinson’s disease was the most difficult in all three difficult word analyses performed (p < 0.001).

3.3. Quality Analysis

From the quality analysis, it was found that none of the associations held a currently valid HONCode certification (Figure 2a). Only seven of the 55 associations ever held a certification at any point in time. Additionally, the majority of the associations (58%) displayed zero JAMA Benchmark quality indicators (Figure 2c). Authorship and currency were identified as the most commonly reported quality indicators and disclosure as the least common indicator (Figure 2b).

4. Discussion

4.1. Implications

Online patient health materials are a potential mechanism to improve the management of chronic diseases, as they have been shown to promote non-pharmacological healthy lifestyle behaviors and improve adherence to physician advice [34,35]. Therefore, they should be written at a level consistent with the health literacy of the targeted population and be easily identifiable as credible sources of health information. Our study has identified that PEMs from associations for chronic medical conditions are written at a GRL of 10.8 ± 2.8. In addition, the graphical scales identified that over 98% of PEMs were written above the recommended 6th grade level. As chronic medical conditions are common in older adults, this suggests that PEMs from these associations are not written at the 6th GRL that is recommended as being understood by most American older adults.
This study aimed to identify whether different medical topics common to older adults may have different levels of readability, which may help identify PEMs that require more focused efforts to improve readability. Grouped by disease cluster, the medical conditions that had the lowest readability average were kidney disease, cancer, and urinary incontinence. Certain associations have begun to use readability indices in the development of their PEMs; for example, the National Cancer Institute uses the SMOG readability test as its gold standard [36]. In addition, certain disease clusters only had a small number of associations creating PEMs, which may confound readability either positively or negatively depending on whether readability was factored in during their authorship. Conditions in which there is typically a higher frequency of medical jargon due to the disease, pathophysiology, and its treatment (e.g., dementias, Parkinson’s disease, and macular degeneration), may be more likely to have higher grade level requirements. The difficult word analysis identified the top ten most frequently used terms that were complex, either by syllabic count, character count, or unfamiliarity. The majority of these words were medical jargon. Overall, to improve the readability of the PEMs, associations should not only consider their word choices, such as choosing short, familiar, and mono- or bi-syllabic words, but also the PEMs’ format (e.g., a simple style and layout) and use of multimedia (e.g., visuals and illustrations), as both have been shown to assist in comprehension [11,37,38,39]. Associations should give additional consideration to the disease clusters that were identified as being the most difficult to read (e.g., Dementia and Parkinson’s).
Several other studies have examined PEM readability across many surgical and medical specialties outside of geriatrics [16,17,18,19,36,40,41,42,43,44]. In addition, we have previously assessed PEMs from geriatric-specific associations. Overall, these studies have shown that the GRL of PEMs remain above the GRL of the average American [22]. Compared to PEMs of internal medicine diagnoses and its subspecialties, we found a similar trend wherein PEMs were, on average, above the recommended 6th GRL [15,45,46]. Compared to studies examining geriatric associations, the overall GRL of these chronic disease associations was similar to previous studies, suggesting that even if older adults use sources outside of geriatric specific organizations, they will likely still face difficulties in finding accessible and reliable information for their health concerns [22,23]. In addition to readability, this paper is the first to identify that none of the associations related to chronic diseases held an up-to-date HONcode certification, a quality indicator gold standard, nor did the majority of the associations display any JAMA Benchmark quality indicators (Figure 2). The most (authorship and currency) and least (discloser) frequently identified quality indicator, as identified in this study, aligns well with previous findings in other fields of medicine [33]. Quality has been assessed in many different medical fields in the U.S., such as surgery, oncology, ophthalmology, and infectious disease, and they have come to similar conclusions, citing the need for higher quality information [32,33,47]. This study has identified a need for associations related to chronic diseases to better display quality indicators in order for patients and caregivers to navigate medical information on the internet and identify credible sources of health information.

4.2. Limitations

This study has several limitations. Certain chronic medical conditions may be understood by many older adults, such as arthritis, Parkinson’s disease, or Alzheimer’s dementia, but would have a high GRL as many of the readability indices use syllabic count, which can potentially overestimate GRL. That being said, medical jargon that has fewer syllables, such as renal or Merkel cell cancer, may not be accurately captured within readability scales. Many of the PEMs also explain the medical condition that is being described, but repetition of medical jargon is frequently necessary, which may increase the overall GRL. Due to the limitations of the text-based readability indices, visual representation components from PEMs could not be analyzed. It is well known that visual aids can effectively and efficiently convey medical information and should be considered in future studies. Additionally, the comprehension of health information depends on many parameters, including health and cultural experiences as well as the specific goals when reading PEMs, which were not evaluated in this study [37]. Authors of PEMs can consider the various following guidelines in the development of their materials, including the NIH Clear & Simple and the United States General Services Administration University Design and Accessibility guides [38,39].

5. Conclusions

Over 2500 PEMs from 55 associations relating to specific medical conditions that older adults commonly experience have an average GRL of 10.8 ± 2.8, which is significantly above the recommended 6th GRL. Our study identifies an opportunity to provide highly accessible and accurate medical information for patients, but it is imperative that these be written at a level that can be understood and easily identified as credible sources of information by the majority of older adults. Associations should consider the use of readability scales in their PEM design and must better display the quality of their information, either through credible certification or the incorporation of quality benchmarks. Future studies incorporating patient understanding can be considered to assess education material readability among a representative sample of older adults.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/healthcare10020234/s1, Table S1: Associations identified for each chronic disease cluster; Table S2: Difficult Words with Alternative Word Recommendations; Figure S1: Fry readability graph assessment of all high sentence estimate online patient education materials by disease cluster; Figure S2: Raygor readability estimate graph of all high sentence estimate online patient education materials by disease cluster; Table S3: Difficult words analysis displaying the mean and standard deviation of the % 3+ syllable words, % 6+ character words, and % unfamiliar words found in the patient education material (PEMs) of each of the disease clusters; Table S4: Difficult word analysis statistics: Analysis of variance (ANOVA) and pairwise comparison of the difficult words analyses [e.g., the % 3+ syllable words, % 6+ character words, and % unfamiliar words found in the patient education material (PEMs) of each of the disease clusters].

Author Contributions

Conceptualization, C.v.B. and P.M.H.; methodology, C.v.B. and P.M.H.; formal analysis, C.v.B. and P.M.H.; investigation, C.v.B. and P.M.H.; resources, C.v.B.; data curation, C.v.B. and P.M.H.; writing—original draft preparation, C.v.B. and P.M.H.; writing—review and editing, C.v.B. and P.M.H.; visualization, C.v.B.; funding acquisition, C.v.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Canadian Institutes of Health Research, grant number 21R04868.

Institutional Review Board Statement

This study qualifies for exemption of ethical approval due to the use of non-human subject material (patient education materials).

Informed Consent Statement

This study did not contain human subjects and therefore is not subjected to informed consent requirements.

Data Availability Statement

Data may be made available upon request.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

Age Related Macular Degeneration(AMD)
Analysis of variance (ANOVA)
Centers for Disease Control (CDC)
Cardiovascular Disease(CD)
Coleman-Liau index (CLI)
Degrees of reading power (DRP)
Flesch-Kincaid grade level (FK)
Ford, Caylor, Sticht (FORCAST)
Fry readability graph (FRG)
Grade equivalent (GE)
Gunning Fog index (GF)
Grade reading level (GRL)
Health On the Net Foundation Code of Conduct(HONCode)
Journal of the American Medical Association(JAMA)
New Dale-Chall (NDC)
New Fog Count (NFC)
New general service list (NGSL)
Patient education material (PEM)
Portable document format (PDF)
Raygor readability estimate graph (RREG)
Simple measure of gobbledygook index (SMOG)

References

  1. Boersma, P.; Black, L.I.; Ward, B.W. Prevalence of Multiple Chronic Conditions Among US Adults, 2018. Prev. Chronic Dis. 2020, 17, 200130. [Google Scholar] [CrossRef] [PubMed]
  2. Ward, B.W.; Schiller, J.S. Prevalence of multiple chronic conditions among US adults: Estimates from the national health interview survey, 2010. Prev. Chronic Dis. 2013, 10, 120203. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Institute of Education Sciences National Center for Education Statistics. Available online: https://nces.ed.gov/pubs2006/2006483.pdf (accessed on 10 July 2021).
  4. Joint Commission. Available online: http://www.nslij-healthliteracytraining.com/documents/What_Did_the_Doctor_Say.pdf (accessed on 10 July 2021).
  5. Network of the National Library of Medicine. Available online: https://nnlm.gov/guides/intro-health-literacy (accessed on 10 August 2021).
  6. US Department of Health and Human Services: Office of Disease Prevention and Health Promotion. Available online: https://health.gov/our-work/health-literacy/national-action-plan-improve-health-literacy (accessed on 12 September 2021).
  7. Schillinger, D.; Grumbach, K.; Piette, J.; Wang, F.; Osmond, D.; Daher, C.; Palacios, J.; Sullivan, G.D.; Bindman, A.B. Association of health literacy with diabetes outcomes. J. Am. Med. Assoc. 2002, 288, 475–482. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  8. Halladay, J.R.; Donahue, K.E.; Cené, C.W.; Li, Q.; Cummings, D.M.; Hinderliter, A.L.; Miller, C.L.; Garcia, B.A.; Little, E.; Rachide, M.; et al. The association of health literacy and blood pressure reduction in a cohort of patients with hypertension: The heart healthy lenoir trial. Patient Educ. Couns. 2017, 100, 542–549. [Google Scholar] [CrossRef] [Green Version]
  9. Miller, T.A. Health literacy and adherence to medical treatment in chronic and acute illness: A meta-analysis. Patient Educ. Couns. 2016, 99, 1079–1086. [Google Scholar] [CrossRef] [Green Version]
  10. Sharifirad, G.; Reisi, M.; Javadzade, S.; Heydarabadi, A.; Mostafavi, F.; Tavassoli, E. The relationship between functional health literacy and health promoting behaviors among older adults. J. Educ. Health Promot. 2014, 3, 119. [Google Scholar] [CrossRef]
  11. U.S. Department of Health and Human Services: Centers for Disease Control and Prevention. Available online: https://www.cdc.gov/healthliteracy/pdf/olderadults-508.pdf (accessed on 10 July 2021).
  12. Pew Research Center. Available online: https://www.pewresearch.org/internet/2017/05/17/technology-use-among-seniors/ (accessed on 10 July 2021).
  13. Flynn, K.E.; Smith, M.A.; Freese, J. When do older adults turn to the internet for health information? Findings from the Wisconsin Longitudinal Study. J. Gen. Intern. Med. 2006, 21, 1295–1301. [Google Scholar] [CrossRef]
  14. Turner, A.M.; Osterhage, K.P.; Taylor, J.O.; Hartzler, A.L.; Demiris, G. A Closer Look at Health Information Seeking by Older Adults and Involved Family and Friends: Design Considerations for Health Information Technologies. AMIA Annu. Symp. Proc. 2018, 2018, 1036–1045. [Google Scholar]
  15. Agarwal, N.; Hansberry, D.R.; Sabourin, V.; Tomei, K.L.; Prestigiacomo, C.J. A comparative analysis of the quality of patient education materials from medical specialties. JAMA Intern. Med. 2013, 173, 1257–1259. [Google Scholar] [CrossRef] [Green Version]
  16. Agarwal, N.; Chaudhari, A.; Hansberry, D.R.; Tomei, K.L.; Prestigiacomo, C.J. A comparative analysis of neurosurgical online education materials to assess patient comprehension. J. Clin. Neurosci. 2013, 20, 1357–1361. [Google Scholar] [CrossRef]
  17. Vives, M.; Young, L.; Sabharwal, S. Readability of spine-related patient education materials from subspecialty organization and spine practitioner websites. Spine 2009, 34, 2826–2831. [Google Scholar] [CrossRef]
  18. Huang, G.; Fang, C.H.; Agarwal, N.; Bhagat, N.; Eloy, J.A.; Langer, P.D. Assessment of online patient education materials from major ophthalmologic associations. JAMA Ophthalmol. 2015, 133, 449–454. [Google Scholar] [CrossRef] [Green Version]
  19. Colaco, M.; Svider, P.F.; Agarwal, N.; Eloy, J.A.; Jackson, I.M. Readability assessment of online urology patient education materials. J. Urol. 2013, 189, 1048–1052. [Google Scholar] [CrossRef]
  20. Patel, C.R.; Sanghvi, S.; Cherla, D.V.; Baredes, S.; Eloy, J.A. Readability Assessment of Internet-Based Patient Education Materials Related to Parathyroid Surgery. Ann. Otol. Rhinol. Laryngol. 2015, 124, 523–527. [Google Scholar] [CrossRef]
  21. Perni, S.; Rooney, M.K.; Horowitz, D.P.; Golden, D.W.; McCall, A.R.; Einstein, A.J.; Jagsi, R. Assessment of Use, Specificity, and Readability of Written Clinical Informed Consent Forms for Patients with Cancer Undergoing Radiotherapy. JAMA Oncol. 2019, 5, e190260. [Google Scholar] [CrossRef]
  22. Van Ballegooie, C.; Hoang, P. Assessment of the Readability of Online Patient Education Material from Major Geriatric Associations. J. Am. Geriatr. Soc. 2020, 69, 1051–1056. [Google Scholar] [CrossRef]
  23. Weiss, B.D.; Mollon, L.; Lee, J.K. Readability of patient education information on the American Geriatrics Society Foundation’s health-in-aging website. J. Am. Geriatr. Soc. 2013, 61, 1845–1846. [Google Scholar] [CrossRef]
  24. Edmunds, M.R.; Barry, R.J.; Denniston, A.K. Readability assessment of online ophthalmic patient information. JAMA Ophthalmol. 2013, 131, 1610–1616. [Google Scholar] [CrossRef] [Green Version]
  25. U.S. Department of Health and Human Services: The Administration for Community Living. Available online: https://acl.gov/sites/default/files/Aging%20and%20Disability%20in%20America/2017OlderAmericansProfile.pdf (accessed on 29 September 2021).
  26. National Institutes of Health: National Eye Institute. Age-Related Macular Degeneration (AMD) Data and Statistics. Available online: https://www.nei.nih.gov/learn-about-eye-health/outreach-campaigns-and-resources/eye-health-data-and-statistics/age-related-macular-degeneration-amd-data-and-statistics (accessed on 29 September 2021).
  27. National Institutes of Health: National Eye Institute. Cataract Data and Statistics. Available online: https://www.nei.nih.gov/learn-about-eye-health/outreach-campaigns-and-resources/eye-health-data-and-statistics/cataract-data-and-statistics (accessed on 29 September 2021).
  28. American Cancer Society. Available online: https://www.cancer.org/content/dam/cancer-org/research/cancer-facts-and-statistics/annual-cancer-facts-and-figures/2020/cancer-facts-and-figures-2020.pdf (accessed on 29 September 2021).
  29. Van Ballegooie, C.; Hoang, P. Health Services: A Mixed Methods Assessment of Canadian Cancer Patient Education Materials Related to the 2019 Novel Coronavirus. Cancer Control 2021, 28, 1–10. [Google Scholar] [CrossRef]
  30. Health On the Net. Available online: https://www.hon.ch/HONcode/ (accessed on 20 September 2021).
  31. Silberg, W.M. Assessing, Controlling, and Assuring the Quality of Medical Information on the Internet. JAMA 1997, 277, 1244. [Google Scholar] [CrossRef]
  32. Fefer, M.; Lamb, C.C.; Shen, A.H.; Clardy, P.; Muralidhar, V.; Devlin, P.M.; Dee, E.C. Multilingual Analysis of the Quality and Readability of Online Health Information on the Adverse Effects of Breast Cancer Treatments. JAMA Surg. 2020, 155, 781–784. [Google Scholar] [CrossRef] [PubMed]
  33. Kloosterboer, A.; Yannuzzi, N.A.; Patel, N.A.; Kuriyan, A.E.; Sridhar, J. Assessment of the Quality, Content, and Readability of Freely Available Online Information for Patients Regarding Diabetic Retinopathy. JAMA Ophthalmol. 2019, 137, 1240–1245. [Google Scholar] [CrossRef] [PubMed]
  34. Iverson, S.A.; Howard, K.B.; Penney, B.K. Impact of internet use on health-related behaviors and the patient-physician relationship: A survey-based study and review. J. Am. Osteopath Assoc. 2008, 108, 699–711. [Google Scholar] [PubMed]
  35. Bujnowska-Fedak, M.M.; Węgierek, P. The Impact of Online Health Information on Patient Health Behaviours and Making Decisions Concerning Health. Int. J. Environ. Res. Public Health 2020, 17, 880. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  36. Van Ballegooie, C. Assessment of Canadian patient education material for oncology pharmaceutics. J. Oncol. Pharm. Pract. 2020, 27, 1578–1587. [Google Scholar] [CrossRef]
  37. Institute of Medicine (US) Committee on Health Literacy. What Is Health Literacy? In Health Literacy: A Prescription to End Confusion; Nielsen-Bohlman, L., Panzer, A.M., Kindig, D.A., Eds.; National Academies Press: Washington, DC, USA, 2004; Volume 1, pp. 31–58. [Google Scholar]
  38. National Institutes of Health. Available online: https://www.nih.gov/institutes-nih/nih-office-director/office-communications-public-liaison/clear-communication/clear-simple (accessed on 20 September 2021).
  39. US General Services Administration. Available online: https://www.section508.gov/create/universal-design. (accessed on 20 September 2021).
  40. Edmunds, M.R.; Denniston, A.K.; Boelaert, K.; Franklyn, J.A.; Durrani, O.M. Patient information in Graves’ disease and thyroid-associated ophthalmopathy: Readability assessment of online resources. Thyroid 2014, 24, 67–72. [Google Scholar] [CrossRef] [Green Version]
  41. Sabharwal, S.; Badarudeen, S.; Unes Kunju, S. Readability of online patient education materials from the AAOS Web site. Clin. Orthop. Relat. Res. 2008, 466, 1245–1250. [Google Scholar] [CrossRef] [Green Version]
  42. Badarudeen, S.; Sabharwal, S. Assessing readability of patient education materials: Current role in orthopaedics. Clin. Orthop. Relat. Res. 2010, 468, 2572–2580. [Google Scholar] [CrossRef] [Green Version]
  43. Eloy, J.A.; Li, S.; Kasabwala, K.; Agarwal, N.; Hansberry, D.R.; Baredes, S.; Setzen, M. Readability assessment of patient education materials on major otolaryngology association websites. Otolaryngol. Head Neck Surg. 2012, 147, 848–854. [Google Scholar] [CrossRef]
  44. Kasabwala, K.; Agarwal, N.; Hansberry, D.R.; Baredes, S.; Eloy, J.A. Readability assessment of patient education materials from the American Academy of Otolaryngology—Head and neck surgery foundation. Otolaryngol. Head Neck Surg. 2012, 147, 466–471. [Google Scholar] [CrossRef]
  45. Ayyaswami, V.; Padmanabhan, D.; Patel, M.; Prabhu, A.V.; Hansberry, D.R.; Agarwal, N.; Magnani, J.W. A Readability Analysis of Online Cardiovascular Disease-Related Health Education Materials. Health Lit. Res. Pract. 2019, 3, e74–e80. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  46. Hutchinson, N.; Baird, G.L.; Garg, M. Examining the Reading Level of Internet Medical Information for Common Internal Medicine Diagnoses. Am. J. Med. 2016, 129, 637–639. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  47. Chande-Mallya, R.; Msonde, S.E.; Mtega, W.P.; Lwoga, E.T. Health Information on the Internet. Int. Encycl. Public Health 2016, 285, 414–417. [Google Scholar]
Figure 2. The quality analyses using the Health On the Net (HON) Foundation Code of Conduct (HONCode) and JAMA benchmarks. (a) The number of associations that either have never had a HONCode certification, have a certification that expired over two years ago, have a certification that expired under two years ago, or have a current certification. (b) The number of associations, out of 43 associations, that display each of the individual JAMA benchmarks, including authorship, attribution, disclosure, and currency. (c) The number off associations that display zero, one, two, three, or all four of the JAMA benchmarks.
Figure 2. The quality analyses using the Health On the Net (HON) Foundation Code of Conduct (HONCode) and JAMA benchmarks. (a) The number of associations that either have never had a HONCode certification, have a certification that expired over two years ago, have a certification that expired under two years ago, or have a current certification. (b) The number of associations, out of 43 associations, that display each of the individual JAMA benchmarks, including authorship, attribution, disclosure, and currency. (c) The number off associations that display zero, one, two, three, or all four of the JAMA benchmarks.
Healthcare 10 00234 g002
Table 1. Number of associations and patient education documents of chronic medical conditions.
Table 1. Number of associations and patient education documents of chronic medical conditions.
Disease ClusterAssociations, No.Documents, No.
Age-Related Macular Degeneration and Cataracts11 *164
Cancer a11 *628
Dementia7 *243
Cardiovascular Disease, Hypertension and Stroke5 *402
Kidney2262
Osteoarthritis8 *93
Osteoporosis12 *66
Parkinson’s6 *244
Urinary Incontinence280
Lung8408
* Identifies the presence of associations that provided PEMs on medical conditions across more than one disease cluster. a Only the top five most common cancers, as identified by the American Cancer Society by gender, were selected for assessment, which were prostate, lung, colorectal, urinary bladder cancers as well as melanoma in males, and breast, lung, colorectal, uterine, and thyroid cancer in females.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Hoang, P.M.; van Ballegooie, C. Assessment of the Readability and Quality of Online Patient Education Material for Chronic Medical Conditions. Healthcare 2022, 10, 234. https://doi.org/10.3390/healthcare10020234

AMA Style

Hoang PM, van Ballegooie C. Assessment of the Readability and Quality of Online Patient Education Material for Chronic Medical Conditions. Healthcare. 2022; 10(2):234. https://doi.org/10.3390/healthcare10020234

Chicago/Turabian Style

Hoang, Peter Minh, and Courtney van Ballegooie. 2022. "Assessment of the Readability and Quality of Online Patient Education Material for Chronic Medical Conditions" Healthcare 10, no. 2: 234. https://doi.org/10.3390/healthcare10020234

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop