Next Article in Journal
Quality of Life among Peritoneal and Hemodialysis Patients: A Cross-Sectional Study
Next Article in Special Issue
Factors Contributing to Surgical Site Infections: A Comprehensive Systematic Review of Etiology and Risk Factors
Previous Article in Journal
Individual Differences in Auditory Training Benefits for Hearing Aid Users
Previous Article in Special Issue
Benign Mesothelial Proliferations of the Tunica Vaginalis Testis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Brief Report

Digital Patient Education on Xanthelasma Palpebrarum: A Content Analysis

1
Department of Biomedical Sciences, University of Missouri Kansas City School of Medicine, Kansas City, MO 64108, USA
2
Department of Nephrology, Mayo Clinic Alix School of Medicine, Rochester, MN 55905, USA
*
Authors to whom correspondence should be addressed.
Clin. Pract. 2023, 13(5), 1207-1214; https://doi.org/10.3390/clinpract13050108
Submission received: 21 July 2023 / Revised: 18 September 2023 / Accepted: 21 September 2023 / Published: 29 September 2023
(This article belongs to the Special Issue Teaching Pathology Towards Clinics and Practice)

Abstract

:
Patient education has been transformed using digital media and online repositories which disseminate information with greater efficiency. In dermatology, this transformation has allowed for patients to gain education on common cutaneous conditions and improve health literacy. Xanthelasma palpebrarum is one of the most common cutaneous conditions, yet there is a poor understanding of how digital materials affect health literacy on this condition. Our study aimed to address this paucity of literature utilizing Brief DISCERN, Rothwell’s Classification of Questions, and six readability calculations. The findings of this study indicate a poor-quality profile (Brief DISCERN < 16) regarding digital materials and readability scores which do not meet grade-level recommendations in the United States. This indicates a need to improve the current body of educational materials used by clinicians for diagnosing and managing xanthelasma palpebrarum.

1. Introduction

The utilization of digital applications in healthcare serves an imperative role in the physician–patient relationship [1]. Mobile health applications, telehealth, and multiple internet-based resources allow for the efficient dissemination of information and serve as a primary medium for healthcare literacy [2,3,4]. Moreover, patient education and healthcare literacy are closely tied to patient outcomes. Poor patient education and health literacy are associated with suboptimal healthcare outcomes [5,6]. Among dermatologic conditions, there is a growing emphasis on improving patient education for the goal of improving healthcare outcomes [7].
Xanthomas are common well-circumscribed deposits of lipids that may be found in skin, tendons, or fasciae [1]. They present as asymptomatic papules, plaques, or nodules [8]. Xanthelasma palpebrarum is the most common presentation of xanthoma, which is the occurrence of xanthoma near the upper eyelids [9]. Xanthelasma is estimated to have a prevalence in the general population of four percent [1]. Cutaneous xanthomas most commonly present in adulthood, with no apparent variance in prevalence between men and women [9]. The pathogenesis of xanthomas is related to the cutaneous deposition of lipids and subsequent inflammation. For xanthomas that occur in the setting of hyperlipidemia, it is thought that the high levels of serum lipoproteins extravasate to the extracellular space [10]. Macrophages are subsequently recruited to consume the lipoproteins and converted to foam cells; these aggregates present clinically as xanthomas [10,11]. Primary hyperlipidemia (i.e., familial hypercholesterolemia) as well as secondary hyperlipidemia (i.e., obesity, diabetes, and hypothyroidism) may present with xanthomas [1,12,13]. The mechanism by which xanthomas occur in the absence of dyslipidemia is not well known. One proposal is that monoclonal gammopathies generate antibodies which cause lipids to accrue in macrophages via immune complexes [14]. Subtypes of xanthomas include planar, eruptive, tuberous, tendinous, and verruciform, with planar xanthomas being the most common [12]. The evaluation of xanthelasma palpebrarum includes the evaluation and correction of any underlying dyslipidemia [1]. Because xanthomas can be a manifestation of highly prevalent cardiovascular conditions such as hypertension and diabetes, it is important to assess the body of information currently available to patients. Furthermore, patients may seek online education on the correlations of this condition with cardiovascular mortality. To our knowledge, there has not been an examination of the quality of online educational materials for xanthelasma palpebrarum. Herein, this study presents a cross-sectional analysis of the quality and content of digital health education on xanthelasma palpebrarum. We aim for this profile of digital information to direct clinicians in improving patient education.

2. Materials and Methods

2.1. Ethics and Review

This study did not require Institutional Review Board approval as all data utilized in this study is available for public use and did not involve human or animal subjects.

2.2. Summary

Inclusion criteria for extracted articles required them to provide pertinent information regarding xanthelasma palpebrarum. This was determined by the screeners after a joint education session on medically accurate and relevant information for xanthelasma. Additionally, articles were evaluated for the following: (1) Written in the English language; (2) Contained over 200 words; (3) Content was publicly available without the need for a content subscription; (4) Content was pertinent to providing information on the search query of interest. Articles were excluded if the associated access link was nonfunctional, a duplicate of a previous link, or did not meet further inclusion criteria [15].
Additionally, frequently asked questions and associated online articles regarding xanthelasma palpebrarum were investigated using publicly available data from May 2023 using the Google RankBrain (Google Inc., Mountain View, CA, USA) machine learning algorithm after the application of inclusion and exclusion criteria. This algorithm was specifically utilized via the search engine function of Google [16,17]. The algorithm provided the most common questions and the associated single article links which Google provides within its search results. These were recorded by the study coordinators.
After questions and articles were extracted, two independent raters evaluated questions for Rothwell’s Classification of Questions. Regarding extracted articles, their content was further reformatted into plain text on Microsoft Word [18,19,20,21,22,23]. Moreover, the irrelevant content material was removed by the screeners (FMQ, SS) from the plain text if the screeners identified that the content was unrelated to education on xanthelasma palpebrarum. This included removal of author information, copyright disclaimers, acknowledgments, references, and any webpage-navigation text. All remaining content remained unchanged when converted to individual plain text documents. Raters then utilized 6 readability scales (i.e., Flesh Reading Ease) and Brief DISCERN (cut-off ≥ 16) for each associated educational article [24,25]. Descriptive statistics for readability calculations were completed using the Stata 14 Statistical Package ® (StataCorp, College Station, TX, USA).

3. Results

This study analyzed the first thirty unique frequently asked questions and associated online articles regarding xanthelasma palpebrarum. The profile of Rothwell’s Classifications of Questions revealed most questions were considered “Fact” questions at 80.0% (n = 24). Upon subclassification, questions regarding “Technical Details” were most common at 70.0% (n = 21) followed by questions regarding cost at 6.7% (n = 2) and timeline of recovery at 3.3% (n = 1). Following this category, questions were next classified as “Policy” at 13.3% (n = 4). Upon subclassification of the “Policy” category, all questions pertained to the risks and complications of xanthelasma palpebrarum. The least common classification of questions was “Value” at 6.7% (n = 2). Upon subclassification of the “Value” category, questions regarding “Evaluation” and “Pain” were found at 3.3% each (n = 1; n = 1). The source of questions and articles was most commonly a “Commercial” source at 40% (n = 12). This was followed by the source “Medical Practice” at 26.7% (n = 8), then “Government Website” at 20.0% (n = 6). Furthermore, sources from “Academic Institution” and “Media Outlet” were the least common at 6.7% each (n = 2; n = 2) (Figure 1). The inter-rater reliability of Rothwell’s Classification of Questions was 90.0%.
Regarding assessment of the quality of websites, the Brief DISCERN instrument implemented a maximum score of 30, and the average Brief DISCERN of this analysis was 9.3 (SD = 4.9; Range = 1 to 17) (Figure 2). The grade reading level of extracted articles was calculated across six readability formulas. The average Flesh–Kincaid reading score was 11.4 (SD = 3.5). The average Flesch Reading Ease score was 43.6 (SD = 13.7) which is classified as “Difficult to read” or college-level (Figure 3). The average Gunning–Fog score was 13.1 (SD = 2.8). The average Coleman–Liau Index was calculated at 12.3 (SD = 2.4). The average SMOG was 10.3 (SD = 2.8). The average Linsear Write score was 11.1 (SD = 4.9) (Table 1). There was no statistically significant correlation between Flesch Reading Ease Scores and Brief DISCERN scores (p = 0.17).

4. Discussion

This study aimed to describe the content, readability, and quality of publicly available digital educational resources on xanthelasma palpebrarum [26,27]. Utilization of the internet as a repository of educational resources has allowed for increased dissemination of information [15,26]. However, the paucity of regulatory mechanisms which can observe the complexity and authenticity of these resources has contributed to the results of our study. Specifically, the ability of patients to comprehend these educational resources was evaluated through the implementation of readability calculations. These calculations have been established in multiple research studies and can directly indicate whether educational material meets standard metrics such as the recommended grade reading level in the United States, which is between 6th and 8th grade reading levels [28,29,30]. This reading level can be quantified using the readability calculations implemented in this study. For example, the Gunning–Fog score can be directly translated to a grade category (i.e., a Gunning–Fog score of 8.3 suggests an 8th-grade reading level). This numerical association can be applied to Coleman–Liau Index, SMOG, Linsear–Write, and Flesch–Kincaid calculations. The Flesch Reading Ease score translates to the level of difficulty. Moreover, the Flesch Reading Ease scores are correlated as follows: 0–30 (very difficult), 30–49 (difficult), 50–59 (fairly difficult), 60–69 (standard), 70–79 (fairly easy), 80–89 (easy), and 90–100 (very easy) [31,32,33]. Given the standardized nature of the Flesch Reading Ease score, this study considered the score between 60 and 89 as meeting grade reading level recommendations in the United States, as modeled by the previous literature [15,34]. The findings of this study indicate that the range of mean readability scores was between the 10th to college grade reading levels (SMOG: 10.3, Gunning–Fog: 13.1). Similarly, Figure 2 demonstrates that the distribution of the Flesh Reading Ease score is largely not within the recommendations scores of 60–89. These readability calculations indicate that the currently available patient educational resources on xanthelasma palpebrarum do not meet the recommendations for grade reading levels in the United States but rather are of a higher complexity. This higher complexity may be due to a lack of paraphrasing or simplification of language regarding the subject matter. This high complexity can result in less optimal comprehension by patients regarding xanthelasma palpebrarum [35,36]. A potential modality to improve these patient educational resources may be through transparency of peer review of the articles. Future investigations ought to utilize the readability calculations to further categorize whether there are specific “hotspots” which are more difficult for comprehension regarding xanthelasma palpebrarum (i.e., treatment options and costs).
Additionally, the findings of our readability calculations provide further explanation for the findings of Rothwell’s Classification of Questions in this study. The use of Rothwell’s Classification of Questions was implemented to categorize the frequently asked questions regarding xanthelasma palpebrarum. Specifically, this tool served to understand the specific components of xanthelasma palpebrae that patients’ inquiries focused on. Rothwell’s classification was implemented in previous literature which used internet-based public resources in clinical medicine, but to the best of our knowledge, this has not been utilized for xanthelasma palpebrae before [37,38]. The large majority of questions asked by patients online were categorized as “Fact” based questions (80%), which indicates that the current literature has a concerning paucity of information that can effectively address the questions asked to investigate xanthelasma palpebrarum. The further subclassification of these questions indicates that the technical details of xanthelasma palpebrarum are less known, creating the desire for patients to ask questions to fill this knowledge gap. Likewise, the presence of this knowledge gap may be contributed to by poor readability of the current educational materials on xanthelasma palpebrarum. Further statistical analysis ought to be performed in future studies to investigate this hypothesis.
Brief DISCERN was implemented as the third tool (i.e., readability, Rothwell’s Classification of Questions, and Brief DISCERN) to address the aim of this study. Specifically, this tool has been validated to serve as a threshold marker for quality, evidence-based information. However, this dataset of educational materials of xanthelasma palpebrarum was classified as low quality for most resources, regardless of source (i.e., academic institutions, commercial, government websites, etc.). Only two resources were classified as good quality. Specifically, the Brief DISCERN tools observe deficits in the clarity of references used, addressal of the associated question, and sources of support. This tool categorizes these materials to be of low quality, in addition to the low readability found in this study [24,39,40,41,42,43,44].
To our knowledge, this was the first cross-sectional study that analyzed the quality, readability, and content of patient educational materials of xanthelasma palpebrarum. Given this study is the first in addressing this aim, it is not without its deficits that can be addressed in future studies. For example, this study implemented the use of the RankBrain algorithm. This tool has been utilized in previous literature, and Google’s search engine share of the internet is greater than other engines by over 90% [45,46]. However, this study still did not account for other search engines, which indicates the need for future studies to observe if the findings of the current study are consistent with other search engine systems. Similarly, the use of Rothwell’s Classification of Questions is subjective in nature as a tool, which may suggest potential bias in raters scoring [16,38,47,48,49,50,51]. However, to address this, the high inter-rater reliability of this study addresses the validity of the scoring by the raters.

5. Conclusions

Xanthelasma palpebrarum may indicate underlying chronic diseases which significantly contribute to general morbidity and mortality, such as dyslipidemia. Patients with xanthelasma are turning to the internet for education on this condition. This study implemented tools established in the literature to profile the quality, readability, and content of patient educational materials on xanthelasma palpebrarum. The overall profile of these materials is poor and requires dissemination of higher-quality materials. This finding is important for clinicians when aiming to educate their patients on this condition. Moreover, we recommend that clinicians involved in the diagnosis and management of xanthelasma palpebrarum educate their own patients directly in the clinical setting to circumvent the currently poor profile of online educational materials. Thereby, patients may rely on their providers for healthcare literacy as opposed to unreliable online sources. Furthermore, we aim to repeat this analysis for another serious or high-burden disease.

Author Contributions

Conceptualization, K.J.V. and S.P.S.; methodology, S.P.S.; software, F.M.Q.; validation, K.J.V., S.P.S. and S.S.; formal analysis, S.P.S.; investigation, S.P.S.; resources, F.Q.; data curation, S.S.; writing—original draft preparation, K.J.V.; writing—review and editing, S.P.S.; visualization, A.R.; supervision, F.Q.; project administration, K.J.V. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zak, A.; Zeman, M.; Slaby, A.; Vecka, M. Xanthomas: Clinical and Pathophysiological Relations. Biomed. Pap. Med. Fac. Univ. Palacky Olomouc. Czech. Repub. 2014, 158, 181–188. [Google Scholar] [CrossRef]
  2. Digital Health Education: The Need for a Digitally Ready Workforce|ADC Education & Practice Edition. Available online: https://ep.bmj.com/content/108/3/214 (accessed on 12 July 2023).
  3. Utilizing Digital Health Technologies for Patient Education in Lifestyle Medicine—PMC. Available online: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7092400/ (accessed on 12 July 2023).
  4. Schnitman, G.; Wang, T.; Kundu, S.; Turkdogan, S.; Gotlieb, R.; How, J.; Gotlieb, W. The Role of Digital Patient Education in Maternal Health: A Systematic Review. Patient Educ. Couns. 2022, 105, 586–593. [Google Scholar] [CrossRef]
  5. Shahid, R.; Shoker, M.; Chu, L.M.; Frehlick, R.; Ward, H.; Pahwa, P. Impact of Low Health Literacy on Patients’ Health Outcomes: A Multicenter Cohort Study. BMC Health Serv. Res. 2022, 22, 1148. [Google Scholar] [CrossRef]
  6. Miller, T.A. Health Literacy and Adherence to Medical Treatment in Chronic and Acute Illness: A Meta-Analysis. Patient Educ. Couns. 2016, 99, 1079–1086. [Google Scholar] [CrossRef]
  7. Zirwas, M.J.; Holder, J.L. Patient Education Strategies in Dermatology. J. Clin. Aesthet. Dermatol. 2009, 2, 28–34. [Google Scholar]
  8. Lynch, M.C.; Wood, L.; Anderson, B.E.; Clarke, L.E. JAAD Grand Rounds. Asymptomatic Dermal Plaques. J. Am. Acad. Dermatol. 2015, 72, 922–924. [Google Scholar] [CrossRef]
  9. Bergman, R. The Pathogenesis and Clinical Significance of Xanthelasma Palpebrarum. J. Am. Acad. Dermatol. 1994, 30, 236–242. [Google Scholar] [CrossRef]
  10. Parker, F.; Bagdade, J.D.; Odland, G.F.; Bierman, E.L. Evidence for the Chylomicron Origin of Lipids Accumulating in Diabetic Eruptive Xanthomas: A Correlative Lipid Biochemical, Histochemical, and Electron Microscopic Study. J. Clin. Investig. 1970, 49, 2172–2187. [Google Scholar] [CrossRef]
  11. Cruz, P.D.; East, C.; Bergstresser, P.R. Dermal, Subcutaneous, and Tendon Xanthomas: Diagnostic Markers for Specific Lipoprotein Disorders. J. Am. Acad. Dermatol. 1988, 19, 95–111. [Google Scholar] [CrossRef]
  12. Nair, P.A.; Singhal, R. Xanthelasma Palpebrarum—A Brief Review. Clin. Cosmet. Investig. Dermatol. 2017, 11, 1–5. [Google Scholar] [CrossRef]
  13. Malekzadeh, H.; Ormseth, B.; Janis, J.E. A Practical Review of the Management of Xanthelasma Palpebrarum. Plast. Reconstr. Surg. Glob. Open 2023, 11, e4982. [Google Scholar] [CrossRef] [PubMed]
  14. Szalat, R.; Arnulf, B.; Karlin, L.; Rybojad, M.; Asli, B.; Malphettes, M.; Galicier, L.; Vignon-Pennamen, M.-D.; Harel, S.; Cordoliani, F.; et al. Pathogenesis and Treatment of Xanthomatosis Associated with Monoclonal Gammopathy. Blood 2011, 118, 3777–3784. [Google Scholar] [CrossRef] [PubMed]
  15. Comprehension Profile of Patient Education Materials in Endocrine Care—PubMed. Available online: https://pubmed.ncbi.nlm.nih.gov/35899057/ (accessed on 14 July 2023).
  16. Shen, T.S.; Driscoll, D.A.; Islam, W.; Bovonratwet, P.; Haas, S.B.; Su, E.P. Modern Internet Search Analytics and Total Joint Arthroplasty: What Are Patients Asking and Reading Online? J. Arthroplast. 2020, 36, 1224–1231. [Google Scholar] [CrossRef] [PubMed]
  17. Sajjadi, N.B.; Shepard, S.; Ottwell, R.; Murray, K.; Chronister, J.; Hartwell, M.; Vassar, M. Examining the Public’s Most Frequently Asked Questions Regarding COVID-19 Vaccines Using Search Engine Analytics in the United States: Observational Study. JMIR Infodemiology 2021, 1, e28740. [Google Scholar] [CrossRef]
  18. Kanthawala, S.; Vermeesch, A.; Given, B.; Huh, J. Answers to Health Questions: Internet Search Results Versus Online Health Community Responses. J. Med. Internet. Res. 2016, 18, e95. [Google Scholar] [CrossRef] [PubMed]
  19. Corcelles, R.; Daigle, C.R.; Talamas, H.R.; Brethauer, S.A.; Schauer, P.R. Assessment of the Quality of Internet Information on Sleeve Gastrectomy. Surg. Obes. Relat. Dis. 2015, 11, 539–544. [Google Scholar] [CrossRef] [PubMed]
  20. Jones, J.; Cassie, S.; Thompson, M.; Atherton, I.; Leslie, S.J. Delivering Healthcare Information via the Internet: Cardiac Patients’ Access, Usage, Perceptions of Usefulness, and Web Site Content Preferences. Telemed J. E. Health 2014, 20, 223–227, quiz 228. [Google Scholar] [CrossRef]
  21. Alfayad, K.; Murray, R.L.; Britton, J.; Barker, A.B. Content Analysis of Netflix and Amazon Prime Instant Video Original Films in the UK for Alcohol, Tobacco and Junk Food Imagery. J. Public Health 2022, 44, 302–309. [Google Scholar] [CrossRef]
  22. Saleh, D.; Fisher, J.H.; Provencher, S.; Liang, Z.; Ryerson, C.J.; Weatherald, J. A Systematic Evaluation of the Quality, Accuracy, and Reliability of Internet Websites about Pulmonary Arterial Hypertension. Ann. Am. Thorac. Soc. 2022, 19, 1404–1413. [Google Scholar] [CrossRef]
  23. Kwan, J.Y.; Stocco, F.; Scott, D.J.A.; Bailey, M.A.; Coughlin, P.A. Assessment of Internet-Based Information on Statin Therapy. Eur. J. Cardiovasc. Nurs. 2023, zvad061. [Google Scholar] [CrossRef]
  24. Khazaal, Y.; Chatton, A.; Cochand, S.; Coquard, O.; Fernandez, S.; Khan, R.; Billieux, J.; Zullino, D. Brief DISCERN, Six Questions for the Evaluation of Evidence-Based Content of Health-Related Websites. Patient Educ. Couns. 2009, 77, 33–37. [Google Scholar] [CrossRef]
  25. DISCERN: An Instrument for Judging the Quality of Written Consumer Health Information on Treatment Choices—Google Search. Available online: https://www.google.com/search?q=DISCERN%3A+an+instrument+for+judging+the+quality+of+written+consumer+health+information+on+treatment+choices&rlz=1C1CHBF_enUS879US879&oq=DISCERN%3A+an+instrument+for+judging+the+quality+of+written+consumer+health+information+on+treatment+choices&aqs=chrome..69i57j69i58.1121j0j4&sourceid=chrome&ie=UTF-8 (accessed on 14 July 2023).
  26. Fishman, J.; Greenberg, P.; Bagga, M.B.; Casarett, D.; Propert, K. Comparing Strategies for Health Information Dissemination: Messengers That Can Help or Hinder. Am. J. Health Promot. 2018, 32, 932–938. [Google Scholar] [CrossRef]
  27. Huang, C.; Yan, D.; Liang, S. The Relationship between Information Dissemination Channels, Health Belief, and COVID-19 Vaccination Intention: Evidence from China. J. Environ. Public Health 2023, 2023, 6915125. [Google Scholar] [CrossRef]
  28. Eltorai, A.E.M.; Ghanian, S.; Adams, C.A.; Born, C.T.; Daniels, A.H. Readability of Patient Education Materials on the American Association for Surgery of Trauma Website. Arch. Trauma. Res. 2014, 3, e18161. [Google Scholar] [CrossRef] [PubMed]
  29. Readability of Patient Education Materials From High-Impact Medical Journals: A 20-Year Analysis—Michael K Rooney, Gaia Santiago, Subha Perni, David P Horowitz, Anne R McCall, Andrew J Einstein, Reshma Jagsi, Daniel W Golden. 2021. Available online: https://journals.sagepub.com/doi/full/10.1177/2374373521998847 (accessed on 14 July 2023).
  30. Hutchinson, N.; Baird, G.L.; Garg, M. Examining the Reading Level of Internet Medical Information for Common Internal Medicine Diagnoses. Am. J. Med. 2016, 129, 637–639. [Google Scholar] [CrossRef] [PubMed]
  31. Jindal, P.; MacDermid, J.C. Assessing Reading Levels of Health Information: Uses and Limitations of Flesch Formula. Educ. Health 2017, 30, 84–88. [Google Scholar] [CrossRef] [PubMed]
  32. Symons, T.; Davis, J.S. Creating Concise and Readable Patient Information Sheets for Interventional Studies in Australia: Are We There Yet? Trials 2022, 23, 794. [Google Scholar] [CrossRef] [PubMed]
  33. Priyanka, P.; Hadi, Y.B.; Reynolds, G.J. Analysis of the Patient Information Quality and Readability on Esophagogastroduodenoscopy (EGD) on the Internet. Can J. Gastroenterol. Hepatol. 2018, 2018, 2849390. [Google Scholar] [CrossRef]
  34. Akinleye, S.D.; Garofolo-Gonzalez, G.; Montuori, M.; Culbertson, M.D.; Hashem, J.; Edelstein, D.M. Readability of the Most Commonly Accessed Online Patient Education Materials Pertaining to Pathology of the Hand. Hand 2018, 13, 705–714. [Google Scholar] [CrossRef]
  35. Mazor, K.M.; Roblin, D.W.; Williams, A.E.; Greene, S.M.; Gaglio, B.; Field, T.S.; Costanza, M.E.; Han, P.K.J.; Saccoccio, L.; Calvi, J.; et al. Health Literacy and Cancer Prevention: Two New Instruments to Assess Comprehension. Patient Educ. Couns. 2012, 88, 54–60. [Google Scholar] [CrossRef]
  36. Peterson, P.N.; Shetterly, S.M.; Clarke, C.L.; Bekelman, D.B.; Chan, P.S.; Allen, L.A.; Matlock, D.D.; Magid, D.J.; Masoudi, F.A. Health Literacy and Outcomes among Patients with Heart Failure. JAMA 2011, 305, 1695–1701. [Google Scholar] [CrossRef] [PubMed]
  37. Rothwell, P.M.; Eliasziw, M.; Gutnikov, S.A.; Warlow, C.P.; Barnett, H.J.M. Carotid Endarterectomy Trialists Collaboration Endarterectomy for Symptomatic Carotid Stenosis in Relation to Clinical Subgroups and Timing of Surgery. Lancet 2004, 363, 915–924. [Google Scholar] [CrossRef] [PubMed]
  38. McCormick, J.R.; Kerzner, B.; Tuthill, T.A.; Khan, Z.A.; Hodakowski, A.J.; Damodar, D.; Fortier, L.M.; Dasari, S.P.; Nho, S.J.; Chahla, J. Patients With Femoroacetabular Impingement Obtain Information From Low-Quality Sources Online and Are Most Interested in Conservative Treatment and Expected Recovery. Arthrosc. Sports Med. Rehabil. 2023, 5, e21–e27. [Google Scholar] [CrossRef] [PubMed]
  39. Banasiak, N.C.; Meadows-Oliver, M. Evaluating Asthma Websites Using the Brief DISCERN Instrument. JAA 2017, 10, 191–196. [Google Scholar] [CrossRef]
  40. Kaicker, J.; Borg Debono, V.; Dang, W.; Buckley, N.; Thabane, L. Assessment of the Quality and Variability of Health Information on Chronic Pain Websites Using the DISCERN Instrument. BMC Med. 2010, 8, 59. [Google Scholar] [CrossRef]
  41. Rees, C.E.; Ford, J.E.; Sheard, C.E. Evaluating the Reliability of DISCERN: A Tool for Assessing the Quality of Written Patient Information on Treatment Choices. Patient Educ. Couns. 2002, 47, 273–275. [Google Scholar] [CrossRef]
  42. Batchelor, J.M.; Ohya, Y. Use of the DISCERN instrument by patients and health professionals to assess information resources on treatments for asthma and atopic dermatitis. Allergol. Int. 2009, 57, 141–145. [Google Scholar] [CrossRef]
  43. AutoDiscern: Rating the Quality of Online Health Information with Hierarchical Encoder Attention-Based Neural Networks—PMC. Available online: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7285491/ (accessed on 14 July 2023).
  44. Toward Automated Assessment of Health Web Page Quality Using the DISCERN Instrument|Journal of the American Medical Informatics Association|Oxford Academic. Available online: https://academic.oup.com/jamia/article/24/3/481/2907912 (accessed on 14 July 2023).
  45. Pratt, M.; Searles, G.E. Using Visual Aids to Enhance Physician-Patient Discussions and Increase Health Literacy. J. Cutan Med. Surg. 2017, 21, 497–501. [Google Scholar] [CrossRef]
  46. Skalitzky, M.K.; Gulbrandsen, T.R.; Lorentzen, W.; Gao, B.; Shamrock, A.G.; Weinstein, S.L.; Morcuende, J.A. Health Literacy in Clubfoot: A Quantitative Assessment of the Readability, Understandability and Actionability of Online Patient Education Material. Iowa. Orthop. J. 2021, 41, 61–67. [Google Scholar]
  47. Hodakowski, A.J.; McCormick, J.R.; Damodar, D.; Cohn, M.R.; Carey, K.D.; Verma, N.N.; Nicholson, G.; Garrigues, G.E. Rotator Cuff Repair: What Questions Are Patients Asking Online and Where Are They Getting Their Answers? Clin. Shoulder. Elb. 2023, 26, 25–31. [Google Scholar] [CrossRef]
  48. Foster, B.K.; Brule, N.R.; Callahan, C.; Baylor, J.; Klena, J.C.; Grandizio, L.C. Online Information Related to Symptoms of Carpal Tunnel Syndrome: A Google Search Analysis. Cureus 2023, 15, e35586. [Google Scholar] [CrossRef] [PubMed]
  49. Shepard, S.; Sajjadi, N.B.; Checketts, J.X.; Hughes, G.; Ottwell, R.; Chalkin, B.; Hartwell, M.; Vassar, M. Examining the Public’s Most Frequently Asked Questions About Carpal Tunnel Syndrome and Appraising Online Information About Treatment. Hand 2022, 15589447221142896. [Google Scholar] [CrossRef] [PubMed]
  50. Murray, E.; Hekler, E.B.; Andersson, G.; Collins, L.M.; Doherty, A.; Hollis, C.; Rivera, D.E.; West, R.; Wyatt, J.C. Evaluating Digital Health Interventions: Key Questions and Approaches. Am. J. Prev. Med. 2016, 51, 843–851. [Google Scholar] [CrossRef] [PubMed]
  51. Suresh, N.; Fritz, C.; De Ravin, E.; Rajasekaran, K. Modern Internet Search Analytics and Thyroidectomy: What Are Patients Asking? World J. Otorhinolaryngol.—Head Neck Surg. 2023, preprint. [Google Scholar] [CrossRef]
Figure 1. Data Source Classification.
Figure 1. Data Source Classification.
Clinpract 13 00108 g001
Figure 2. Distribution of Brief DISCERN Scores.
Figure 2. Distribution of Brief DISCERN Scores.
Clinpract 13 00108 g002
Figure 3. Flesh Reading Ease Score Distribution.
Figure 3. Flesh Reading Ease Score Distribution.
Clinpract 13 00108 g003
Table 1. Average Readability Scores of Digital Resources regarding Xanthelasma Palpebrarum.
Table 1. Average Readability Scores of Digital Resources regarding Xanthelasma Palpebrarum.
Flesch-KincaidFlesch Reading
Ease
Gunning-FogColeman-Liau
Index
SMOGLinsear Write
Mean11.442.113.112.310.311.1
σ (SD)3.517.72.82.42.84.9
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Varghese, K.J.; Singh, S.P.; Qureshi, F.M.; Shreekumar, S.; Ramprasad, A.; Qureshi, F. Digital Patient Education on Xanthelasma Palpebrarum: A Content Analysis. Clin. Pract. 2023, 13, 1207-1214. https://doi.org/10.3390/clinpract13050108

AMA Style

Varghese KJ, Singh SP, Qureshi FM, Shreekumar S, Ramprasad A, Qureshi F. Digital Patient Education on Xanthelasma Palpebrarum: A Content Analysis. Clinics and Practice. 2023; 13(5):1207-1214. https://doi.org/10.3390/clinpract13050108

Chicago/Turabian Style

Varghese, Kevin J., Som P. Singh, Fahad M. Qureshi, Shreevarsha Shreekumar, Aarya Ramprasad, and Fawad Qureshi. 2023. "Digital Patient Education on Xanthelasma Palpebrarum: A Content Analysis" Clinics and Practice 13, no. 5: 1207-1214. https://doi.org/10.3390/clinpract13050108

Article Metrics

Back to TopTop