Hostname: page-component-745bb68f8f-5r2nc Total loading time: 0 Render date: 2025-02-05T21:30:36.607Z Has data issue: false hasContentIssue false

Tinnitus information online – does it ring true?

Published online by Cambridge University Press:  24 October 2018

R M McKearney*
Affiliation:
School of Psychological Sciences, University of Manchester, UK Audiology Department, Addenbrooke's Hospital, Cambridge University Hospitals NHS Foundation Trust, UK Audiology Department, Guy's and St Thomas’ NHS Foundation Trust, London, UK
R C MacKinnon
Affiliation:
School of Psychological Sciences, University of Manchester, UK Audiology Department, Addenbrooke's Hospital, Cambridge University Hospitals NHS Foundation Trust, UK Department of Vision and Hearing Sciences, Anglia Ruskin University, Cambridge, UK
M Smith
Affiliation:
Audiology Department, Addenbrooke's Hospital, Cambridge University Hospitals NHS Foundation Trust, UK
R Baker
Affiliation:
School of Psychological Sciences, University of Manchester, UK
*
Author for correspondence: Mr Richard M McKearney, Audiology Department, Guy's Hospital, London, SE1 9RT, UK E-mail: richard.mckearney@nhs.net
Rights & Permissions [Opens in a new window]

Abstract

Objective

To assess, using standardised tools, the quality and readability of online tinnitus information that patients are likely to access.

Methods

A standardised review was conducted of websites relating to tinnitus and its management. Each website was scored using the DISCERN instrument and the Flesch Reading Ease scale.

Results

Twenty-seven unique websites were evaluated. The mean DISCERN score of the websites was 34.5 out of 80 (standard deviation = 11.2). This would be considered ‘fair’ in quality. Variability in DISCERN score between websites was high (range, 15–57: ‘poor’ to ‘very good’). Website readability was poor, with a mean Flesch Reading Ease score of 52.6 (standard deviation = 7.7); this would be considered ‘difficult’ to read.

Conclusion

In general, the quality of tinnitus websites is fair and the readability is poor, with substantial variability in quality between websites. The Action on Hearing Loss and the British Tinnitus Association websites were identified as providing the highest quality information.

Type
Main Articles
Copyright
Copyright © JLO (1984) Limited, 2018 

Introduction

Tinnitus is a common health condition that has a marked impact upon some individuals’ quality of life.Reference Nondahl, Cruickshanks, Dalton, Klein, Klein and Schubert1 Tinnitus affects approximately 10 per cent of the UK population, with 4 out of 10 sufferers considering the condition to be moderately or severely annoying.Reference Davis, El Rafaie and Tyler2

Patients attending an ENT clinic will commonly use the internet to research their condition, but little is known about the quality of the information available or its readability.Reference Tassone, Georgalas, Patel, Appleby and Kotecha3 Patients who access online health information are unlikely to use academic or medical databases, but rather online search engines. Entering the term ‘tinnitus’ into the most widely used internet search engine, Google, currently produces 5.62 million search results (February 2018). Some patients place significant credence on the online information they find,Reference Glynn, O'Duffy and O'Dwyer4 and do not necessarily discuss this information with their clinician.Reference Diaz, Griffith, Ng, Reinert, Friedmann and Moulton5 It is therefore clearly important for clinicians to be aware of the types and quality of information that patients may access. The present study aimed to assess the quality and readability of tinnitus information accessible to patients on the internet.

The percentage of UK households with internet access has increased from 10 per cent in 1998 to 90 per cent in 2017, with 80 per cent of adults using it daily or almost daily.6 The internet is used by some individuals to access online information related to health. Diaz et al.Reference Diaz, Griffith, Ng, Reinert, Friedmann and Moulton5 conducted a survey of 1000 randomly selected primary care patients. Of the 512 respondents, 54 per cent reported having accessed online health information. Of these, 60 per cent reported that they considered the online information to be equal to or surpass the information provided by their general practitioner. Notably, 59 per cent of respondents did not discuss this online information with their general practitioner.

The impact of patients utilising online healthcare information is not fully known, although it is possible to speculate over the potential advantages and limitations. Potential benefits include enabling patients to take a more active role in managing their health, promoting autonomy and enabling them to make more well-informed decisions regarding their treatment.Reference Gerber and Eiser7 The internet can provide support to patients through online fora and through the provision of details of group support meetings.Reference van Uden-Kraan, Drossaert, Taal, Seydel and van de Laar8 Information of poor quality may misinform patients, potentially leading to anxiety, stress and unnecessary visits to their general practitioner.Reference Eysenbach and Diepgen9, Reference Murray, Lo, Pollack, Donelan, Catania and Lee10 Despite these limitations, it is postulated that online health information can serve patients in a positive manner, especially if its use is guided by clinicians.Reference Wald, Dube and Anthony11

Patients with ENT conditions have been shown to readily access online health information prior to attending appointments. Tassone et al.Reference Tassone, Georgalas, Patel, Appleby and Kotecha3 surveyed 535 ENT outpatients prior to their clinic appointment. Sixty-four per cent (n = 344) had internet access, and, of those, 18 per cent had accessed the internet to seek information about their condition. In a similar study of parents of children visiting an ENT specialist,Reference Glynn, O'Duffy and O'Dwyer4 30 per cent of the 501 respondents had accessed online health information prior to their appointment, 26 per cent of whom reported that the online information accessed had influenced their management decisions regarding their child's care.

Despite the notable use of online health information by ENT patients, much remains unknown about the quality of the information available and its readability. One study by PothierReference Pothier12 measured the readability of online information in relation to ‘glue ear’ using the Flesch Reading Ease score. The Flesch Reading Ease score is an objective, quantitative test of readability (how easy a text is to read).Reference Flesch13, Reference Shepperd, Charnock and Gann14 Pothier found the level of readability of the material to be ‘difficult’ to ‘very difficult’, placing it above the reading age of the average UK adult.Reference Pothier12

A study by Kieran et al.Reference Kieran, Skinner, Donnelly and Smyth15 sought to evaluate the accountability and quality of online information on tinnitus. The assessment of accountability comprised four criteria that contributed to an accountability score: authorship, attribution, disclosure and currency. A sample of 90 websites (30 from 3 separate search engines) was initially obtained using the search term ‘tinnitus’. The authors used their own 10-point Tinnitus Information Value scale to assess the 39 websites they reviewed. Kieran et al.Reference Kieran, Skinner, Donnelly and Smyth15 found the mean Tinnitus Information Value score to be 5 out of 10. The accountability score of the websites was very low (2 out of 7), with 27 out of 39 websites omitting the name of the author of the information. The Tinnitus Information Value scale is not validated, and its narrow range may cause it to have a ceiling and/or floor effect, making differentiation between website quality at the ends of the scale less meaningful.

Assessing the quality of websites used by patients is challenging, not least because of the lack of consensus on how this task is best achieved. A systematic review by Eysenbach et al.Reference Eysenbach, Powell, Kuss and Sa16 identified 79 studies, whereby the authors systematically searched for online healthcare information and assessed it quantitatively. Eysenbach et al. considered the best studies to be ones that used demonstrably reliable evaluation tools. Two of the studies evaluated used the DISCERN instrument to assess online information.Reference Charnock17 The DISCERN instrument is a standardised assessment tool created by an expert panel (including clinicians and leaders in health information) to assess the quality of written health information. It has been shown to have a high degree of validity and suitable inter-observer agreement.Reference Charnock, Shepperd, Needham and Gann18, Reference Ademiluyi, Rees and Sheard19

With a significant proportion of patients likely to be accessing online health information and using it to inform their treatment decisions, it is important for clinicians to be aware of what information is available to patients online, in order to better counsel them and guide them towards resources of known quality.Reference Tassone, Georgalas, Patel, Appleby and Kotecha3, Reference Diaz, Griffith, Ng, Reinert, Friedmann and Moulton5 Whilst the DISCERN instrument has been used to examine the quality of tinnitus information accessed by general practitioners online,Reference Fackrell, Hoare, Smith, McCormack and Hall20 no study to date appears to have used a validated assessment tool, such as the DISCERN instrument, to carry out an in-depth assessment of the quality of online health information that would be typically accessible to patients on the subject of tinnitus and its management. The present study used a standardised method with validated tools to identify and examine websites likely to be accessed by patients with tinnitus.

Materials and methods

Systematic search

Websites containing information on tinnitus and its management were systematically identified. The three most commonly used search engines on desktops, tablets and consoles are: Google (www.google.co.uk), Bing (www.bing.com/?cc=uk), and Yahoo (uk.yahoo.com), which have a combined market share of 96.6 per cent.21 These three search engines were used to identify the websites for review.

In order to find a representative sample of websites that patients with tinnitus might access, the first 15 websites were taken from each search engine, using both the search terms ‘tinnitus’ and ‘noise in ears’, giving a total of 90 potential websites for analysis. This approach was expected to capture at least 95 per cent of any ‘click throughs’ to websites in the results, whilst keeping the sample constrained.22 The online searches were carried out on 12th August 2015. The information quality and readability assessments were conducted over the subsequent month.

Links that were sponsored advertisements were excluded because these are unlikely to be used.23 Other websites excluded from analysis included: websites inaccessible to the general public, non-English-language websites, and websites containing no written content or content irrelevant to the subject of tinnitus management.

Information quality assessment

The quality of the information provided by each website was assessed using the DISCERN instrument, which is available without charge online.Reference Charnock17 It consists of 16 separate criteria, each assessing a different aspect of quality considered an essential feature of good quality information, and includes a score for overall quality. Each of these criteria are rated on a scale from 1 to 5 (except question 2, for which the scale is 0 to 5) and the scores are summed. This gives a total score range of 15 (poor) to 80 (very good). The DISCERN instrument is divided into three main sections assessing: the reliability of the information, information relating specifically to treatment choices, and the assessor's global rating of the publication as a whole. Detailed instructions for how to accurately score each criterion are provided by the DISCERN handbook, to promote consistency between assessors.Reference Charnock17

In order to assess the inter-observer reliability of the website assessments using the DISCERN instrument, a random sample of 15 websites was selected for a blinded second assessor to score independently. The level of inter-observer reliability was then evaluated.

Readability assessment

The level of readability for each website was assessed using the Flesch Reading Ease score.Reference Flesch13 This was calculated in Microsoft Word software using the formula shown in Figure 1.

Fig. 1. The Flesch Reading Ease (FRE) score, as calculated by word processing software Microsoft Word. The formula provides a method of objectively assessing the readability of a text. Adapted from Flesch.Reference Flesch13

The Flesch Reading Ease score takes factors such as the number of words per sentence and the number of syllables per word into account to give a score from 0 to 100, with a high-scoring text being more easily understood than one with a low score. A text with a score of 71–100 is considered ‘easy’ to read, with the average 11-year-old able to read it with ease. A score of 61–70 is considered of ‘standard’ difficulty, with children aged 13–15 years being able to read it clearly. A text with a score of 60 or below is considered ‘difficult’ to read.Reference D'Alessandro, Kingsley and Johnson-West24

Statistical methods

A priori statistical tests were performed using IBM® SPSS software, version 24. The effect of the search term used (‘tinnitus’ or ‘noise in ears’) on DISCERN and Flesch Reading Ease scores was assessed by an independent samples two-tailed student's t-test on results not duplicated between search terms. The effect of website ranking in the search engine results list, and of correlation between DISCERN and Flesch Reading Ease scores, were assessed by calculation of Spearman's rank correlation coefficients. Inter-observer reliability between assessors using the DISCERN instrument was assessed by calculating Cohen's kappa with quadratic weighting for a random sample of 15 websites rated blindly and independently by a second assessor, in a manner similar to the DISCERN instrument's validation study.Reference Charnock, Shepperd, Needham and Gann18

Results

Websites analysed

Of the initial sample of 90 results (15 for each of the 2 search terms, giving 30 from each of the 3 search engines), 35 unique websites were identified. The remaining 55 results were duplicates of the 35 unique websites and were excluded. A further 8 of the 35 remaining unique websites were excluded from analysis because they did not contain any information relating to tinnitus management (3 online newspaper articles, 2 online magazine articles, 2 standard webpages and 1 webpage with no written information), leaving 27 unique relevant websites remaining for analysis (Figure 2).

Fig. 2. Flow chart of search results, showing steps of excluding duplicate and irrelevant results.

Information quality

The mean DISCERN score of the 27 unique websites was 34.5 (range, 15–57; standard deviation (SD) = 11.2) (Figure 3). Based on a pre-existing categorisation of scores,Reference Daraz, MacDermid, Wilkins, Gibson and Shaw25, Reference Hargrave, Hargrave and Bouffet26 the mean DISCERN score of 34.5 would be considered to represent information of ‘fair’ quality. Of the 27 websites, 9 websites had a ‘poor’ quality score (range, 15–28), 12 websites had a ‘fair’ quality score (range, 29–41), 4 websites had a ‘good’ quality score (range, 42–54) and 2 websites had a ‘very good’ quality score (range, 55–67); no websites had an ‘excellent’ quality score (range, 68–80).

Fig. 3. Box-and-whisker plots for DISCERN and Flesch Reading Ease (FRE) scores for the 27 websites included in the analysis.

Readability

The mean Flesch Reading Ease score was 52.6 (SD = 7.7). Using a pre-existing categorisation of scores,Reference D'Alessandro, Kingsley and Johnson-West24 text of this score would be considered ‘difficult’ to read. The range of Flesch Reading Ease scores was 35.7–64.2, ranging from ‘difficult’ to ‘standard’ reading ease.

A summary table of DISCERN and Flesch Reading Ease scores for the 27 websites analysed is shown in Table 1 and in box-and-whisker plots in Figure 3.

Table 1. Summarised DISCERN and Flesch Reading Ease scores*

* Using previously published categorisations for the scores.Reference D'Alessandro, Kingsley and Johnson-West24Reference Hargrave, Hargrave and Bouffet26

Information quality and readability association

No statistically significant correlation was found between information quality (DISCERN) and readability (Flesch Reading Ease) scores (Spearman's r = 0.24, p = 0.24). A scatter plot of DISCERN scores against Flesch Reading Ease scores is shown in Figure 4.

Fig. 4. Scatter chart of DISCERN score against Flesch Reading Ease (FRE) score. No significant correlation was detected (Spearman's r = 0.24, p = 0.24).

Inter-observer reliability

Quadratic-weighted kappa showed moderate agreement between the two independent raters (κ = 0.69, 95 per cent confidence interval (CI) = 0.50–0.88). This is slightly greater than that reported by Charnock et al.Reference Charnock, Shepperd, Needham and Gann18 for an expert panel when validating the DISCERN instrument (κ = 0.53, 95 per cent CI = 0.48–0.59), and is greater than their stated level for acceptable agreement (κ = 0.4). This finding provides reassurance that the DISCERN instrument was used in a reliable manner.

Search terms used

Of the 27 unique websites, 11 appeared using both search terms and 16 appeared for one search term only. The mean DISCERN score when using the search term ‘tinnitus’ was 36.8 (SD = 10.2) and for ‘noise in ears’ was 29.9 (SD = 9.8). The mean Flesch Reading Ease score for ‘tinnitus’ was 51.8 (SD = 8.5) and for ‘noise in ears’ was 53.2 (SD = 7.7). When comparing websites produced using each search term, excluding duplicates, there was no statistically significant difference in DISCERN score (t(14) = 1.004, p = 0.33) or Flesch Reading Ease score (t(14) = 1.085, p = 0.29).

Discussion

Online tinnitus information likely to be accessed by patients is most commonly and on average ‘difficult’ to read and only of ‘fair’ information quality, as assessed by standardised tools.Reference Flesch13, Reference Charnock17 The information ranged from ‘standard’ to ‘difficult’ in terms of readability, and from ‘poor’ to ‘very good’ in terms of information quality. Readability was not significantly correlated with information quality. These factors are of potential significance to clinicians discussing and guiding patient access to online health information relating to tinnitus.

Kieran et al.Reference Kieran, Skinner, Donnelly and Smyth15 used their own Tinnitus Information Value scale to rate websites on the subject of tinnitus. The mean score was 5 (range, 0–10), the midpoint of the range of possible scores. This score appears potentially slightly higher than that of websites evaluated in this study using the DISCERN instrument, with the mean score of 34.5 being lower than the midpoint (47.5) of the possible range of DISCERN scores (range, 15–80). However, as the two evaluative tools used differ considerably, with the DISCERN instrument being a universal health information assessment tool and the Tinnitus Information Value being non-validated and tinnitus-specific, a direct comparison is difficult to make. Similarly to Kieran et al.,Reference Kieran, Skinner, Donnelly and Smyth15 this study found that the quality of information between websites was highly variable.

The DISCERN tool has previously been used to evaluate information relating to tinnitus accessed by general practitioners. Fackrell et al.Reference Fackrell, Hoare, Smith, McCormack and Hall20 assessed the quality of online information sources used by general practitioners to guide their management decisions and help counsel patients. They used the DISCERN instrument to evaluate the top 10 information sources used by general practitioners to read about tinnitus, as previously identified by El-Shunnar et al.Reference El-Shunnar, Hoare, Smith, Gander, Kang and Fackrell27 The mean DISCERN score of these 10 websites was 47.0 (extrapolated from the average score per question provided by the authors). This score is unsurprisingly greater than the mean DISCERN score of the publically accessible websites identified using search engines in this study (34.5), as the websites assessed by Fackrell et al. were identified as being the sources of information on tinnitus most commonly consulted by general practitioners.Reference El-Shunnar, Hoare, Smith, Gander, Kang and Fackrell27 However, a score of 47.0 would still only be considered ‘good’, and suggests that information quality could be improved further to allow sources to reach the ‘very good’ and ‘excellent’ score categories.

Many other studies have assessed the quality of online patient information within areas of clinical practice other than tinnitus. For example, Som and Gunawardana assessed online patient information relating to chemotherapy, and obtained a mean DISCERN score of 56.1 (SD = 8.8), with a wide range of scores (41–69).Reference Som and Gunawardana28 Cajita et al.Reference Cajita, Rodney, Xu, Hladek and Han29 evaluated online patient information pertaining to heart failure. The mean DISCERN score was 46.0. Both of these studies report a mean DISCERN score higher than that obtained in this study. This could reflect the differences in methodology between studies, or simply chance variation. Alternatively, this finding may represent differences in public awareness and funding for different conditions and associated publically accessible online information. There may also be more commercial websites targeting patients with tinnitus, which have been reported to have lower quality information.Reference Kieran, Skinner, Donnelly and Smyth15 Suggestions to improve the quality and readability of patient-accessible information include the award of accreditation by third-party organisations.Reference Wilson30

Health literacy (the ability to apply one's literacy skills to health information) varies widely between patients and is a key social determinant of an individual's health.Reference Graham and Brookey31, Reference Nutbeam32 The present study found that using the search term ‘noise in ears’ produced websites with slightly lower information quality and slightly higher readability than using the search term ‘tinnitus’, however these differences were not statistically significant. The readability of information is a key determinant of how well it is understood. The mean readability of the websites using the Flesch Reading Ease score was 52.6 (range, 35.7–64.2). Text of this score would be rated as ‘difficult’ to read and equates approximately to the reading standard of a 15–18 year old.

A cross-sectional study of 251 healthcare websites with information on 12 common conditions found the mean Flesch Reading Ease score to be 47.5.Reference Cheng and Dunn33 This suggests that health information text is generally ‘difficult’ to read and has slightly worse readability compared to the websites identified in the present study (Flesch Reading Ease score of 52.6). Healthcare information needs to be written in a style that is accessible to the majority of people. With almost a quarter of adult US citizens reading at or below the level expected of a 10–11 year old, the American Medical Association has advised healthcare information providers to write their material at this level or below in order to widen accessibility to healthcare information.Reference Safeer and Keenan34

The readability of health information may act as a barrier to those with less education, and further compound the healthcare inequalities already experienced by these members of society. Simple measures to improve the clarity of language, as promoted by the Plain English Campaign,35 will allow greater and fairer access to healthcare information.

Clinical implications

Many patients seek online information to find out more about their health condition and help guide their decisions regarding treatment choices. Clinicians who treat patients with tinnitus should be aware that the quality and readability of online information is highly variable, and that many patients do not discuss with their clinician the online information that they have read.Reference Diaz, Griffith, Ng, Reinert, Friedmann and Moulton5

Directing patients towards information sources of known quality will help them make better-informed treatment decisions. The two websites with the highest DISCERN scores in this study were published by Action on Hearing Loss and the British Tinnitus Association.36, 37 Although both websites had a Flesch Reading Ease score of 57 (considered ‘difficult’ to read), they scored higher than the mean Flesch Reading Ease score for all websites in this study (52.6).

Clinicians should be aware of the importance of writing healthcare information in a manner accessible to all patients. Tools to assess readability may be used in conjunction with a clear writing style to ensure that patients of all backgrounds have access to high quality information.

Limitations

As the searches were conducted on a computer with a UK Internet Protocol address, there may be a predisposition for search engines to identify UK-based websites. Additionally, patients and clinicians from other parts of the world may use more localised search engines, potentially limiting the applicability of the presented findings to those regions. Whilst no non-English-language websites were identified in our English-language searches, information sources exist in a variety of languages, and may not have the same characteristics as those assessed here.

Although the rating of information quality between assessors using the DISCERN instrument has been shown to have a good degree of inter-observer agreement, the scoring process still requires some subjective assessor input.Reference Charnock17 Whilst the DISCERN instrument evaluates a wide variety of factors that contribute towards information quality,Reference Charnock17 there are factors not considered, such as the factual content of the information, how current the information is and readability.

Although the Flesch Reading Ease formula is able to objectively provide information on readability, its approach of calculating a single value from only the numbers of words, syllables and sentences is a reductionist take on a complex psycholinguistic process. Whilst this is a suitable proxy, it does not capture all dimensions of the readability of a text.

Finally, the internet allows easier access to both publish new information and update pre-existing information compared to print media. Over time, this study's findings will inevitably become less reflective of currently published information as the information available online gradually changes. Beyond their contemporary utility, this study's results will also serve as a point of comparison for future assessments.

Conclusion

Online tinnitus information likely to be accessed by patients is most commonly and on average ‘difficult’ to read and only of ‘fair’ information quality, as assessed by standardised tools. With most online resources rated as such, there is substantial room for improvement. Both the DISCERN and Flesch Reading Ease scores were highly variable, reflecting the range of information that can be accessed by patients with tinnitus online. Readability was not significantly correlated with information quality.

Clinicians who counsel patients with tinnitus should be aware of these findings, especially in the context of the number of patients who access online information to make decisions regarding their treatment and the likelihood that they may not discuss this information with their clinician.

  • Patients commonly use the internet to research their health condition

  • This study evaluated the quality and readability of information that patients with tinnitus are likely to access

  • Generally, the quality of the websites assessed was fair and the readability was poor

  • Information quality varies substantially between websites

The two best sources of online information identified by this study were published by Action on Hearing Loss and the British Tinnitus Association. Authors of patient information may use tools to evaluate readability and information quality, such as the DISCERN instrument, in order to widen access to quality healthcare information.

Competing interests

None declared

Footnotes

Mr R M McKearney takes responsibility for the integrity of the content of the paper

References

1Nondahl, DM, Cruickshanks, KJ, Dalton, DS, Klein, BE, Klein, R, Schubert, CR et al. The impact of tinnitus on quality of life in older adults. J Am Acad Audiol 2007;18:257–66Google Scholar
2Davis, A, El Rafaie, A. Epidemiology of tinnitus. In: Tyler, RS, ed. Tinnitus Handbook. San Diego: Singular, Thomson Learning, 2000;123Google Scholar
3Tassone, P, Georgalas, C, Patel, NN, Appleby, E, Kotecha, B. Do otolaryngology out-patients use the internet prior to attending their appointment? J Laryngol Otol 2004;118:34–8Google Scholar
4Glynn, RW, O'Duffy, F, O'Dwyer, TP. Patterns of Internet and smartphone use by parents of children attending a pediatric otolaryngology service. Int J Pediatr Otorhinolaryngol 2013;77:699702Google Scholar
5Diaz, JA, Griffith, RA, Ng, JJ, Reinert, SE, Friedmann, PD, Moulton, AW. Patients’ use of the Internet for medical information. J Gen Intern Med 2002;17:180–5Google Scholar
7Gerber, BS, Eiser, AR. The patient–physician relationship in the Internet age: future prospects and the research agenda. J Med Internet Res 2001;3:E15Google Scholar
8van Uden-Kraan, CF, Drossaert, CH, Taal, E, Seydel, ER, van de Laar, MA. Participation in online patient support groups endorses patients’ empowerment. Patient Educ Couns 2009;74:61–9Google Scholar
9Eysenbach, G, Diepgen, TL. Towards quality management of medical information on the internet: evaluation, labelling, and filtering of information. BMJ 1998;317:1496–500Google Scholar
10Murray, E, Lo, B, Pollack, F, Donelan, K, Catania, J, Lee, K et al. The impact of health information on the internet on the physician-patient relationship. Arch Intern Med 2003;163:1727–34Google Scholar
11Wald, HS, Dube, CE, Anthony, DC. Untangling the Web – the impact of Internet use on health care and the physician-patient relationship. Patient Educ Couns 2007;68:218–24Google Scholar
12Pothier, DD. Patients and the internet: are websites on glue ear readable? Clin Otolaryngol 2005;30:566Google Scholar
13Flesch, R. A new readability yardstick. J Appl Psychol 1948;32:221–33Google Scholar
14Shepperd, S, Charnock, D, Gann, B. Helping patients access high quality health information. BMJ 1999;319:764–6Google Scholar
15Kieran, SM, Skinner, LJ, Donnelly, M, Smyth, DA. A critical evaluation of Web sites offering patient information on tinnitus. Ear Nose Throat J 2010;89:E1114Google Scholar
16Eysenbach, G, Powell, J, Kuss, O, Sa, ER. Empirical studies assessing the quality of health information for consumers on the World Wide Web: a systematic review. JAMA 2002;287:2691–700Google Scholar
17Charnock, D. The DISCERN Handbook. Oxford: Radcliffe Medical Press, 1998Google Scholar
18Charnock, D, Shepperd, S, Needham, G, Gann, R. DISCERN: an instrument for judging the quality of written consumer health information on treatment choices. J Epidemiol Community Health 1999;53:105–11Google Scholar
19Ademiluyi, G, Rees, CE, Sheard, C. Evaluating the reliability and validity of three tools to assess the quality of health information on the Internet. Patient Educ Couns 2003;50:151–5Google Scholar
20Fackrell, K, Hoare, DJ, Smith, S, McCormack, A, Hall, DA. An evaluation of the content and quality of tinnitus information on websites preferred by general practitioners. BMC Med Inform Decis Mak 2012;12:113Google Scholar
21StatCounter Global Stats. 2015 Top 5 desktop, tablet & console search engines from June 2014 to June 2015. In: http://gs.statcounter.com/#search_engine-ww-monthly-201406-201506/ [22 July 2015]Google Scholar
22Advanced Web Ranking. 2014 Google organic CTR study. In: https://www.advancedwebranking.com/google-ctr-study-2014.html [15 July 2017]Google Scholar
23GroupM UK. Evaluating the UK search marketing landscape infographic. In: http://www.mecmanchester.co.uk/evaluating-the-uk-search-marketing-landscape-infographic/ [7 March 2016]Google Scholar
24D'Alessandro, DM, Kingsley, P, Johnson-West, J. The readability of pediatric patient education materials on the World Wide Web. Arch Pediatr Adolesc Med 2001;155:807–12Google Scholar
25Daraz, L, MacDermid, JC, Wilkins, S, Gibson, J, Shaw, L. The quality of websites addressing fibromyalgia: an assessment of quality and readability using standardised tools. BMJ Open 2011;1:e000152Google Scholar
26Hargrave, DR, Hargrave, UA, Bouffet, E. Quality of health information on the Internet in pediatric neuro-oncology. Neuro Oncol 2006;8:175–82Google Scholar
27El-Shunnar, SK, Hoare, DJ, Smith, S, Gander, PE, Kang, S, Fackrell, K et al. Primary care for tinnitus: practice and opinion among GPs in England. J Eval Clin Pract 2011;17:684–92Google Scholar
28Som, R, Gunawardana, NP. Internet chemotherapy information is of good quality: assessment with the DISCERN tool. Br J Cancer 2012;107:403Google Scholar
29Cajita, MI, Rodney, T, Xu, J, Hladek, M, Han, HR. Quality and health literacy demand of online heart failure information. J Cardiovasc Nurs 2016;32:156–64Google Scholar
30Wilson, P. How to find the good and avoid the bad or ugly: a short guide to tools for rating quality of health information on the internet. BMJ 2002;324:598602Google Scholar
31Graham, S, Brookey, J. Do patients understand? Perm J 2008;2:67–9Google Scholar
32Nutbeam, D. Health literacy as a public health goal: a challenge for contemporary health education and communication strategies into the 21st century. Health Promot Int 2000;15:259–67Google Scholar
33Cheng, C, Dunn, M. Health literacy and the Internet: a study on the readability of Australian online health information. Aust N Z J Public Health 2015;39:309–14Google Scholar
34Safeer, RS, Keenan, JK. Health literacy: the gap between physicians and patients. Am Fam Physician 2005;72:463–8Google Scholar
35Plain English Campaign. In: http://www.plainenglish.co.uk/ [20 March 2016]Google Scholar
36Action on Hearing Loss. Tinnitus. In: https://www.actiononhearingloss.org.uk/your-hearing/tinnitus.aspx [30 March 2016]Google Scholar
37British Tinnitus Association. Get support. In: http://www.tinnitus.org.uk/for-health-professionals [28 January 2015]Google Scholar
Figure 0

Fig. 1. The Flesch Reading Ease (FRE) score, as calculated by word processing software Microsoft Word. The formula provides a method of objectively assessing the readability of a text. Adapted from Flesch.13

Figure 1

Fig. 2. Flow chart of search results, showing steps of excluding duplicate and irrelevant results.

Figure 2

Fig. 3. Box-and-whisker plots for DISCERN and Flesch Reading Ease (FRE) scores for the 27 websites included in the analysis.

Figure 3

Table 1. Summarised DISCERN and Flesch Reading Ease scores*

Figure 4

Fig. 4. Scatter chart of DISCERN score against Flesch Reading Ease (FRE) score. No significant correlation was detected (Spearman's r = 0.24, p = 0.24).