Heart murmurs are common in healthy infants, children, and adolescents, and most are not pathological. Most heart murmurs in asymptomatic children are innocent. Also known as physiological or functional murmurs, innocent murmurs result from normal patterns of blood flow through the heart and vessels. Innocent murmurs vary with physiological changes in a child’s body, such as an increase in cardiac output or the development of anaemia. The overall incidence of CHD is estimated to be 4–50 per 1000 live births. However, a murmur may be the sole manifestation of serious heart disease.Reference Frank and Jacobe1–Reference Mesropyan and Sanil3 Distinguishing between pathological and innocent murmurs can be challenging, and the examiner’s experience is crucial for the identification of distinctive properties of an innocent murmur. Heart murmurs in children remain a common cause of referral for assessment by paediatric cardiologists, and a reported 61% of patients referred to cardiology subspecialists for heart murmurs have innocent murmurs.Reference McCrindle, Shaffer, Kan, Zahka, Rowe and Kidd4
When innocent murmurs cannot be distinguished from pathological murmurs, further assessment is required and referral to a pediatric cardiologist is the appropriate next step. However, the identification of a heart murmur increases parental anxiety because heart disease is a possible aetiology. Therefore, the provision of reassurance and education to family members is crucial.Reference Etoom and Ratnapalan2 The Internet is increasingly being used to learn about health-related issues, because online medical information is accessible and relatively inexpensive and empowers patients to make decisions.Reference Storino, Castillo-Angeles and Watkins5 It provides many resources that help patients to understand and gain knowledge about their conditions, given that 50% of patients leave their doctors’ offices with poor understanding of their diagnoses.Reference De Oliveira, Jung, Mccaffery, McCarthy and Wolf6 However, low literacy, and specifically low health literacy, can lead to misunderstanding of the literature available online. Poor health literacy also hampers effective communication between physicians and patients, even during face-to-face clinical consultations.Reference Shukla, Sanghvi, Lelkes, Kumar and Contractor7 Therefore, the readability and quality of Internet-based patient educational materials are important.
Readability is characterised according to the level of understanding a person must have to comprehend written materials, as determined by a set formula. The readability standard for patient educational materials set by the United States Department of Health and Human Services, American Medical Association, and National Institutes of Health is at or below the sixth-grade level.Reference McCrindle, Shaffer, Kan, Zahka, Rowe and Kidd4,Reference Shukla, Sanghvi, Lelkes, Kumar and Contractor7 Available algorithms for readability calculation include the Flesch Reading Ease Score, Flesch–Kincaid Grade Level, Simple Measure of Gobbledygook, and Gunning Frequency of Gobbledygook.Reference Shukla, Sanghvi, Lelkes, Kumar and Contractor7,Reference AlKhalili, Shukla, Patel, Sanghvi and Hubbi8
The Agency for Healthcare Research and Quality developed the Patient Education Materials Assessment Tool for assessment of the overall understandability and actionability of any audiovisual or written information for patients.Reference Shoemaker, Wolf and Brach9,Reference McClure, Ng, Vitzthum and Rudd10 The quality of information provided on websites can be evaluated using two instruments: the global quality scoreReference Schreuders, Grobbee, Kuipers, Spaander and Veldhuyzen van Zanten11 and the Health on the Net code instrument (www.hon.ch/HONcode/).Reference Schreuders, Grobbee, Kuipers, Spaander and Veldhuyzen van Zanten11–Reference Azer, AlOlayan, AlGhamdi and AlSanea13 The ALEXA tool is used to investigate website popularity and visibility.14 In the present study, we evaluated the quality, readability, and understandability of online materials on heart murmur, assessing whether these resources are easily read and understood by the public. Our research questions were for websites targeting the public, what is the quality of these information resources? and do the readability and understandability levels of these online resources match the recommended levels for the public?
Materials and methods
Study design
The Institutional Review Board of the Education Planning Board of the University of Health Sciences Konya Training and Research Hospital exempted this study protocol from review because it was “non-human subject research” (no. 02 May 2019/25–07). A search for heart murmur-related Internet-based patient educational materials was conducted using the Google search engine in May 2019, and the first 230 individual websites from the results were evaluated for quality, understandability, readability, and popularity. Materials that were not patient educational materials, those written in a language other than English, those containing descriptions mainly in graphic or table form, and those with articles consisting of <10 sentences were excluded. Websites targeting physicians, such as Uptodate and Medscape, those asking for subscriptions or fees, and websites such as Google Scholar and PubMed were excluded from the study. The database was created with websites from the following sources: academic departments, societies, and organisations; clinics and hospitals; and miscellaneous healthcare-associated (primarily Internet only) sources. When evaluating the information provided by the websites about heart murmurs, it was taken into consideration whether or not the following points were included:
Is the murmur defined?
What are the types of heart murmur?
What are heart murmurs in children?
What causes heart murmurs in a child?
What are the symptoms of heart murmurs in a child?
What are possible complications of heart murmurs in a child?
How are heart murmurs diagnosed in a child?
Will I always have a heart murmur?
Could this become a problem as he or she grows up?
How are heart murmurs treated in a child?
Do I need surgery?
When should I call my child’s healthcare provider?
After the application of the exclusion criteria, available information from each included website was stored as single Microsoft Office Word (Microsoft Corporation, Redmond, WA, USA) file.
Quality assessment
The quality of each website was rated using the global quality score, a validated five-point Likert scale (1–5 point score: 1 point corresponds to poor quality, 5 points correspond to excellent quality) based on the elements of informed consent that is used to rate the overall quality of information on a website (Table 1).Reference Schreuders, Grobbee, Kuipers, Spaander and Veldhuyzen van Zanten11 The amount of advertisements on each website was scored as none, few, average, or many, with agreement of two reviewers achieved through discussion. The reviewers assessed the accessibility, quality, and overall flow of information on each website, and recorded how useful they believed the website would be to a screenee. They assigned a global quality score after evaluating the entire website.
Table 1. Global quality score criteria applied to websites on heart murmur screening
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary:20200311151939063-0719:S104795111900307X:S104795111900307X_tab1.png?pub-status=live)
The Health on the Net Foundation developed an automated system for the evaluation of a website’s Health on the Net code conformity.Reference Schreuders, Grobbee, Kuipers, Spaander and Veldhuyzen van Zanten11 This system focuses on key aspects of health information provided on the Internet, with a code of conduct addressing eight principles such as authority (authors’ qualifications), complementarity (the information supports, and does not replace, the doctor–patient relationship), privacy (respect for the privacy and confidentiality of personal data submitted to the site by visitors), attribution [citation of the source(s) of published information, dating of medical and health pages], justifiability (claims relating to benefits and performance are backed up), transparency (accessible presentation, accurate email contact information), financial disclosure (funding sources are identified), and advertising policy (clear distinction of advertising from editorial content). A Health on the Net code expert assesses a candidate website using precise guidelines for each principle. The Health on the Net code has been used widely to assess health-related websites.Reference Schreuders, Grobbee, Kuipers, Spaander and Veldhuyzen van Zanten11,Reference Hargrave, Hargrave and Bouffet15–Reference Dee and Lee18
Assessment of understandability
The Patient Education Materials Assessment Tool is a validated tool used to evaluate the understandability and actionability of written and audiovisual Internet-based patient educational materials.Reference Shoemaker, Wolf and Brach9 Individual Patient Education Materials Assessment Tool understandability items are scored as 0 (disagree), 1 (agree), or not applicable. The total understandability score for an individual article is calculated by summing the item scores, dividing this value by the total possible score, and multiplying by 100. The Patient Education Materials Assessment Tool scores are converted to percentages ranging from 0 to 100%, 70% or higher scores may be considered sufficiently understandable.Reference Shoemaker, Wolf and Brach9
In our study, the Patient Education Materials Assessment Tool for printable materials was used. The Patient Education Materials Assessment Tool for printable materials includes 17 items used to explore the understandability of materials. Two authors evaluated each website understandability. Each criterion for understandability was scored as present or not by two reviewers and scores were reconciled by consensus. However, differences were reviewed and resolved through consensus with a third reviewer. The Microsoft Office Excel program was used to enter the data, and the Patient Education Materials Assessment Tool Auto-Scoring Form was used to calculate understandability scores, provided as percentages, with higher scores reflecting greater ease of understanding.
Assessment of readability
The text from the Internet-based patient educational materials was copied and saved as separate Microsoft Word and plain text documents for analysis. The readability of the English-language patient educational materials was measured using the electronic system available at: http://www.readabilityformulas.com/free-readabilityformula-tests.php. It was assessed using four validated indices: the Flesch Reading Ease Score, Flesch–Kincaid Grade Level, Gunning Frequency of Gobbledygook, and Simple Measure of Gobbledygook.Reference Shukla, Sanghvi, Lelkes, Kumar and Contractor7,Reference AlKhalili, Shukla, Patel, Sanghvi and Hubbi8 The Flesch Reading Ease Score is a simple measure of the required grade level of the reader and is best used on school texts. It has become the standard readability assessment used by many US government agencies, including the US Department of Defense. Flesch Reading Ease Scores for reading ease range from 0 to 100, with lower values indicating greater difficultly (0–30, very difficult; 30–50, difficult; 50–60, fairly difficult; 60–70, standard; 70–80, fairly easy; 80–90, easy; and 90–100, very easy; Table 2).Reference De Oliveira, Jung, Mccaffery, McCarthy and Wolf6,Reference Shukla, Sanghvi, Lelkes, Kumar and Contractor7 The Flesch–Kincaid Grade Level is directly proportional to the mean number of words per sentence and mean number of syllables per word. The Simple Measure of Gobbledygook is directly proportional to the total number of polysyllabic words and inversely proportional to the total number of sentences. The Gunning Frequency of Gobbledygook is directly proportional to the average number of words per sentence and the ratio of polysyllabic words to the total number of words.Reference Patel, Gordon, Wong, Grobman, Goucher and Toledo19 Scores of 0–12 reflect a precollege grade level, scores of 13–16 correspond to college level, and scores >16 correspond to a graduate degree level. These methods have been validated for the assessment of readability, and their use has been described in the literature.Reference De Oliveira, Jung, Mccaffery, McCarthy and Wolf6,Reference Hansberry, Agarwal and John20,Reference Ballonoff Suleiman, Lin and Constantine21 In this study, average readability grades were calculated from the three test (Flesch–Kincaid Grade Level, Gunning Frequency of Gobbledygook, and Simple Measure of Gobbledygook) scores.
Table 2. Flesch reading ease score interpretation
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary:20200311151939063-0719:S104795111900307X:S104795111900307X_tab2.png?pub-status=live)
FRES = Flesch Reading Ease Score.
Popularity and visibility analysis
The ALEXA traffic tool was used to assess domain popularity and visibility.14 The ALEXA traffic tool (https://www.alexa.com/) is a measure of how often a website is frequented relative to all other sites on the web over the past 3 months. The number increases, it means that the website has been clicked so many times in the last 3 months.
Statistical analysis
Data obtained in this study were analysed using the Statistical Package for the Social Sciences software (ver. 22; SPSS, Chicago, IL, USA). Numerical variables are presented as means ± SDs or medians and interquartile ranges. The normality of data distribution was assessed using the Shapiro–Wilk test. The one-sample t test was used to evaluate the mean readability level of the English-language Internet-based patient educational materials compared with the recommended United States Department of Health and Human Services and National Institutes of Health standard (sixth-grade reading level). To compare numerical variables, the Kruskal–Wallis test was applied. Categorical variables are presented as numbers and percentages. They were analysed using the chi-squared test and Fisher’s exact test. The Pearson correlation coefficient was used to analyse the relationship between readability and the Patient Education Materials Assessment Tool score. p values < 0.05 were considered to be significant.
Results
The Google search using the term “heart murmur” yielded 230 total websites, and Internet-based patient educational materials on 39% (n = 86) of these websites fulfilled the previously defined inclusion criteria. Twenty-four Internet-based patient educational materials originated from academic departments and professional societies and organisations, 37 originated from clinical practices and hospitals, and 25 were provided on miscellaneous health information websites.
Quality of information
The overall quality of each website was evaluated using the global quality score and the Health on the Net code instrument; the results are presented in Table 3.
Table 3. Readability, understandability, and quality results for all websites
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary:20200311151939063-0719:S104795111900307X:S104795111900307X_tab3.png?pub-status=live)
FKGL = Flesch–Kincaid Grade Level, FRES = Flesch Reading Ease Score, GFOG = Gunning Frequency of Gobbledygook, SMOG = Simple Measure of Gobbledygook, PEMAT = Patient Education Materials Assessment Tool.
Variables are presented as means ± SDs, medians (Q1–Q3), ranges, or frequencies (%).
The average global quality score was 4.34 (SD = 0.71; range from 3 to 5; Table 3), indicating that the quality and flow of information on most websites were good and that most of the relevant information was provided, but that some topics were not covered or were not useful for patients. Only 14 (16.3%) websites had Health on the Net certification (Table 3). The distribution of Health on the Net certification and global quality scores by group is shown in Table 4.
Table 4. Evaluation of patient educational materials classified by source
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary:20200311151939063-0719:S104795111900307X:S104795111900307X_tab4.png?pub-status=live)
FKGL = Flesch–Kincaid Grade Level, FRES = Flesch Reading Ease Score, GFOG = Gunning Frequency of Gobbledygook, PEMAT = Patient Education Materials Assessment Tool, SMOG = Simple Measure of Gobbledygook.
Variables are presented as medians (Q1–Q3), ranges, or frequencies (%). The Kruskal–Wallis and chi-squared tests were used.
Understandability: Patient Education Materials Assessment Tool results
The mean understandability score for all Internet-based patient educational materials combined was 74.6% (SD = 12.8%; range from 31.2 to 93.7%). The median understandability score for Internet-based patient educational materials from academic departments and professional societies and organisations was 80% (interquartile range, 70–83%), that for Internet-based patient educational materials from clinical practices and hospitals was 75% (interquartile range, 67–83%), and that for materials from miscellaneous health information websites was 75% (interquartile range, 65–85%). For all three groups, the intelligibility level exceeded 70%. The understandability scores for the Internet-based patient educational materials are summarised in Tables 3 and 4 and Figure 1. No correlation between readability and intelligibility was detected (r = −0.05, p = 0.62; Fig 2).
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary:20200311151939063-0719:S104795111900307X:S104795111900307X_fig1.png?pub-status=live)
Figure 1. PEMAT score distribution for all websites.
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary:20200311151939063-0719:S104795111900307X:S104795111900307X_fig2.png?pub-status=live)
Figure 2. Correlation of readability grade levels with Patient Education Materials Assessment Tool scores.
Readability results
The overall mean readability scores for the evaluated websites, determined using the readability formulas, were significantly higher than sixth-grade level. The average grade level for all web pages was 10.4 ± 1.65 (range from 7.5 to 14.1; Table 3). The average Flesch Reading Ease Score was 55 ± 9.1 (range from 32.4 to 72.9), reflecting a “fairly difficult” writing style (Table 2, Fig 3). The average Flesch–Kincaid Grade Level score was 10 ± 1.81 (range from 6.8 to 14.5). The average Gunning Frequency of Gobbledygook and Simple Measure of Gobbledygook scores were 12.1 ± 1.85 (range from 8.9 to 16.4) and 9.1 ± 1.38 (range from 6.7 to 12.2), respectively (Figs 4, 5 and 6). No significant difference in the readability score was detected among subcategories. The readability of this article was assessed using all three indices, which yielded values that were significantly higher than the recommended sixth-grade reading level (all p < 0.0001, single-sample one-tailed t test; Table 5).
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary:20200311151939063-0719:S104795111900307X:S104795111900307X_fig3.png?pub-status=live)
Figures 3. Comparison of readability scores for IPEMs related to HM calculated using the FRES scale.
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary:20200311151939063-0719:S104795111900307X:S104795111900307X_fig4.png?pub-status=live)
Figures 4. Comparison of readability scores for IPEMs related to HM calculated using the FKGL scale.
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary:20200311151939063-0719:S104795111900307X:S104795111900307X_fig5.png?pub-status=live)
Figures 5. Comparison of readability scores for IPEMs related to HM calculated using the GFOG scale.
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary:20200311151939063-0719:S104795111900307X:S104795111900307X_fig6.png?pub-status=live)
Figures 6. Comparison of readability scores for IPEMs related to HM calculated using the SMOG scale.
Table 5. Readability scores for English-language web-based patient educational materials
![](https://static.cambridge.org/binary/version/id/urn:cambridge.org:id:binary:20200311151939063-0719:S104795111900307X:S104795111900307X_tab5.png?pub-status=live)
FKGL = Flesch–Kincaid Grade Level, GFOG = Gunning Frequency of Gobbledygook, SMOG = Simple Measure of Gobbledygook.
* Scores represent a grade level (e.g., 12 = 12th grade, 13 = first year of college). For comparison, readability scores were also calculated for this article. The one-sample t test was used.
Discussion
If written Internet-based patient educational materials on heart murmur are to be helpful, people must be able to read them and understand the information provided. However, the most important finding of our study was that a 10.4 grade level is required to read most currently available patient educational materials related to heart murmur that are disseminated via easily searched websites. This level is higher than the recommended sixth-grade reading level. The United States Department of Health and Human Services categorise patient educational materials as “easy to read” if they are written at a sixth-grade or lower reading level, and the average adult in the United States reads at a seventh-grade level. According to this average, material at the seventh- to ninth-grade reading levels is considered to be of “average difficulty” and material above the ninth-grade reading level is considered to be “difficult”.Reference Patel, Sanghvi, Cherla, Baredes and Eloy22,Reference Walsh and Volsko23 Our study has shown that currently available Internet-based patient educational materials on various topics are written well above the recommended reading level.
Patients increasingly tend to use the Internet as a source of information.Reference Doruk, Enver, Çaytemel, Azezli and Başaran24 The readability of a text represents the reading comprehension level that a person must have to understand it, and is an important determinant of a person’s ability to comprehend health information.Reference Sabharwal, Badarudeen and Unes Kunju25 The families use the internet to resolve their anxiety about health situations or increase their understanding of health issues. To effectively communicate information, patient educational materials must be presented at a level that is comprehensible to the target audience. National organisations encourage the composition of Internet-based patient educational materials at the fifth- to sixth-grade level.Reference Ayyaswami, Padmanabhan and Patel26 Numerous studies in the medical field have shown that Internet-based patient educational materials do not comply with national readability recommendations.Reference De Oliveira, Jung, Mccaffery, McCarthy and Wolf6–Reference AlKhalili, Shukla, Patel, Sanghvi and Hubbi8,Reference Priyanka, Hadi and Reynolds16,Reference Patel, Gordon, Wong, Grobman, Goucher and Toledo19 To our knowledge, this study is the first to evaluate readability in the field of paediatric cardiology. Readability is a measure of how easily a text can be read and understood. It can be improved by various methods, including limiting sentence size to 8–10 words; replacing long, polysyllabic words with shorter, more commonly used synonyms; and replacing medical jargon with simpler lay terms when possible.Reference Patel, Gordon, Wong, Grobman, Goucher and Toledo19 However, no gold standard has been established for the determination of readability. Our study showed that much heart murmur information provided on websites is too difficult to read for a large proportion of the population.
The Patient Education Materials Assessment Tool is used to evaluate the quality of Internet-based patient educational materials. The Patient Education Materials Assessment Tool is a new tool for the systematic assessment of Internet-based patient educational material understandability based on a variety of parameters, including clarity of purpose, simplicity of wording, organisation, and use of visual aids. Compared with readability formulas based on sentence structure and word length, the Patient Education Materials Assessment Tool may enable more critical evaluation of the quality of Internet-based patient educational materials.Reference Balakrishnan, Chandy, Hseih, Bui and Verma27 The designers of the Patient Education Materials Assessment Tool defined scores ≥70% as indicative of understandability.Reference Shoemaker, Wolf and Brach9
In our study, the mean Patient Education Materials Assessment Tool score for the articles analysed was 74.6% (range from 31.2 to 93.7%), thus falling above, but close to, the threshold. Doruk et alReference Doruk, Enver, Çaytemel, Azezli and Başaran24 reported an average overall understandability score of 59% for online materials on vocal fold nodules. Balakrishnan et alReference Balakrishnan, Chandy, Hseih, Bui and Verma27 obtained similar findings for understandability of online vocal fold paralysis materials (average score, 53%).
In our study, although the understandability level of the articles was good, the readability level was above the sixth-grade reading level recommended. It is suggesting that the materials were difficult to read. The Patient Education Materials Assessment Tool includes important items for the evaluation of information quality, but the subjective nature of the assessment is a major limitation. One Patient Education Materials Assessment Tool item pertains to the use of common everyday language in the article of interest. Such assessment is subjective and may be biased based on the reader’s education level, native language, and medical background. In our study, we attempted to overcome this limitation by having individuals with different levels of education and medical backgrounds rate each article. Another limitation of the Patient Education Materials Assessment Tool is that it does not address the accuracy or completeness of the material, and these aspects were not evaluated in this study. The median global quality score in this study was 4.34 (range from 3 to 5), indicating that the quality of information on most websites was good to excellent. Previous studies have yielded lower scores. Schreuders et alReference Schreuders, Grobbee, Kuipers, Spaander and Veldhuyzen van Zanten11 reported a median global quality grade of 3 for websites on colorectal cancer. According to this study, the global quality score of websites related to colorectal cancer was found to be low, while the global quality score of websites related to heart murmur was found to be higher. This may mean that the articles associated with heart murmur are better in terms of content.
In another study, information on 13 sites was of moderate quality, and only 3 websites had high global quality grades.Reference Priyanka, Hadi and Reynolds16 Only 14 websites in our sample were found to have Health on the Net certification, with no difference among groups. In our study, ALEXA scores ranged from 330 to 52,000,000, indicative of heterogeneity in the visibility and popularity of website domains providing information on heart murmur.
Individuals do not need to pay medical bills to visit Web sites published by professional organisations and to obtain the highest quality information for their illness. This increases individuals’ orientation to the Internet for more information. However, incorrect, inadequate, or difficult to understand information has the potential to affect parents in making appropriate health-related decisions for their children. This raises concern, given that Berland et al recently noted that patients using the Web for medical information may have difficulty “finding complete and accurate information” and suggested that deficiencies with health information online may “negatively influence” patients’ decisions.Reference Berland, Elliott and Morales28 The information that is of poor quality and is that is difficult to understand may cause unnecessary use of emergency services and clinics in the absence of serious illness. Therefore, the development of comprehensive, focused, concise, and easy to understand patient education materials can also be beneficial for the national economy. However, improving readability of these materials should be a common goal, but we must be careful not to oversimplify information. The emphasis should be to strive for diversity and not simplification of information, given that oversimplification could inadvertently penalise patients with a high reading literacy level who could potentially derive great benefit from very detailed patient educational materials.
Limitations of this study include the computer-based analysis of readability, which may be controversial because it has been shown to overestimate the difficult level of materials.Reference Doruk, Enver, Çaytemel, Azezli and Başaran24 Many readability indices have been validated, and no consensus on the best index for the assessment of patient educational materials has been achieved. Each readability index uses a different formula to calculate readability, and scores obtained with different indices may vary substantially. However, in our study, all indices used indicated that the mean readability of patient educational materials was above the sixth-grade reading level.
Another limitation of the study is the use of the Health on the Net code accreditation tool to evaluate the quality of the website, because Health on the Net code is not designed to assess the accuracy of the information provided by a website. The Health on the Net code applies to a website’s editorial processes and transparency, based on the eight Health on the Net code principles (outlined in the methods section). But, we also used the global quality score to assess the quality of information provided on the websites. Thus, we saw Health on the Net code accreditation as a useful research tool for evaluating Internet-based patient educational materials quality.
Also, only selected English-language website materials were reviewed in this study. Thus, the resource table we provide will not be very useful for families with limited English proficiency. However, in our study, we performed a screening including the “heart murmur”. We included websites that provided information about heart murmurs in infants and children. We did not scan by writing words such as “baby heart murmur” and “child heart murmur”. Therefore, there may be sites that do not scan for heart murmurs. This may be one of the limitations of our study.
Conclusion
We found that websites on heart murmur were understandable. However, Internet-based patient educational materials were written at the recommended sixth-grade reading level. Optimisation of the most visible websites, particularly improvement of the readability of information, is desirable. We recommend further analysis and improvement of the readability, content, quality, popularity, and visibility of web-based English-language patient educational materials on heart murmur to help patients become more knowledgeable and to reduce their anxiety. Future studies should examine patient understanding of web-based materials and evaluate whether the provision of more readable patient educational materials leads to better comprehension.
Acknowledgements
None.
Financial Support
This research received no specific grant from any funding agency, commercial, or not-for-profit sectors.
Conflict of Interest
None.
Ethical Standards
This article does not contain any studies with animals performed by any of the authors. This article does not contain any studies with human patients or animals performed by any of the authors.
The Institutional Review Board of the Education Planning Board of the University of Health Sciences Konya Training and Research Hospital exempted this study protocol from review because it was “non-human subject research” (no. 02 May 2019/25–07).