Hostname: page-component-745bb68f8f-b6zl4 Total loading time: 0 Render date: 2025-02-06T15:50:17.335Z Has data issue: false hasContentIssue false

SYSTEMATIC REVIEWS OF ECONOMIC EVALUATIONS: HOW EXTENSIVE ARE THEIR SEARCHES?

Published online by Cambridge University Press:  27 March 2017

Hannah Wood
Affiliation:
York Health Economics Consortium Ltd Enterprise House, Innovation Way, University of Yorkhannah.wood@york.ac.uk
Mick Arber
Affiliation:
York Health Economics Consortium Ltd Enterprise House, Innovation Way, University of York
Julie M. Glanville
Affiliation:
York Health Economics Consortium Ltd Enterprise House, Innovation Way, University of York
Rights & Permissions [Opens in a new window]

Abstract

Objectives: Economic evaluation (EE) is an accepted element of decision making and priority setting in healthcare. As the number of published EEs grows, so does the number of systematic reviews (SRs) of EEs. Although search methodology makes an important contribution to SR quality, search methods in reviews of EEs have not been evaluated in detail. We investigated the resources used to identify studies in recent, published SRs of EEs, and assessed whether the resources reflected recommendations.

Methods: We searched MEDLINE for SRs of EEs published since January 2013 and extracted the following from eligible reviews: databases searched, health technology assessment (HTA) sources searched, supplementary search techniques used. Results were compared against the minimum search resources recommended by National Institute for Health and Care Excellence (NICE) (MEDLINE, Embase, NHS EED, EconLit) for economic evidence for single technology appraisals, and resource types suggested in the summary of current best evidence from SuRe Info (economic databases, general databases, HTA databases, HTA agency Web pages, gray literature).

Results: Sixty-five SRs met the inclusion criteria; data were extracted from forty-two. Five reviews (12 percent) met or exceeded the NICE recommended resources. Nine reviews (21 percent) searched at least four of the five types of resource recommended by SuRe Info. Five reviews (12 percent) searched all five. Twenty-three reviews (55 percent) did not meet the NICE recommendations or four of five of the SuRe Info recommended resource types. Search reporting was frequently unclear or incorrect.

Conclusions: Searches conducted for the majority of recently published SRs of EEs do not meet two published approaches.

Type
Assessments
Copyright
Copyright © Cambridge University Press 2017 

The economic evaluation of healthcare interventions is now an accepted element of healthcare decision making and priority setting in many countries and has a key role in the work carried out by national HTA agencies. In this context, the rate of publication of economic evaluations continues to grow rapidly. A search of the NHS Economic Evaluation Database (NHS EED) identified 867 economic evaluations with a publication date of 2007, and 1,552 with a publication date of 2012; an increased rate of publication of 79 percent in 5 years. This increase in the volume of published economic evaluations is reflected by an increasing number of systematic reviews synthesizing this type of evidence. A targeted search of PubMed for systematic reviews of economic evidence retrieved thirty-four records with a publication date of 2007 and seventy-one records with a publication date of 2012; an increase of 108 percent over 5 years. This does not include the many reviews undertaken and published as part of technology assessments.

Although a significant amount of research has been published on search methodology and search reporting in systematic reviews, much of this research has focused on systematic reviews of clinical effectiveness or diagnostic test accuracy studies. We have identified little published research assessing the quality of searches carried out as part of systematic reviews of economic evaluations. The most current research into search methodology in this context is summarized by SuRe Info, a Web resource providing assessments of current research-based information relating to the information retrieval aspects of producing systematic reviews and health technology assessments (Reference Kanuelis and Glanville1).

A study evaluating the quality of systematic reviews of economic evaluations was carried out by Jefferson, Demicheli, and Vale (Reference Jefferson, Demicheli and Vale2), but this research did not focus specifically on the assessment of search methodology. Moreover, the authors’ conclusion that “more attention needs to be paid to search methods” was based on an analysis of reviews of economic evaluations published between 1990 and 2001. It is possible that subsequent research impacting on information retrieval in health economics may have changed practices. This research includes a small number of studies on the utility of various search resources for identifying economic evidence (Reference Alton, Eckerlund and Norlund3Reference Sassi, Archard and McDaid5), guidance from health technology assessment agencies such as National Institute for Health and Care Excellence (NICE) on the resources required to identify economic evidence for HTA (6;7), and evidence-based recommendations from information professionals such as those in SuRe Info (Reference Kanuelis and Glanville1).

The objectives of this study were to identify which resources were used to identify studies in recent, published systematic reviews of economic evaluations, and to investigate whether the resources searched reflect recommendations for the conduct of such reviews.

METHODS

Search Strategy

A search to identify a sample of recent systematic reviews of economic evaluations (published since January 2013) was undertaken in January 2014 using MEDLINE and MEDLINE In-Process by means of Ovid SP. The strategy combined the Centre for Reviews and Dissemination's search filter (8) to identify economic studies for NHS EED with a focused search to identify studies described as systematic reviews (see Supplementary Figure 1). The strategy was designed to retrieve records that explicitly referred to a systematic review of economic evaluations in the title and/or abstract. A rapid analysis carried out by the research team of a sample of 1,300 MEDLINE records published in 2013 suggested that the majority of systematic reviews are described as such at title and abstract level. The strategy was limited to studies published from 2013 to date, and irrelevant publication types such as animal studies, news stories, letters and editorials were removed.

Reviews were eligible for inclusion in this study if they met the following criteria: (i) the review methods were explicitly described as systematic by the review authors in the title or abstract, or the review was published in the Cochrane Database of Systematic Reviews (approaches used by researchers who described their review methods as systematic were of interest, whether or not they followed standard systematic review methodology in practice); (ii) the title and / or abstract of the systematic review was judged to clearly and explicitly indicate that its objectives were to review economic evaluations of healthcare interventions (reviews of both clinical and cost effectiveness were excluded); (iii) the full text of the systematic review could be accessed either free of charge or by means of our local subscriptions; (iv) reviews were published in full; protocols were not included; (v) reviews were published in the English language.

Results were screened by one researcher (H.W. and M.A. each screened 50 percent of results). Any disagreement in the final selection of the papers was resolved by discussion and/or the input of a third reviewer (J.M.G.).

Data Extraction

For each review, data were collected on: (i) the general medical literature databases searched (such as MEDLINE or Embase); (ii) specialist economic databases searched (such as NHS EED or the Health Economic Evaluations Database [HEED]); (iii) health technology assessment sources searched (such as the HTA database); (iv) additional sources and supplementary search techniques used (e.g., checking of reference lists, expert contact, searches for conference abstracts).

For the purpose of analyzing results, we assumed that where authors stated that they searched the Cochrane Library, the Centre for Reviews and Dissemination databases or Evidence-Based Medicine Reviews (EBMR), but gave no further details, that these searches included all of the individual databases which form part of these collections. The Cochrane Library includes the Cochrane Database of Systematic Reviews (CDSR), the Cochrane Central Register of Controlled Trials (CENTRAL), the Cochrane Methodology Register (CMR), the Database of Abstracts of Reviews of Effects (DARE), the Health Technology Assessment (HTA) database and NHS EED. The Centre for Reviews and Dissemination databases include DARE, the HTA database and NHS EED. EBMR includes CDSR, DARE, the HTA database, NHS EED, the ACP Journal Club, CENTRAL, and the CMR.

Comparison with Current Recommendations

Results were compared against two guides. First, the search resources required as a minimum by NICE in 2014 when searching for published economic evidence for single technology appraisals (STAs) (6) (MEDLINE, Embase, NHS EED, EconLit). Second, the resource types recommended in the ‘Costs and economic evaluation’ chapter of the Web resource Summarized Research in Information Retrieval for HTA (SuRe Info) (Reference Kanuelis and Glanville1) when searching for economic evaluations (specialist economic databases, general databases, HTA databases, Web pages of HTA agencies, gray literature).

NICE standards were chosen as an example of reimbursement agency recommendations for specific resources to search for economic evaluations. We note that since our research was completed, NICE have changed their specification for submission of evidence for STAs, and no longer state which specific resources should be searched. However, we suggest that the minimum resources they recommend still constitute an appropriate example of reimbursement agency recommendations and they are still specified in other NICE guidance, for example, in the NICE Medical Technologies Evaluation Program submission template (7). The SuRe Info recommendations were chosen as an example of recommendations from information professionals based on a summary of the published research evidence on which resources should be searched for health technology assessment.

RESULTS

A total of 1,743 MEDLINE records were screened. Based on an assessment of titles and abstracts, sixty-five reviews met the inclusion criteria. Twenty-three reviews were not available to us (either free of charge or by means of our local subscriptions), leaving a sample of forty-two reviews for analysis. All of the reviews gave at least some details of the search resources used.

A summary of the resources searched by the reviews is given in Table 1.

Table 1. Summary of resources searched by reviews

General Databases Searched

MEDLINE was the most frequently searched general bibliographic database (40/42; 95 percent), followed by Embase (25/42; 60 percent). Seventy-one percent of reviews (30/42) included a search of general bibliographic databases beyond MEDLINE and Embase. These databases included resources which reflected the research question (e.g., CINAHL, PsycInfo, AMED), and resources which reflected the geographical context of the research team or the research question (e.g., Indice Medico Espanol, German Medical Science database, African Index Medicus).

Specialist Economic Databases Searched

NHS EED was the most frequently searched specialist economic database (33/42; 79 percent). The number of reviews searching other economic databases was low. In addition to HEED (4/42; 10 percent) and EconLit (7/42; 17 percent), resources searched included EUROHEED, the Cost Effectiveness Analysis (CEA) Registry and Business Source Complete.

Technology Assessment Sources Searched

The HTA database was the most frequently searched technology assessment resource (23/42; 55 percent). Other technology assessment resources included country-specific databases (e.g., German Agency of Health Technology Assessment database), Web sites of agencies producing health technology assessments and Web sites of networks of HTA agencies (e.g., International Network of Agencies for Health Technology Assessment).

Supplementary Search Techniques Used

Sixty-nine percent of reviews used supplementary search techniques (29/42). Reference-list checking was the most frequently used supplementary search technique (55 percent). Other techniques used included expert contact, searching for conference abstracts, hand-searches of journals, and Internet searching.

Comparison with Current Recommendations

Five reviews (12 percent) met or exceeded the search resources recommended by NICE (MEDLINE, Embase, NHS EED, EconLit) (Table 2). Failure to search EconLit was the primary reason that a review failed to meet the minimum resources recommended by NICE; only 17 percent of the reviews included a search of this database. Nine reviews (21 percent) searched at least four of the five types of resource recommended by SuRe Info (specialist economic databases, general databases, HTA databases, Web pages of HTA agencies, gray literature). Five reviews (12 percent) searched all five resource types (Table 2).

Table 2. Recommended / required search resources

DISCUSSION

An analysis of forty-two systematic reviews of economic evaluations found that the majority did not search a wide range of resources, which suggests that their identification of studies may not have been optimal.

Limitations of this Study

This study used a pragmatic sample of published systematic reviews for analysis which may potentially limit its conclusions; it is possible that relevant systematic reviews were not identified by the search. Because we only included studies where the full text could be accessed either free of charge or by means of our local University subscriptions, we may also have introduced bias into the study. However, we do have access to a wide range of journals and we suggest that the sample is representative of economic reviews, if not exhaustive, as it includes reviews on a wide range of health topics including surgical, pharmacological, diagnostic, public health, and complex health interventions. The journals in which the reviews were published are listed in Supplementary Figure 2; they included general journals, health economics and health technology assessment journals, and journals specific to a particular healthcare topic.

We relied on the authors’ statement that they were conducting a systematic review to judge relevance to this study. It may be the case that the authors were mislabeling their studies and their reviews were intended to be pragmatic from a search perspective rather than extensive. If that were the case then those reviews would not be expected to have conducted the wide ranging searches expected of a systematic review. We only reviewed reviews in English, which may have introduced bias into this study because we have no evidence of the quality of the searches in reviews in languages other than English.

Limitations of the Included Studies

Determining which resources were searched by the included reviews was difficult due to a lack of clarity in the reporting of search methodology. Reviews frequently contained errors or there was a lack of clarity in the names of databases and interfaces. It was particularly difficult to ascertain which sections of larger search platforms, such as the Cochrane Library and Web of Knowledge, had been included in the searches. This forced us to make assumptions; most frequently that a review reporting a search of “The Cochrane Library” searched all of the databases contained within this resource. This assumption may have led us to over-estimate the number of reviewers who included NHS EED and the HTA database (individual databases of The Cochrane Library) in their search for evidence. Poor reporting may also mean that authors actually carried out more evidence identification than their published methods indicate. Poor reporting of search resources used in published systematic reviews is not a problem specific to reviews of economic evidence; Golder, Loke, and Zorzela (Reference Golder, Loke and Zorzela9) identified the same issue in a recent evaluation of search methodology in systematic reviews of adverse effects.

Although it is acknowledged that journal editorial policies and word limits can make it difficult to fully describe the search process in published reviews, authors should clearly describe the named databases, rather than just the platform, in order for this aspect of their methodology to be accurately evaluated. Authors and journal editors need to be aware of, and comply with, reporting standards for systematic reviews, for example those outlined in the Methodological Expectations of Cochrane Intervention Reviews (MECIR) project (10) and the PRISMA checklist (Reference Moher, Liberati, Tetzlaff, Altman and Group11).

Which Resources Were Searched?

The most commonly used resources/techniques (in order) were MEDLINE, NHS EED, checking reference lists, Embase, and the HTA database. Searching NHS EED and the HTA database was most frequently suggested by a reported search of the Cochrane Library, rather than being named individually. The resources appear to reflect the most commonly used resources in other types of systematic review; such as reviews of clinical effectiveness and adverse effects (Reference Golder, Loke and Zorzela9). This perhaps suggests that reviewers are not adapting their search approaches when searching for economic evidence beyond the resources included in the Cochrane Library, because the number of reviews of economic evaluations which searched specialized economic or HTA resources was very low.

Funding to produce NHS EED ceased at the end of March 2015, with no bibliographic records being added after March 31, 2015 (12). The closed version of NHS EED can still be searched, but this change has significant implications for reviews of economic evidence, given that its use acts as a check on the completeness of searches and can lessen the impact of sub-optimal searches of general bibliographic databases. The closest similar resource to NHS EED was HEED; a resource only available by means of paid subscription. HEED also ceased production in 2014, however, unlike NHS EED the closed database is no longer available or searchable (13). With the closure (to new records) of the key free economic evaluation database (NHS EED) and the subscription-based alternative (HEED), reviewers will need to place greater emphasis on identification of economic evaluations from major bibliographic databases such as MEDLINE and Embase.

The quality of the searches undertaken in these resources will become more important as authors will no longer be able to rely on databases such as NHS EED to identify economic evaluations missed by methodologically weak searches in other resources. The adverse impact of sub-optimal strategies will, therefore, be increased. Reviewers should be aware that searches of general bibliographic databases for new research will need to cover a range of databases and approaches to reflect the processes for producing NHS EED and HEED.

The increased reliance on general databases rather than subject specific resources will also increase the importance of validated search filters to identify economic evaluations whilst maintaining an appropriate balance of sensitivity and precision. The InterTASC Information Specialists' Sub-Group Search Filter Resource may be a useful tool for reviewers as it provides access to published and unpublished search filters designed to retrieve research by study design or focus, in addition to information on the use and appraisal of search filters (Reference Glanville, Lefebvre and Wright14). Reviewers can keep up to date with developments in search sources and search approaches through the use of online resources such as SuRe Info (Reference Kanuelis and Glanville1), the use of search-related mailing lists such as the EXPERTSEARCHING list at pss.mlanet.org, the use of health economics current awareness services such as that provided by healtheconomics.com, and through attending training events.

Authors of systematic reviews of economic evaluations should be aware that the search resources recommended by health technology assessment agencies, such as NICE, and groups which synthesize information retrieval evidence, such as SuRe Info, are likely to change to reflect the closure of NHS EED and HEED. Reviewers should ensure that they keep up-to-date with any such changes in the recommendations to ensure that their searches meet current best practice.

This research suggests that the information resources used to identify evidence for the majority of recently published systematic reviews of economic evaluations are not as extensive as the approaches suggested by HTA organizations, such as NICE. Nor do they seem to reflect the evidence on economic searches as summarized in the SuRe Info summaries.

In relation to the NICE guidance, a reasonably large percentage of reviews did not search Embase (40 percent) and NHS EED (21 percent), and a large majority did not search EconLit (83 percent). EconLit is a subscription-based resource; this may be one reason for the low number of reviews which included it as a search source. In addition, EconLit is a general economics database with no particular focus on healthcare literature, containing a low proportion of healthcare-related records. This may also be a reason why it was not included as a search source. We are not aware of any published evidence which indicates that searching EconLit is a key source of economic evaluations (if MEDLINE, Embase and NHS EED are also being searched); the omission of EconLit from searches may, therefore, not have a significant impact on overall search sensitivity. The omission of Embase and NHS EED, however, would seem to indicate that the search methods for these reviews were not designed to be extensive, increasing the risk of publication bias and of missing relevant evidence (Reference Chandler, Churchill, Higgins, Lasserson and Tovey15).

The SuRe Info summary includes a wider range of resources than the NICE guidance, covering resources for unpublished as well as published literature. Examples of unpublished literature in the context of economic evaluations may include assessment reports from HTA agencies, manufacturers’ submissions to HTA agencies, studies from stakeholders and prepublication working papers from academic departments. It was noticeable that relatively few of the reviews searched those resources which potentially include unpublished economic evaluations or economic evaluations published outside the journal literature, for example, Technology Assessment databases, Web sites of HTA agencies, and gray literature.

Published research (Reference Royle and Waugh4) on the utility of various search resources for identifying economic evidence indicates that a small range of core databases (MEDLINE, EMBASE, NHS EED) are sufficient to identify the majority of relevant published economic evaluations and that the incremental yield from additional resources is small. However, this research also indicates that a relatively high proportion of the economic evidence in reviews (much higher than in equivalent reviews on clinical evidence) will be found in unpublished studies (Reference Royle and Waugh4). It is, therefore, important that reviewers consider sources for economic evaluations that are unpublished or published outside journal literature.

There is little research evidence on the most efficient and useful ways of identifying such material, despite the fact that this element of the search is often the most resource intensive. The approaches used by the reviews included in this study were very variable, perhaps reflecting the lack of conclusive evidence on the value of various techniques and resources to identify this material. Further research into the unique yield of sources to identify unpublished economic evidence would assist reviewers to make evidence-based decisions on the most appropriate search resources to use and ensure that recommendations on the use of search resources are evidence based.

Reviewers should also consider publication language when searching for evidence. If the systematic review has a region-specific focus which is non-English speaking (or is informing an economic model or cost-effectiveness estimate which is region specific) efforts should be made to search beyond the English-language literature.

Whereas this study has focused on the choice of resources used to identify studies in recent, published systematic reviews of economic evaluations, it is important to highlight that the effectiveness of search methods is reliant on both the choice of resources and the quality of the actual search strategies used. We are assessing the quality of the search strategies used in these reviews in another paper.

CONCLUSION

The information resources used to identify evidence for the majority of recently published systematic reviews of economic evaluations do not conform to the two published approaches we could identify. With the closure of two key economic evaluation databases, researchers conducting systematic reviews of economic evaluations should reassess the resources they search to ensure extensive searches congruent with the systematic review approach.

SUPPLEMENTARY MATERIAL

Supplementary Figure 1: https://doi.org/10.1017/S0266462316000660

Supplementary Figure 2: https://doi.org/10.1017/S0266462316000660

CONFLICTS OF INTEREST

The authors are part of the project team for the SuRe Info resource (provided by HTAi) mentioned in the manuscript. Their organisation, YHEC, receives a very small payment to reimburse some of the time spent in the maintenance and development of SuRe Info.

References

REFERENCES

1. Kanuelis, D, Glanville, J. Summarized research information retrieval for HTA (SuRE Info) Costs and economic evaluation [webpage]. HTAi Vortal2014 [updated 31 March 31, 2015]. Available from: http://vortal.htai.org/?q=node/336. (accessed March 1, 2016).Google Scholar
2. Jefferson, T, Demicheli, V, Vale, L. Quality of systematic reviews of economic evaluations in health care. JAMA. 2002;287:2809-2812.CrossRefGoogle ScholarPubMed
3. Alton, V, Eckerlund, I, Norlund, A. Health economic evaluations: How to find them. Int J Technol Assess Health Care. 2006;22:512-517.CrossRefGoogle Scholar
4. Royle, P, Waugh, N. Literature searching for clinical and cost-effectiveness studies used in health technology assessment reports carried out for the National Institute for Clinical Excellence appraisal system. Health Technol Assess. 2003;7:iii, ix-x, 151.Google Scholar
5. Sassi, F, Archard, L, McDaid, D. Searching literature databases for health care economic evaluations: How systematic can we afford to be? Med Care. 2002;40:387394.CrossRefGoogle ScholarPubMed
6. National Institute For Health and Care Excellence. Single technology appraisal (STA). Specification for manufacture/sponsor submission of evidence. London: NICE, 2012.Google Scholar
7. National Institute for Health and Care Excellence. Medical technologies evaluation programme. Sponsor's submission template. London: NICE, 2013.Google Scholar
8. University of York Centre for Reviews and Dissemination (CRD). Search strategies [webpage]. York: CRD; 2015. Available from: http://www.crd.york.ac.uk/crdweb/searchstrategies.asp (accessed August 10, 2015).Google Scholar
9. Golder, S, Loke, YK, Zorzela, L. Comparison of search strategies in systematic reviews of adverse effects to other systematic reviews. Health Info Libr J. 2014;31:92105.Google Scholar
10. Cochrane Editorial Unit. Methodological Expectations of Cochrane Intervention Reviews (MECIR) [updated 31 March 2015]. Available from: http://editorial-unit.cochrane.org/mecir (accessed August 11, 2015).Google Scholar
11. Moher, D, Liberati, A, Tetzlaff, J, Altman, DG, Group, The PRISMA. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. PLoS Med. 2009;6:e1000097.CrossRefGoogle ScholarPubMed
12. University of York Centre for Reviews and Dissemination (CRD). Changes to DARE [webpage]. York 2015. Available from: http://www.crd.york.ac.uk/crdweb/newspage.asp#changesdare (accessed August 11, 2015).Google Scholar
13. Health Economics Evaluations Database (HEED) [Internet]. Available from: http://onlinelibrary.wiley.com/book/10.1002/9780470510933 (accessed August 11, 2015).Google Scholar
14. Glanville, J, Lefebvre, C, Wright, K. The InterTASC Information Specialists' Sub-Group Search Filter Resource [webpage]. York and Oxford: InterTASC Information Specialists' Sub-Group; 2016. Available from: https://sites.google.com/a/york.ac.uk/issg-search-filters-resource/home (accessed March 1, 2016).Google Scholar
15. Chandler, J, Churchill, R, Higgins, J, Lasserson, T, Tovey, D. Methodological standards for the conduct of new Cochrane Intervention Reviews. Methodological Expectations of Cochrane Intervention Reviews (MECIR). Version 2.3. Cochrane Editorial Unit, 2013. Available from: http://sti.cochrane.org/sites/sti.cochrane.org/files/public/uploads/Methodological%20standards%20for%20the%20conduct%20of%20Cochrane%20Intervention%20Reviews.PDF (accessed January 1, 2017).Google Scholar
Figure 0

Table 1. Summary of resources searched by reviews

Figure 1

Table 2. Recommended / required search resources

Supplementary material: File

Wood supplementary material S1

Supplementary Figure

Download Wood supplementary material S1(File)
File 15.1 KB
Supplementary material: File

Wood supplementary material S2

Supplementary Figure

Download Wood supplementary material S2(File)
File 20.7 KB