Hostname: page-component-745bb68f8f-f46jp Total loading time: 0 Render date: 2025-02-06T15:56:34.103Z Has data issue: false hasContentIssue false

Identifying economic evaluations for health technology assessment

Published online by Cambridge University Press:  13 October 2010

Julie Glanville
Affiliation:
York Health Economics Consortium
Suzy Paisley
Affiliation:
University of Sheffield
Rights & Permissions [Opens in a new window]

Abstract

Objectives and Methods: Health technology assessment (HTA) often requires the identification and review of economic evaluations and models. This study surveys the available specific and general resources to search to identify economic evaluations. It also provides information on efficient searching of those resources and comments on the current evidence-base.

Results: Published checklists recommend searching for economic evaluations in specific information resources which collect economic evaluations such as NHS EED and HEED, followed by top-up searches of large biomedical bibliographic databases (such as MEDLINE and EMBASE). Other resources such as the HTA and DARE databases can yield reports of economic evaluations. Searches within NHS EED and HEED can be made more efficient by using database-specific search options. Searches within large biomedical databases such as MEDLINE and EMBASE require the use of economic search terms called search filters. Search filters are highly sensitive, retrieving most economic evaluations, but suffer from low precision returning many irrelevant records which need to be assessed.

Conclusions: It is relatively easy to identify rapidly a high proportion of economic evaluations but more research is required to improve the efficiency of this process. There are key high yield resources to search but more evidence is required on their overlap and unique contribution to searches. The value of other resources, particularly those providing access to gray literature, should be explored. Research into efficient retrieval requires clear definitions of economic evaluations to allow comparison across studies.

Type
THEME SECTION: INFORMATION RETRIEVAL FOR HTA
Copyright
Copyright © Cambridge University Press 2010

Health technology assessment (HTA) often requires the identification and review of economic evaluations and models (3;14) Searching for economic evaluations such as cost-effectiveness and cost-utility studies involves identifying relevant resources to search and compiling search terms to capture relevant records in those resources. There are several well established resources that provide access to economic evaluations and there is increasing evidence on efficient ways to search for economic evidence. This study surveys current resources and search approaches.

WHAT RESOURCES ARE AVAILABLE?

A review of economic evaluations may involve searching a range of information resources such as economic evaluation databases, major biomedical databases, health technology assessment Web sites, health economic resources, and organizational Web sites (3;5;Reference Kristensen and Sigmund11;14;Reference Shemilt, Mugford, Byford, Higgins and Green19). The search may include efforts to identify research published outside journals such as working papers or reports (gray literature) (2;Reference Kristensen and Sigmund11;14).

Economic Evaluation Resources

HTA guidance documents list resources which are likely to be high yielding and which may constitute minimum search requirements (3;5;Reference Kristensen and Sigmund11;14). Guidance and checklists make recommendations and are pragmatic guides. As such they are likely, as in much of research evidence retrieval, to be only partly evidence-based because there is limited evidence on the yield and overlap of resources. Most published checklists recommend that searches for economic evaluations should begin with the specific information resources which collect economic evaluations such as NHS EED and HEED (Table 1) followed by top-up searches of large biomedical bibliographic databases, such as MEDLINE and EMBASE. This approach is suggested because it is likely to maximize the efficiency of searching. Focused resources, such as NHS EED, reflect the expenditure of major resource with experts identifying and categorizing economic evaluations from the large volumes of records which are yielded through searches for economic studies in MEDLINE and EMBASE (9;15). Published guidance recognizes possible delays between records being identified and added to resources such as NHS EED and HEED. To compensate for possible delays, limited searches of records in more recent years in larger biomedical databases (MEDLINE and/or EMBASE) are suggested.

Table 1. Selected Databases Collecting Economic Evaluations, Technology Assessments, and Systematic Reviews

The largest international collections of economic evaluations are the Health Economic Evaluations Database (HEED) and the NHS Economic Evaluation Database (NHS EED) (9;15). There are also subject specific collections such as the Paediatric Economic Database Evaluation (PEDE) and country-specific databases such as COnnaissances et Décision en EConomie de la Santé (CODECS) (Table 1) (6;16).

Many economic evaluation resources are value-added, offering more than bibliographic details and author abstracts. NHS EED and CODECS, for example, offer a detailed summary and critical appraisal of economic evaluations (6;15). HEED offers abstracts and extensive categorizations which provide further access points into the records through added ICD-9 codes, drug names and ATC codes (9). The CEA Registry extracts and presents tabulated information on utility weights and cost-effectiveness ratios (7). A range of other specialist resources are available including journals which provide critical appraisals of economic evaluations and these can be identified from lists such as those provided by Healtheconomics.com and HTAi.org.

How Far Do These Resources Overlap?

These resources overlap in their coverage of published evaluations and there is little published information on the scale of the overlap or the currency of the resources. Although Sassi and colleagues, searching in 1997, suggested that MEDLINE could be relied on as the key source for the identification of published economic evaluations in healthcare (Reference Sassi, Archard and McDaid18), more recent research has suggested other approaches. Royle and Waugh (Reference Royle and Waugh17) assessed 19 English health technology assessments (retrieving 130 studies) to explore the yield of different databases in terms of the economic evaluations reviewed. MEDLINE and EMBASE both identified 86.6 percent of the economic evaluations used in the reviews with 78 studies common to both databases. A total of 40.2 percent of studies were identifiable in NHS EED. Searching MEDLINE, EMBASE, and NHS EED reached a cumulative percentage of 94.8 percent of the included studies and led Royle and Waugh (Reference Royle and Waugh17) to recommend searching all three databases. More recently, Alton and colleagues (Reference Alton, Eckerlund and Norlund1) recommended that a search of NHS EED supplemented by a search of a general database such as MEDLINE was effective. Comparative studies such as these are hampered by different definitions of economic evaluations and ensuring that the currency of the databases is known. Further retrospective analyses of completed reviews to identify the sources of included studies might provide more information on the value of searching a range of databases to identify economic evaluations. Such record analyses are relatively easy to achieve at the end of a technology assessment and, to a limited extent, are publishable. However, the prospect of repeated and regular journal publication of such data appears remote. A more realistic approach to increasing information on resource yield and value may be to develop collaborative Web sites to collect such data routinely so that many contributors can provide information (at the end of projects) to build the evidence picture.

Other Resources

Econlit indexes the economic literature and is often suggested as a resource to search for economic evaluations. However, it yields few reports of economic evaluations, as it indexes economic journals and few economic evaluations in healthcare are published in economic journals. Databases of reviews and health technology assessments may also help to identify economic evaluations which have been included in reviews. Selected review resources are listed in Table 1.

SEARCH STRATEGIES

As well as identifying resources to be searched, searching for economic evaluations involves compiling search terms into strategies which are likely to capture records which meet the research question.

Search Terms and Term Combinations

Search strategies are search terms structured into conceptual groups. One commonly used conceptual grouping is known as PICO, meaning the search captures one or more of the following concepts: the population(s), intervention(s), comparator(s), and outcome(s) of interest (5). For example, a research question about the cost-effectiveness of nicotine replacement strategies to achieve smoking cessation might have a population concept representing “smokers,” and an intervention concept comprising the various nicotine replacement therapies. The concepts are combined using the Boolean AND operator.

Searching Economic Evaluation Databases

Searching databases dedicated to economic evaluations, such as NHS EED or HEED, does not require the addition of a lengthy search string with all the economic evaluation concepts to focus the search. Typically, the search strategy will involve one or more PICO concepts combined limited to the economic evaluations subset within the database. For example, in NHS EED searches can be limited to economic evaluations by making use of indexing terms which have been added by the database producers. The indexing term describes the type of study and is put in a field which has the abbreviated name (label) “TY”. To find economic evaluations in NHS EED a subject search can be combined with the following indexing terms in the TY field, as shown in line #2 below:

  • #1 (smok* or tobacco or cigarette*) AND nicotine

  • #2 Economic:ty OR provisional:ty

  • #3 #1 AND #2

Information on how to focus searches can be found in database help pages and online tutorials.

Searching Large Biomedical Databases

In large biomedical databases, such as MEDLINE, economic evaluation records form only a small proportion of the total records. To find these records among many millions of records requires the use of PICO search strategies combined with a detailed set of economics search terms. The key question is which search terms will both yield the most economic evaluations (have the highest sensitivity) and produce the fewest irrelevant records (have good precision). Moreover, the search terms need to be adapted to the different resources in which the strategy will be run, to reflect the following issues:

  1. (i) Database producers do not all use the same indexing term schemes, for example MEDLINE records receive Medical Subject Headings (MeSH) and EMBASE records receive EMTREE indexing terms;

  2. (ii) Databases are provided by different publishers, who may offer very different sets of search commands and options. For example, one publisher may offer a truncation symbol of * to retrieve all words beginning with a common root, and another may offer $. In addition the way that database publishers (such as Ovid or DIMDI) implement searches may differ. For example, when a term is typed into the PubMed search box the PubMed software performs extensive processing to identify synonyms and related terms. This “behind the scenes” processing does not happen when a search term is typed into the Ovid interface.

Focusing the search strategy onto economic evaluations can be achieved by using database-specific search filters. Search filters are collections of search terms developed to capture specific themes such as study design, for specific databases such as MEDLINE. Search filters can be identified from the ISSG Search Filter Resource (http://www.york.ac.uk/inst/crd/intertasc/).

Although many economic search filters have been published, their performance in terms of record retrieval has not been extensively tested or validated. This means that until recently there has been little evidence on the comparative performance of the available filters to assist in choosing between them (Reference McKinlay, Wilczynski and Haynes13;Reference Royle and Waugh17;Reference Sassi, Archard and McDaid18;Reference Wilczynski, Haynes and Lavis20). Performance tends to focus on how far search filters succeed in retrieving all relevant studies (sensitivity) and how few irrelevant records they retrieve (precision and specificity). Recent research has tested the performance of thirteen MEDLINE filters and eight EMBASE filters in two large sets of economic evaluations: Glanville and colleagues (Reference Glanville, Kaunelis and Mensinkai8) found that filters developed by CRD, NHS Quality Improvement Scotland and Royle and Waugh performed with high sensitivity in MEDLINE finding 99–100 percent of the known relevant records (4;Reference Macpherson and Boynton12;Reference Royle and Waugh17). However, the precision of these filters was well below 10 percent: to find one relevant record, more than nine irrelevant records would need to be assessed. This ratio of irrelevant to relevant records can be changed, to improve precision, by reducing sensitivity. The searcher makes this choice in the knowledge that although less time is spent sifting irrelevant records some relevant records will not have been retrieved. The study identified filters which offered a best optimization of sensitivity and precision for MEDLINE (Reference Glanville, Kaunelis and Mensinkai8).

For EMBASE, high sensitivity filters were those developed by NHS Quality Improvement Scotland, CADTH, Royle and Waugh, and CRD (4;Reference Ho, Li and Noorani10;Reference Macpherson and Boynton12;Reference Royle and Waugh17). Those filters also had low precision yielding high proportions of irrelevant studies (Reference Glanville, Kaunelis and Mensinkai8).

The difficulty that search filter designers face in improving the precision of search filters may reflect imprecision about and within the target records. There is a lack of agreement on the definition of economic evaluations which means search filter designers may have to make filters over-sensitive (Reference Sassi, Archard and McDaid18;Reference Wilczynski, Haynes and Lavis20). Second, some authors of economic evaluations may not report their methods clearly in abstracts or may use varying definitions of economic evaluations. NHS EED provides examples of records which authors described as economic evaluations but which NHS EED abstractors have categorized differently, perhaps as costing studies (15). Database indexers assign economic evaluation index terms to records but these may not be consistently applied. Using indexing terms specific to economic evaluations (such as “Cost-benefit analysis/” in MEDLINE and “Cost effectiveness analysis/” in EMBASE which indexers of those databases should add to relevant records) did not achieve high levels of sensitivity and precision in the recent search filter performance research (Reference Glanville, Kaunelis and Mensinkai8). The research noted that some economic evaluations were not indexed with the most appropriate indexing term available and there were records that did have the indexing term which were not, in fact, economic evaluations. This means that search filters cannot benefit significantly from sensitive but precise indexing and so under perform in terms of precision, yielding high proportions of irrelevant results (Reference Glanville, Kaunelis and Mensinkai8;Reference Wilczynski, Haynes and Lavis20). As a result, a high proportion of irrelevant records may still need to be processed to identify relevant records when searching biomedical databases such as MEDLINE and EMBASE.

DISCUSSION

Ideally the retrieval of economic evaluations should be informed by evidence on the most efficient strategies. This evidence is sparse at present and there are many areas which merit investigation. Research into efficient retrieval requires clear definitions of economic evaluations to allow comparison across studies. Research into comparative yield, search filter development, and performance testing will benefit from the development of gold standard sets of economic evaluations.

Further information on the comparative yield of different resources and different search approaches is required to enhance the efficient production of health technology assessments. Database producers should also be interested in identifying and promoting the unique value of their resources to users including researchers and ensuring that researchers can access studies consistently by study design indexing. Authors of studies should be aware that if they are not explicit about reporting methods clearly along with the research topic their research risks being missed by indexers and by searchers. Ideally, the type of economic evaluation being reported in the study (for example, cost-utility analysis) should be mentioned in the title and then also in the Methods section of the structured abstract. Authors can also signal the type of evaluation in author keywords, if these are available. Ideally, authors would use MeSH terms or EMTREE terms in their keywords. Although EMTREE offers specific economic evaluation terms, MeSH only offers “Cost-benefit Analysis.” This is useful for indexing evaluations which are cost-benefit analyses, but at present all types of economic evaluation are indexed under that heading. MeSH would become an even more useful tool for searchers seeking to retrieve economic evaluations accurately if either additional subject headings could be added (to reflect more accurately the differences in economic evaluations) or a better generic term (such as “Economic Evaluation”) could be introduced to replace “Cost-benefit Analysis.” A change in MeSH is, however, not yet in sight, so authors of economic studies should continue to use the most specific term which describes their method as a keyword.

Identifying economic evaluations, on the whole, suffers from lack of precision rather than lack of sensitivity. We can find a large proportion of economic evaluations relatively easily, but the efficiency of this process could be greatly increased if it were possible to reduce the proportion of irrelevant records to be sifted.

CONTACT INFORMATION

Julie Glanville, BA, PGDipLib, MSc (), Project Director–Information Services, York Health Economics Consortium, University of York, Market Square, York, North Yorks YO10 5NH, United Kingdom

Suzy Paisley, MA (), Senior Research Fellow, School of Health and Related Research (ScHARR), University of Sheffield, Regent Court, 30 Regent Street, Sheffield S1 4DA, United Kingdom

CONFLICT OF INTEREST

S. Paisley received a Department of Health Fellowship for this work. J. Glanville managed the NHS EED database and promoted all CRD databases during her employment with CRD (1993 to 2008). Wiley Interscience provided a three month free access subscription to HEED, allowing authors to prepare this paper.

References

REFERENCES

1. Alton, V, Eckerlund, I, Norlund, A. Health economic evaluations: How to find them. Int J Technol Assess Health Care. 2006;22:512571.CrossRefGoogle Scholar
2. Canadian Agency for Drugs and Technologies in Health. Grey matters: A practical search tool for evidence-based medicine. Ottawa: Canadian Agency for Drugs and Technologies in Health; 2008. http://www.cadth.ca/index.php/en/cadth/products/grey-matters (accessed September 3, 2010).Google Scholar
3. Canadian Agency for Drugs and Technologies in Health. Guidelines for the economic evaluation of health technologies: Canada. 3rd ed. Ottawa: Canadian Agency for Drugs and Technologies in Health; 2006.Google Scholar
4. Centre for Reviews and Dissemination. How are studies identified for NHS EED [Internet]. York: Centre for Reviews and Dissemination; 2009. http://www.crd.york.ac.uk/crdweb/html/help.htm (accessed September 3, 2010).Google Scholar
5. Centre for Reviews and Dissemination. Systematic Reviews: CRD's Guidance for Undertaking Reviews in Healthcare. 3rd ed. York: Centre for Reviews and Dissemination; 2009. http://www.york.ac.uk/inst/crd/pdf/Systematic_Reviews.pdf (accessed September 3, 2010).Google Scholar
6. CODECS [database on the Internet]. Paris: Collège des Economistes de la Santé; 2009. http://infodoc.inserm.fr/codecs/codecsanglais.nsf/(Web+English+Startup+Page)?OpenForm (accessed September 3, 2010).Google Scholar
7. Cost-effectiveness Analysis (CEA) Registry [database on the Internet]. Medford, MA: Center for the Evaluation of Value and Risk in Health, Tufts University; 2009. https://research.tufts-nemc.org/cear/default.aspx (accessed September 3, 2010).Google Scholar
8. Glanville, J, Kaunelis, D, Mensinkai, S. How well do search filters perform in identifying economic evaluations in MEDLINE and EMBASE. Int J Technol Assess Health Care. 2009;25:522529.Google Scholar
9. HEED: Health Economics Evaluation Database [database on the Internet]. Wiley Interscience; 2009. http://onlinelibrary.wiley.com/book/10.1002/9780470510933 (accessed September 3, 2010).Google Scholar
10. Ho, C, Li, H, Noorani, H, et al. Appendix 2: Literature search strategy for cost-effectiveness studies. In: Implantable cardiac defibrillators for primary prevention of sudden cardiac death in high risk patients: A meta-analysis of clinical efficacy, and a review of cost-effectiveness and psychosocial issues [Technology report no 81]. Ottawa: Canadian Agency for Drugs and Technologies in Health (CADTH); 2007. http://www.cadth.ca/media/pdf/332_ICD_tr_appendices_e.pdf (accessed September 3, 2010).Google Scholar
11. Kristensen, FB, Sigmund, H, eds. Health technology assessment handbook. Copenhagen: Danish Centre for Health Technology Assessment; 2008.Google Scholar
12. Macpherson, K, Boynton, J. Economics filter [Internet]. York: InterTASC Information Specialists’ SubGroup; 2009. http://www.york.ac.uk/inst/crd/intertasc/econ3.htm (accessed September 3, 2010).Google Scholar
13. McKinlay, RJ, Wilczynski, NL, Haynes, RB. Optimal search strategies for detecting cost and economic studies in EMBASE. BMC Health Serv Res. 2006;6:67.CrossRefGoogle ScholarPubMed
14. National Institute for Health and Clinical Effectiveness. Single technology appraisal (STA). Specification for manufacturer/sponsor submission of evidence. London: NICE; 2009. http://www.nice.org.uk/media/4D9/DD/ManufacturerConsulteeTemplateOct09.doc (accessed September 3, 2010).Google Scholar
15. NHS Economic Evaluation Database [database on the Internet]. York: Centre for Reviews and Dissemination; 2009. http://www.crd.york.ac.uk/crdweb/ (accessed September 3, 2010).Google Scholar
16. PEDE: Paediatric Economic Database Evaluation [database on the Internet]. Toronto: Hospital for Sick Children; 2008. http://pede.ccb.sickkids.ca/pede/index.jsp (accessed September 3, 2010).Google Scholar
17. Royle, P, Waugh, N. Literature searching for clinical and cost-effectiveness studies used in health technology assessment reports carried out for the National Institute for Clinical Excellence appraisal system. Health Technol Assess. 2003;7:161.Google Scholar
18. Sassi, F, Archard, L, McDaid, D. Searching literature databases for health care economic evaluations: How systematic can we afford to be? Med Care. 2002;40:387394.CrossRefGoogle ScholarPubMed
19. Shemilt, I, Mugford, M, Byford, S, et al. Chapter 15: Incorporating economics evidence. In: Higgins, JPT, Green, S, eds. Cochrane Handbook for Systematic Reviews of Interventions Version 5.0.1. updated September 2008). The Cochrane Collaboration; 2008. http://www.cochrane-handbook.org (accessed September 3, 2010).Google Scholar
20. Wilczynski, NL, Haynes, RB, Lavis, JN, et al. Optimal search strategies for detecting health services research studies in MEDLINE. CMAJ. 2004;171:11791185.Google Scholar
Figure 0

Table 1. Selected Databases Collecting Economic Evaluations, Technology Assessments, and Systematic Reviews