Hostname: page-component-745bb68f8f-5r2nc Total loading time: 0 Render date: 2025-02-06T09:59:24.359Z Has data issue: false hasContentIssue false

VALUE OF DATABASES OTHER THAN MEDLINE FOR RAPID HEALTH TECHNOLOGY ASSESSMENTS

Published online by Cambridge University Press:  28 April 2014

Diane L. Lorenzetti
Affiliation:
Department of Community Health Sciences, Institute of Public Health, University of Calgary and the Institute of Health Economics
Leigh-Ann Topfer
Affiliation:
Health Technology & Policy Unit, School of Public Health, University of Alberta
Liz Dennett
Affiliation:
Institute of Health Economics and JWS Health Sciences Library, University of Alberta
Fiona Clement
Affiliation:
Department of Community Health Sciences, Institute of Public Health, University of Calgary
Rights & Permissions [Opens in a new window]

Abstract

Objectives: The objective of this study was to explore the degree to which databases other than MEDLINE contribute studies relevant for inclusion in rapid health technology assessments (HTA).

Methods: We determined the extent to which the clinical, economic, and social studies included in twenty-one full and four rapid HTAs published by three Canadian HTA agencies from 2007 to 2012 were indexed in MEDLINE. Other electronic databases, including EMBASE, were then searched, in sequence, to assess whether or not they indexed studies not found in MEDLINE. Assessment topics ranged from purely clinical (e.g., drug-eluting stents) to those with broader social implications (e.g., spousal violence).

Results: MEDLINE contributed the majority of studies in all but two HTA reports, indexing a mean of 89.6 percent of clinical studies across all HTAs, and 88.3 percent of all clinical, economic, and social studies in twenty-four of twenty-five HTAs. While EMBASE contributed unique studies to twenty-two of twenty-five HTAs, three rapid HTAs did not include any EMBASE studies. In some instances, PsycINFO and CINAHL contributed as many, if not more, non-MEDLINE studies than EMBASE.

Conclusions: Our findings highlight the importance of assessing the topic-specific relative value of including EMBASE, or more specialized databases, in HTA search protocols. Although MEDLINE continues to be a key resource for HTAs, the time and resource limitations inherent in the production of rapid HTAs require that researchers carefully consider the value and limitations of other information sources to identify relevant studies.

Type
Methods
Copyright
Copyright © Cambridge University Press 2014 

Health technology assessment (HTA) is a process for evaluating the clinical, economic, and social implications of health technologies such as drugs, devices, and medical or surgical procedures (Reference Facey1). HTA reports inform policy and practice with respect to the implementation or reassessment of these technologies (Reference Hailey, Corabian, Harstall and Schneider2;Reference Hailey3). These reports are rigorous reviews of the existing evidence, and typically take between 6 months to 1 year or more to complete (Reference Menon and Stafinski4).

In contrast, rapid health technology assessments are often completed within a much shorter time frame (Reference Ganann, Ciliska and Thomas5). The need for rapid HTAs can be “driven by clinical urgency. . .intense demands for uptake of technology. . .[or] . . .limited time and resources” (Reference Ganann, Ciliska and Thomas5). Although “there is no universally accepted definition of what constitutes a rapid review” (Reference Khangura, Konnyu and Cushman6), they are usually narrower in scope and intended to inform the more immediate information needs of policy and healthcare decision makers (Reference Ganann, Ciliska and Thomas5Reference Watt, Cameron and Sturm8).

Producers of rapid HTAs use a variety of methods to expedite or streamline these reports, including adopting less extensive or rigorous study review, quality assessment, data extraction, and searching techniques than are typical of full HTAs (Reference Menon and Stafinski4;Reference Ganann, Ciliska and Thomas5;Reference Harker and Kleijnen7;Reference Watt, Cameron and Sturm8). As literature searches provide the foundation for health technology assessments, a failure to identify all literature relevant to a research question may bias results and directly impact the quality of these assessments.

Typically, extensive literature searches are highly sensitive. In attempting to capture all relevant studies, searchers may retrieve large numbers of irrelevant references. The time required to screen and exclude irrelevant studies may impact on the timeliness, and, ultimately, the utility of rapid HTAs.

While previous research has demonstrated that MEDLINE invariably identifies the majority of clinically relevant studies included in HTAs or systematic reviews, researchers have also shown that searching MEDLINE alone may bias the conclusions reached by the authors of these assessments (Reference Bahaadinbeigy, Yogesan and Wootton9Reference Subirana, Sola and Garcia20). In 2005, Royle et al. (Reference Royle, Bain and Waugh16) conducted a systematic review of meta-analyses of diabetes interventions. The authors found that, in fifteen (34 percent) of forty-four meta-analyses reviewed, searching MEDLINE alone would have biased study results (Reference Royle, Bain and Waugh16). As a result, other biomedical databases, such as EMBASE and the Cochrane Library, are often searched, in addition to MEDLINE, to identify potentially relevant studies.

In any context, an effective search protocol is one that achieves an acceptable balance between sensitivity (the degree to which searches retrieve all relevant studies) and precision (the degree to which searches exclude irrelevant studies) (Reference Glanville and Paisley21). Sampson and colleagues suggest that “in most retrieval situations, a threshold will be explicitly or implicitly set for one parameter. . . [sensitivity or precision]. . . and efforts will be made to maximize the other” (Reference Sampson, Tetzlaff and Urquhart22) (Figure 1). Overlapping content, reported in comparisons between MEDLINE and EMBASE as ranging from 10 percent to 87 percent, and inadequate indexing of studies in individual databases can significantly impact the precision of a literature search, increasing the time required to identify studies and, ultimately, complete a rapid HTA (Reference Betran, Say and Gulmezoglu10;Reference Royle and Milne14;Reference Sampson, Barrowman and Moher17;Reference Topfer, Parada and Menon23).

Figure 1. Definitions.

In a 2005 study which compared indexing of publications in MEDLINE and EMBASE, researchers reported that, drug indexing aside, EMBASE EMTREE indexing terms were less specific than MEDLINE MeSH terms (Reference Leclercq24). Anecdotal evidence also suggests that EMBASE studies may be over-indexed, with many more indexing terms applied to individual studies than is typical of MEDLINE (Reference Leclercq24). Both phenomena may result in the identification of many irrelevant studies when searching EMBASE.

Given increasing demands for high-quality rapid health technology assessments, and ongoing improvements to core databases such as MEDLINE and EMBASE, it is important to assess the effectiveness of biomedical databases in identifying relevant studies for rapid HTAs. The objective of this study was to explore the degree to which databases other than MEDLINE contribute studies relevant for inclusion in rapid health technology assessments.

METHODS

Data Collection

We chose to analyze a convenience sample of twenty-one full and four rapid health technology assessments published from 2007 to 2012 by three Canadian health technology assessment units: Institute of Health Economics (Edmonton Alberta), the University of Alberta and the University of Calgary. These agencies produce HTAs to inform the decision-making activities of a common provincial health authority. The agencies regularly meet to establish guidelines and procedures, including tools to support the design and execution of literature searches. The information professionals attached to these organizations have similar levels of searching experience, and access to the same core clinical and health research databases. As such, there exists a commonality of searching techniques and procedures both within and among the HTAs included in our study. The assessment topics in our sample ranged from purely clinical (e.g., drug-eluting stents) to those with broader social implications (e.g., spousal violence) (Table 1).

Table 1. Unique Clinical Studies Indexed in Electronic Databases (%)

We adopted a relative recall method to assess the contribution that each database made to the studies included in our sample (Reference Sampson, Zhang and Morrison25;Reference Hoogendam, de Vries Robbe, Stalenhoef and Overbeke26). Relative recall is “the proportion of. . . .relevant articles that any specific system, filter, or tool retrieves” (Reference Hoogendam, de Vries Robbe, Stalenhoef and Overbeke26). We isolated those studies included in the clinical, economic, and social sections of the HTAs from supplementary studies cited in the background or discussion sections of each report. Through the application of a variety of techniques, including searching multiple title fragments, author/date, author/journal and title word/journal combinations, we then determined which of these studies were indexed in MEDLINE.

Other electronic databases, identified in the methods sections of the HTAs reviewed, were then searched, in sequence, to determine whether or not they indexed those studies not found in MEDLINE. The databases searched, in order, were: MEDLINE (OVID), EMBASE (OVID), the Cochrane Library, PsycINFO (OVID), CINAHL (EBSCO), SocINDEX (EBSCO), Business Source Complete (EBSCO), and ERIC (EBSCO). Conference abstracts and other gray literature were excluded.

Data Analysis

An Excel spreadsheet was created to track and analyze study results. We calculated the proportion of clinical and all (clinical, economic, and social) studies indexed in each electronic database for each HTA.

RESULTS

Among twenty-five rapid and full HTAs, we identified 731 clinical and 1,621 economic or social references, for an average of 94 included studies per HTA. The number of databases searched during the preparation of these reports ranged from three to eleven, with a mean of 6.2 databases per HTA and 3.5 per rapid HTA. Although the number and type of electronic databases varied across assessments, MEDLINE and the Cochrane Library were included in all HTAs, and EMBASE in all but two assessments.

Clinical Studies

MEDLINE indexed an average of 89.6 percent of clinical studies across all HTAs (Table 1). In twenty of twenty-three HTAs with a clinical component, the proportion of clinical studies indexed in MEDLINE was equal to or greater than 80 percent, with MEDLINE indexing 100 percent of the clinical studies in eight HTAs (Assisted Reproductive Technologies Genetics Rapid Review, Drug Eluting Stents Rapid Review, Exercise Testing for Cardiac Events, HPV Testing, Islet Transplantation, Point of Care Testing, Sleep Disordered Breathing, and Vitamin D Testing Rapid Review). MEDLINE contributed less than 50 percent of included studies in two HTAs (Sex Offender Treatment and Spousal Violence).

Although MEDLINE and EMBASE, combined, indexed an average of 95.3 percent of clinical studies in all HTAs and rapid reviews, EMBASE indexed relevant clinical studies in thirteen of twenty-five HTAs and rapid reviews. The proportion of clinical studies identified in EMBASE but not in MEDLINE averaged 5.7 percent across HTAs (ranging from 0 percent to 37.5 percent). In two assessments (Fetal Fibronectin and Spousal Violence), CINAHL or PsycINFO contributed more clinical non-MEDLINE studies than EMBASE.

All Studies

When we examined all studies included in the clinical, economic, and social components of the HTAs, we found that, in twenty-four of twenty-five assessments, MEDLINE indexed an average of 88.3 percent of studies across all three domains (Table 2). In only one case (Spousal Violence) were the majority of included studies not indexed in MEDLINE.

Table 2. Unique Studies (Clinical, Social & Economic) Indexed in Electronic Databases (%)

Combined, MEDLINE and EMBASE indexed 90.3 percent of all included studies. While EMBASE contributed unique studies in twenty-two of twenty-five HTAs, three of four rapid reviews did not include any studies unique to EMBASE. In the case of four reviews (Middle Ear Implants, Occupational Stress, Sex Offenders Treatment, and Spousal Violence) PsycINFO, or CINAHL, contributed greater numbers of non-MEDLINE studies than EMBASE. Notably, PsycINFO contributed a larger percentage of all included studies (48 percent) to the HTA on Spousal Violence than either MEDLINE (39 percent) or EMBASE (11 percent).

DISCUSSION

MEDLINE contributed more than 50 percent of included studies to all but two HTAs included in our analysis. Whereas most reports incorporated studies not indexed in MEDLINE, the importance of EMBASE varied across HTAs and study types (clinical versus social or economic aspects).

Although we anticipated that MEDLINE and EMBASE would collectively index the majority of clinical studies in our sample, we did not predict the contribution that EMBASE would make to the identification of nonclinical studies. While MEDLINE indexed by far the largest number of nonclinical studies included in all but one HTA, EMBASE more frequently contributed unique nonclinical, rather than clinical, studies (23 versus 13 HTAs) to the HTAs in our sample. This finding suggests that searchers may wish to consider the benefits and feasibility of searching EMBASE to identify unique nonclinical literature. That said, in some cases, PsycINFO and CINAHL indexed as many, if not more, included studies than EMBASE, highlighting the importance of assessing the relative value of individual databases in the context of the topics being searched.

Our findings mirror previous research on the relevance of MEDLINE for biomedical systematic reviews and HTAs (Reference Royle and Waugh15;Reference Slobogean, Verma and Giustini18;Reference Topfer, Parada and Menon23). This study updates the literature on this topic by reviewing HTAs published between 2007 and 2012. To our knowledge, it is also the first to compare and contrast the relevance of electronic databases in identifying both clinical and nonclinical studies appropriate for inclusion in health technology assessments.

Our study has caveats and limitations. Although we included assessments produced by three HTA agencies in Canada, all agencies have access to similar university library-based electronic database resources. The inclusion of HTAs from international agencies may have produced different results. Second, the retrospective nature of this study precluded us from determining whether or not the exclusion of EMBASE and other non-MEDLINE databases could, in fact, have affected the conclusions reached by the authors of these assessments. Finally, although we determined which databases indexed studies included in the HTAs in our sample, we did not assess the degree to which the search strategies that were implemented in these HTAs were capable of identifying all of the studies indexed in each database.

We used a relative recall approach to evaluate the contribution that each database made to the set of relevant studies in our sample. Through this method, we were able to determine, for example, that 64 percent of all studies included in our HTA on Occupational Stress were indexed in MEDLINE. While relative recall enables the collection of data that is fundamental to assessing the value of electronic and other information sources, this method of analysis is but a first step toward determining which information sources can best inform the production of topic-specific HTAs. One would also wish to establish whether or not a database-specific search strategy was able to, or could be adapted to, retrieve all relevant studies indexed in that database. As variability in study indexing can hamper the ability of even the most experienced searchers to identify relevant studies, the development of methods to address this variability would be an important area for further research.

Although it appears that authors will continue to be required to supplement MEDLINE searching with other sources of information to achieve acceptable levels of extensiveness in study identification, the question, in the context of rapid reviews, is which electronic databases should be searched to achieve an acceptable balance between sensitivity and precision? As Sampson and colleagues reported in 2003: “[although] [s]earching Medline but not Embase risks biasing a meta-analysis by finding studies that show larger estimates. . . .their prevalence seems low enough that the risk may be slight, provided the rest of the search is comprehensive” (Reference Sampson, Barrowman and Moher17).

Despite their size and breadth of content, databases such as MEDLINE and EMBASE index only a portion of the world's biomedical journals (27;Reference Ware and Mabe28). In addition, whereas MEDLINE (by means of PubMED) has been freely available, worldwide, since 1997, the cost of accessing commercial databases, such as EMBASE and CINAHL, may be prohibitive for HTA agencies that are unable to link to these resources through their university or agency libraries.

There is and will likely remain a need for rapid access to current information to inform immediate healthcare decision making (Reference Ganann, Ciliska and Thomas5). As such, it behooves authors to continue to establish best practice with respect to the production of rapid HTAs. Although researchers may contend that excluding EMBASE and other electronic databases from rapid reviews will bias study results, study identification can be achieved through a variety of means. Authors of full HTAs and systematic reviews typically incorporate non-database information sources in their search protocols. These strategies, including scanning the reference lists of included studies, cited reference searching, and consulting with experts, may enable searchers to match or exceed the precision achieved through searching EMBASE or other databases (Reference Royle and Waugh15). Indeed, it may be that these alternative sources can complement MEDLINE searching in cases where the retrieval of large numbers of irrelevant records from EMBASE and other non-MEDLINE databases may threaten the timely completion of health technology assessments.

CONCLUSIONS

Although MEDLINE continues to be a key resource for HTAs, the time and resource limitations inherent in the production of rapid HTAs require that researchers carefully consider the value, and limitations, of other information sources to support study identification. While EMBASE is undoubtedly a key resource in health research and HTA production, our study indicates that, where time and resources are at a premium, and choices must be made, there may be some topics for which other, more specialized, or topic-specific, databases could make a more significant contribution to study identification than EMBASE. Study identification protocols should reflect the information needs of the topic under review, as well as the timelines and resources available to complete these assessments.

CONTACT INFORMATION

Diane L. Lorenzetti, MLS (), Department of Community Health Sciences, Institute of Public Health, University of Calgary, 3280 Hospital Dr NW, Calgary Alberta, Canada T2N 4Z6

Leigh-Ann Topfer, MLS, Health Technology & Policy Unit, School of Public Health, University of Alberta, 3021 RTF Building, 8308-114 St., Edmonton, AB T6G 2V2

Liz Dennett, MLIS, JWS Health Sciences Library, WC Mackenzie Health Science Centre, Edmonton, Alberta, Canada T6G 2R7

Fiona Clement, PhD, Institute of Public Health, Department of Community Health Sciences, University of Calgary, 3280 Hospital Dr NM, Calgary Alberta, Canada T2N 4Z6

CONFLICTS OF INTEREST

The authors report no conflicts of interest.

References

REFERENCES

1. Facey, K. Glossary. International network of agencies for health technology assessment 2006. http://www.inahta.org/Glossary/ (accessed August 15, 2013).Google Scholar
2. Hailey, D, Corabian, P, Harstall, C, Schneider, W. The use and impact of rapid health technology assessments. Int J Technol Assess Health Care. 2000;16:651656.Google Scholar
3. Hailey, D. A preliminary survey on the influence of rapid health technology assessments. Int J Technol Assess Health Care. 2009;25:415418.Google Scholar
4. Menon, D, Stafinski, T. Health technology assessment in Canada: 20 years strong? Value Health. 2009;12(Suppl 2):S1419.Google Scholar
5. Ganann, R, Ciliska, D, Thomas, H. Expediting systematic reviews: Methods and implications of rapid reviews. Implement Sci. 2010;5:56.CrossRefGoogle ScholarPubMed
6. Khangura, S, Konnyu, K, Cushman, R, et al. Evidence summaries: The evolution of a rapid review approach. Syst Rev. 2012;1:10.CrossRefGoogle ScholarPubMed
7. Harker, J, Kleijnen, J. What is a rapid review? A methodological exploration of rapid reviews in health technology assessments. Int J Evid Based Healthc. 2012;10:397410.Google Scholar
8. Watt, A, Cameron, A, Sturm, L, et al. Rapid reviews versus full systematic reviews: An inventory of current methods and practice in health technology assessment. Int J Technol Assess Health Care. 2008;24:133139.Google Scholar
9. Bahaadinbeigy, K, Yogesan, K, Wootton, R. MEDLINE versus EMBASE and CINAHL for telemedicine searches. Telemed J E Health. 2010;16:916919.Google Scholar
10. Betran, AP, Say, L, Gulmezoglu, AM, et al. Effectiveness of different databases in identifying studies for systematic reviews: Experience from the WHO systematic review of maternal morbidity and mortality. BMC Med Res Methodol. 2005;5:6.CrossRefGoogle ScholarPubMed
11. Egger, M, Juni, P, Bartlett, C, et al. How important are comprehensive literature searches and the assessment of trial quality in systematic reviews? Empirical study. Health Technol Assess. 2003;7:176.CrossRefGoogle ScholarPubMed
12. Lemeshow, AR, Blum, RE, Berlin, JA, et al. Searching one or two databases was insufficient for meta-analysis of observational studies. J Clin Epidemiol. 2005;58:867873.Google Scholar
13. Parkhill, AF, Clavisi, O, Pattuwage, L, et al. Searches for evidence mapping: Effective, shorter, cheaper. J Med Libr Assoc. 2011;99:157160.Google Scholar
14. Royle, P, Milne, R. Literature searching for randomized controlled trials used in Cochrane reviews: Rapid versus exhaustive searches. Int J Technol Assess Health Care. 2003;19:591603.CrossRefGoogle ScholarPubMed
15. Royle, P, Waugh, N. A simplified search strategy for identifying randomised controlled trials for systematic reviews of health care interventions: A comparison with more exhaustive strategies. BMC Med Res Methodol. 2005;5:23.Google Scholar
16. Royle, PL, Bain, L, Waugh, NR. Sources of evidence for systematic reviews of interventions in diabetes. Diabet Med. 2005;22:13861393.Google Scholar
17. Sampson, M, Barrowman, NJ, Moher, D, et al. Should meta-analysts search Embase in addition to Medline? J Clin Epidemiol. 2003;56:943955.Google Scholar
18. Slobogean, GP, Verma, A, Giustini, D, et al. MEDLINE, EMBASE, and Cochrane index most primary studies but not abstracts included in orthopedic meta-analyses. J Clin Epidemiol. 2009;62:12611267.Google Scholar
19. Stevinson, C, Lawlor, DA. Searching multiple databases for systematic reviews: Added value or diminishing returns? Complement Ther Med. 2004;12:228232.Google Scholar
20. Subirana, M, Sola, I, Garcia, JM, et al. A nursing qualitative systematic review required MEDLINE and CINAHL for study identification. J Clin Epidemiol. 2005;58:2025.Google Scholar
21. Glanville, J, Paisley, S. Identifying economic evaluations for health technology assessment. Int J Technol Assess Health Care. 2010;26:436440.Google Scholar
22. Sampson, M, Tetzlaff, J, Urquhart, C. Precision of healthcare systematic review searches in a cross-sectional sample. Res Synth Methods. 2011;2:119125.CrossRefGoogle ScholarPubMed
23. Topfer, LA, Parada, A, Menon, D, et al. Comparison of literature searches on quality and costs for health technology assessment using the MEDLINE and EMBASE databases. Int J Technol Assess Health Care. 1999;15:297303.Google Scholar
24. Leclercq, E. Indexing of a clinical paper by EMBASE and MEDLINE. CILIP Health Libr Group Newsletter. 2005;22:1013.Google Scholar
25. Sampson, M, Zhang, L, Morrison, A, et al. An alternative to the hand searching gold standard: Validating methodological search filters using relative recall. BMC Med Res Methodol. 2006;6:33.Google Scholar
26. Hoogendam, A, de Vries Robbe, PF, Stalenhoef, AF, Overbeke, AJ. Evaluation of PubMed filters used for evidence-based searching: Validation using relative recall. J Med Libr Assoc. 2009;97:186193.Google Scholar
27. Key MEDLINE indicators. US National Library of Medicine 2012. http://www.nlm.nih.gov/bsd/bsd_key.html (accessed August 16, 2013).Google Scholar
28. Ware, M, Mabe, M. The STM report: An overview of scientific and scholarly journal publishing, 3rd ed. The Netherlands: International Association of Scientific, Technical and Medical Publishers; 2012.Google Scholar
29. Cochrane Collaboration. Glossary. Cochrane Collaboration 2014. www.cochrane.org/glossary/ (accessed August 13, 2013).Google Scholar
Figure 0

Figure 1. Definitions.

Figure 1

Table 1. Unique Clinical Studies Indexed in Electronic Databases (%)

Figure 2

Table 2. Unique Studies (Clinical, Social & Economic) Indexed in Electronic Databases (%)