Hostname: page-component-745bb68f8f-lrblm Total loading time: 0 Render date: 2025-02-06T07:34:42.470Z Has data issue: false hasContentIssue false

Economic evaluations conducted by Canadian health technology assessment agencies: Where do we stand?

Published online by Cambridge University Press:  01 October 2008

Jean-Eric Tarride
Affiliation:
McMaster University/St. Joseph's Healthcare Hamilton
Catherine Elizabeth McCarron
Affiliation:
McMaster University/St. Joseph's Healthcare Hamilton
Morgan Lim
Affiliation:
McMaster University/St. Joseph's Healthcare Hamilton
James M. Bowen
Affiliation:
McMaster University/St. Joseph's Healthcare Hamilton
Gord Blackhouse
Affiliation:
McMaster University/St. Joseph's Healthcare Hamilton
Robert Hopkins
Affiliation:
McMaster University/St. Joseph's Healthcare Hamilton
Daria O'Reilly
Affiliation:
McMaster University/St. Joseph's Healthcare Hamilton
Feng Xie
Affiliation:
McMaster University/St. Joseph's Healthcare Hamilton
Ron Goeree
Affiliation:
McMaster University/St. Joseph's Healthcare Hamilton
Rights & Permissions [Opens in a new window]

Abstract

Objectives: To examine the production of Health Technology Assessments (HTAs) with economic evaluations (EEs) conducted by Canadian HTA agencies.

Methods: This research used a three-step approach: (i) the Web sites of five Canadian organizations promoting HTA were searched to identify HTA reports with EEs; (ii) HTA agencies were surveyed to verify that our information was complete with respect to HTA activities and to describe the factors that influence the HTA process in Canada (i.e., selection of HTA topics, execution, dissemination of results and future trends); (iii) HTAs with EEs were appraised in terms of study design, retrieval of clinical and economic evidence, resource utilization and costing, effectiveness measures, treatment of uncertainty as well as presence of a budget impact analysis (BIA), and policy recommendations.

Results: Two hundred forty-nine HTA reports were identified of which 19 percent included EEs (n = 48). Decision analytic techniques were used in approximately 75 percent of the forty-eight EEs and probabilistic sensitivity analyses were commonly used by all agencies to deal with parameter uncertainty. BIAs or policy recommendations were given in 50 percent of the evaluations. Differences between agencies were observed in terms of selection of topics, focus of assessment and production of HTA (e.g., in-house activities). Major barriers to the conduct of HTAs with EEs were capacity, a lack of interest by decision makers and a lack of robust clinical information.

Conclusions: The results of this research point to the need for increased HTA training, collaboration, evidence synthesis, and use of pragmatic “real world” evaluations.

Type
GENERAL ESSAYS
Copyright
Copyright © Cambridge University Press 2008

Over the past few decades, health technology assessments (HTA) and economic evaluations (EEs) have gained increasing importance in healthcare decision making worldwide and in Canada. Two reviews of Canadian HTA agencies activities have been previously performed. The most recent review assessed 187 HTA documents produced between 1995 and 2001 by six Canadian agencies (Reference Lehoux, Tailliez, Denis and Hivon8). Results indicated that while 90 percent of these reports evaluated the effectiveness of the technology, less than 25 percent of the studies included a cost-effectiveness analysis (Reference Lehoux, Tailliez, Denis and Hivon8). In both this evaluation and in an earlier review by Menon and Topfer (Reference Menon and Topfer10), which evaluated 97 HTAs conducted by four agencies between 1988 and 2001, no further information was available regarding the characteristics of the EEs.

Several changes have also occurred in Canada since 2000, which may have impacted the Canadian production of HTAs. In Québec, the Agence d'évaluation des technologies et des modes d'intervention en santé (AETMIS), an independent organization reporting to Québec's Minister of Health and Social Services, was formed in 2000 to replace the Conseil d'évaluation des technologies (CETS). At the hospital level, the Health Technology Assessment Unit (TAU) was established by McGill University Health Centre (MUHC) in June 2001 to advise the hospital in difficult resource allocation decisions. A common governance with the Centre Hospitalier de l'Université de Montréal (CHUM) was established in December 2005. In British Columbia, the production of formalized HTA reports was halted because the funding to the Centre for Health Services and Policy Research (CHSPR) was removed in 2002. A new agency, the Ontario Health Technology Advisory Committee (OHTAC), working in collaboration and supported by the previously established Medical Advisory Secretariat (MAS) of the Ontario Ministry of Health and Long-Term Care, was created in 2003 to evaluate and provide recommendations regarding the funding and diffusion of nondrug technologies in Ontario (Reference Goeree and Levin5;Reference Levin, Goeree and Sikich9). In support of this program, the Programs for Assessment of Technology in Health (PATH) Research Institute, at St. Joseph's Healthcare Hamilton, was also created in 2003. In Alberta, the Alberta Health Technologies Decision Process/Alberta Advisory Committee on Health Technologies (AACHT) was established in 2004 to facilitate decisions on funding and coverage of health services in this province, with a focus on medical and acute services (Reference Borowski, Brehaut and Hailey1). As a consequence of the Decision Process, the Health Technology Assessment Unit of the Alberta Heritage Foundation for Medical Research (AHFMR) was moved to the Institute of Health Economics (IHE) in 2006. Finally, the Canadian Coordinating Office of Health Technology Assessment (CCOHTA) was renamed in 2006 the Canadian Agency for Drugs and Technologies in Health (CADTH) to reflect the new orientation of the agency.

Since 2004, Canada has also had a national HTA strategy for health technology which was endorsed by the federal and provincial health ministers (6). Considering the recent developments in economic evaluations (e.g., treatment of uncertainty), the publication in 2006 of the Third Edition of the Canadian guidelines for the economic evaluation of health technologies (2), the lack of information regarding economic evaluations performed by Canadian HTA agencies and the changes in the Canadian HTA landscape, a new assessment was warranted to gain a better understanding of the Canadian capacity in undertaking HTAs with EEs.

Methods

The Web sites of five Canadian organizations promoting HTA at the time of the study and known to be active in conducting economic evaluations (AETMIS, CADTH, IHE, MAS/OHTAC, TAU) were initially searched to identify HTA reports conducted between January 2001 and June 2006. After screening the title and abstracts/executive summaries of each identified report, the reports were classified as HTA with or without an EE. Publication dates given in the HTA reports were used to classify HTAs per year of publication. As AETMIS reports can be published in both French and English, the dates of publication indicated in the French and English versions of an HTA report may not match. However, to ensure consistency across evaluations, the publication dates indicated in the English reports were chosen by default for AETMIS HTA reports, unless an English version was not available. Included reports for this review were HTAs with an EE as part of the assessment (i.e., cost-minimization analysis, cost-effectiveness analysis, cost-utility analysis, or cost-benefit analysis) as opposed to HTAs that included only a review of the clinical or economic evidence.

Key individuals from each HTA agency were identified and invited in December 2006 to validate the results of the search (i.e., total number of HTAs per year of publication and list of HTA reports with and without EEs) and to answer a survey developed to gain a better understanding of the Canadian HTA capacity in undertaking EEs. This five-page survey, which included open and closed questions, was designed around five major themes: (i) selection of HTA topics (e.g., request from government); (ii) barriers to conducting EEs (e.g., lack of economic expertise); (iii) HTA capacity (e.g., number of HTA employees with economic expertise); (iv) utilization (e.g., provincial funding) and dissemination of results (e.g., Web site posting); and (v) perceived future trends (e.g., increased collaboration between provinces).

A standardized abstraction form was developed and used to capture key information for each HTA report with an EE: technology type (e.g., drug); disease area (as per International Classification Disease -10CA, Canadian enhancement's, main classifications); assessment of the clinical and economic evidence (e.g., QUORUM diagram); specifics of the EE (study design, data sources for resource utilization and costing, type of effectiveness measures, treatment of uncertainty, and whether the limitations of the EE were identified); completion of a budget impact analysis; presence of policy recommendations; and discussion around ethical, social, or legal aspects. Data were abstracted by two reviewers (J.E.T., C.E.M.) and entered into an Excel spreadsheet. As in Lehoux et al. (2004) (Reference Lehoux, Tailliez, Denis and Hivon8), agencies were referred to by numbers ranging from Agency 1 (A1) to Agency 5 (A5). The search was updated in October 2007 by searching the five agencies' Web sites to retrieve HTA reports posted between July 2006 and September 2007. No follow-up with the agencies was done for this period of time.

Results

HTA Activities

Our search resulted in a total of 249 HTA reports produced by these five agencies between January 2001 and September 2007. The total annual number of HTA reports increased by more than 150 percent between 2001 (n = 21) and 2006 (n = 55). The proportion of HTAs including an EE in 2001 and 2006 remained relatively constant (24 percent and 25 percent, respectively). For the 9 months of 2007, twenty-nine HTA reports were retrieved, of which 28 percent included an economic evaluation. In total, 48 HTA reports with an EE (19 percent of total) and 201 HTA reports without an EE were identified for the period January 2001 to September 2007.

Table 1 shows fluctuations over time in the production of HTA reports with and without an EE. For example, the number of HTAs without an EE was lowest for four agencies in 2004 (A1, A2, A3, and A4). While A3 has seen a decrease of its HTA activities after 2003, A5 was the main driver of the increase observed in the total number of HTA reports in 2005 and 2006. It is not known if the drop observed in 2007 for A5 is due to a decrease in HTA activities or for other reasons (e.g., Web site not updated). Although it is difficult to directly compare the agencies due to their differences in terms of mandate, organizational structure, personnel and budgets, HTAs with EEs were more frequently conducted by three agencies (A1: 28 percent of total HTA activities, A4: 26 percent and A5: 18 percent compared with 5 percent and 10 percent for A3 and A2, respectively).

Table 1. Number of HTA Reports by the Presence of Economic Evaluation (EE), Year, and Agency (A1 to A5)

Selection of Topics and Barriers to Conduct Economic Evaluations

When answering the survey, two agencies reported that HTA topics were selected following requests from healthcare facilities or professionals (A2, A4, A5) or the government (A2, A5). Promising technologies associated with a high degree of uncertainty or technologies with a potentially large impact on the healthcare system were also more likely to be evaluated by these three agencies. One agency (A1) specified that the origin of the request for an assessment was not important. Instead, several criteria were used to select a particular topic (e.g., impact on the healthcare system, potential clinical and economic impact, number of currently available alternatives, and availability of HTA reports). Finally, topics for assessment were exclusively chosen by A3 without any external influences.

All agencies agreed that the factors determining the topic of a HTA with or without an EE were the same. Capacity (A1, A3, A5), limited clinical evidence or robust data (A1, A4, A5), and a lack of interest or relevance to policy makers (A1, A2) were cited as the main barriers for conducting EEs as part of a HTA.

HTA Production and Capacity

Differences between agencies were also observed in terms of production of HTAs. Two agencies indicated that they conducted their HTA reports entirely in-house (A3 and A4) while three agencies (A1, A2, A5) reported collaborating with other centers or research groups in conducting HTAs. Working in collaboration with other researchers or outsourcing was more common for HTA reports with an EE (Table 2). In terms of capacity, the five agencies had a total of twenty-eight employees with economic evaluation expertise in 2006.

Table 2. HTA Production by Presence of Economic Evaluation (EE), Year, and Agency (A1 to A5)

Technology Type and Disease Areas

As shown in Table 3, devices (39 percent) and drugs (35 percent) were more frequently evaluated than procedures (18 percent), programs (4 percent), or other (4 percent) (e.g., combination of drug and program). Two agencies (A2 and A5) never conducted an EE of a drug, while almost two-thirds of A1's economic reports assessed drugs. The evaluation of procedures was mainly performed by three agencies (A2, A4, and A5).

Table 3. Type of Technology Assessed in HTAs with Economic Evaluations (EEs), Total and by Agency (A1 to A5)

Approximately half of the forty-eight HTA reports with an EE evaluated technologies related to diseases of the circulatory system (n = 10), infectious and parasitic diseases (n = 8), and neoplasms (n = 7) (Table 4). The other half of the evaluations was spread across several disease conditions. There was little duplication among the agencies, with the exception of drug eluting stents for percutaneous coronary artery stent, for which cost-effectiveness was evaluated by three agencies.

Table 4. Disease Conditions Assessed in Health Technology Assessments (HTAs) with Economic Evaluations (EEs) per Agency (A1 to A5) (ICD-10CA)1 (Total and by Agency)

1International Classification Disease-10CA (Canadian enhancement).

aTotal equals 50 as one technology (positron emission tomography) was assessed for three separate disease conditions.

Review of the Evidence

Overall, 85 percent of HTA reports with an EE reported details on the literature search used to identify clinical evidence (e.g., databases used for the literature search) and 67 percent specified the inclusion and exclusion criteria (Table 5). A QUORUM diagram summarizing the retrieval process of the clinical evidence was included in 29 percent of the reports. In comparison, approximately half (54 percent) of the reports described the literature searches and 40 percent specified the inclusion and exclusion criteria used to retrieve the economic evidence. Although meta-analyses synthesizing the clinical evidence were conducted in almost 31 percent of these forty-eight reports, these results should be interpreted with caution as in some cases a meta-analysis could not be performed due to a lack of clinical evidence (e.g., one randomized clinical trial available).

Table 5. Assessment of Clinical and Economic Evidence per Agency (Total and by Agency)

Characteristics of Economic Evaluations

Study Design

Seventy-four EEs were conducted within the forty-eight reports: forty-three cost-effectiveness analyses (CEAs), twenty-six cost-utility analyses (CUAs), and five cost-minimization analyses (CMAs). The perspective of the studies was primarily that of the payer (e.g., Provincial Ministry of Health, hospital). Approximately 15 percent of the HTAs with an EE also included indirect costs in their evaluations.

Decision analytic techniques were used in approximately 75 percent of the EEs with models based on decision trees being more common than Markov models. The remaining 25 percent of the reports calculated expected costs, outcomes and associated cost-effectiveness measures outside of a decision analytic framework. Bayesian techniques were used once by one agency to synthesize the clinical evidence and to perform the EE. Another agency conducted field evaluations to collect data to determine the costs and effects associated with two new technologies.

Resource Utilization and Costing

As several reports used direct costing information for their evaluations rather than valuing resources consumed, the resource utilization data used to populate the models were not always reported. Administrative databases were the most commonly cited sources for resource utilization. The unit costs or the costs used were relevant to the province of origin with the exception of one agency (A1), which used data from Alberta, Nova Scotia, and Ontario, depending on the studies. Overall, unit costs of resources were clearly stated in all reports but a minority of reports presented unit costs in a table.

Effectiveness Measures

Effectiveness measures were distributed between quality-adjusted life-years (QALYs) (n = 26), disease specific measures (n = 31), and life-years gained (n = 17). In CUAs, utility data sources were documented 50 percent of the time. When reported, utilities were derived from published papers (n = 5), patients (n = 3), the general public (n = 3), or clinical opinion (n = 1). One study collected primary data on patients using the EuroQoL-5 Dimensional (EQ-5D) instrument to derive utilities.

Treatment of Uncertainty

Overall, univariate sensitivity analyses (SAs) were conducted in over two-thirds (69 percent) of the EEs and Monte Carlo simulations were used in 50 percent of the HTAs (Table 6). The uncertainty was plotted 38 percent of the time and cost-effectiveness acceptability curves (CEACs) were used in 29 percent of the reports. To help prioritize future research, value of information analysis was conducted by one agency on two occasions. However, these numbers should be interpreted with caution as one-quarter of the HTA reports with an EE did not use any decision analytic techniques. As a percentage of the number of studies using modeling techniques (n = 36), 67 percent used Monte Carlo simulations, 48 percent depicted the uncertainty, and 39 percent used CEACs.

Table 6. Methods to Deal with Uncertainty per Agency (Total and by Agency)

Note. CEACs, cost-effectiveness acceptability curves.

Study Limitations

Two-thirds (67 percent) of the studies identified the limitations associated with the economic results, and 54 percent compared their results with other economic studies.

Other Components of HTA Reports

Budget impact analyses (BIAs) were conducted in 50 percent of the HTA reports with an EE and mostly from the perspective of the jurisdiction. Equity, social or legal issues were discussed in 31 percent of the reports and policy recommendations were provided in approximately 50 percent of the HTAs with an EE.

Utilization and Dissemination of the Results

The results of the survey indicated that HTA reports were used by policy makers for making decisions regarding funding and health services organization. Three agencies (A1, A3, and A5) also mentioned that HTA reports were used for guideline formulation or background information. Levels of government using these HTA reports were either federal (A1), provincial or regional (A1, A2, A3, A5), local (A1, A3, A5), or institutional (A1, A2, A4, A5). At the time of the survey, all HTA reports conducted by these five agencies were systematically posted on the Web sites and distribution lists were used to disseminate results to federal or provincial agencies, hospitals, physicians and various other users. Private insurers and employers were rarely included in these distribution lists. One agency indicated that their HTA reports were posted on the International Network of Agencies for Health Technology Assessment (INAHTA) Web site.

Future Trends

When asked about the future trends over the next 3 years, three agencies (A2, A3, A5) expected an increase in personnel and budget. Four agencies (A2, A3, A4, and A5) anticipated an augmentation of the number of HTA reports while one agency (A1) predicted a decrease of its activities and a status quo regarding personnel and budget. Agencies had mixed feelings regarding increased collaboration with other provinces or research groups, the conduct of field evaluations and potential participation of the public in HTAs.

Discussion

Despite many changes in the Canadian HTA landscape over the past few years, HTA activities have continued to increase in Canada. In this review, a total of 249 HTAs conducted by five agencies were identified for the period from 2001 to September 2007, which compares to 187 HTAs for the period 1995–2001 (Reference Lehoux, Tailliez, Denis and Hivon8) and 60 HTAs for the period 1989–95 (Reference Menon and Topfer10). It is also important to note that our search was aimed at five agencies known to conduct economic evaluations and, as such, our results underestimate the whole production of HTAs in Canada. For example, more than thirty HTA reports without an economic evaluation were produced by AHFMR between 2001 and 2006 but were not included in our study. Similarly, we did not survey academic groups conducting systematic reviews of technologies as our intent was to get a better understanding of the Canadian capacity in conducting economic evaluations. Although not directly comparable, while Lehoux et al. (Reference Lehoux, Tailliez, Denis and Hivon8) reported that 24 percent of HTAs included an EE for the period 1995–2001, we observed in our sample a smaller proportion for the period 2001–07 (i.e., 19 percent of all HTA reports) mainly due to a drop in the number of economic activities in 2004. However, more EEs were identified between January 2001 and September 2007 (n = 48) than during the period 1995–2001 (n = 45) (Reference Lehoux, Tailliez, Denis and Hivon8). More than one-third of our sample evaluated drugs (35 percent) while the two previous assessments reported lower percentages (26 percent in Menon and Topfer, (Reference Menon and Topfer10) and 28 percent in Lehoux et al. (Reference Lehoux, Tailliez, Denis and Hivon8)). Legal, social and ethical issues also seemed to be more frequently discussed in 2001–07 (31 percent) than in 1995–2001 (20 percent). It is currently unknown if these differences were due to different samples of studies (all HTAs as in the two previous reviews versus all HTAs with an EE as in our study), to a change in the Canadian HTA environment (e.g., different agencies) or other reasons. Due to the economic scope of this study, we did not analyze the 201 HTAs without an EE to directly compare our results with the two previous assessments conducted in Canada. Instead, we concentrated on the Canadian capacity to undertake EEs.

Our review of forty-eight HTAs with EE indicated that CEAs was the preferred type of economic evaluation and the perspective of the studies was mainly from the payer. Primary data collection was rarely used and models relied mostly on literature and administrative databases to answer the clinical and economic questions. The results of this study also suggest a slight improvement in the methods to retrieve the clinical evidence. Menon and Topfer (Reference Menon and Topfer10) reported that 58 percent of the studies did not specify the inclusion/exclusion criteria selected to retrieve the clinical evidence compared with 33 percent in our sample. However, our results suggest that economic literature searches were less rigorously documented than clinical searches. Encouragingly, probabilistic sensitivity analyses (e.g., Monte Carlo simulations) were commonly used to deal with parameter uncertainty and the majority of the studies identified limitations associated with their EEs. The fact that not all HTA reports with an EE included a BIA or policy recommendations, may reflect differences among the mandates of the respective agencies (e.g., federal, provincial, and local). Similar to the first two Canadian assessments (Reference Lehoux, Tailliez, Denis and Hivon8;Reference Menon and Topfer10), we noted differences between agencies in terms of production, selection of topics, focus of assessment (drug versus nondrug) and ways of conducting activities (in/out-house production).

As previously reported in Canada (Reference Hivon, Lehoux, Denis and Tailliez7) and elsewhere (Reference van Velden, Severens and Novak11), a lack of capacity and a lack of interest by decision makers were cited as barriers to conduct EEs. While not identified as a potential barrier by the survey, time to complete an economic evaluation may also be an additional factor limiting the conduct of economic studies, which could also explain a lack of widespread demand by decision makers. As many models use input data from systematic literature reviews, it takes overall longer to complete an economic evaluation than a clinical review of the evidence. In some cases, decision makers faced with considerable pressure may not be able to wait this extra time before making a decision. Another reason that could explain the limited utilization of EEs by decision makers may be related to the difficulty in understanding complex economic evaluations. As such, presenting the information in a succinct and meaningful way to decision makers may be very important. For example, the production of a ten-page brief document to provide advice to the Ministry is part of the Decision Process implemented in Alberta (Reference Borowski, Brehaut and Hailey1). Considering the recent advances in HTA methods and the growing role of EEs in decision making, it appears important to continue training decision makers and students in economic methods for the evaluation of healthcare programs. The recent collaboration in Ontario and Alberta of academia and government should also provide a good example of a successful relationship between governments and the academic community (Reference Borowski, Brehaut and Hailey1;Reference Goeree and Levin5). Another barrier to the conduct of EEs was attributed to the availability of robust clinical information. Although the conduct of field evaluations to “collect primary research data on new and experimental technologies where data needed for decision making is insufficient” is an integral part of the Canadian HTA strategy (6), only one province had endorsed this concept at the time of the study.

Several limitations associated with this study should be taken into consideration when interpreting the results. First, face-to-face or telephone interviews were not undertaken with HTA agencies to gain a deeper understanding of their capacity and constraints in undertaking HTAs with EEs. We also did not survey decision makers, patients' advocacy groups or the medical community regarding the use of HTA reports. As such, the results of this research should be interpreted with caution especially with respect to the results of the survey sent to agencies. While the survey's respondents were key persons within each agency, their responses may not reflect the views and opinions of other employees. With respect to our assessment of the economic HTA reports, we did not use a quality assessment tool (Reference Drummond and Jefferson3;Reference Drummond, Sculpher, Torrance, O'Brien and Stoddart4) to appraise the EEs but rather developed a more general abstraction form. Our intent was not to look at the “quality” of the EEs but rather to describe the main characteristics of these EEs. Although our sample size (n = 48) was similar to Menon and Topfer's sample size (n = 60) (Reference Menon and Topfer10), the low number of EEs for some agencies makes direct comparisons between agencies difficult. Finally, while the abstraction form was designed to minimize interpretation, there is always a risk that some studies may have been misclassified.

Despite these limitations, this research provides new information regarding the Canadian HTA capacity in conducting EEs. The results indicate that some provinces in Canada (Alberta, Ontario, and Quebec) continue to be very active in terms of HTA activities although less than one-fourth of the HTA reports included an EE. The research also points to the need for increased HTA training, collaboration, evidence synthesis, and use of pragmatic “real world” evaluations to reduce the uncertainty associated with new technologies characterized by limited evidence. As Canada has seen the emergence of new “academic” groups over the past 2 years (e.g., University of Alberta/Capital Health Evidence-based Practice Center; McMaster/Network of Excellence for the Assessment of Health Technologies; and University of Toronto/Toronto Health Economics and Technology Assessment Collaboration) and the creation of new hospital-based HTA units in Québec, it will be important to continue to monitor the production of Canadian HTAs with and without an EE. Of particular interest is the conduct of field evaluations to reduce uncertainty and the use of Bayesian techniques for evidence synthesis in the presence of limited information.

CONTACT INFORMATION

Jean-Eric Tarride, PhD (), Assistant Professor, Catherine Elizabeth McCarron, MA, MSc, PhD Candidate (), Morgan Lim, MA, PhD Candidate (), James M. Bowen, BScPhm, MSc (), Program Manager, Gord Blackhouse, BComm, MBA, MSc (), Research Associate, Robert Hopkins, BA, BSc, MA, PhD Candidate (), Research Biostatistician, Daria O'Reilly, PhD (), Assistant Professor, Feng Xie, PhD (), Assistant Professor, Ron Goeree, MA (), Associate Professor, Department of Clinical Epidemiology & Biostatistics, McMaster University/Programs for Assessment of Technology in Health (PATH) Research Institute, St. Joseph's Healthcare Hamilton, 25 Main Street West, Suite 200, Hamilton, Ontario L8P 1H1, Canada

References

REFERENCES

1. Borowski, HZ, Brehaut, J, Hailey, D. Linking evidence from health technology assessments to policy and decision making: The Alberta model. Int J Technol Assess Health Care. 2007;23:155161.CrossRefGoogle ScholarPubMed
2. Canadian Agency for Drugs and Technologies in Health. Guidelines for the economic evaluation of health technologies: Canada. 3rd ed. Ottawa: Canadian Agency for Drugs and Technologies in Health; 2006. http://www.cadth.ca/media/pdf/186_EconomicGuidelines_e.pdf.Google Scholar
3. Drummond, MF, Jefferson, TO. Guidelines for authors and peer reviewers of economic submissions to the BMJ. The BMJ Economic Evaluation Working Party. BMJ. 1996;313:275283.CrossRefGoogle Scholar
4. Drummond, M, Sculpher, M, Torrance, G, O'Brien, B, Stoddart, G. Methods for the economic evaluation of health care programmes. 3rd ed. Oxford: Oxford University Press; 2005.CrossRefGoogle Scholar
5. Goeree, R, Levin, L. Building bridges between academic research and policy formulation: The PRUFE framework – an integral part of Ontario's evidence-based HTPA process. Pharmacoeconomics. 2006;24:11431156.CrossRefGoogle ScholarPubMed
6. Health Canada. Health Technology Assessment Task Group. Health technology strategy 1.0 final report. June 2004. http://www.cadth.ca/media/corporate/planning_documents/health_tech_strategy_1.0_nov2004_e.pdf.Google Scholar
7. Hivon, M, Lehoux, P, Denis, JL, Tailliez, S. Use of health technology assessment in decision-making: Coresponsibility of users and producers? Int J Technol Assess Health Care. 2005;21:268275.CrossRefGoogle ScholarPubMed
8. Lehoux, P, Tailliez, S, Denis, JL, Hivon, M. Redefining health technology assessment in Canada: Diversification of products and contextualization of findings. Int J Technol Assess Health Care. 2004;20:325336.CrossRefGoogle ScholarPubMed
9. Levin, L, Goeree, R, Sikich, N et al. Establishing a comprehensive continuum from an evidentiary base to policy development for health technologies: The Ontario experience. Int J Technol Assess Health Care. 2007;23:299309.CrossRefGoogle ScholarPubMed
10. Menon, D, Topfer, LA. Health technology assessment in Canada. A decade in review. Int J Technol Assess Health Care. 2000;16:896902.CrossRefGoogle Scholar
11. van Velden, ME, Severens, JL, Novak, A. Economic evaluations of healthcare programmes and decision-making: The influence of economic evaluations on different healthcare decision-making levels. Pharmacoeconomics. 2005;23:10751082.CrossRefGoogle ScholarPubMed
Figure 0

Table 1. Number of HTA Reports by the Presence of Economic Evaluation (EE), Year, and Agency (A1 to A5)

Figure 1

Table 2. HTA Production by Presence of Economic Evaluation (EE), Year, and Agency (A1 to A5)

Figure 2

Table 3. Type of Technology Assessed in HTAs with Economic Evaluations (EEs), Total and by Agency (A1 to A5)

Figure 3

Table 4. Disease Conditions Assessed in Health Technology Assessments (HTAs) with Economic Evaluations (EEs) per Agency (A1 to A5) (ICD-10CA)1 (Total and by Agency)

Figure 4

Table 5. Assessment of Clinical and Economic Evidence per Agency (Total and by Agency)

Figure 5

Table 6. Methods to Deal with Uncertainty per Agency (Total and by Agency)