Hostname: page-component-745bb68f8f-s22k5 Total loading time: 0 Render date: 2025-02-06T13:57:57.720Z Has data issue: false hasContentIssue false

QUALITY OF HEALTH TECHNOLOGY ASSESSMENT REPORTS PREPARED FOR THE MEDICAL SERVICES ADVISORY COMMITTEE

Published online by Cambridge University Press:  03 October 2016

Martin Hua
Affiliation:
Sydney School of Public Health, University of Sydneymhua8891@uni.sydney.edu.au
Tristan Boonstra
Affiliation:
Royal Melbourne Hospital
Patrick J. Kelly
Affiliation:
Sydney School of Public Health, University of Sydney
Andrew Wilson
Affiliation:
Sydney School of Public Health, University of Sydney; Menzies Centre for Health Policy, University of Sydney
Jonathan C. Craig
Affiliation:
Sydney School of Public Health, University of Sydney
Angela C. Webster
Affiliation:
Sydney School of Public Health, University of Sydney Centre for Kidney Research, The Children's Hospital at Westmead Centre for Transplant and Renal Research, Westmead Hospital
Rights & Permissions [Opens in a new window]

Abstract

Objectives: The Medical Services Advisory Committee (MSAC) makes recommendations to the Australian Government for funding health technologies under the Medicare Benefits Schedule (MBS). Differences in public, clinical, commercial, and political opinions on health expenditure emphasize the importance of defensible funding decisions. We aimed to evaluate the quality of health technology assessment (HTA) reports over time and among health technologies assessed for MSAC.

Main Outcome Measures: A cohort study was performed of HTA reports prepared for MSAC between 1998 and 2013. We measured the quality of HTA reports using reporting guidelines proposed by the European Collaboration for Assessment of Health Interventions. Individual component scores across eleven domains were calculated, and summed for an overall aggregate score. We used linear regression to investigate any change in quality over time and among the types of technologies assessed.

Results: We included 110 HTA reports. The safety (80 percent), effectiveness (84 percent), economic (74 percent), and organizational (99 percent) domains were better reported than the psychological, social, and ethical considerations (34 percent). The basic (75 percent), methodological (62 percent), background (82 percent), contextual (46 percent), status quo (54 percent), and technical information (66 percent) that framed each assessment were inconsistently reported. On average, overall quality scores increased by 2 percent (p < 0.001) per year, from approximately 60 percent to 80 percent over the 15-year period, with no significant difference among surgical, diagnostic or other nonpharmaceutical health technologies (p = 0.22).

Conclusions: HTA reports prepared for MSAC are a key tool in allocating scarce health resources. The overall quality of these reports has improved, but the reporting of specific domains and subthemes therein could be better addressed.

Type
Policies
Copyright
Copyright © Cambridge University Press 2016 

Health technology encompasses products used to prevent, diagnose, monitor, or treat health conditions. It has been estimated that new health technologies account for approximately half of the improvements in the length and quality of life of Australians (1). Health technology assessments (HTA) represent a multidisciplinary field of policy analysis that studies medical, social, ethical, and economic implications of health technology development and diffusion (2). In Australia, the Pharmaceutical Benefits Scheme and Medicare Benefits Schedule (MBS) require formal HTA as part of the consideration for public funding (3). With combined expenditure in excess of $27 billion for the 2012–13 financial year, the Pharmaceutical Benefits Scheme and MBS are among the Federal Government's most significant health financing arrangements (46). As such, HTA are a key tool in allocating scarce health resources (4).

In Australia, the Medical Services Advisory Committee (MSAC) advises the Minister for Health on the listing of MBS subsidies for health technologies other than pharmaceuticals and prosthetic devices (7). In doing so, MSAC's Terms of Reference requires it to consider “the strength of evidence in relation to the comparative safety, effectiveness, cost-effectiveness and total cost of the medical service” under assessment (7). Applicants submit information about their technology to the Department of Health, who assesses its suitability for the MSAC process. There may be one or more indications for the technology in each application. Before 2010 MSAC would then commission an assessment of the medical service. Guided by an expert panel, an HTA report was then prepared by one of the Department's external contracted evaluation groups. The HTA report represents one input into MSAC's decision to recommend to the Minister to support, or not support, permanent or interim funding. Since 2010, either the applicant or the Department initiates development of a decision analytic protocol, which is reviewed by the Protocol Advisory Sub-Committee. It is released for public consultation and re-reviewed by the Protocol Advisory Sub-Committee before the applicant or an external evaluator undertakes the HTA.

The HTA reports prepared by external contracted evaluation groups, MSAC recommendations to the Minister and public summary documents are made publicly available online (8). However, stakeholder perceptions of inconsistencies among evidence, clinical opinion, and MSAC recommendations (4;9Reference Ware, Francis and Read11) continue. Examination of the quality of published HTA reports remains limited (Reference Petherick, Villanueva, Bryan and Dharmage12). The use of reporting guidelines can help improve documentation of the HTA process, enhancing transparency and allowing appraisal of quality and relevance (Reference Altman, Simera, Hoey, Moher and Schulz13). We used reporting guidelines proposed by the European Collaboration for Assessment of Health Interventions (Reference Busse, Orvain and Velasco14) to evaluate the quality of HTA reports prepared for MSAC. Our secondary objectives were to investigate any change in quality of HTA reports over time, or among the health technologies assessed.

METHODS

Data Source

We analyzed HTA reports prepared for MSAC between 1998 and 2013. We searched the MSAC website for reports published in full and from applications other than referrals from the Department. We excluded reports not published in full and referrals from the Department, the latter because they do not originate from external applicants. As HTA reports are made publically available following a recommendation by MSAC, the majority of the evaluations reviewed here commenced before the reforms of 2010. Both before and following the reforms of 2010, the formal mandate for MSAC focuses on safety, effectiveness and economic considerations (7;15).

Data Extraction

We abstracted data on general characteristics of the HTA, including the technology and year of assessment. We evaluated the completeness of reporting of the HTA as a proxy for HTA quality (Reference Altman, Simera, Hoey, Moher and Schulz13) using reporting guidelines proposed by the European Collaboration for Assessment of Health Interventions (Reference Busse, Orvain and Velasco14). Completeness of reporting appraised eleven domains: basic information; general methodology; context; background information; the technology's current status; technical description; safety; effectiveness; psychological, social and ethical considerations; organizational implications; and economic evaluation. Within each domain were several subthemes and we used a dichotomous score for subthemes (addressed or absent). Data was abstracted using a standardized data form by one investigator (M.H.). A random sample of 25 percent of included HTA reports was duplicated and checked by a second investigator (T.B.) independently without blinding to HTA details. A third investigator was available for adjudication in cases of disagreement.

Outcomes and Data Analysis

For every included HTA report, an average score was calculated for each domain based on the average of the sub-themes score. We calculated an aggregate quality score for reports, with domains weighted equally, as an indicator of overall quality. Linear regression models were fitted using STATA 12 (StataCorp. 2011. Stata Statistical Software: Release 12. College Station, TX: StataCorp LP) to investigate any change in domain and overall quality over time, and whether the type of health technology influenced quality. An interrater reliability analysis using the measured weighted Kappa statistic was performed to determine consistency among investigators.

RESULTS

We identified 110 HTA reports for inclusion, conducted across surgical, diagnostic and other nonpharmaceutical health technologies. Table 1 summarizes the characteristics of these reports. Due to the time taken between lodgement of an application and the MSAC recommendation, the last HTA report included was for an application lodged in 2010. Economic evaluations were explicitly not conducted in nineteen HTA. As we aimed to assess only the quality of reporting, where economic evaluations were explicitly not conducted, this domain was excluded from the denominator. Otherwise all domains were considered. Table 2 show the quality of domain reporting and highlight variability. The domain best addressed across all HTA was organizational implications (99 percent), whereas the worst were the psychological, social, ethical, and equity considerations (34 percent). Figure 1 shows the relationship between time and quality for reports that addressed all eleven domains (n = 91/110). The measured weighted Kappa for the ratings by the two investigators was 0.677 (p < 0.001), 95 percent confidence interval [CI], 0.617–0.751.

Table 1. Characteristics and Outcomes of HTA Prepared for MSAC

aFor the purposes of this study, surgery has been defined as the application of a manual or instrumental technique in the treatment of disease, injury or deformity.

bFor the purposes of this study, a therapeutic intervention has been defined as a non-surgical intervention in the treatment of disease, injury or deformity.

cThe number of applications can include resubmissions following a previous negative MSAC outcome and/or reviews following a recommendation of interim funding.

dExisting items listed on the MBS include those under interim funding arrangements and/or where some or all parts of a technology are listed under different indications.

eApplications are made per technology, but can be for multiple indications.

fMSAC may make this recommendation where some or all components of a technology are already listed on the MBS.

Table 2. Quality of HTA Reports Prepared for MSAC by Domain

a Denominator is 110 assessment reports except for sub-themes within the ‘Economic Evaluation’ domain, the denominator is 91.

Figure 1. The quality of HTA reports prepared for MSAC over time.

Basic Information

Reporting on authorship (99 percent; 109/110), funding (98 percent; 108/110) and expert review (97 percent; 107/110) assists readers in assessing the credibility of HTA reports (Reference Hailey16Reference Johri and Lehoux19). Acknowledging any potential conflicting interests of MSAC members within the HTA report may reduce any perceived risk of bias (Reference Petherick, Villanueva, Bryan and Dharmage12;Reference Hailey16;Reference Liberati, Sheldon and Banta17), but only 4 percent (4/110) of HTA reports made any such statement.

General Methodological Aspects

The general methodological aspects of HTA were inconsistently reported, but improved over time (p < 0.001). A clearly documented, predefined HTA protocol is considered central to transparency (Reference Busse, Orvain and Velasco14;Reference Liberati, Sheldon and Banta17). Before the introduction of the decision analytic protocol, only one HTA referred to the use of a protocol. Other less well reported subthemes include research questions (47 percent; 52/110), scope of assessment (67 percent; 74/110) and reasons for aspects not assessed (0 percent; 0/110). The sources (99 percent; 109/110), selection criteria (96 percent; 106/110) and appraisal of included information (100 percent; 110/110), and generalizability or transferability of results (80 percent; 88/110) were better reported.

Context of the Assessment

Context may guide scope and methodology. As such, reporting of policy reasons (39 percent; 43/110), applicants (35 percent; 38/110), and timing (14 percent; 15/110) supports assessment of comprehensiveness, validity, and relevance (Reference Busse, Orvain and Velasco14). These subthemes were poorly documented, but overall domain quality improved across time (p < 0.001).

Background Information

Background information provides the nexus between policy and research questions (Reference Busse, Orvain and Velasco14). The targeted conditions (84 percent; 92/110), target groups (91 percent; 100/110) and health technologies (88 percent; 97/110) were well reported. The outcomes assessed were adequately documented in only 66 percent (72/110) of HTA reports. The quality of this domain improved over time (p < 0.001).

Status quo and Technical Characteristics of the Technology

Reporting a technology's position within its lifecycle provides the context to assess clinical relevance and validity (Reference Busse, Orvain and Velasco14;Reference Liberati, Sheldon and Banta17). Indications (96 percent; 105/110) and regulatory status (97 percent; 107/110) were well reported, but usage (22 percent; 24/110), diffusion (26 percent; 28/110) and time trends (16 percent; 18/110) less so. This domain improved marginally over time (p = 0.07). Technical characteristics that directly affect effectiveness were described in 66 percent (72/110) of reports.

Safety and Effectiveness

A documented systematic approach to assessing safety and effectiveness allows users to judge a HTA report's clinical relevance and potential for bias (Reference Busse, Orvain and Velasco14;Reference Hailey16;Reference Liberati, Sheldon and Banta17). Data sources used in assessing safety were stated in 83 percent (91/110) of reports and 87 percent (96/110) stated the selection criteria. However, only 59 percent (65/110) outlined a transparent assessment of validity or quality of safety data and 70 percent (77/110) transparently presented results through tables, graphs or meta-analysis plots. In contrast, every report documented a systematic literature search for effectiveness, reporting search strategies, data sources and years considered. Inclusion and exclusion criteria were defined in 96 percent (106/110) of reports, 99 percent (109/110) documented the checking of quality or validity and 93 percent (102/110) transparently presented results. The method for data extraction and a list of excluded studies were included in 65 percent (71/110) of reports. Whereas the quality of safety reporting did not improve with time (p = 0.18), the quality of effectiveness reporting did improve (p < 0.001).

Organizational Implications

A health technology's organizational impact should be considered in a HTA given its implications for stakeholders including staff, healthcare providers, and patients (Reference Busse, Orvain and Velasco14;Reference Liberati, Sheldon and Banta17). Of the HTA reports assessed, 99 percent (109/110) discussed issues such as personnel requirements, regulation, and clinical need.

Psychological, Social, Ethical, and Equity Considerations

Transparent representation of stakeholder perspectives safeguards against perceived bias in a HTA (Reference Busse, Orvain and Velasco14;Reference Liberati, Sheldon and Banta17). For the public, assessing psychological (43 percent; 47/110), social (47 percent; 47/110), and ethical implications (14 percent; 15/110) would be beneficial (Reference Busse, Orvain and Velasco14), but documentation was limited despite improving over time (p = 0.001). Many HTA reports noted that MSAC will take “into account other issues such as access and equity”, but only 32 percent (35/110) made references to the actual equity implications of the technology.

Economic Evaluation

Economic evaluations were conducted in 83 percent (91/110) of the HTA. The remaining reports explicitly documented exclusion of economic considerations, for example that the “cost-effectiveness of the procedure is not possible to assess due to a lack of strong evidence on clinical effectiveness and safety” (15). A documented systematic approach fosters impartiality (Reference Hailey16) and is an imperative, as financial impacts are perceived as significant determinants of public funding (Reference Chim, Kelly, Salkeld and Stockler20). Where assessed, 99 percent (90/91) of reports documented the type of economic evaluation used, but only 68 percent (62/91) documented a systematic literature search. The evaluation perspective (63 percent; 57/91), justification of assumptions (63 percent; 57/91) and transferability of analysis (76 percent; 69/91) were also less frequently reported. This domain significantly improved across time (p < 0.001).

Variability in Quality over Time and among Technology Types

Figure 1 shows the relationship between time and quality for reports that addressed all eleven domains (n = 91/110). The degree of change in each domain was variable. Overall quality scores improved on average by 2 percent per year (95 percent CI; p < 0.001), from approximately 60 percent to 80 percent, with no significant difference among the three technological categories (p = 0.22). Figure 2 illustrates the differences in domain by technological types, which were in general minimal but safety reporting was noticeably better among surgical technologies.

Figure 2. The quality of HTA reports prepared for MSAC, by quality domain, stratified by surgical, diagnostic, and non-surgical therapeutic technologies.

DISCUSSION

Stakeholders in Australia have voiced concerns over perceived inconsistencies between evidence, expert opinion and MBS funding decisions (4;9), and raised questions of political pressure in decision making (Reference Van Der Weyden and Armstrong10;Reference Gallego, Casey, Norman and Goodall18). This emphasizes the importance of defensible funding decisions, but research on the quality of HTA reports prepared for MSAC is limited. The evaluation by Petherick et al. (Reference Petherick, Villanueva, Bryan and Dharmage12) of the methodology used in assessing effectiveness provides some basis for comparison. The authors predicted improvements in the overall quality of HTA reporting with the emergence of best practice guidelines. As such, we used guidelines proposed by the European Collaboration for Assessment of Health Interventions (Reference Busse, Orvain and Velasco14) as best practice for completeness in reporting to assess quality. Whereas the average overall quality scores for these reports increased by 2 percent (p < 0.001) per year, the reporting of specific domains and subthemes therein require continued improvement.

The robust documentation of effectiveness provides confidence of clinical impartiality, but the quality of safety reporting was more limited and did not improve with time (p = 0.18). It may be that this is because the Therapeutic Goods Administration assesses the safety of new medical technologies as a precursor to MSAC's HTA process (4), yet safety is nonetheless a central tenant of MSAC's Terms of Reference (7). Improvements in the quality of safety reporting may thus go toward reducing any perceived inconsistencies in MSAC's recommendations or rationales. Strong qualitative themes within a standard template for HTA reports may also provide reassurance that stakeholder perspectives have been considered in decision making (Reference Liberati, Sheldon and Banta17).

For the public, reporting of psychological, social, and ethical implications are integral to the perception of unbiased and informed decision making (Reference Busse, Orvain and Velasco14). However, the formal mandate for MSAC focuses on safety, effectiveness, and economic considerations (8). As such, despite a significant increase over time (p < 0.001), there were limited references to psychological (43 percent; 47/110), social (47 percent; 52/110), and ethical implications (14 percent; 15/110). This may reflect the published evidence on which HTA are based, but the current mandate may limit stakeholder support for recommendations made by MSAC.

Implementation of a HTA may be hindered if the contextual and methodological aspects of a HTA are unclear and at risk of bias (Reference Gallego, Casey, Norman and Goodall18;Reference Drummond and Weatherly21). The Australian Government has noted that “transparency is paramount to ensuring equity, fairness, and most importantly, scientific rigor on which healthcare decisions are based” (4). Yet the poor reporting of contextual information such as policy reasons, applicants, and timing of HTAs has perhaps contributed to some MSAC stakeholders feeling that “the currently opaque arrangements reduce confidence in what might actually be sound outcomes” (4). A systematic template for reporting may address this, but the results shows that before the 2010 reforms, clear protocols, policy and research questions were infrequently reported. These influence scope and methodology, and their limited reporting impedes the assessment of comprehensiveness, validity and relevance. The examination by Ware et al. (Reference Ware, Francis and Read11) of the deliberations for funding of positron emission tomography demonstrates how such shortcomings contribute to a perception of bias.

The Australian Government has recognized HTAs as an overarching determinant of policy and practice (1), with diffusion centered on the process of communication. As HTA reports are key to the dissemination of MSAC's recommendations and rationale (7), any perceived bias unaddressed in the report may thus have a negative influence on the technology's diffusion. Acknowledging potential conflicting interests may mitigate perceptions of bias (Reference Petherick, Villanueva, Bryan and Dharmage12;Reference Hailey16;Reference Liberati, Sheldon and Banta17) but only 4 percent (4/110) of HTA reports assessed made any such statement. Petherick et al. (Reference Petherick, Villanueva, Bryan and Dharmage12) noted similar concerns, although some stakeholders have believed that the identity of evaluators and advisors should be made public as an essential component of natural justice (4). This is particularly pertinent given well-documented stakeholder concerns of inconsistency between evidence and MSAC's conclusions (4;9Reference Ware, Francis and Read11), and a lingering perception of political pressure in decision making (Reference Van Der Weyden and Armstrong10;Reference Gallego, Casey, Norman and Goodall18). Given all MSAC members, sub-committees and consultants must declare conflicts of interest this should thus be routinely published in the HTA reports to bolster confidence in the HTA enterprise.

Due to the time taken to perform a HTA, most of the reports reviewed were for HTA undertaken before the 2010 MSAC reforms. Among these reforms, MSAC aims to increase transparency through using decision analytic protocols (22), which may improve reporting, especially of methodological aspects such as protocol, policy and research questions. Similarly, submission based assessments offer applicants an alternative to MSAC commissioned assessments (22) but require conformity to a protocol. The introduction of a MSAC report template to facilitate this indicates ongoing attempts to improve quality and consistency of HTA reports. These are relatively recent measures, with the initial discussion papers made publicly available in 2010. Most of the HTA reports included here were prepared for applications received by the MSAC Secretariat before 2010 (96 percent; 106/110). Submission based assessment reports have generally not been published at this time, contributing to the notable reduction in HTA reports assessed after 2009. Should they be made available for comparison, our results will be useful as a baseline for monitoring resulting changes in the quality of HTA reports.

We limited this study to HTA reports published online up to March 2013. For feasibility issues we only cross checked data for 25 percent of included HTA reports using a second independent investigator. The measured weighted Kappa for the ratings by the two investigators indicate substantial agreement, but differences arose due to reader interpretation. We have used completeness of reporting based on proposed best practice guidelines as a proxy for quality of the HTA process (Reference Altman, Simera, Hoey, Moher and Schulz13). A critique may be made concerning both this approach and the choice of guidelines, contending that it unnecessarily imposes a “one-size-fits-all” approach onto a unique agency. We have also assumed that all domains are equally important, and all subthemes of equal importance within each domain, which may not be true (23;24). Furthermore, the variability of subtheme reporting within domains may limit the value of domain quality scores. However, in the absence of any previous analysis, the methodology adopted provides a starting point for assessing HTA report quality. Our purpose is not to prescribe a generic guideline for the preparation of HTA reports for MSAC consideration. Rather the results are aimed at providing an indication of overall quality for discussion purposes and this discussion may serve as a guide for future refinements.

Finally, although this study focused on HTA reports prepared for MSAC in Australia, it is also the first study to use peer-reviewed best practice guidelines to evaluate the overall quality of health technology assessment reports prepared by a national HTA agency. The Australian HTA process is comparable to that of agencies such as the National Institute for Health and Clinical Excellence and the Canadian Agency for Drugs and Technologies in Health (4), but in Australia, there is more perceived procedural transparency (4). As such, it is hoped that this study prompts similar undertakings globally, with the issues highlighted here assisting other national agencies to improve their HTA reporting.

Conclusions and Policy Implications

The quality of HTA reports prepared for MSAC has several implications for the defensibility and implementation of MBS funding. This study has highlighted variability in the quality of these reports. Although quality has increased, specific domains and subthemes therein require continued improvement. We have established a baseline quality of HTA reports prepared for MSAC, a benchmark against which future improvement can be measured.

CONFLICTS OF INTEREST

Prof Wilson reports personal committee fees from the Australian Department of Health, outside the submitted work; and is current Chair, Protocol Advisory Sub Committee of MSAC, and current Chair, Pharmaceutical Benefits Advisory Committee of Australian Department of Health. No other financial support or conflict of interests were reported by the authors.

References

REFERENCES

1. The Australian Government Productivity Commission. Impacts of advances in medical technology in Australia. [Online]. Melbourne; 2005. http://www.pc.gov.au/projects/study/medical-technology/docs/finalreport (accessed February 23, 2014).Google Scholar
2. International Network of Agencies for Health Technology Assessment. The International Network of Agencies for Health Technology Assessment. [Online]; 2013 http://www.inahta.net/hta (accessed January 10, 2013)Google Scholar
3. The Australian Government Department of Health and Ageing. Health technology assessment. [Online]. http://www.health.gov.au/internet/hta/publishing.nsf/Content/reimbursement-1 (accessed February 23, 2014).Google Scholar
4. Department of Health and Ageing. Review of health technology assessment in Australia. Canberra: Australian Government, Department of Health and Ageing; 2009.Google Scholar
5. Hailey, D. The history of health technology assessment in Australia. Int J Technol Assess Health Care. 2009;25 (Suppl 1):6167.Google Scholar
6. The Australian Government Department of Health. Australian government 2012-13 health and ageing portfolio additional estimates statements. [Online]; 2013. http://www.health.gov.au/internet/budget/publishing.nsf/Content/F9D011B7B4DD0C0ECA257B09000BCBD2/$File/2012-13%20PAES%204%20Feb%202013%2017.08pm.pdf.v (accessed February 23, 2014).Google Scholar
7. Medical Services Advisory Committee. Medical Services Advisory Committee. [Online]; 2014. http://www.msac.gov.au/internet/msac/publishing.nsf/Content/about-us-lp-1 (accessed February 23, 2014).Google Scholar
8. Medical Services Advisory Committee. Medical Services Advisory Committee (MSAC) terms of reference. [Online]; 2010 http://www.msac.gov.au/internet/msac/publishing.nsf/Content/msac-tor-1 (accessed February 23, 2014).Google Scholar
9. Medical Services Advisory Committee. Report of a review of the Medical Services Advisory Committee. Canberra: The Australian Government, Department of Health; 2005.Google Scholar
10. Van Der Weyden, MB, Armstrong, RM. Evidence and Australian health policy. Med J Aust. 2004;180:607608.Google Scholar
11. Ware, RE, Francis, HW, Read, KE. The Australian government's review of positron emission tomography: Evidence-based policy-making in action. Med J Aust. 2004;180;627632.Google Scholar
12. Petherick, ES, Villanueva, EV, Bryan, EJ, Dharmage, S. An evaluation of methods used in health technology assessments produced for the Medical Services Advisory Committee. Med J Aust. 2007;187:289292.Google Scholar
13. Altman, DG, Simera, I, Hoey, J, Moher, D, Schulz, K. EQUATOR: Reporting guidelines for health research. Lancet. 2008;371:11491150.Google Scholar
14. Busse, R, Orvain, J, Velasco, M, et al. Best practice in undertaking and reporting health technology assessments. Working group 4 report. Int J Technol Assess Health Care. 2002;18:361422.CrossRefGoogle Scholar
15. Medical Services Advisory Committee. Placement of artificial bowel sphincters in the management of faecal incontinence. MSAC application 1023. Canberra: Australian Government, Department of Health and Aged Care; 1999.Google Scholar
16. Hailey, D. Toward transparency in health technology assessment. Int J Technol Assess Health Care. 2003;19:17.Google Scholar
17. Liberati, A, Sheldon, TA, Banta, HD. EUR-ASSESS Project Subgroup Report on Methodoloy: Methodological guidance for the conduct of health technology assessment. Int J Technol Assess Health Care. 1997;13:186219.Google Scholar
18. Gallego, G, Casey, R, Norman, R, Goodall, S. Introduction and uptake of new medical technologies in the Australian health care system: A qualitative study. Health Policy. 2011;102:152158.Google Scholar
19. Johri, M, Lehoux, P. The Great Escape? Prospects for regulating access to technology through health technology assessment. Int J Technol Assess Health Care. 2003;19:179193.CrossRefGoogle ScholarPubMed
20. Chim, L, Kelly, PJ, Salkeld, G, Stockler, MR. Are cancer drugs less likely to be recommended for listing by the Pharmaceutical Benefits Advisory Committee in Australia? Pharmacoeconomics. 2010;28:463475.Google Scholar
21. Drummond, M, Weatherly, H. Implementing the findings of health technology assessments: If the CAT got out of the bag, can the TAIL wag the dog? Int J Technol Assess Health Care. 2000;16:112.Google Scholar
22. Department of Health and Ageing. Proposal for changes to the Medical Services Advisory Committee processes for applications for public funding. Canberra: Australian Government, Department of Health and Ageing; 2010.Google Scholar
23. Drummond, M, Neumann, P, Jonsson, B, Luce, B. Can we reliably benchmark health technology assessment organizations? Int J Technol Assess Health Care. 2012;28:159165.CrossRefGoogle ScholarPubMed
24. International Working Group for HTA Advancement; Neumann, PJ, Drummond, MF, et al. Are key principles for improved health technology assessment supported and used by health technology assessment organizations? Int J Technol Assess Health Care. 2010;26:7178.Google Scholar
Figure 0

Table 1. Characteristics and Outcomes of HTA Prepared for MSAC

Figure 1

Table 2. Quality of HTA Reports Prepared for MSAC by Domain

Figure 2

Figure 1. The quality of HTA reports prepared for MSAC over time.

Figure 3

Figure 2. The quality of HTA reports prepared for MSAC, by quality domain, stratified by surgical, diagnostic, and non-surgical therapeutic technologies.