Hostname: page-component-7b9c58cd5d-sk4tg Total loading time: 0 Render date: 2025-03-14T09:14:46.481Z Has data issue: false hasContentIssue false

IMPACT OF HEALTH TECHNOLOGY ASSESSMENT REPORTS ON HOSPITAL DECISION MAKERS – 10-YEAR INSIGHT FROM A HOSPITAL UNIT IN SHERBROOKE, CANADA: IMPACT OF HEALTH TECHNOLOGY ASSESSMENT ON HOSPITAL DECISIONS

Published online by Cambridge University Press:  19 July 2018

Thomas G. Poder
Affiliation:
UETMIS, CIUSSS de l'Estrie - CHUS, Sherbrooke, Department of Family Medicine and Emergency Medicine, Faculty of Medicine and Health Sciences, University of Sherbrooke, Centre de recherche du Centre hospitalier universitaire de Sherbrooke (CRCHUS), Department of Economics, École de Gestion, University of Sherbrookethomas.poder@usherbrooke.ca
Christian A. Bellemare
Affiliation:
UETMIS, CIUSSS de l'Estrie - CHUS, Sherbrooke
Suzanne K. Bédard
Affiliation:
UETMIS, CIUSSS de l'Estrie - CHUS, Sherbrooke
Jean-François Fisette
Affiliation:
UETMIS, CIUSSS de l'Estrie - CHUS, Sherbrooke
Pierre Dagenais
Affiliation:
UETMIS, CIUSSS de l'Estrie - CHUS, Sherbrooke, Centre de recherche du Centre hospitalier universitaire de Sherbrooke (CRCHUS), Department of Medicine, Faculty of Medicine and Health Sciences, University of Sherbrooke
Rights & Permissions [Opens in a new window]

Abstract

Objectives:

The overarching goal of this research was to (i) evaluate the impact of reports with recommendations provided by a hospital-based health technology assessment (HB-HTA) unit on the local hospital decision-making processes and implementation activities and (ii) identify the underlying factors of the nonimplementation of recommendations.

Methods:

All reports produced by the HB-HTA unit between December 2003 and March 2013 were retrieved, and hospital decision makers who requested these reports were solicited for enrolment. Participants were interviewed using a mixed design survey.

Results:

Twenty reports, associated with fifteen decision makers, fulfilled the study criteria. Nine decision makers accepted to participate, corresponding to thirteen reports and twenty-three recommendations. Of the twenty-three recommendations issued, 65 percent were implemented, 9 percent were accepted for implementation but not implemented, and 26 percent were declined. In terms of the utility of each report to guide decision makers, 92 percent of the reports were considered in the decision-making process; 85 percent had one or more recommendations adopted; and 77 percent had recommendations implemented. The most frequently mentioned reasons for nonimplementation were related to contextual factors (64 percent), production/diffusion process factors (14 percent), content/format factors (14 percent), or other factors (9 percent). Among the contextual factors, the complexity of the changes (i.e., administrative reasons), budget and resources constraints, failure to identify administrative responsibility to carry out the recommendation, and nonpriority status of the HTA recommendation, were provided.

Conclusions:

This study highlights that although HB-HTA reports are useful to hospital managers in their decision-making processes, certain barriers such as contextual factors need to be better addressed to improve HB-HTA efficiency and usefulness.

Type
Assessment
Copyright
Copyright © Cambridge University Press 2018 

In the global context of sparse funds but rapid growth of medical expenditures, the need to provide decision makers with adequate information on policy alternatives and for them to elaborate wiser decisions regarding the coverage of new health technologies has become urgent (Reference Banta1). In this context, health technology assessment (HTA) was developed to assess the balance between benefits/consequences of technology implementation and costs/risks (Reference Abelson, Giacomini and Lehoux2).

HTA first emerged at the macro level to guide national or international decisions (Reference Banta1). As one example, the Medical Services Advisory Committee was created in 1998 to make recommendations to the Australian Government for funding health technologies and is now recognized as a key tool in allocating scarce health resources (Reference Hua, Boonstra and Kelly3). Later, HTA was implemented at the micro level and recent reports suggest that hospital-based HTA (HB-HTA) units provide meaningful benefits, as the public, the staff, and the managers are the primary grantees of these measures (Reference Gagnon, Desmartis and Poder4Reference Hailey8).

Building on these international recommendations, the province of Quebec in Canada pioneered the first network of HB-HTA units (9). In the early 1990s, the Quebec Ministry of Health and Social Services legislated the assessment of health technology as one of the four mandatory missions of university health-care centers (Reference Gagnon, Ben AbdelJelil and Desmartins10), thus positioning the province as a world leader.

Since then, and particularly in the 2000s, the assessment of health technologies and services clustered in many local units throughout the province of Quebec. In accordance with the ministry politics and with the aim to increase tertiary hospital center performance appraisals in regard to HTA, the Centre hospitalier universitaire de Sherbrooke (CHUS) officially created in 2004 the Health Services and Technology Assessment Unit (UETMIS in French) (Reference Bellemare, Fisette, Poder, Sampietro-Colom and Martin11). The UETMIS principal mandate was to support and steer the CHUS managers in decision making, using the highest level of evidence for the efficient use of resources and improvement in health care and services provided to patients.

Following the work conducted by Lafortune et al. (Reference Lafortune, Farand and Mondou12), the framework guiding the production process adopted by the UETMIS is largely inspired from the Parsons’ production function (Reference Parsons13), with the exception that we also integrated key elements from the conceptual framework for quality of care developed by Donabedian (Reference Donabedian14;Reference Donabedian15). In this framework, the HB-HTA unit is at the crossroad between decision-makers’ queries (input) and the continuum of recommendations guidance (output) from the decision-making process through implementation and observance of the beneficial impact of these recommendations on patient health care (Figure 1). This kind of approach has proven to be relevant in many domains, including HTA (Reference Steuten, Vrijhoef and Severens16;Reference McDonald, Sundaram and Bravata17). Finally, it should be kept in mind that recommendations produced by a local HB-HTA unit must be evaluated in the context of the unique institution for which it was requested, as this factor is the one most influencing implementation (Reference Busse, Orvain and Velasco18).

Figure 1. Framework guiding the production process adopted by the UETMIS (Health Services and Technology Assessment Unit).

Following up on a 10-year insight of UETMIS activities, the overarching goal of this research was to (i) evaluate the impact of reports with recommendations on the local hospital decision-making processes and implementation activities and (ii) identify the underlying factors of the nonimplementation of recommendations.

METHODS

Background Review

A rapid review of the literature conducted in January 2013 using PubMed and Google Scholar was conducted. The aim was to identify frameworks or tools developed to assess the impact of the HB-HTA unit. At that time, neither a specific framework nor a validated tool was identified. This was confirmed later by the systematic review conducted by Gagnon et al. (Reference Gagnon, Desmartis and Poder4), with the exception of the multidimensional framework developed by Schumacher and Zechmeister (Reference Schumacher and Zechmeister19), which was published a few month after we started this study. However, some studies assessing the impact of HTA activities were helpful to identify items of importance for the design of the survey. These studies were primarily those of Hailey (Reference Hailey20), Bodeau-Livinec et al. (Reference Bodeau-Livinec, Simon and Montagnier-Petrissans21), Gerhardus et al. (Reference Gerhardus, Dorendorf, Rottingen, Garrido, Kristensen and Nielsen22), and McGregor (Reference McGregor23).

Study Design

A mixed design survey combining both open- and closed-ended questions was conducted face-to-face by an experienced interviewer not involved in the HB-HTA process. The survey (Supplementary File S1) was developed based on the rapid review described above. A first draft was generated followed by internal validation by the UETMIS staff. An external validation was then conducted by external HB-HTA managers to ensure consistency and clarity of statements. Answers were either binary, framed on a Likert scale or opened (i.e., qualitative data).

The impact of recommendations was assessed based on an eleven-level scale (Box 1), divided into six categories and adapted from the framework described earlier (Figure 1) with elements and concepts proposed by Gerhardus et al. (Reference Gerhardus, Dorendorf, Rottingen, Garrido, Kristensen and Nielsen22) and Hailey (Reference Hailey20).

Box 1. Levels of impact of recommendations

In the case where recommendations were not implemented, participants were asked to designate the appropriate suggested reasons, which were grouped into four different categories, as previously validated by Gerhardus et al. (Reference Gerhardus, Dorendorf, Rottingen, Garrido, Kristensen and Nielsen22): contextual factors (eleven items), factors related to report production/diffusion (six items), factors related to report content and/or format (four items) and others (two items). All items were identified in the literature or with discussion with HB-HTA specialists and managers.

Participant Recruitment

All reports produced by the UETMIS between December 2003 and March 2013 were retrieved (Supplementary File S2). Hospital decision makers who requested these reports were eligible and invited by email in June 2013 to participate in a semi-structured interview (the corresponding report and associated recommendations, the cover letter, the survey and the consent form were attached to the email). One week after email reception, a phone call was placed to set an interview date. This study was approved by the CHUS Human Ethics Board Committee (#2014-607, 13-072), and participants signed the informed consent form before participating in the interview.

Analyses

The primary objective of the study is the assessment of the utility of the HB-HTA reports and recommendations perceived by managers in the decision-making process (i.e., as it is the mandate given to the UETMIS by the hospital). The primary outcome is defined as the percentage of recommendations issued by the UETMIS that were accepted (i.e., criteria of “acceptance”) by the decision makers (cf. question 2 in Supplementary File S1). The secondary objectives address the levels of impact that were observed following transmission of the reports and the factors impeding subsequent implementation. These secondary outcomes are defined as the percentage of recommendations that reach each level of impact, if recommendations were implemented, and reasons for nonimplementation according to the decision makers.

Qualitative data (i.e., open-ended questions) were analyzed using the content analysis method (Reference Berelson and Glencoe24). Interviews were designed for the participant to detail his/her answers without being influenced. A written summary of the participant's objective statements, impressions/feelings and highlights was produced by the interviewer to group the participants’ answers with the interviewer's notes (Reference Silverman25). In cases where verbal speech was poor, participant attitude, behavior, or gestures were noted and transcribed using a restitution approach. Qualitative data were coded into a mixed evaluation grid, and subsequent thorough empirical analyses were conducted based on a seven-step scale (Reference Andreani and Conchon26).

RESULTS

Between December 2003 and March 2013, twenty-one reports highlighting specific recommendations were issued by the UETMIS. Of these twenty-one reports, one requested by the UETMIS manager was excluded to avoid bias. The twenty remaining reports were associated with fifteen decision makers, considering that four of them requested more than one report. Nine decision makers (60 percent) agreed to participate, which allowed the evaluation of twenty-three recommendations from thirteen reports. The other six decision makers declined to participate because of lack of time (n = 1); they were no longer in that position or retired (n = 4); or because they did not remember the content of the report (n = 1). The interviews were conducted between August and October 2013.

Results indicated that all reports and recommendations were considered useful in the decision-making process (i.e., acceptance). Of the twenty-three recommendations, fifteen were implemented (65 percent); two were accepted for implementation but were ultimately not implemented (9 percent); and six were declined (26 percent). The eleven levels of impact, according to the six evaluation criteria used for assessment, are described in Table 1.

Table 1. Number of Recommendations and Impact Level According to Each Criterion

* Not evaluated: the decision maker did not know, or the answer was not applicable.

We observed that twelve of the thirteen reports (92 percent) were considered in the decision-making process (i.e., appropriation); eleven of the thirteen reports (85 percent) had one or more recommendations adopted; and ten (77 percent) had recommendations implemented, whereas seven of them (54 percent) had positive impacts on the efficient use of the resources (Table 2). Regarding the utility of the UETMIS reports and recommendations in the decision-making process, 100 percent of the decision makers found they were useful, with 54 percent of the respondents stating that these recommendations were highly useful, while 46 percent agreed that recommendations were partially useful.

Table 2. Number of Reports and Impact Level According to Each Criterion

a Not evaluated: the decision maker did not know, or the answer was not applicable.

Also, we analyzed the extent to which the nature of the recommendations could have an impact on their implementation. Among the twenty-three recommendations, sixteen (70 percent) were related to a device or an aspect of the hospital infrastructure while others were related to a process or service. Results indicate that a recommendation was less likely to be implemented in case of a process/service (43 percent) than with a device/infrastructure (75 percent). Similarly, to recommend for an introduction (N = 21) was less likely to be implemented than for a maintenance (N = 2) of a technology (62 percent versus 100 percent). As regard to a negative (N = 1), positive (N = 19) or under-condition (N = 3) recommendation, there is still difference (100 percent versus 74 percent versus 0 percent).

Finally, if the target was medical/healthcare staff (N = 13) compared to administrative/logistic staff (N = 10), approximately 62 percent of recommendations were to be implemented versus 70 percent. However, caution needs to be exercised with these results, because some of these attributes were seldom encountered (e.g., only one negative and three under-condition recommendations).

Of the thirteen reports, six included one or more recommendations that were not implemented (i.e., eight recommendations not implemented, Table 1). Based on the semi-structured interviews, decision makers reported nine different reasons among the twenty-three proposed to explain the absence of implementation of the recommendations (see Supplementary File S1). Some reasons were provided in more than one report for a total of twenty-two unique reasons. The most frequently mentioned reasons were related to contextual factors (64 percent), production/diffusion process factors (14 percent), content/format factors (14 percent) or other factors (9 percent).

Among the contextual factors, the complexity of the changes (i.e., administrative reasons) (18 percent), budget and resources constraints (23 percent), failure to identify administrative responsibility to carry out the recommendation (5 percent), and nonpriority status of the HTA recommendation (18 percent), were provided. Indeed, despite considering organizational and economic aspects in our HTA reports, recommendations do not consider the other competing requests that the decision makers face. As a consequence, they still need to perform a trade-off when making a decision. This is particularly true in cases where the reason provided was “lack of resources” or “nonpriority recommendation.”

In addition, the context is rapidly evolving in a university hospital and what was considered a priority could be not a month later. By cons, when decision makers refer to an “administrative reason,” this is more difficult to interpret because in some cases this was not clearly explained and because from the viewpoint of some HB-HTA evaluators other reasons in the list could have been provided. Finally, in a minority of cases, it may be considered that the HB-HTA unit insufficiently involved stakeholders in the process (5 percent) or produced reports insufficiently convincing as regard to the level of evidence (14 percent). An exhaustive list of the reasons is detailed in Table 3.

Table 3. List of Reasons Reported to Explain Absence of Implementation of a Recommendation

DISCUSSION

The aim of this study was to evaluate the impact of the reports with recommendations issued by the UETMIS during its first 10 years of existence and to measure their utility in local hospital decision-making processes and implementation activities. Results revealed that the reports were deemed highly useful (100 percent) in the decision-making process of hospital managers who consulted the UETMIS. This observation is similar to, if not better than, the reported rate of appreciation of the HB-HTA process at the Lausanne University Hospital (Reference Grenon, Pinget and Wasserfallen27) in Switzerland and in clinical engineering departments in the United States (Reference Cram, Groves and Foster28) (85 percent and 80 percent, respectively).

Our study also revealed that 65 percent of the HTA recommendations were implemented at different levels by the decision makers, which is very close to the 71 percent obtained by the HB-HTA unit at McGill University Health Center (MUHC), also located in the province of Quebec, for the period from 2004–11 (Reference McGregor23). While facilitators for recommendation-associated adoption and implementation were not evaluated in this study, although they are usually known to be directly associated with the participation of clinical staff in the consultation phase or in the implementation phase (Reference Gagnon, Desmartis and Poder4;Reference Gerhardus, Dorendorf, Rottingen, Garrido, Kristensen and Nielsen22), our study identified the main reasons that prevented the adoption and implementation of recommendations.

As such, these reasons are mainly related to the context, which can be influenced by the complexity of the changes required, the availability of material, financial or human resources, or institutional priorities. To improve the utility of the HB-HTA reports and enhance applicability of their recommendations, it is, therefore, necessary to further analyze contextual aspects and to better involve stakeholders. Interestingly, discordant priorities between recommendations and institutional vision were mentioned in two-thirds of the reports as a reason for nonimplementation (i.e., four over six). This discordance highlights the necessity to better identify the stakeholders involved in the decision-making process and recommendation implementation. In addition, this will help validate the level of priority of the HTA request before the assessment. This observation led the UETMIS to re-evaluate its procedures and to add a follow-up step to ensure anchoring between HB-HTA goals and decision-makers’ priority and context.

The framework developed by Schumacher and Zechmeister (Reference Schumacher and Zechmeister19), published a few months after this study started, shows strong similarities with our framework because it is equally inspired from the work of Gerhardus et al. (Reference Gerhardus, Dorendorf, Rottingen, Garrido, Kristensen and Nielsen22). In particular, both frameworks consider that HB-HTA activities may impact not only decision-making processes but also outcomes. However, different from Schumacher and Zechmeister (Reference Schumacher and Zechmeister19), more importance is placed on disentangling the adoption and implementation of recommendations. Indeed, in the context of an HB-HTA unit where reports are requested by local decision makers, many events can occur between the moment when recommendations are released and the moment when all conditions for implementation are met.

In addition, the framework used in this study considers “change in practice” as an outcome and not as a fully separated category, because requests addressed to HB-HTA units are not always directly oriented toward the patients. In this sense, it was more appropriate to disentangle the most frequent types of outcomes that can be observed in a hospital setting. Finally, our framework did not consider the “enlightenment” criteria, described as the “establishment of an HTA culture,” which was a conceptual innovation from Schumacher and Zechmeister (Reference Schumacher and Zechmeister19). Retrospectively, it would have been very relevant to incorporate this criterion in our framework as well as a criterion for the dissemination of HB-HTA reports outside the hospital (e.g., citations, re-use of reports). However, due to the specific nature of this study (i.e., interviews), it would have been difficult to obtain objective answers for this last criterion. A more thorough framework may thus incorporate up to eight criteria and thirteen levels to assess the impact of HB-HTA reports.

A strength of the study is that sufficient time was allowed to conduct the interviews and to adequately collect the answers (i.e., one hour). Moreover, the questionnaire was validated by external HB-HTA unit managers and conducted by an experienced and nonbiased interviewer (i.e., to avoid influencing the respondent and potential conflict of interest with the HB-HTA unit). One limitation is the low number of reports evaluated compared with those of similar published studies (Reference Hua, Boonstra and Kelly3;Reference Donabedian14), although they were representative of the reality of smaller HB-HTA units.

Another limit is the 4-year median delay between the issue of the reports and the interviews. This delay introduced bias, as some interviewed decision makers had poor recall of the discussed report, which resulted in incomplete surveys or led to recruitment bias, as most cases of nonparticipation were because decision makers no longer held that position or were retired. Our design also led to the participation of engaged participants, as only those who were interested volunteered for the study. A possible Hawthorne effect might also have positively influenced the interviewed decision makers, but this effect was counterbalanced by the use of binary and Likert-scale types of questions to capture more objective answers.

In conclusion, as previously reported by Gagnon et al. (Reference Gagnon, Desmartis and Poder4) in a recent systematic review, HB-HTA activities impact decision-making processes, although this influence may be limited by several factors related to HB-HTA units (composition, independency, procedure, understanding of the global issues) or to the management/clinical committee responsible for the decision (composition, dedicated time, resources, perception, priorities, structured process). This study showed that HB-HTA reports are useful to hospital managers in their decision-making processes. Although HB-HTA reports were valued as high quality and very relevant, some of the reports or recommendations were not considered in the decision process. Certain barriers such as contextual factors need to be further addressed in the scientific process to improve HB-HTA efficiency and usefulness.

SUPPLEMENTARY MATERIAL

Supplementary File 1:

https://doi.org/10.1017/S0266462318000405

Supplementary File 2:

https://doi.org/10.1017/S0266462318000405

CONFLICTS OF INTEREST

The authors have completed and submitted the ICMJE Form for Disclosure of Potential Conflicts of Interest. Authors have no potential conflicts of interest to disclose.

Footnotes

The authors acknowledge the contributions of all participants who diligently accepted to take part of this study. The authors thank Martin Coulombe and Renald Lemieux for their valuable suggestions on the validation of our questionnaire. We acknowledge Louise Corbeil, Sylvain Bernier, and Linda Pinsonneault for their assistance and comments. We also thank the Unité de Recherche Clinique et Épidémiologique (URCE) of the Centre de recherche du CHU de Sherbrooke for their support in coordinating the preparation and revisions of this manuscript. Thomas G. Poder is member of the FRQS-funded Centre de recherche du CHUS (CRCHUS). This project was funded by the operating budget of the UETMIS of the CHUS.

References

REFERENCES

1.Banta, D. What is technology assessment? Int J Technol Assess Health Care. 2009;25:79.Google Scholar
2.Abelson, J, Giacomini, M, Lehoux, P, et al. Bringing “the public” into health technology assessment and coverage policy decisions: From principles to practice. Health Policy. 2007;82:3750.Google Scholar
3.Hua, M, Boonstra, T, Kelly, PJ, et al. Quality of health technology assessment reports prepared for the Medical Services Advisory Committee. Int J Technol Assess Health Care. 2016;32:315323.Google Scholar
4.Gagnon, M-P, Desmartis, M, Poder, T, et al. Effects and repercussions of local/hospital-based health technology assessment (HTA): A systematic review. Syst Rev. 2014;3:129.Google Scholar
5.Attieh, R, Gagnon, M-P. Implementation of local/hospital-based health technology assessment initiatives in low- and middle-income countries. Int J Technol Assess Health Care. 2012;28:445451.Google Scholar
6.Kahveci, R. Editorial: HB-HTA in low- and middle-income countries. Int J Hosp-Based Health Technol Assess. 2017;1:12.Google Scholar
7.Dupouy, C, Gagnon, M. The influence of hospital-based HTA on technology acquisition decision. Int J Hosp-Based Health Technol Assess. 2016;1:1928.Google Scholar
8.Hailey, D. Commentary on “The influence of hospital-based HTA on technology acquisition decision.” Int J Hosp-Based Health Technol Assess. 2016;1:2930.Google Scholar
9.CADTH. Informing decision-makers about emerging medical technologies, policies, practices, and research. Health technology update. https://www.cadth.ca/media/pdf/hta_htupdate_issue-13_e.pdf (accessed June 7, 2018).Google Scholar
10.Gagnon, MP, Ben AbdelJelil, A, Desmartins, M, et al. Opportunities to promote efficiency in hospital decision-making through the use of health technology assessment. http://www.cfhi-fcass.ca/sf-docs/default-source/commissioned-research-reports/Gagnon-Dec2011-EN.pdf?sfvrsn=0 (accessed June 7, 2018).Google Scholar
11.Bellemare, CA, Fisette, J-F, Poder, TG, et al. The Health Technology Assessment Unit of the Centre hospitalier universitaire de Sherbrooke (Canada). In: Sampietro-Colom, L, Martin, J, eds. Hospital-based health technology assessment. The next frontier for health technology assessment. Switzerland: Adis; 2016:185–200.Google Scholar
12.Lafortune, L, Farand, L, Mondou, I, et al. Assessing the performance of health technology assessment organizations: A framework. Int J Technol Assess Health Care. 2008;24:7686.Google Scholar
13.Parsons, T. Social systems and the evolution of action theory. New York: Free Press; 1977.Google Scholar
14.Donabedian, A. Aspects of medical care administration. Cambridge, MA: Harvard University Press; 1973.Google Scholar
15.Donabedian, A. The quality of care. How can it be assessed? JAMA. 1988;260:17431748.Google Scholar
16.Steuten, L, Vrijhoef, B, Severens, H, et al. Are we measuring what matters in health technology assessment of disease management? Systematic literature review. Int J Technol Assess Health Care. 2006;22:4757.Google Scholar
17.McDonald, K, Sundaram, V, Bravata, D, et al. Closing the quality gap: A critical analysis of quality improvement strategies (Vol. 7 - Care Coordination). Rockville, MD: Agency for Healthcare Research and Quality. 2007.Google Scholar
18.Busse, R, Orvain, J, Velasco, M, et al. Best practice in undertaking and reporting health technology assessments. Working group 4 report. Int J Technol Assess Health Care. 2002;18:361422.Google Scholar
19.Schumacher, I, Zechmeister, I. Assessing the impact of health technology assessment on the Austrian healthcare system. Int J Technol Assess Health Care. 2013;29:8491.Google Scholar
20.Hailey, D. Profile of an HTA program. The AHFMR Health Technology Assessment Unit, 2002–2003. Alberta: Alberta Heritage Foundation for Medical Research; 2004.Google Scholar
21.Bodeau-Livinec, F, Simon, E, Montagnier-Petrissans, C, et al. Impact of CEDIT recommendations: An example of health technology assessment in a hospital network. Int J Technol Assess Health Care. 2006;22:161168.Google Scholar
22.Gerhardus, A, Dorendorf, E, Rottingen, J-A, et al. What are the effects of HTA reports on the health system? Evidence from the research literature. In: Garrido, M, Kristensen, F, Nielsen, C, et al. , eds. Health technology assessment and health policy-making in Europe. Current status, challenges and potential. Observatory Studies Series N°14. Brussels: European Observatory on Health Systems and Policies; 2008:109137.Google Scholar
23.McGregor, M. The impact of reports of the technology assessment unit of the McGill University Health Centre. Montreal (Canada): Technology Assessment Unit (TAU) of the McGill University Health Centre (MUHC). Report no. 65. 2012. https://secureweb.mcgill.ca/tau/sites/mcgill.ca.tau/files/muhc_tau_2012_65_impact.pdfGoogle Scholar
24.Berelson, B. Content analysis in communication research. In: Glencoe, I, ed. Glencoe IL: Free Press; 1952.Google Scholar
25.Silverman, D. Doing qualitative research. A practical handbook. (4th ed). Thousand Oaks, C: SAGE; 1999.Google Scholar
26.Andreani, JC, Conchon, F. Méthodes d'analyse et d'interprétation des études qualitatives: état de l'art en marketing. In: International Marketing Trends Conference in Paris; 2013.Google Scholar
27.Grenon, X, Pinget, C, Wasserfallen, J-B. Hospital-based health technology assessment (HB-HTA): A 10-year survey at one unit. Int J Technol Assess Health Care. 2016;32:116121.Google Scholar
28.Cram, N, Groves, J, Foster, L. Technology assessment--A survey of the clinical engineer's role within the hospital. J Clin Eng. 1997;22:373382.Google Scholar
Figure 0

Figure 1. Framework guiding the production process adopted by the UETMIS (Health Services and Technology Assessment Unit).

Figure 1

Box 1. Levels of impact of recommendations

Figure 2

Table 1. Number of Recommendations and Impact Level According to Each Criterion

Figure 3

Table 2. Number of Reports and Impact Level According to Each Criterion

Figure 4

Table 3. List of Reasons Reported to Explain Absence of Implementation of a Recommendation

Supplementary material: File

Poder et al. supplementary material 1

Poder et al. supplementary material

Download Poder et al. supplementary material 1(File)
File 24.1 KB