Hostname: page-component-745bb68f8f-g4j75 Total loading time: 0 Render date: 2025-02-06T11:25:32.334Z Has data issue: false hasContentIssue false

Health technology assessment for digital technologies that manage chronic disease: a systematic review

Published online by Cambridge University Press:  26 May 2021

Amy von Huben*
Affiliation:
School of Public Health, Faculty of Medicine and Health, University of Sydney, Camperdown, New South Wales, Australia
Martin Howell
Affiliation:
School of Public Health, Faculty of Medicine and Health, University of Sydney, Camperdown, New South Wales, Australia
Kirsten Howard
Affiliation:
School of Public Health, Faculty of Medicine and Health, University of Sydney, Camperdown, New South Wales, Australia
Joseph Carrello
Affiliation:
School of Public Health, Faculty of Medicine and Health, University of Sydney, Camperdown, New South Wales, Australia
Sarah Norris
Affiliation:
School of Public Health, Faculty of Medicine and Health, University of Sydney, Camperdown, New South Wales, Australia
*
Author for correspondence: Amy von Huben, E-mail: amy.vonhuben@sydney.edu.au
Rights & Permissions [Opens in a new window]

Abstract

Objective

A growing number of evaluation frameworks have emerged over recent years addressing the unique benefits and risk profiles of new classes of digital health technologies (DHTs). This systematic review aims to identify relevant frameworks and synthesize their recommendations into DHT-specific content to be considered when performing Health Technology Assessments (HTAs) for DHTs that manage chronic noncommunicable disease at home.

Methods

Searches were undertaken of Medline, Embase, Econlit, CINAHL, and The Cochrane Library (January 2015 to March 2020), and relevant gray literature (January 2015 to August 2020) using keywords related to HTA, evaluation frameworks, and DHTs. Included framework reference lists were searched from 2010 until 2015. The EUNetHTA HTA Core Model version 3.0 was selected as a scaffold for content evaluation.

Results

Forty-four frameworks were identified, mainly covering clinical effectiveness (n = 30) and safety (n = 23) issues. DHT-specific content recommended by framework authors fell within 28 of the 145 HTA Core Model issues. A further twenty-two DHT-specific issues not currently in the HTA Core Model were recommended.

Conclusions

Current HTA frameworks are unlikely to be sufficient for assessing DHTs. The development of DHT-specific content for HTA frameworks is hampered by DHTs having varied benefit and risk profiles. By focusing on DHTs that actively monitor/treat chronic noncommunicable diseases at home, we have extended DHT-specific content to all nine HTA Core Model domains. We plan to develop a supplementary evaluation framework for designing research studies, undertaking HTAs, and appraising the completeness of HTAs for DHTs.

Type
Assessment
Copyright
Copyright © The Author(s), 2021. Published by Cambridge University Press

Introduction

Digital health technologies (DHTs) have the potential to overcome the barrier of geographical location to widen access to health care and improve connectivity between patients and their healthcare team. A DHT's ability to continuously monitor a patient's physiological indicators with preset alert thresholds can expedite treatment compared with traditional office visits.

Chronic diseases are long-lasting conditions with persistent effects, often affecting a patient's social and economic circumstances (1). DHTs that help patients self-manage a long-lasting condition at home and escalate treatment only when required may be particularly suited to these patients. With increasing personal investment in electronic devices, the growing burden of chronic disease, and a limited health budget and workforce, there is potential for DHTs to offer a comparatively safe, effective, and cost-effective treatment pathway for chronic disease.

Terms describing DHT classes (digital devices, mhealth, and ehealth) are numerous, not consistently defined, and rapidly changing (see Table 1 footnote d for DHT class terms and definitions). The DHTs that are the focus of this review are those specifically designed for patients with diagnosed chronic noncommunicable diseases to use at home for active monitoring or treatment; for example, remote monitoring via implants/wearables and web-based cognitive behavioral therapy treatment programs. These DHTs with a functional classification of “Active monitoring” or “Treat” are classified into the highest risk evidence tier, Evidence Tier 3b, under the United Kingdom's (UK) National Institute for Health and Care Excellence (NICE) Evidence Standards (37), and are regulated as Medical Device Software (MDSW) under the new European Union (EU) Medical Devices Regulation (MDR) (48).

Table 1. Summary of coverage and DHT-specific content by the HTA domain for each framework

WHO, World Health Organization; MDCG, Medical Device Co-ordinating Group; HAS, Haute Autorité de Santé; ACSQHC, Australian Commission on Safety and Quality in Health Care.

Despite the unique benefits of these DHTs, there are many risks/challenges associated with their use: technical reliability/stability of electronic sensors and data transmissions; transparency of algorithms for autonomous decisions; access and usability; reorganization of workflows/infrastructure; and security threats in data transmissions and storage. Given that patients with chronic disease may already be socially isolated and economically vulnerable, the use of DHTs in this population deserves careful consideration. A tailored approach to conducting health technology assessments (HTAs) of DHTs could assist such considerations by explicitly examining the unique benefits and risks of DHTs for these vulnerable patients.

Although HTA has multiple definitions, for this paper, we define HTA as a multidisciplinary process (Reference O'Rourke, Oortwijn and Schuller49) to assess and prioritize new technologies against existing health care interventions based on comparative safety, clinical, and cost-effectiveness (50) at the lifecycle stage of public funding assessment.

Given that the topics and issues within established HTA frameworks have evolved to guide the assessment of pharmaceuticals, medical devices, and medical services, it is not clear if such frameworks are fit for purpose in assessing DHTs. The last decade has seen an increase in DHT-specific evaluation frameworks, HTA agency guidance, and improved clarity in DHT regulation (EU MDR (48;51) and EU General Data Protection Regulation [GDPR] (52)); all important considerations for a DHT-specific HTA framework.

An exponential rise in clinical applications for DHTs has driven an increase in clinical trials of these technologies. Recent systematic reviews (Reference Moshi, Tooher and Merlin53Reference Vukovic, Favaretti, Ricciardi and de Waure56) of HTAs and economic evaluations for DHTs identify a wide variation in the scope and methods used, limiting the quality and consistency of evidence available to inform funding decisions. Identifying and defining DHT-specific content within generally accepted HTA frameworks may help researchers collect consistent and robust evidence for decision makers.

The aim of the current systematic review is twofold: first, to identify and synthesize the recommendations of DHT-specific HTA and evaluation frameworks using an established HTA model with a broad scope of content and applicability to multiple jurisdictions as a scaffold, and second, to develop a comprehensive list of DHT-specific content to be considered when undertaking an HTA to inform funding decisions for DHTs that manage chronic noncommunicable disease at home.

Methods

This systematic review was registered with PROSPERO (#CRD42020186888) and is reported in accordance with the preferred reporting items for systematic reviews and meta-analysis (PRISMA) guidelines (Reference Moher, Liberati, Tetzlaff and Altman57).

Inclusion Criteria

This review focuses on HTA frameworks for evaluating comparative effectiveness, cost-effectiveness, and safety for public funding purposes, not on the evaluation of effectiveness or safety for individual interventions. The review is limited to recently published frameworks because of the rapid development of DHTs. Frameworks also have to be suitable for MDSW. For these reasons, peer-reviewed journal articles, dissertations and theses, HTA agency, and health economic institute publications that discuss methods for performing an HTA, or an assessment of comparative effectiveness, safety, or cost-effectiveness, appropriate for MDSW and published between 2015 and 2020, were eligible for inclusion.

Exclusion Criteria

The following types of frameworks were excluded: Guidelines or regulations from medical device regulators; frameworks for evaluating DHTs used in clinical trials of nondigital interventions; and frameworks targeted solely at DHTs that are not MDSW. Frameworks for implementing digital technology for health systems, such as clinical decision support, electronic health record systems, and establishing telemedicine businesses, were also excluded.

Information Sources and Search Strategy

Medline, Embase, Econlit, CINAHL, and The Cochrane Library were searched from 1 January 2015 to 20 March 2020 using keywords related to HTA, evaluation frameworks, and DHT. The full search strategy is presented in Supplementary Table 1. The start date of January 2015 was selected given the rapid development of DHTs and the focus on up-to-date HTA frameworks.

Gray literature was searched using the Canadian Agency for Drugs and Technologies in Health (CADTH)'s Grey Matters (58). Agencies listed under HTA and Health Economics (see Supplementary Table 2) were searched for evaluation frameworks published between 1 January 2015 and 31 March 2020 using the keyword searches: “electronic health” or eHealth or “mobile health” or mHealth or telehealth or telemedicine or “digital health” or “digital medicine.” The ProQuest Dissertations and Theses Global (PQDT) database was searched using these same keywords. The gray literature search was updated on 31 August 2020 for releases post 31 March 2020.

To reduce the risk of missing DHT-specific content from evaluation frameworks published before 2015 but not subsequently updated, pearling of included frameworks was conducted. The start date of 2010 for pearling was chosen because, prior to 2010, DHT evaluation frameworks had focused mainly on telecommunications as a replacement for face-to-face consultations (Reference Hailey, Ohinmaa and Roine59Reference Dávalos, French, Burdick and Simmons63), and these DHTs are out of scope for our review.

Study Selection

All authors participated in the title and abstract screening. Full-text screening was undertaken by AvH, with 10 percent of full texts reviewed independently by JC and conflicts resolved by SN.

Data Extraction

Data extracted for each framework included: First author/institution, the year of publication, the country/region that the framework is intended for, the Web site or journal citation, the author's affiliation (e.g., university, HTA agency, and government agency), the intended audience, the purpose of the framework (and if relevant, the name of the framework), and the DHT classes covered.

Data extraction was conducted by AvH and checked by JC.

Content Evaluation

The aspects covered by the included frameworks were analyzed using the European Network for Health Technology Assessment (EUNetHTA) HTA Core Model version 3.0 (HTA Core Model) (46). The HTA Core Model was selected as our analytic scaffold, because it is used across multiple countries to assess a range of health technologies, it includes a wide range of issues for content mapping, and it uses internationally accepted HTA terminology. The model has nine domains, with 51 topics and 145 issues (see Supplementary Table 3). Each of the 145 issues has a unique assessment element identifier (issue identifier) and a card that clarifies which content is common to all applications or is specific to applications within a technology class.

Content from the included frameworks was mapped to the 145 issues of the HTA Core Model in a two-stage process. Initially, DHT-specific topics and issues raised by the frameworks but not already included in the model were included to ensure a comprehensive collation of DHT content. For new DHT-specific topics, new topic names were proposed (indicated as NEW in tables), and for new DHT-specific issues, new issue identifiers were assigned using a DHT prefix. Subsequently, all content recommended by each framework was mapped to the extended set of issues. Decisions regarding whether to map content from the included frameworks to new DHT-specific issues or existing HTA Core Model issues were made by AvH and reviewed by SN.

For each included framework, we recorded whether it partially or (near) completely covered each HTA domain and whether it recommended any DHT-specific content in each HTA domain.

Synthesis of Results

We calculated the number and proportion of frameworks covering, and recommending DHT-specific content in, each HTA domain.

We summarized the content mapping results into two lists: The first comprised DHT-specific content to be considered when undertaking an HTA; the second comprised existing HTA content (i.e., content common across digital and nondigital technologies) but recommended by the frameworks as essential for undertaking HTAs on DHTs. For both lists, each item of content was reported by HTA domain, topic, issue identifier, and the reference(s) of the framework(s) that recommended it for ease of use and traceability.

Risk of bias and completeness of reporting assessments (beyond comparison with the HTA Core Model) were not relevant for this systematic review.

Results

Study Selection and Characteristics

The peer-reviewed literature and gray literature searches resulted in 9,236 unique records (Supplementary Figure 1). After applying our inclusion and exclusion criteria, forty-four frameworks were included (Table 1 and Supplementary Table 4). These frameworks were published between 2011 and 2020, with twenty-three dating from 2018 to 2020. Twenty-two frameworks were indicated as being international, eleven were intended for EU countries, seven for the UK, and four for the Asia Pacific region. Fifteen frameworks covered digital health, seven were limited to eHealth, fifteen further refined their scope to mHealth, five were strictly intended for MDSW, and two targeted sensors and wearables (digital devices). Twenty-six first authors were affiliated with universities, seven with HTA agencies, and seven with government bodies.

HTA Domain Coverage and Recommended HTA Content From Included Frameworks

Table 1 presents a summary of coverage and DHT-specific content by HTA domain for each framework, and Table 2 reports the number and proportion of frameworks covering, and recommending DHT-specific content for, each HTA domain.

Table 2. Summary of EUNetHTA HTA core model version 3.0 (46) domain coverage and digital health technology (DHT)-specific content of frameworks in review

HTA, health technology assessment; DHT, digital health technology.

Frameworks covering the domain: Framework provides any coverage of the domain.

Full or near-full coverage: Framework covers more than two-thirds of topics in the domain.

Partial coverage: Framework covers less than two-thirds of topics in the domain.

Rows of the table are the domains of the EUNetHTA HTA Core Model version 3.0 (46):

CUR: Describes the new technology's target population, target condition and current management, current and expected utilization, and regulatory status.

TEC: Describes the new technology's features in enough detail to differentiate it from comparators, and the investments, tools, and training required to use it.

SAF: Identifies unwanted or harmful effects of the new technology important to patients or the decisions of healthcare providers and policy makers.

EFF: Provides evidence of comparative effectiveness of the new technology in producing health benefits in the relevant healthcare setting.

ECO: Provides information on the new technology's costs, health-related outcomes, and economic efficiency to inform value for money judgments.

ETH: Considers potential harms to autonomy, respect for persons, justice, and equity from the use of the new technology or from performing the HTA.

ORG: Identifies resources to mobilized or organized to implement the new technology and the consequences (Intra/interorganizational and health system).

SOC: Considers issues related to the new technology relevant to patients, carers, and social groups.

LEG: Identifies rules and regulations protecting patient's rights and societal interests for consideration when evaluating the new technology.

As stated in Methods, we created two lists of HTA content recommended by the frameworks. Table 3 presents the list of DHT-specific content to be considered when undertaking an HTA. Table 4 presents the list of existing HTA content common across digital and nondigital technologies but recommended as essential for undertaking HTAs on DHTs. A more detailed listing of the recommended content can be found in Supplementary Table 5.

Table 3. Digital specific content to be considered when undertaking health technology assessments (HTAs) of DHTs

a From EUNetHTA HTA Core Model version 3.0 (46).

b New topic.

c A DHT prefixed denotes a new issue (i.e., DHTXX).

Table 4. Existing health technology assessment (HTA) content that is common across DHTs and non-DHTs

a From EUNetHTA HTA Core Model version 3.0 (46).

The included frameworks recommended DHT-specific content in 28 of 145 issues (18 of the 51 topics) and all nine domains of the HTA Core Model (see Table 3). Another twenty-two issues (eight topics) not included in the HTA Core Model are recommended in six HTA domains; predominantly Domain 3: Safety (SAF) and Domain 4: Clinical effectiveness (EFF).

The frameworks’ coverage of HTA domains, DHT-specific content, and HTA content recommendations are summarized below by HTA domain.

Domain 1: Health Problem and Current Use of the Technology (CUR)

More than one-third of frameworks covered CUR, but only three frameworks (7 percent) recommended DHT-specific content, the least out of all domains (see Table 2). The topics and issues raised by the frameworks for CUR were the same as the HTA Core Model. DHT-specific content was confined to issues of the new technology's current and expected utilization (see Table 3).

Domain 2: Description and Technical Characteristics of the Technology (TEC)

TEC was covered by nineteen frameworks (43 percent), with seventeen discussing DHT-specific content (see Table 2). The topics raised by the frameworks for TEC were the same as the HTA Core Model. However, thirteen frameworks suggested a new issue addressing how well the features of DHTs and their comparator(s) overcome technical barriers. DHT-specific content was recommended for HTA Core Model issues of material investments, training, and information required to use the technology (see Table 3).

Domain 3: Safety (SAF)

SAF had the most DHT-specific content, with all twenty-three frameworks covering this domain recommending DHT-specific content (see Table 2). The frameworks recommended three DHT topics (covering a total of ten issues) not in the HTA Core Model for SAF: Quality and safeguarding (data security and privacy, interoperability, usability and accessibility, transparency, and adequate disclosures for algorithms); technical safety (technical reliability and stability, continuity and updates); and communicating for safety (see Table 3).

Domain 4: Clinical Effectiveness (EFF)

EFF was the most commonly covered domain, with thirty frameworks (68 percent) making recommendations in this domain. The frameworks suggested four additional topics (and eight issues) for EFF: Demonstrating effectiveness (DHT-appropriate study design, comparators, outcome measures, and transparent reporting of effectiveness studies); ensuring reliable information content; the use of appropriate and best practice behavior change; and measures for assessing the external validity/generalisability of DHT effectiveness studies. DHT-specific content was also recommended for the HTA Core Model issue of patient satisfaction.

Domain 5: Costs and Economic Evaluation (ECO)

Nineteen frameworks covered ECO, with twelve making DHT-specific recommendations. Cost-effectiveness and budget impact frameworks comprise this domain. The topics raised by the frameworks for ECO were the same as the HTA Core Model. However, a new issue within the validity of the model(s) topic was recommended to ensure that the changes in fixed costs for scaling up DHTs from the trial to the health-system level have been investigated. DHT-specific content was recommended for estimating resource utilization, costs, and health outcomes.

Domain 6: Ethical Analysis (ETH)

Fourteen frameworks covered ETH, with ten making DHT-specific recommendations. The topics and issues raised by the frameworks for ETH were the same as the HTA Core Model. However, DHT-specific content was recommended for four HTA Core Model topics (seven issues): Benefit-harm balance (benefits and harms for stakeholders other than the patient, and hidden unintended consequences of the technology), autonomy (vulnerable persons, threats to autonomy, and supports required); respect for persons (privacy); and justice and equity (accessibility).

Domain 7: Organizational Aspects (ORG)

Fourteen frameworks covered ORG, with nine making DHT-specific recommendations. A new topic not in the HTA Core Model for ORG, namely, contextual issues for barriers and enablers to DHT implementation, was recommended. DHT-specific content was also recommended for two HTA Core Model topics (five issues): Health delivery process (changes to current work processes, resources, training, co-operation, and communication) and the structure of the health system (processes to ensure access to the new technology).

Domain 8: Patients and Social Aspects (SOC)

SOC was the least covered with only eight frameworks making recommendations, and only four making DHT-specific recommendations. The topics and issues raised by the frameworks for SOC were the same as the HTA Core Model. DHT-specific content was limited to two issues: Improving access to health care and upfront communication of direct and data usage costs of DHTs to improve treatment adherence.

Domain 9: Legal Aspects (LEG)

Fourteen frameworks covered LEG, with almost all, thirteen, making DHT-specific recommendations. A new issue of professional liability was recommended for the HTA Core Model topic of ownership and liability. DHT-specific content was also recommended for the HTA Core Model topic of patient privacy, that is, designing DHTs to comply with laws/binding rules for data security and privacy.

Discussion

To our knowledge, we have conducted the most extensive systematic search of international peer-reviewed and gray literature for HTA and evaluation frameworks for DHTs designed to actively monitor or treat a diagnosed chronic noncommunicable disease at home. These DHTs, such as remote monitoring via digital devices or web-based treatment programs, are classified into the highest risk evidence tier under the NICE Evidence Standards (37) and are strictly regulated under medical device regulation (48). Deliberately focusing on a high-risk DHT class has allowed us to identify a fuller range of DHT-specific content, with the expectation that not all of this content will apply to lower-risk DHT classes.

The findings from this systematic review demonstrate that there is no single framework that is used uniformly across jurisdictions to assess the comparative safety, effectiveness, and cost-effectiveness of DHTs. The NICE's Evidence Standards for DHTs (37), although DHT-specific, focus primarily on the EFF and ECO domains. Our review highlights the need for more comprehensive technology-specific questions for undertaking HTAs of DHTs across all HTA domains.

Our analysis shows that HTA Core Model topics are relevant for funding assessment of DHTs, covering all topics raised by the frameworks in six domains. However, the included frameworks recommend adding DHT-specific content in 28 of 145 issues (18 of the 51 topics) and all nine domains of the HTA Core Model (see Table 3). They also recommend another twenty-two issues (eight topics) that are not currently included in the HTA Core Model (see Table 3). Collectively, this suggests that the HTA Core Model is not sufficiently comprehensive for undertaking HTAs of DHTs that manage chronic noncommunicable disease at home.

We also highlight the existing HTA content common to digital and nondigital technologies but essential for DHTs, as shown in Table 4. Given the rapid growth in DHTs over recent years, identifying current alternative DHTs available for patients with the targeted condition (Reference Philpott, Guergachi and Keshavjee22) assists in estimating the expected utilization of DHTs and understanding the DHTs available to comparator groups. Rapid growth in DHT development also makes identifying a DHT's stage in the product lifecycle crucial. The NICE (37) requires evidence that a DHT is relevant and has been piloted successfully in the healthcare system and also evidence that a DHT can perform for an expected number of users, for example, adequate server size. Kidholm et al. (Reference Kidholm, Ekeland, Jensen, Rasmussen, Pedersen and Bowes4) also stipulate that the technology is in a steady state to enable a robust economic analysis to be performed. The lack of face-to-face contact in remote monitoring/self-management interventions may also require heightened risk management controls. For example, defined parameters to identify and respond to a patient's acute deteriorating condition and controls for vulnerable users (40) may reduce patient risk. Remote-monitoring DHTs require a consideration of the management of incidental findings. All DHTs require evidence of improved access to health care.

Because the DHTs of interest to this study are used directly by patients for self-management, existing HTA content examining patient satisfaction is crucial. Identifying changes to infrastructure, services, and systems for existing and new care pathways associated with DHTs is also critical when changing health-care delivery from in-person consultations to remote. An organizational enabler to the successful implementation of DHTs is its credibility with healthcare professionals; the NICE (37) requires published or publicly available evidence documenting the relevant healthcare experts’ role in the development of DHTs.

There was much discussion in the included frameworks about innovative trial designs for assessing the clinical effectiveness of DHTs in EFF and the complexity of economic evaluation in ECO. However, no evidence was provided that these alternate trial designs are appropriate when DHTs have reached a steady state. The framework authors concluded that a high-quality randomized controlled trial (RCT) conducted in people with the target condition in a setting relevant to the health system (37) remains the most unbiased evidence of clinical effectiveness for DHTs (5;Reference Mookherji, Mehl, Kaonga and Mechael10;Reference Philpott, Guergachi and Keshavjee22;Reference Wyatt30;34;37). Advice for overcoming common methodological problems for RCTs of DHTs, such as blinding and informed consent, was given by the Haute Autorité de Santé (5). Little justification was provided for using a pre-test/post-test design for DHTs that are an adjunct to standard care (relevant to many DHTs that manage chronic noncommunicable disease at home), because the ideal comparator group, people having standard care (37), should not generally create ethical issues (5). For economic evaluation methods in ECO, frameworks state that DHTs are complex interventions implemented in a complex health system (Reference Bergmo8;Reference McNamee, Murray, Kelly, Bojke, Chilcott and Fischer15;Reference Shiell, Hawe and Gold62;Reference Rickles, Hawe and Shiell64). This complexity presents challenges for economic evaluation, such as instability in preference values (Reference Bergmo8). However, McNamee et al. (Reference McNamee, Murray, Kelly, Bojke, Chilcott and Fischer15) consider that it is valid to use standard economic methods for DHTs, and where there are interactions, nonlinearity in changes, or multiplier effects, these can be dealt with by sensitivity analyses (Reference Bergmo8;Reference McNamee, Murray, Kelly, Bojke, Chilcott and Fischer15) and data from cluster trials (Reference Bergmo8).

Twenty of the twenty-eight existing HTA Core Model issues recommended for DHT-specific content are concentrated in four domains. The identification of DHT-specific content for the technical characteristics in TEC, the estimation of DHT-specific resource utilization and costs in ECO, and the DHT-specific changes to work processes in ORG were expected. The large amount of DHT-specific content identified in ETH is warranted when we consider the description by Sax et al. (Reference Sax, Helberger and Bol28) of the unique risks of DHTs that collect a large amount of personal data to develop predictive algorithms of behavior. Consequently, there are ethical issues in terms of the potential for DHTs to influence the behavior of susceptible persons at critical times for commercial purposes.

A weakness of the included frameworks is the lack of discussion and recommendations on patients’ perspectives in the domain of SOC. We acknowledge that the ability of a DHT to engage and motivate a patient is implicit in any demonstration of DHT effectiveness, and we are not suggesting that effectiveness from a patient perspective should be re-evaluated during an HTA. Rather, we suggest that information regarding patient preferences and experience with a DHT will be informative to judgments regarding the transferability of effectiveness from one population setting to another.

The eight new topics (and nineteen of the twenty-two new issues) are concentrated in the three domains of SAF, EFF, and ORG. The new SAF topics address issues of technical reliability and stability, data security and privacy, accessibility, and communications that promote the safety of users and the autonomy of stakeholders. Although examples of data privacy breaches/threats (e.g., Australia's HealthEngine, UK NHS ransomware attacks) are plentiful, it is the less overt data privacy breaches that occur when DHTs operate on personal devices that patients use for social media and the internet (i.e., not purpose-built medical devices) that are a unique threat for DHTs. Huckvale et al. (Reference Huckvale, Torous and Larsen36) showed evidence of the prevalence of data transmissions with linkable identifiers from depression and smoking cessation apps to technology companies for marketing and analytics purposes without disclosures in privacy policies. The authors recommend regular audits of data transmissions rather than reliance on privacy disclosures.

The new EFF topics focus on high-quality evidence generation, transparent and standardized reporting of effectiveness studies, ensuring the reliability of health information content, and the use of appropriate and best practice behavior change techniques. Contextual issues for barriers and enablers to DHT implementation in ORG are comprehensively addressed by Drury et al. (Reference Drury, Roth, Jones, Stahl and Medeiros23), Lennon et al. (Reference Lennon, Bouamrane, Devlin, O'Connor, O'Donnell and Chetty19), and Rojahn et al. (Reference Rojahn, Laplante, Sloand, Main, Ibrahim and Wild17).

A strength of our analysis is the use of many sources, including gray literature. Additionally, focusing on a particular class of DHT with its specific risk/benefit profile has allowed us to identify and extend DHT-specific content to all HTA Core Model domains. Identifying content specific to the chronic noncommunicable disease target population and the active monitoring/treatment MDSW DHT class may limit the applicability of our analysis to other clinical circumstances, but many of the issues are sufficiently generic to be broadly applicable across other health areas and DHT classes. We also aimed to identify content broadly applicable across jurisdictions. However, some tailoring to meet local HTA needs may be required. Although a focus on the most recent 5 years in our search strategy was appropriate given the rapid development of DHTs, we have managed the risk of missing DHT-specific content in earlier evaluation frameworks by pearling included frameworks.

As DHT development continues apace, greater clarity is required regarding the evidence needed to inform policy makers and payers of the value of DHTs. By specifying additional DHT-specific content, we hope researchers can better plan to gather standardized and robust evidence that meets decision makers’ needs.

Future research is recommended on the applicability of the new topics and issues to lower-risk DHT classes and their relative importance to specific chronic diseases.

Conclusion

The development of DHT-specific content for HTA frameworks is hampered by DHTs having varied benefits and risk profiles. By focusing on a particular DHT class, we demonstrate that relevant evaluation frameworks from peer-reviewed and gray literature can be used to extend DHT-specific content to all HTA Core Model domains. We plan to develop companion resources for designing research studies and undertaking HTAs of DHTs that manage chronic noncommunicable disease at home.

Supplementary material

The supplementary material for this article can be found at https://doi.org/10.1017/S0266462321000362.

Acknowledgments

We thank Ms. Bernie Carr, Academic Liaison Librarian, Fisher Library, University of Sydney, for assisting with the search strategy.

Funding Statement

AvH is supported by an Australian Government Research Scholarship and Postgraduate Scholarship in Health Economics (Patient-Centered Care and Outcomes in Chronic Disease) from the University of Sydney, School of Public Health. JC is supported by a Postgraduate Research Scholarship from the Australian Prevention Partnership Centre (TAPPC). MH is funded by an Australian Government National Health and Medical Research Council program grant.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

References

Australian Institute of Health and Welfare. Chronic disease overview. 2020 [cited 2021 Feb 10]. Available from: https://www.aihw.gov.au/reports-data/health-conditions-disability-deaths/chronic-disease/overview.Google Scholar
Eysenbach, G. Consort-ehealth: Improving and standardizing evaluation reports of web-based and mobile health interventions. J Med Internet Res. 2011;13:e126.CrossRefGoogle ScholarPubMed
Andalusian Health Quality Agency (ES) [Internet] Complete list of recommendations on design, use and assessment of health apps. Seville (ES). 2012 [cited 2020 Aug 16]. Available from: www.calidadappsalud.com/en/listado-completo-recomendaciones-app-salud/.Google Scholar
Kidholm, K, Ekeland, AG, Jensen, LK, Rasmussen, J, Pedersen, CD, Bowes, A, et al. A model for assessment of telemedicine applications: MAST. Int J Technol Assess Health Care. 2012;28:44.CrossRefGoogle Scholar
Haute Autorité de Santé [French National Authority for Health]. Methodological choices for the clinical development of medical devices. Paris (FR): The Authority; 2013.Google Scholar
Khoja, S, Durrani, H, Scott, RE, Sajwani, A, Piryani, U. Conceptual framework for development of comprehensive e-health evaluation tool. Telemed e-Health. 2013;19:4853.CrossRefGoogle ScholarPubMed
Lewis, TL, Wyatt, JC. Mhealth and mobile medical apps: A framework to assess risk and promote safer use. J Med Internet Res. 2014;16:e210.CrossRefGoogle ScholarPubMed
Bergmo, TS. How to measure costs and benefits of ehealth interventions: An overview of methods and frameworks. J Med Internet Res. 2015;17:e254.CrossRefGoogle ScholarPubMed
Mohr, DC, Schueller, SM, Riley, WT, Brown, CH, Cuijpers, P, Duan, N, et al. Trials of intervention principles: Evaluation methods for evolving behavioral intervention technologies. J Med Internet Res. 2015;17:e166.CrossRefGoogle ScholarPubMed
Mookherji, S, Mehl, G, Kaonga, N, Mechael, P. Unmet need: Improving mhealth evaluation rigor to build the evidence base. J Health Commun. 2015;20:1224–9.CrossRefGoogle ScholarPubMed
Steventon, A, Grieve, R, Bardsley, M. An approach to assess generalizability in comparative effectiveness research: A case study of the whole systems demonstrator cluster randomized trial comparing telehealth with usual care for patients with chronic health conditions. Med Decis Making. 2015;35:1023–36.CrossRefGoogle ScholarPubMed
Ruck, A, Wagner Bondorf, S, Lowe, C (Consard Limited). Second draft of guidelines, EU guidelines on assessment of the reliability of mobile health applications. European Commission, Directorate-General of Communications Networks, Content & Technology; 2016.Google Scholar
Gorski, I, Bram, JT, Sutermaster, S, Eckman, M, Mehta, K. Value propositions of mhealth projects. J Med Eng Technol. 2016;40:400–21.CrossRefGoogle ScholarPubMed
McMillan, B, Hickey, E, Patel, MG, Mitchell, C. Quality assessment of a sample of mobile app-based health behavior change interventions using a tool based on the national institute of health and care excellence behavior change guidance. Patient Educ Couns. 2016;99:429–35.CrossRefGoogle ScholarPubMed
McNamee, P, Murray, E, Kelly, MP, Bojke, L, Chilcott, J, Fischer, A, et al. Designing and undertaking a health economics study of digital health interventions. Am J Prev Med. 2016;51:852–60.CrossRefGoogle ScholarPubMed
Murray, E, Hekler, EB, Andersson, G, Collins, LM, Doherty, A, Hollis, C, et al. Evaluating digital health interventions: Key questions and approaches. Am J Prev Med. 2016;51:843–51.CrossRefGoogle ScholarPubMed
Rojahn, K, Laplante, S, Sloand, J, Main, C, Ibrahim, A, Wild, J, et al. Remote monitoring of chronic diseases: A landscape assessment of policies in four European countries. PLoS ONE. 2016;11:e0155738.CrossRefGoogle ScholarPubMed
Young, M. IRBs could address ethical issues related to tracking devices: Mobile devices raise new concerns. IRB Advisor. 2017;17:89.Google Scholar
Lennon, MR, Bouamrane, MM, Devlin, AM, O'Connor, S, O'Donnell, C, Chetty, U, et al. Readiness for delivering digital health at scale: Lessons from a longitudinal qualitative evaluation of a national digital health innovation program in the United Kingdom. J Med Internet Res. 2017;19:e42.CrossRefGoogle ScholarPubMed
Maar, MA, Yeates, K, Perkins, N, Boesch, L, Hua-Stewart, D, Liu, P, et al. A framework for the study of complex mhealth interventions in diverse cultural settings. JMIR MHealth UHealth. 2017;5:e47.CrossRefGoogle Scholar
Michie, S, Yardley, L, West, R, Patrick, K, Greaves, F. Developing and evaluating digital interventions to promote behavior change in health and health care: Recommendations resulting from an international workshop. J Med Internet Res. 2017;19:e232.CrossRefGoogle ScholarPubMed
Philpott, D, Guergachi, A, Keshavjee, K. Design and validation of a platform to evaluate mhealth apps. Stud Health Technol Inform. 2017;235:37.Google ScholarPubMed
Drury, P, Roth, S, Jones, T, Stahl, M, Medeiros, D. Guidance for investing in digital health. Manila (PH): Asian Development Bank (ADB); 2018.CrossRefGoogle Scholar
European Commission. Synopsis report, consultation: Transformation health and care in the digital single market. Luxembourg: The Commission; 2018.Google Scholar
Hogaboam, LS. Assessment of technology adoption potential of medical devices: Case of wearable sensor products for pervasive care in neurosurgery and orthopedics [PhD]. Ann Arbor: Portland State University; 2018.CrossRefGoogle Scholar
Jurkeviciute, M. Planning of a holistic summative ehealth evaluation: The interplay between standards and reality [Licentiate]. Ann Arbor: Chalmers Tekniska Hogskola (Sweden); 2018.Google Scholar
Nielsen, S, Rimpiläinen, S. Report on international practice on digital apps. Glasgow (UK): Digital Health and Care Institute; 2018.Google Scholar
Sax, M, Helberger, N, Bol, N. Health as a means towards profitable ends: Mhealth apps, user autonomy, and unfair commercial practices. J Consumer Policy. 2018;41:103–34.CrossRefGoogle Scholar
Academy of Medical Sciences (UK). Our data-driven future in healthcare: People and partnerships at the heart of health related technologies. London (UK): The Academy; 2018.Google Scholar
Wyatt, JC. How can clinicians, specialty societies and others evaluate and improve the quality of apps for patient use? BMC Med. 2018;16:225.CrossRefGoogle ScholarPubMed
Beintner, I, Vollert, B, Zarski, AC, Bolinski, F, Musiat, P, Gorlich, D, et al. Adherence reporting in randomized controlled trials examining manualized multisession online interventions: Systematic review of practices and proposal for reporting standards. J Med Internet Res. 2019;21:e14181.CrossRefGoogle ScholarPubMed
Caulfield, B, Reginatto, B, Slevin, P. Not all sensors are created equal: A framework for evaluating human performance measurement technologies. npj Digital Med. 2019;2:7.CrossRefGoogle ScholarPubMed
Department of Health & Social Care (UK) [Internet] Code of conduct for data-driven health and care technology. London (UK): The Department; 2019 [updated 2019 Jul 18; cited 2020 Aug 18]. Available from: https://www.gov.uk/government/publications/code-of-conduct-for-data-driven-health-and-care-technology/initial-code-of-conduct-for-data-driven-health-and-care-technology.Google Scholar
Haute Autorité de Santé [French National Authority for Health]. Guide to the specific features of clinical evaluation of a connected medical device (CMD) in view of its application for reimbursement. Paris (FR): The Authority; 2019.Google Scholar
Haute Autorité de Santé [French National Authority for Health]. Public consultation on the draft analysis grid intended for use by CNEDiMTS to contribute to its evaluation of medical devices embedding decision systems based on automatic learning processes (“artificial intelligence”). Paris (FR): The Authority; 2019.Google Scholar
Huckvale, K, Torous, J, Larsen, ME. Assessment of the data sharing and privacy practices of smartphone apps for depression and smoking cessation. JAMA Network Open. 2019;2:e192542.CrossRefGoogle ScholarPubMed
National Institute for Health and Care Excellence (UK). Evidence standards framework for digital health technologies. London (UK): The Institute; 2019.Google Scholar
NHS Digital (UK) [Internet] How we assess health apps and digital tools. London (UK): NHS Digital; 2019 [updated 2019 May 17; cited 2020 Apr 13]. Available from: https://digital.nhs.uk/services/nhs-apps-library/guidance-for-health-app-developers-commissioners-and-assessors/how-we-assess-health-apps-and-digital-tools.Google Scholar
Rajan, B, Tezcan, T, Seidmann, A. Service systems with heterogeneous customers: Investigating the effect of telemedicine on chronic care. Manag Sci. 2019;65:1236–67.CrossRefGoogle Scholar
Australian Commission on Safety and Quality in Health Care. National safety and quality digital mental health standards - Consultation draft. Sydney (AU): The Commission; 2020.Google Scholar
Dick, S, O'Connor, Y, Thompson, MJ, O'Donoghue, J, Hardy, V, Wu, TJ, et al. Considerations for improved mobile health evaluation: Retrospective qualitative investigation. JMIR MHealth UHealth. 2020;8:e12424.CrossRefGoogle ScholarPubMed
Federal Ministry of Health (DE). Regulation on the procedure and requirements for testing the eligibility for reimbursement of digital health applications in the statutory public health insurance (Digital Health Applications Ordinance - DiGAV) (Draft bill). Bonn (DE): The Ministry; 2020.Google Scholar
Health Information and Quality Authority (IE). International review of consent models for the collection, use and sharing of health information. Cork (IE): The Authority; 2020.Google Scholar
Medical Services Advisory Committee (AU). Draft guidelines for preparing assessment reports for the medical services advisory committee. Canberra (AU): The Committee; 2020.Google Scholar
Moshi, MR, Tooher, R, Merlin, T. Development of a health technology assessment module for evaluating mobile medical applications. Int J Technol Assess Health Care. 2020;36:252–61.CrossRefGoogle ScholarPubMed
EUnetHTA Joint Action 2, Work Package 8. HTA Core Model® version 3.0. [Pdf]; 2016. Available from: www.htacoremodel.info/BrowseModel.aspx.Google Scholar
World Health Organization. WHO guideline: Recommendations on digital interventions for health system strengthening. Geneva (CH): The Organization; 2019.Google Scholar
Medical Device Coordination Group. Guidance on qualification and classification of software in regulation (EU) 2017/745 – MDR and Regulation (EU) 2017/746 – IVDR. European Commission; 2019.Google Scholar
O'Rourke, B, Oortwijn, W, Schuller, T. The new definition of health technology assessment: A milestone in international collaboration. Int J Technol Assess Health Care. 2020;36:187–90.CrossRefGoogle ScholarPubMed
Australian Government Department of Health and Ageing. Review of health technology assessment in Australia. Canberra (AU): Commonwealth of Australia; 2009.Google Scholar
Regulation (EU) 2017/745 of the European Parliament and of the Council. Official Journal. 2017;L117:1–175.Google Scholar
Regulation (EU) 2016/679 of the European Parliament and of the Council. Official Journal. 2016;L119:1–88.Google Scholar
Moshi, MR, Tooher, R, Merlin, T. Suitability of current evaluation frameworks for use in the health technology assessment of mobile medical applications: A systematic review. Int J Technol Assess Health Care. 2018;34:464–75.CrossRefGoogle ScholarPubMed
Iribarren, SJ, Cato, K, Falzon, L, Stone, PW. What is the economic evidence for mhealth? A systematic review of economic evaluations of mhealth solutions. PLoS ONE. 2017;12:e0170581.CrossRefGoogle ScholarPubMed
Kidholm, K, Kristensen, MBD. A scoping review of economic evaluations alongside randomized controlled trials of home monitoring in chronic disease management. Appl Health Econ Health Policy. 2018;16:167–76.CrossRefGoogle ScholarPubMed
Vukovic, V, Favaretti, C, Ricciardi, W, de Waure, C. Health technology assessment evidence on e-health/m-health technologies: Evaluating the transparency and thoroughness. Int J Technol Assess Health Care. 2018;34:8796.CrossRefGoogle ScholarPubMed
Moher, D, Liberati, A, Tetzlaff, J, Altman, DG. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. PLoS Med. 2009;6:e1000097.CrossRefGoogle ScholarPubMed
Grey matters: A practical tool for searching health-related grey literature [Internet]. Ottawa (CA): CADTH; 2018 [updated 2019 Apr; cited 2020 Apr 4]. Available from: https://www.cadth.ca/resources/finding-evidence.Google Scholar
Hailey, D, Ohinmaa, A, Roine, R. Study quality and evidence of benefit in recent assessments of telemedicine. London (UK): SAGE Publications; 2004. p. 318–24.Google ScholarPubMed
Gagnon, M-P, Scott, R. Striving for evidence in e-health evaluation: Lessons from health technology assessment. J Telemed Telecare. 2005;11:S34–6.CrossRefGoogle ScholarPubMed
Reardon, T. Research findings and strategies for assessing telemedicine costs. Telemed e-Health. 2005;11:348–69.CrossRefGoogle ScholarPubMed
Shiell, A, Hawe, P, Gold, L. Complex interventions or complex systems? Implications for health economic evaluation. BMJ. 2008;336:1281–3.CrossRefGoogle ScholarPubMed
Dávalos, ME, French, MT, Burdick, AE, Simmons, SC. Economic evaluation of telemedicine: Review of the literature and research guidelines for benefit-cost analysis. Telemed e-Health. 2009;15:933–48.CrossRefGoogle ScholarPubMed
Rickles, D, Hawe, P, Shiell, A. A simple guide to chaos and complexity. J Epidemiol Community Health. 2007;61:933–7.CrossRefGoogle ScholarPubMed
Figure 0

Table 1. Summary of coverage and DHT-specific content by the HTA domain for each framework

Figure 1

Table 2. Summary of EUNetHTA HTA core model version 3.0 (46) domain coverage and digital health technology (DHT)-specific content of frameworks in review

Figure 2

Table 3. Digital specific content to be considered when undertaking health technology assessments (HTAs) of DHTs

Figure 3

Table 4. Existing health technology assessment (HTA) content that is common across DHTs and non-DHTs

Supplementary material: File

von Huben et al. supplementary material

Tables S1-S5 and Figure S1

Download von Huben et al. supplementary material(File)
File 98.3 KB