Hostname: page-component-6bf8c574d5-rwnhh Total loading time: 0 Render date: 2025-02-22T07:32:23.691Z Has data issue: false hasContentIssue false

Electronic Informed Consent in Mobile Applications Research

Published online by Cambridge University Press:  01 January 2021

Rights & Permissions [Opens in a new window]

Abstract

The article covers electronic informed consent (eIC) from different dimensions so that practitioners might understand the history, regulation, and current status of eIC. It covers the transition of informed consent to electronic screens and the implications of that transition in terms of design, costs, and data analysis. The article explores the limits of regulation mandating eIC for mobile application research, and addresses some of the broader social context around eIC.

Type
Symposium Articles
Copyright
Copyright © American Society of Law, Medicine and Ethics 2020

Electronic Informed Consent (eIC): Regulatory Considerations

Electronic informed consent sits inside a multi-decade process in the United States to convert diverse types of physical, paper contracts into virtual documents that can be “marked” and “signed” electronically. For this paper I will use the term “electronic signatures” for this legally centered concept, to distinguish them from “digital signatures,” which are primarily concerned with cryptographic mechanisms.

Starting with commercial websites in the late 1990s, a variety of previously analog contracts moved online with associated uncertainty as to in what conditions the signatures of parties were binding. The US Congress reacted by passing the Electronic Signatures in Global and National Commerce Act (“E-Sign”)1 signed into law by President Clinton on June 30, 2000. Since then, 47 US States, the District of Columbia, and the US Virgin Islands passed a uniform piece of legislation implementing electronic signatures in their boundaries, with the remaining three states (NY, WA, and IL) passing non-uniform legislation with similar goals.2

Informed consent for research is documented on a written statement from the researcher to the participant, creating an interaction where the participant can understand the research, and make a recordable choice to enroll. Since at least the Camp Lazear yellow fever experimentsReference Reed, Carroll and Agramonte3 these documents explain what the participant deserves to know and a place where they can sign.4 This structure became formal after a series of atrocities at home and abroad, and led to international and national regulation on the conditions under which research would be conducted.5 Electronic informed consent (eIC) for research thus needs to fulfill both the legal and technical realities of a valid electronic signature, as well as fulfill the covenant between researcher and participant to disclose essential facts about the study.

The primary regulator of electronic signatures for research is the Food and Drug Adminstration (FDA), which describes standard process to evaluate electronic records and signatures to be “generally equivalent to a handwritten signature executed on paper.”6 eIC systems also need to capture and record dates of consent7 and provide a signed copy of the informed consent to the participant.8 But as a gateway to regulated research, the eIC is additionally subject to regulations governing traditional informed consent, plus additional electronic specific regulations above and beyond signature validity.

First, the eIC “must contain all elements of informed consent required by the Department of Health and Human Services (HHS) and/or FDA regulations9 (see box 1). Additionally, in interventional trials, the FDA requires that the participant also has the right to opt out of the electronic process and request a paper-based consent form. Specific eIC guidance indicates the consent form must state that “significant new findings developed during the course of the research that may affect the subject’s willingness to continue participation” be provided electronically as well (a concept not unique to eIC, but uniquely enabled due to the electronic medium and thus explicitly called out).10

Box 1 Mandatory Elements of Informed Consent

All regulatorily compliant informed consent includes certain essential elements, in easily understandable language,11 as per the National Human Genome Research Institute: 12

  • Voluntary Participation

  • Purpose of Research

  • Description of the Procedures

  • Risks

  • Confidentiality

  • Potential Benefits

  • Financial Considerations

  • Withdrawal from Research

  • Alternatives to Participation

  • Explanation of Resources Available in Case of Injury

  • Contact Information

eIC further illuminates where the traditional “human to human” informed consent conversation is difficult to translate into electronic form. For example, verifying the identity of study participants becomes far more complex in eIC and FDA guidance (this mandatory verification diverges from general HHS guidance to apply a risk-based approach to identity validation in minimal risk research under the Common Rule).13 Another area where technology introduces complexity is in the conversation between participant and study coordinator: the traditional interaction makes it easy to ask and answer questions, an opportunity that has to be purposefully designed into an electronic context. The joint FDA-OHRP14 guidance here notes that studies should have a method for questions to be asked and answered, but does not proscribe or require methods. The guidance also notes that informedness is a complex topic and notes that eIC “may” use a variety of methods such as multimedia and teach-back to attempt to increase and/or assess informedness.

Of course, these regulations only attach to researchers who perform research in a regulated context. The Common Rule does not apply to the individual using eIC to study herself, or her family, or her community, or even a community unknown to her to which she has no ties and therefore may feel no ethical obligation. Nor does the FDA guidance.Reference Clayton15 Thus, the only binding US federal consent regulation for the “unregulated” researcher using a mobile phone to study humans are related to obtaining a valid digital signature under E-Sign and the uniform state legislation. Interestingly, app store submission requirements can create a form of soft regulation that can require some form of consent akin to that required by statute, but only if the app stores find sufficient motivation to do so from the broader cultural and political environment.16

eIC: Design and Interface Considerations

Within eIC, all the elements of informed consent where people spoke to people now require interfaces and designs on screens. In addition to added cost (the cost of in person interaction having been implicit), these interfaces require very different skills to build and deploy than the skills found in traditional research settings, specifically human-computer interaction (HCI), user experience design (UX), and the translation of bioethical principles into software systems. The HHS and FDA regulation and guidance anticipated this transition and suggests directions for eIC interfaces: IRBs are explicitly tasked with reviewing everything from novel methods to studying materials usability to monitoring version control over time.17

In the 2010s, stakeholders began to focus on these “human to human” functions. In 2013, the Electronic Data Methods Forum (a project funded by the Agency for Healthcare Research and Quality, through Academy Health) funded a Sage Bionetworks project18 in “portable” informed consentReference Wilbanks19 for data donation. That project evolved into a “participant centered design” group, aiming specifically to address questions of how to plausibly inform participants in a fully electronic informed consent process. The design work built on research showing that screen reading is often a “scanning” or “skimming” process compared to print reading, as well as on work demonstrating that users frequently sign complex legal agreements without reading (e.g., copyright licenses, terms of service, privacy policy to inform designs for eIC.)Reference Wilbanks20

In its initial form at Sage, participant centered design focused on the creation of screens that described the key concepts of the research study using small amounts of large-font text combined with semantically relevant iconography. An additional quiz module was added — questions that covered key concepts like therapeutic misconception, voluntariness, and key goals of the research — to assess basic (yes/no) comprehension of essential concepts conveyed (later moving to use these questions as a teach-back method rather than an evaluation).Reference Doerr, Suver and Wilbanks21 Sage released an open source Participant-Centered Consent Toolkit22 (updated in 2018 as the Elements of Informed Consent Toolkit)23 comprised of an icon library, annotated sample protocols, “walkthroughs” of eIC informing processes for use by in-house designers, and more. The first wave of apps to feature the consent process included Sage’s mPower app developed to study Parkinson’s disease, which enrolled more than 16,000 participants.Reference Bot24

Apple incorporated significant elements of the initial toolkit into consent-related templates of its ResearchKit framework in 2015.25 Google did not release its own research app framework for Android, but the community-led ResearchStack open source project replicates nearly all of its key functions including informed consent.26 The adoption and dissemination of UX elements in these toolkits (and the implicit endorsement of at least Apple’s app store) may help incent researchers of all types to use design to communicate key IC concepts.

The move to eIC also opens up new fronts for researcher misbehavior in enrollment. User interfaces that abstract key elements of the consent form can also obscure or downplay key elements.27 Designers in consumer technology regularly use “dark patterns” to entice users to subscribe, spend money, share data, or otherwise make choices without full understanding.Reference Gray28 These patterns may correlate with higher “engagement” numbers29 — for unregulated researchers, higher enrollment numbers — and thus may be attractive to developers who do not need to worry about regulation, although recent legislation introduced in the US Senate attempts to close that loophole.30

These consumer technology policies operate under a completely different framework from informed consent for regulated research: the Fair Information Practice Principles (FIPPs). Although they contain the word “consent,” the definition is quite divergent in the consumer context, and consent is tied to “notice,” and there are few regulatory requirements to achieve notice and consent. Online services frequently require a simple consent via users clicking a button to indicate they have been given notice and agree to whatever terms the site or app proposes, despite evidence that many of those who attempt to read the terms only skim the text.

These app frameworks further accelerated the adoption of eIC, with more than 30 research apps33 launched in 2015-2017 from academic medical centers, nonprofit organizations, patient groups, and pharmaceutical companies.34 Nearly all of the first adopters of eIC frameworks conducted research in regulated contexts. However, given the cost and complexity of implementing eIC frameworks, it is possible that many unregulated app developers to come will simply choose to state clearly that they are not subject to either HHS or FDA regulations, and implement typical consent and privacy policies from consumer technology.

Box 2 What Is essential or Informed?

There is little consensus on either what informedness means,Reference Sand, Stein and Loge31 or on what kinds of information are “essential” for participants to understand. Research has noted that expert groups reach one definition of “essential” information in an ideal context, but even the same expert group will redefine essentiality when faced with participants failing to correctly answer questions.Reference Beskow and Weinfurt32 Thus, even as we change mediums, and who is doing research is opening up as never before, we still don’t have consensus within the research community about informedness or what is essential for participants to “know.”

These consumer technology policies operate under a completely different framework from informed consent for regulated research: the Fair Information Practice Principles (FIPPs). Although they contain the word “consent,” the definition is quite divergent in the consumer context, and consent is tied to “notice,” and there are few regulatory requirements to achieve notice and consent. Online services frequently require a simple consent via users clicking a button to indicate they have been given notice and agree to whatever terms the site or app proposes,Reference Obar and Oeldorf-Hirsch35 despite evidence that many of those who attempt to read the terms only skim the text.Reference Steinfeld36

eIC in an Evolving Research Ecosystem

Apple and Google do not maintain formal lists of mobile research apps (or if those apps are in regulated research or not), but a 2016 review of consent in 24 of these mobile research apps found wide variation in how informed consent processes implement the eIC guidance, including an element not anticipated by the guidance — easy, rapid, and widespread sharing of data beyond the initial study.Reference Moore37 Data sharing can take many forms, from “open” data that can be downloaded and redistributed without restriction to a vast array of methods for collaboration,38 and it can be difficult to fully inform participants of risks given that many risks will be emergent from the distribution itself. This onward sharing opens up another set of requirements for IRBs and for researchers to contemplate, and for eIC to address as part of an ongoing relationship with the participant.

Sharing liberalized data from electronically mediated research can also mean returning participant-level data to the participants themselves. This is a powerful, broad, medical data trend, driven by patientsReference Benham-Hutchins39 and increasingly supported by policy.40 This kind of data return in research is often held up as a form of returning value, although research indicates this is not always clear to all participants.Reference Wilkins41 Others have called for a deeper process of acknowledgement of participants in this context,42 but a larger study found that while respondents highly valued genetic results on medical response, predicting disease, and information about clinical trials and data use, the information of value varied widely across demographic variables.43

This growth in implementation at the industry level masks the complexity of enrolling and retaining participants, which has long-term implications on app design and thus on informed consent. While these are not novel risks and exist in traditional studies, eIC allows for the entrance of scale and speed far beyond traditional consent.

This sharing potential for data from mobile devices offers new complexities in how data are analyzed, and perhaps more importantly, re-analyzed. A multi-case study found that the text of the consent documents does not always keep up with the technology, so that studies originally intending to use Global Positioning System (GPS) data tracking participant movement expanded to include Global Information System (GIS) data. Two cases of this particular expansion transformed data that can easily be obscured by simply tracking total movement regardless of location into data that could be tagged directly to elements on a map, vastly increasing the potential for re-identification with no attempt to reconsent.Reference Nebeker, Linares-Orozco and Crist44

Wearable devices in turn connect to mobile phones, and themselves represent other data collection technologies that may be leveraged for research. This can create a daisy chain of contracts (e.g., Privacy Polices, Terms of Service, Terms of Use) for commercial terms of service that a participant has to accept in order to join a study, which proliferate with every new measurement tool added over time. Each of these contracts holds the potential to complicate or counteract informed consentReference Schairer, Rubanovich and Bloss45 and analysis of their terms indicates no meaningful commitments to privacy.Reference Greig and Irvine46 These contracts are notably long, densely written,Reference Reidenberg47 and rarely read.48 Various groups have in reaction released open source iconographic labels for privacy policies,49 nutrition labels for apps,Reference Gropper50 standard design “patterns” for good privacy policies,51 and AI-enabled privacy policy interpreter software.52 However, the current state of practice seems little impacted by these efforts to improve understandability, directly countervailing the informing requirement of informed consent.

As a further conflation, there may even be reasons to obscure some forms of data collection and analysis in order to generate more accurate observations of “natural” behavior.Reference Vaidhyanathan and Bulock53 These interacting pressures may increase the attractiveness of dark patterns to researchers who find significant drop-offs in mobile research studies after enrollment. And the desire to understand how participants, wearable technology, and the environment may further lead to an expansion of dark patterns from people’s online behavior into people’s relationship to their increasingly digital environment. For example, a person’s mobile phone might send a signal to wireless sensors in a grocery store, or to a large display nearby, which might in turn change their behavior to deliver a personalized advertisement. The resulting dark patterns are explained by proxemics theoryReference Greenberg54 and may form new vectors for risk, harm, and particularly re-identification.

eIC: Growth and Issues of Scale

eIC is gaining adoption quickly. A 2017 industry survey projected years of 30% compound annual growth of eIC in the pharmaceutical and biotech industry, with more than 80% of the industry projected to implement eIC by 2020. Notably, 76% of survey respondents wanted to build in-house (i.e., without relying on external vendors) and 80% said they want to deploy eIC to replace? supplement? traditional on-site informed consent.55

However, this growth in implementation at the industry level masks the complexity of enrolling and retaining participants, which has long-term implications on app design and thus on informed consent. While these are not novel risks and exist in traditional studies, eIC allows for the entrance of scale and speed far beyond traditional consent. For example, a traditional cohort study like the Framingham Heart Study has enrolled over 15,000 participants across six cohorts over 55 years.56 Leveraging eIC and an enrollment app, the AllofUs Research Program explicitly designed along the Framingham format57 enrolled 283,000 participants in 18 months.58 Early data on retention from these app-based studies show signifi-cant challenges: the Stanford My Heart Counts study enrolled more than 50,000 participants, but saw only around 10% actually complete physical tasks, with a “marked dropoff in the initial 7 day monitoring period.”Reference McConnell59 A more systematic review found similar drop-offs persistently across more than 100,000 mobile participants, with stronger engagement predicted by either physician referral or payment than any existing design approaches.Reference Pratap60

As with traditional consent, the ways in which study data are analyzed represents a vector for both benefits and harms. But as with enrollment, the sheer scale of systems that embed eIC creates different pressures on data analysis. Data analysis — or data science — as practiced in unregulated technology depends deeply on experimental processes relabeled as “A/B testing” by which companies study their customers.Reference Meyer61 But, as with unregulated mobile research, much data science falls largely outside traditional biomedical ethics and some data science practitioners choose to reject regulation outright.Reference Metcalf and Crawford62 Human psychology also comes into play here: research reports that for at least some portion of the population, people find being part of an experiment to define a better policy worse than either no-evidence policy alone.Reference Heck63 eIC thus sits inside a larger cultural landscape of data science that is in constant flux, and is subject to the rapid evolution of how data is analyzed outside the clinical context.

Beyond data science, eIC must also grapple with the larger environment. Data collected from Facebook powered a variety of misuses in the 2016 election,Reference Cadwalladr and Graham-Harrison64 increasing public awareness and sensitivity to data science. But this public awareness is deeply contextual; studies of tweets about the scandals in countries tied to high levels of “power distance” show a greater acceptance of authority, and a larger blame on individuals, than in countries with low levels of power distance.Reference González65 Data are being used and arguably misused in areas to automate hiring practices, with known racist and misogynist outcomes due to legacy training data.Reference Ajunwa, Crawford and Schultz66 Immigration enforcement in the United States is actively seeking social media data and other data to target undocumented immigrants,Reference Funk67 and aiming to collect genetic data from detained immigrants.Reference Dickerson68 Meaningful eIC for both regulated and unregulated research should describe these risks and the processes and policies in place to mitigate them.

eIC embedded into unregulated mobile research also risks interacting with the long-running use of the internet to profit from false health information. From nearly the beginning of the web, through to today,69 hucksters have used technology to advertise fake cures for cancer and other diseases.70 It is not a big jump from using Facebook to using an eIC framework built on standard apps to “healthify” what is actually a commercial data grab, or towards marketing a look-alike unregulated research app to support false health claims.

The expansion of communication to screens in eIC represent a challenge for consent anticipated in the FDA/HHS guidance. Reference Boyd and Crawford71 How might one best describe a research study so that a prospective participant can make an informed choice, in the absence of a research coordinator? This textual challenge interacts with the requirements for eIC, resulting in a new set of costs and skills needed to launch a study. Of perhaps greatest concern, there is an explicit risk of transferring already seen dark patterns in electronic engagement to eIC, weighing participant enrollment over participant informedness. Regulated researchers, at least, have the intersecting incentives of the research institutions whose norms and structure at least create “strong incentives to protect research participants from harm and to engage with potential participants to develop trust regardless of what regulations require.”72 However, when unregulated researchers apply a model derived from modern consumer A/B testing like Facebook, but in an app that looks and feels like clinical research, the essential drive to inform may be lost from eIC.Reference Fiske and Hauser73

Acknowledgments

This article is supported by the AllofUs Research Program, and the Helmsley Charitable Trust. Thanks to the Sage Bionetworks Governance team, particularly Megan Doerr, for careful review and commenting on early drafts of this article.

Research on this article was funded by the following grant: Addressing ELS Issues in Unregulated Health Research Using Mobile Devices, No. 1R01CA20738-01A1, National Cancer Institute, National Human Genome Research Institute, and Office of Science Policy and Office of Behavioral and Social Sciences Research in the Office of the Director, National Institutes of Health, Mark A. Rothstein and John T. Wilbanks, Principal Investigators.

Footnotes

The author has no conflicts of interest to disclose.

References

The Electronic Signatures in Global and National Commerce Act (E-Sign Act), available at <https://www.fdic.gov/regulations/compliance/manual/10/x-3.1.pdf> (last visited February 14, 2020).+(last+visited+February+14,+2020).>Google Scholar
Reed, W., Carroll, J.A. S., and Agramonte, A., “The Etiology of Yellow Fever: An Additional Note,” Journal of the American Medical Association 36, no. 7 (1901): 431-440.CrossRefGoogle Scholar
21 C.F.R. § 11.1(a).Google Scholar
21 C.F.R. § 50.27(a).Google Scholar
21 C.F.R. § 50.27(a).Google Scholar
45 C.F.R. § 46.116 and 21 CFR § 50.25.Google Scholar
45 C.F.R. § 46.116(b)(5) and 21 CFR § 50.25(b)(5).Google Scholar
45 C.F.R. § 46.116 and 21 CFR § 50.20.Google Scholar
21 C.F.R. § 11.100(b).Google Scholar
Use of Electronic Informed Consent in Clinical Investigations — Questions and Answers, available at <https://www.fda.gov/regulatory-information/search-fda-guidance-documents/use-electronic-informed-consent-clinical-investigations-questions-and-answers> (last visited February 14, 2020).+(last+visited+February+14,+2020).>Google Scholar
Clayton, E.W., “The Unbearable Requirement of Informed Consent” (2019), a comment on “Exploring Understanding of ‘Understanding’: The Paradigm Case of Biobank Consent Comprehension,” American Journal of Bioethics 19, no. 5 (2019): 6-18.Google Scholar
45 C.F.R. § 46.115; 21 C.F.R. § 56.115.Google Scholar
Collaborative Projects — EDM Forum, available at <https://www.edmforumresearchportal.org/edmhome/collaborate/collaborativeprojects> (last visited October 29, 2019).+(last+visited+October+29,+2019).>Google Scholar
Wilbanks, J., “Portable Approaches to Informed Consent and Open Data,” Privacy, Big Data, and the Public Good: Frameworks for Engagement 1 (2014): 234-252.CrossRefGoogle Scholar
Wilbanks, J., “Design Issues in E-Consent,” Journal of Law, Medicine & Ethics 46, no. 1 (2018): 110-18.CrossRefGoogle Scholar
Doerr, M., Suver, C., and Wilbanks, J., “Developing a Transparent, Participant-Navigated Electronic Informed Consent for Mobile-Mediated Research,” available at SSRN: https://ssrn.com/abstract=2769129> (last visited February 14, 2020).+(last+visited+February+14,+2020).>Google Scholar
GitHub —Sage-Bionetworks, “PCC-Toolkit,” available at <https://github.com/Sage-Bionetworks/PCC-Toolkit> (last visited February 14, 2020).+(last+visited+February+14,+2020).>Google Scholar
Elements of Informed Consent, available at <https://sage-bionetworks.org/tools_resources/elements-of-informed-consent/> (last visited February 14, 2020).+(last+visited+February+14,+2020).>Google Scholar
Bot, B.M. et al., “The mPower Study, Parkinson Disease Mobile Data Collected Using ResearchKit,” Scientific Data 3 (2016): 1-9.CrossRefGoogle Scholar
Developer Human Interface Guidelines, available at <https://developer.apple.com/design/human-interface-guidelines/researchkit/overview/introduction/> (last visited February 14, 2020).+(last+visited+February+14,+2020).>Google Scholar
ResearchStack, available at <http://researchstack.org/> (last visited February).+(last+visited+February).>Google Scholar
See Wilbanks, supra note 19.Google Scholar
Gray, C.M. et al., “The Dark (Patterns) Side of UX Design,” in Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, ACM (2018): 534.CrossRefGoogle Scholar
Senators Introduce Bipartisan Legislation to Ban Manipulative “Dark Patterns,” available at <https://www.fischer.senate.gov/public/index.cfm/2019/4/senators-introduce-bipartisan-legislation-to-ban-manipulative-dark-patterns> (last visited October 29, 2019).+(last+visited+October+29,+2019).>Google Scholar
Sand, K., Stein, K., and Loge, J.H., “The Understanding of Informed Consent Information—Definitions and Measurements in Empirical Studies,” AJOB Primary Research 1, no. 2 (2010): 4-24.CrossRefGoogle Scholar
Beskow, L.M. and Weinfurt, K.P., “Exploring Understanding of ‘Understanding’: The Paradigm Case of Biobank Consent Comprehension,” American Journal of Bioethics 19, no. 5 (2019): 6-18.CrossRefGoogle Scholar
ResearchKit Apps and Studies Launched in 2016, available at <http://blog.appliedinformaticsinc.com/researchkit-appsand-studies-launched-in-2016/> (last visited October 29, 2019).+(last+visited+October+29,+2019).>Google Scholar
FocalView — Redefining Ophthamology Clinical Research, available at <http://researchkit.org/blog.html#article-32> (last visited October 29, 2019).+(last+visited+October+29,+2019).>Google Scholar
Obar, J.A. and Oeldorf-Hirsch, A., “The Biggest Lie on the Internet: Ignoring the Privacy Policies and Terms of Service Policies of Social Networking Services,” Information, Communication & Society (2018): 1-20.CrossRefGoogle Scholar
Steinfeld, N., “I Agree to the Terms and Conditions: (How) Do Users Read Privacy Policies Online? An Eye-tracking Experiment,” Computers in Human Behavior 55-Part B (2016): 992-1000.CrossRefGoogle Scholar
Moore, S. et al., “Consent Processes for Mobile App Mediated Research: Systematic Review,” JMIR mHealth and uHealth 5, no. 8 (2017): e126.CrossRefGoogle Scholar
See Doerr, supra note 21.Google Scholar
Benham-Hutchins, M. et al., “‘I Want to Know Everything’: A Qualitative Study Of Perspectives from Patients with Chronic Diseases on Sharing Health Information During Hospitalization,” BMC Health Services Research 17, no. 1 (2017): 529.CrossRefGoogle Scholar
A Big Step Towards Giving Patients Control Over Their Health Care Data, available at <https://hbr.org/2019/03/a-big-step-toward-giving-patients-control-over-their-health-care-data> (last visited February 14, 2020).+(last+visited+February+14,+2020).>Google Scholar
Wilkins, C. H. et al., “Understanding What Information Is Valued by Research Participants, and Why,” Health Affairs 38, no. 3 (2019): 399-407.CrossRefGoogle Scholar
See Clayton, supra note 15.Google Scholar
Wilkins et al., supra note 41.Google Scholar
Nebeker, C., Linares-Orozco, R., and Crist, K., “A Multi-Case Study of Research Using Mobile Imaging, Sensing and Tracking Technologies to Objectively Measure Behavior: Ethical Issues and Insights to Guide Responsible Research Practice,” Journal of Research Administration 46, no. 1 (2015): 118-137.Google Scholar
Schairer, C.E., Rubanovich, C.K., and Bloss, C.S., “How Could Commercial Terms of Use and Privacy Policies Undermine Informed Consent in the Age of Mobile Health?” AMA Journal of Ethics 20, no. 9 (2018): 864-872.Google Scholar
Greig, P. and Irvine, J., “Privacy Implications of Wearable Health Devices,” in Proceedings of the 7th International Conference on Security of Information and Networks, ACM (2014): at 117.Google Scholar
Reidenberg, J. R. et al., “Disagreeable Privacy Policies: Mismatches Between Meaning and Users’ Understanding,” Berkeley Technology Law Journal 30, no. 1 (2015): 39-88.Google Scholar
See Obar, supra note 35.Google Scholar
“Privacy Icons: Alpha Release,” available at <http://www.azarask.in/blog/post/privacy-icons/> (last visited October 29, 2019).+(last+visited+October+29,+2019).>Google Scholar
Gropper, A., “Patient Privacy Rights Information Governance Label,” August 19, 2019, available at <https://ssrn.com/abstract=3439701> (last visited February 14, 2020).CrossRef+(last+visited+February+14,+2020).>Google Scholar
“Privacy Toolkit,” available at <https://privacytoolkit.sagebio-networks.org/> (last visited February 14, 2020).+(last+visited+February+14,+2020).>Google Scholar
“AI-Powered Privacy Policies,” available at <https://pribot.org/> (last visited February 14, 2020).+(last+visited+February+14,+2020).>Google Scholar
Vaidhyanathan, S. and Bulock, C., “Knowledge and Dignity in the Era of ‘Big Data,’” The Serials Librarian 66, no. 1-4 (2014): 49-64.CrossRefGoogle Scholar
Greenberg, S. et al., “Dark Patterns in Proxemic Interactions: A Critical Perspective,” in Proceedings of the 2014 conference on Designing interactive systems, ACM (2014): 523-532.CrossRefGoogle Scholar
“CRF Health: eConsent Adoption to Reach 82% by 2020,” available at <https://www.outsourcing-pharma.com/Article/2017/03/16/eConsent-adoption-to-reach-82-by-2020-CRF-Health> (last visited February 14, 2020).+(last+visited+February+14,+2020).>Google Scholar
“About the Framingham Heart Study Participants,” available at <https://www.framinghamheartstudy.org/about-the-fhs-participants/> (last visited February 14, 2020).+(last+visited+February+14,+2020).>Google Scholar
The Director of the NIH Lays Out His Vision of the Future of Medical Science,” available at <https://time.com/5709207/medical-science-age-of-discovery/> (last visited February 14, 2020).+(last+visited+February+14,+2020).>Google Scholar
“Data — AllofUs Research Hub,” available at <https://www.researchallofus.org/data/> (last visited February 14, 2020).+(last+visited+February+14,+2020).>Google Scholar
McConnell, M.V. et al., “Feasibility of Obtaining Measures of Lifestyle From a Smartphone App: The MyHeart Counts Cardiovascular Health Study,” JAMA Cardiology 2, no. 1 (2017): 67-76.CrossRefGoogle Scholar
Pratap, A. et al., “Indicators of Retention in Remote Digital Health Studies: A Cross-Study Evaluation of 100,000 Participants,” available at <https://arxiv.org/pdf/1910.01165> (last visited February 14, 2020).CrossRef+(last+visited+February+14,+2020).>Google Scholar
Meyer, M.N., “Ethical Considerations When Companies Study — and Fail to Study — Their Customers,” The Cambridge Handbook of Consumer Privacy (2018): 207.CrossRefGoogle Scholar
Metcalf, J. and Crawford, K., “Where Are Human Subjects in Big Data Research? The Emerging Ethics Divide,” Big Data & Society 3, no. 1 (2016): 1-14.CrossRefGoogle Scholar
Heck, P.R. et al., “Sometimes People Dislike Experiments More than They Dislike Their Worst Conditions: Within-Subjects Evidence for ‘Experiment Aversion’ and the A/B Effect,” PsyArXiv (2019), available at <https://psyarxiv.com/jmxgc> (last visited February 14, 2020).+(last+visited+February+14,+2020).>Google Scholar
Cadwalladr, C. and Graham-Harrison, E., “Revealed: 50 Million Facebook Profiles Harvested for Cambridge Analytica in Major Data Breach,” The Guardian (March 17, 2018).Google Scholar
González, F. et al., “Global Reactions to the Cambridge Analytica Scandal: An Inter-Language Social Media Study,” WWW ’19 Companion, available at <https://faculty.washington.edu/aragon/pubs/LA_WEB_Paper.pdf> (last visited February 14, 2020).CrossRef+(last+visited+February+14,+2020).>Google Scholar
Ajunwa, I., Crawford, K., and Schultz, J., “Limitless Worker Surveillance,” California Law Review 105, no. 3 (2017): 735-776.Google Scholar
Funk, M., “How ICE Picks Its Targets in the Surveillance Age,” The New York Times Magazine, available at <https://www.nytimes.com/2019/10/02/magazine/ice-surveillance-deportation.html> (last visited February 14, 2020).+(last+visited+February+14,+2020).>Google Scholar
Dickerson, C., “U.S. Government Plans to Collect DNA From Detained Immigrants,” New York Times, available at <https://www.nytimes.com/2019/10/02/us/dna-testing-immigrants.html> (last visited February 14, 2020).+(last+visited+February+14,+2020).>Google Scholar
Geekwire, “FTC Files First-Ever Charges Against Company Accused of Paying for Fake Amazon Reviews,” available at <https://www.geekwire.com/2019/ftc-files-first-ever-charges-company-accused-paying-fake-amazon-reviews/> (last visited February 14, 2020).+(last+visited+February+14,+2020).>Google Scholar
“New York Arm-wrestling Legend and His Mom Arrested for Selling ‘Apricots From God’ as Bogus Cancer Cure,” The Washington Post, available at <https://www.washingtonpost.com/nation/2019/10/24/jason-vale-apricot-seeds-cancer-cure-arrest-arm-wrestling/> (last visited February 14, 2020).+(last+visited+February+14,+2020).>Google Scholar
Boyd, D. and Crawford, K., “Critical Questions for Big Data: Provocations for a Cultural, Technological, and Scholarly Phenomenon,” Information, Communication & Society 15, no. 5 (2012): 662-679.CrossRefGoogle Scholar
See Clayton, supra note 15.Google Scholar
Fiske, S.T. and Hauser, R.M., “Protecting Human Research Participants in the Age of Big Data,” Proceedings of the National Academy of Sciences of the United States of America 111, no. 38 (2014): 13675-13676.CrossRefGoogle Scholar