Hostname: page-component-6bf8c574d5-b4m5d Total loading time: 0 Render date: 2025-02-23T17:22:16.992Z Has data issue: false hasContentIssue false

There Oughta Be a Law: When Does(n’t) the U.S. Common Rule Apply?

Published online by Cambridge University Press:  01 January 2021

Rights & Permissions [Opens in a new window]

Abstract

Using mobile health (mHealth) research as an extended example, this article provides an overview of when the Common Rule “applies” to a variety of activities, what might be meant when one says that the Common Rule does or does not “apply,” the extent to which these different meanings of “apply” matter, and, when the Common Rule does apply (however that term is defined), how it applies.

Type
Symposium Articles
Copyright
Copyright © American Society of Law, Medicine and Ethics 2020

I. Introduction

Many people have a sense that, in the modern era, “human subjects research” is, happily, well regulated. That sense might well include the belief that the law ensures that research with humans always involves prospective approval by an ethics committee called an Institutional Review Board (IRB) and the informed consent of participants. The reality is more complicated. Strictly speaking, the U.S. Federal Policy for the Protection of Human Subjects — better known as the Common Rule1 — applies only to research that is conducted or funded by (most parts of) the federal government. Even then, the Common Rule does not regulate all unexpected or potentially risky uses of human data or all interactions or interventions with humans. Nor does it require all human subjects research to be approved by an IRB or to proceed only with participant consent. As a result, especially with respect to some kinds of activities, both the portion of what one might think of as human subjects research that is regulated and the gap between “regulated” and “unregulated” research is smaller than what one might expect.

Using the emerging field of mobile health (mHealth) research as an extended example, this article provides an overview of when the Common Rule “applies” to a variety of activities, what might be meant when one says that the Common Rule does or does not “apply” and the extent to which these different meanings matter, and, when the Common Rule does apply (however that term is defined), how it applies.

II. When Does the Common Rule Apply?

A. Two Ways in Which the Common Rule Can “Apply”

1. direct application to federally-funded research

By its terms, the Common Rule applies to all non-“exempt” “research” involving “human subjects” that is conducted or funded by any of the 18 federal departments and agencies that are either signatories to the Common Rule or follow those regulations pursuant to presidential Executive Order (hereinafter, Common Rule departments).2 Any institution, domestic or foreign, that is “engaged in” non-exempt human subjects research funded by a Common Rule department must submit to the Office for Human Research Protections (OHRP) a Federalwide Assurance (FWA). The FWA is the means by which such an institution provides written assurance to the Common Rule department that it will comply with the Common Rule. The FWA is not study-specific; once executed, it constitutes a promise by the institution (hereinafter an “assured institution”) to apply the Common Rule to all non-exempt human subjects research in which the institution is engaged that is supported by a Common Rule department.

2. proposals to extend and contract direct application to non-federally funded research

For years — but apparently not for much longer (see below) — the FWA form has invited domestic institutions to voluntarily extend the scope of their commitment to cover all non-exempt human subjects research in which the institution is engaged, regardless of the source of funding for the research (if any).3 Among members of the regulated community, this is known as “checking the box” and the large majority of assured institutions have historically done so,Reference Meyer4 although with a downward trend over time.5 Assured institutions that agree to expand their commitment and are found to be noncompliant are subject to OHRP’s compliance oversight authority and face the same penalties (discussed below), regardless of whether the study in question is funded by a Common Rule department or not. As a result, although checking the box is voluntary, once checked, the Common Rule effectively “applies directly” to all non-exempt human subjects research in which the institution is engaged.

That said, this voluntary policy does not capture non-federally funded research conducted by assured institutions that decline to check the box, as well as all human subjects research conducted by non-assured institutions, such as many non-profits and industry organizations. Over the decades, various commissions and commentators have lamented the fact that the Common Rule applies directly only to research conducted or supported by Common Rule departments.Reference Federman, Hanns and Rodriguez6 From the perspective of an individual who might be harmed by research, after all, it hardly matters who funded the harm.

Using the emerging field of mobile health (mHealth) research as an extended example, this article provides an overview of when the Common Rule “applies” to a variety of activities, what might be meant when one says that the Common Rule does or does not “apply” and the extent to which these different meanings matter, and, when the Common Rule does apply (however that term is defined), how it applies.

During the years-long process of revising the Common Rule, federal regulators explored a compromise strategy of requiring assured institutions to apply the Common Rule to all non-exempt human subjects research in which they are engaged, regardless of funding. In other words, once an institution accepts any research funds from a Common Rule department, the Common Rule would apply to all non-exempt human subjects research in which that institution was engaged. This proposal was included in a 2011 advance notice of proposed rulemaking (ANPRM),7 but officially submitted public comments were generally less than enthusiastic. In particular, many argued that this dramatic expansion of coverage ignored the ANPRM’s stated goal of balancing increased protections of participants with reduced burdens on researchers, since the expansion would apply to both low- and high-risk research. As a result, the proposal was dropped.8

Instead, in a 2015 notice of proposed rulemaking (NPRM), regulators proposed extending the Common Rule to a narrower class of non-federally funded research that, at first blush, would seem to include only high-risk research. Specifically, the NPRM proposed that the Common Rule be revised to apply to all “clinical trials” in which domestic assured institutions were engaged, regardless of funding, unless the trials were already subject to FDA regulation.9 At the same time, the NPRM announced regulators’ intent to enact a non-regulatory change to the FWA forms: no longer would institutions have the option of checking the box.10 Thus, the NPRM proposed that assured institutions be required to apply the Common Rule to clinical trials, but, on the other hand, the federal government would no longer have compliance oversight of any other kinds of non-federally funded research. Institutions would remain free to apply the Common Rule to whatever research they liked, and impose whatever penalties for noncompliance they like, as matters of institutional policy (see sections 3 and 4, below).

The NPRM, however, defined “clinical trials” exceptionally broadly to include, essentially, all interventional research.11 Again, the “slim majority” of public comments opposed the proposal.12 Regulators conceded commenters’ claim that the proposal would have failed to accomplish regulators’ goal of only “cover[ing] the most risky types of research … given [that] the definition of ‘clinical trial’ … encompassed research that would pose no more than minimal risk to subjects.”13 Other commenters cast doubt — not without reason14 — on whether the Common Rule’s enabling statute, the Public Health Service Act, permitted regulators to extend the Common Rule to non-federally funded research at all15 and argued that any requirement that nonfederally-funded research be regulated must come from Congress.16

Thus, when regulators announced the final rule in 2017, this narrower proposal, too, had been dropped, at least for the time being.17 At the same time, however, regulators announced that they still plan to discontinue the portion of the FWA process in which assured institutions are invited to extend the Common Rule (and OHRP oversight) to all of their research.18 In the near future, then, the only activity to which the Common Rule will directly apply is non-exempt human subjects research funded or conducted by a Common Rule department.

3. indirect application

Although the Common Rule has limited direct application, there are several other ways in which it meaningfully applies indirectly. First, some states have laws that impose some or all of the Common Rule on some or all human subjects research conducted in their jurisdiction.Reference Tovino19

Second, virtually all academic institutions apply the Common Rule to all human subjects research in which they are engaged as a matter of institutional policy, even if they do not officially check the box.20 Such institutional policy is often incorporated by reference into employment contracts, making compliance a matter of employment law.

Third, non-academic research institutions (e.g., nonprofits, industry) sometimes submit their human subjects research to independent IRBs for review, either as a matter of institutional policy or on an ad hoc basis. For instance, at least some of the research conducted by 23andMe, Fitbit, and Microsoft are reviewed by independent IRBs.Reference Hernandez and Seetharaman21 Companies including Facebook, Google, Microsoft, and Fitbit have instead — or in addition — established their own internal review bodies,Reference Jackman, Kanerva, De Mooy and Yuen22 although, among those bodies, some look more like a Common Rule IRB than others and some apply rules and norms that adhere more closely to the Common Rule than others.Reference Meyer, Selinger, Polonetsky and Tene23 The Consumer Privacy Protection Act of 201524 similarly would have protected consumer data while permitting research and other non-contextual uses of consumer data if a “privacy review board” reviewed the proposed use and determined that it met certain criteria,25 but the bill did not make it out of the Senate Committee on the Judiciary.

Finally, sometimes gatekeepers downstream from the conduct of research make something valuable to researchers conditional on IRB oversight. For instance, journals often require researchers who wish to publish their results to indicate that an IRB either reviewed the reported research or determined that it was exempt or constituted non-human subjects research.26 Similarly, Apple requires that apps conducting “health-related human subject research” be “approv[ed by] an independent ethics review board” and evidence of this review must be made available on request.27 Apple also requires that such apps “must obtain consent from participants or, in the case of minors, their parent or guardian,” and the company specifies several elements of information that must be disclosed in the consent that resemble the Common Rule elements.28

4. the consequences of direct versus indirect application of the common rule

Whether one wishes to deem research to which the Common Rule indirectly applies “unregulated”Reference Rothstein29 depends on what aspects of regulation, and what non-regulatory incentives, one finds meaningful. The penalties for noncompliance when the Common Rule merely indirectly applies might be more than one might imagine, while those for noncompliance when the Common Rule does directly apply might be less, for several reasons.

First, the Common Rule affords no private right of action for research participants who suffer any form of research-related injury — whether a physical injury or a “dignitary harm” from being enrolled in research without voluntary, informed consent — even when the research is conducted or supported by a Common Rule department (i.e., when the Common Rule applies directly).30

Instead, enforcement of compliance with the Common Rule rests with OHRP, which has the statutory responsibility for developing a process for receiving allegations of noncompliance and “taking appropriate action.”31 Pursuant to that authority, when OHRP receives “substantive written allegations or indications of noncompliance” with respect to research under its jurisdiction — i.e., non-exempt human subjects research to which the Common Rule directly applies or in which an assured institution that checked the box is engaged — it may, in its discretion, open a for-cause investigation.32 OHRP may also, instead, “choose to use other mechanisms” to respond to such allegations.33 When OHRP does open noncompliance investigations, they are primarily “paper investigations”: OHRP and the relevant institution exchange a series of letters in which OHRP describes the allegations and the institution provides written responses. On-site visits are relatively uncommon with for-cause investigations. OHRP then issues one or more determination letters specifying any instances of noncompliance. A finding of noncompliance can trigger (sometimes with input from other agencies) any of several responses (in increasing order of severity):

  • Corrective action plan: the institution is required to develop and implement a corrective action plan, such as providing additional education or training of IRB members or staff)Reference Ramnath34;

  • Restricted FWA: OHRP restricts or places conditions on the institution’s approved Federalwide Assurance, such as requiring quarterly reports to OHRP, requiring prior OHRP approval of some or all research subject to the FWA, or suspending a particular study until corrective actions have been taken;

  • Suspended FWA: all research conducted under the FWA is suspended until the FWA is reinstated;

  • Suspended institution or investigator: an institution or an investigator is temporarily suspended or permanently removed from participation in specific studies, or grant study sections are notified of an institution’s or an investigator’s past noncompliance prior to review of new grants;

  • Government-wide debarment: in order to protect the public interest, the institution or one or more of its investigators is debarred from receiving any federal research funds.35

Final determination letters are published on OHRP’s website.36 OHRP also conducts an average of two or three not-for-cause oversight evaluations of institutional human research protection programs (HRPPs) per year, selecting institutions on the basis of a variety of factors.37

Although very serious sanctions for noncompliance with the Common Rule — such as temporarily suspending all FWA-covered research at an institution and government-wide debarment — are possible, by all appearances, by far the most common responses to noncompliance investigations are corrective action plans, which often amount to revision of an institution’s standard operating procedures or remedial education or training of investigators or IRB staff.

Moreover, as noted above, when presented with written allegations of noncompliance, OHRP retains discretion whether to even open an investigation or not. Between 2000 and 2015, OHRP received an average of 123 complaints per year, but while it conducted 60 compliance evaluations in 2000, it conducted only an average of 5 evaluations per year from 2010 to 2015.38 That trend has continued to the current date.39 Similarly, assured institutions are required by regulation to report certain incidents, such as adverse events, to OHRP. Those incidents tripled between 2000 and 2015, but of the several hundred per year OHRP reviewed, it responded by initiating a compliance evaluation in “only a few cases,” such as when a research participant died.40

It is not possible to know whether the relatively low number of opened investigations reflects a lax view of compliance oversight by OHRP without reviewing the allegations the agency receives, which are not public.41 However, OHRP has provided other reasons for this decline, including its practice of formally investigating only the lead institution in multiple-site studies and its use of alternative mechanisms, such an informally resolving complaints and approving of corrective actions without opening an investigation or issuing a determination letter.42 Moreover, in recent years, OHRP has come to view itself as more of a policy body than a compliance body, with any compliance evaluations that are conducted and the resulting published determination letters seen as educational opportunities for the research community at large to improve research oversight.43 As a result, OHRP has decided “to initiate fewer compliance evaluations both to better leverage its limited resources and to focus the evaluations on broad policy issues in protections for human subjects.”44 Still, influential commenters and bodies have argued that OHRP’s compliance efforts are subpar and in fact do not provide meaningful protection to research participants.Reference Delfino45

Conversely, sanctions for noncompliance with the Common Rule, even when it applies only indirectly, can be substantial and have a significant deterrent effect. HRPPs can and do impose on researchers most of the sanctions that OHRP is authorized to impose, including requiring investigators to develop and implement corrective action plans and placing additional constraints on, or suspending, particular studies or investigators. Institutions have no power to literally issue government-wide debarments, of course, but they do have the power to remove an investigator’s ability to apply for any external research grants or to conduct any human subjects research, which has more or less the same effect, and, unlike OHRP, they have the authority to terminate the employment of a repeat offender.

As for journals’ common requirement of IRB review or determination regarding submitted work, publications, too, are of obvious importance to researchers who work in academic settings; indeed, they are the coin of that realm. But they are also important to, e.g., data scientists who work in industry but often have careers that span academia and industry. And all app developers aim to distribute their apps to iOS users via Apple’s App Store.

In any case, to determine whether the Common Rule applies — whether directly or indirectly — both the actor and the activity must be covered.

B. Is the Actor Covered?

1. actors who “engage” common rule institutions

As explained above, the FWA is a contract between the federal government and an institution, not an individual researcher.46 In particular, it is a contract between the government and an institution that is “engaged in research” by virtue of what its employees or agents do.47 Even when the Common Rule applies only indirectly (i.e., a research study is not federally funded), most IRBs consider themselves to have jurisdiction over the study only if the institution is “engaged;” that is, they adopt the Common Rule’s jurisdictional provision. For the Common Rule to “apply,” then, a non-exempt human subjects research study generally must have a sufficient nexus to at least one institution that has committed to the regulations (directly or indirectly).

The Common Rule does not define what it means for an institution to be “engaged in research.” However, OHRP guidance provides a non-exhaustive list of research-related activities that an institution’s employees or agents may participate in without thereby “engaging” that institution in research.48 Human subjects research involves a trajectory of activities, from conception of the research question(s) and study design to (sometimes) consenting participants and data collection to dissemination of results, and all of this activity is often fragmented across multiple sites. Under OHRP guidance, employees or agents of one “Common Rule institution” can participate in some aspects of that trajectory without triggering their IRB’s jurisdiction.

For example, recruiting prospective participants (but stopping short of facilitating the consent process) does not, itself, engage one’s institution in research.49 Nor does an employee who releases identifiable, private data to a researcher elsewhere engage her own institution in research (receiving such data, on the other hand, will tend to engage that institution in research).50 Nor does co-authoring a paper reporting the results of a study, without more, engage the co-author’s institution in that research.51 To be clear, this fragmentation only goes so far; for any research study, at least one institution must be engaged. But that institution might be one that has not committed to the Common Rule, directly or indirectly. And so it is possible that an employee of a Common Rule institution can participate in — indeed, accelerate, legitimate, and be a but-for cause of — non-exempt mHealth (or other) research without the Common Rule ever being triggered.Reference Meyer and Meyer52

2. citizen scientists and self-experimenters

On the other hand, the Common Rule does (potentially) apply to some actors whom one might think would escape its scope. Citizen scientists are — almost by definition — unaffiliated with traditional research institutions and not historically recognized by traditional funders. For that reason, the Common Rule is unlikely to apply to their research directly. Nor would voluntary adoption of the Common Rule by a group of citizen scientists carry the same sanctions as when the Common Rule indirectly applies in an academic or even corporate setting. However, assuming that a citizen scientist or a citizen scientist organization wanted to adopt the Common Rule, it would, in fact, almost certainly cover typical citizen science projects such as N of 1. This includes N of 1 studies (including but not limited to “self-experimentation”) and studies in which the research “subjects” are all also researchers.

Although the Common Rule is silent about scenarios in which researchers study themselves, there is little reason to believe that self-study per se falls outside of the Common Rule. The regulations provide a definition of “human subject” (discussed below) and no part of that definition hinges on the relationship of the researcher to the participant, the presence or absence of power or information asymmetries, or the parties’ identities. All of these things can, of course, affect whether research participants are “vulnerable” and in need of additional protections under the Common Rule or the additional Subparts of Part 46 of the Code of Federal Regulations,53 but not whether they meet the Common Rule’s basic definition of “human subject,” which is a threshold criterion. Indeed, several IRBs have explicit policies clarifying that researchers are indeed required to obtain IRB approval before studying themselves.Reference Meyer54

C. Is the Activity Covered?

Although the Common Rule has its roots in scandals involving biomedical and behavioral research and although its enabling statute limits the Common Rule’s scope to biomedical and behavioral research,55 the regulations themselves are, for better or worse, almost perfectly agnostic about the topic or discipline of research. What matters, instead, is whether the activity meets the Common Rule’s definitions of both “research” and “human subjects” and, if so, whether that human subjects research is nevertheless “exempt.”

1.research

In order to be covered by the Common Rule, an activity must constitute “a systematic investigation, including research development, testing, and evaluation, designed to develop or contribute to generalizable knowledge.”56 The Common Rule unhelpfully provides no definitions of, or further clarification about, terms such as “systematic,” “investigation,” and “generalizable knowledge,” which can be understood in different ways.

This ambiguous definition of research from the late 1970s has not fared well under the burden of the modern learning health system, which, although itself not crisply defined, broadly seeks to routinely embed various “learning activities” (including data collection and analysis and both observational and experimental methods) into the practices of medicine and health care delivery in order to continuously improve those practices.Reference Smith, Faden, Beauchamp, Kass, Horwitz, Kuznetsova and Jones57 The Common Rule’s definition of “research” was explicitly meant to be distinguished from “practice” (which is not defined in the Common Rule). But the modern learning health system explicitly seeks to integrate learning and practice, making it unclear how — or whether — the Common Rule applies to various learning activities.Reference Beauchamp and Saghai58 As one prominent research ethicist described the current state of affairs: “Nobody knows, anymore, what is permitted, forbidden, required, or optional. There is serious debate going on about what should be permitted and what should not.”59

Importantly, health systems are not the only entities to take advantage of how cheap and easy — and, arguably, often ethically imperative — it has become to collect and analyze data or to use A/B testing to ensure that existing or contemplated policies and practices work as intended. Mobile health app owners, for instance, might engage in a variety of “learning activities” designed to improve or assure the quality of their app (rather than to contribute to generalizable knowledge) that an IRB could find falls outside the scope of the Common Rule, even if those regulations would otherwise directly or indirectly apply to the app owner.

OHRP has posted FAQs to its website on various subjects which “provide guidance that represents OHRP’s current thinking ont [sic] hese [sic] topics and should be viewed as recommendations, unless specific regulatory requiremtns [sic] are cited.”60 An FAQ on quality improvement (QI) activities states that “most quality improvement efforts are not research subject to the [Common Rule]. However, in some cases quality improvement activities are designed to accomplish a research purpose as well as the purpose of improving the quality of care, and in these cases the [Common Rule] may apply.”61 Specifically, the FAQs restate the Common Rule’s jurisdictional provision that if an activity constitutes non-exempt human subjects research in which an assured institution is engaged, the Common Rule applies. In those cases, however, the FAQs notes that “the regulations provide great flexibility in how the regulated community can comply.”62

The OHRP FAQs do attempt to distinguish “pure” QI activities from those that include elements of non-exempt human subjects research, but the FAQs are controversial.63 In any case, the FAQs are not binding, even when an institution is committed, directly or indirectly, to the Common Rule. At least some IRBs have determined that rigorous learning health system activities constitute quality improvement activities rather than human subjects research to which the Common Rule might otherwise apply (directly or indirectly). For instance, employees of NYU Langone Health recently described, in the pages of The New England Journal, ten “randomized quality-improvement projects” conducted under the auspices of “turn[ing]” the system “into a learning health system.”64 These field experiments or A/B tests, which were registered at Clinical-Trials.gov, “fall[] squarely into the challenging gray zone of quality improvement versus research.”65 They were ultimately conducted without IRB review, following an IRB determination that they constituted QI rather than human subjects research. That determination was apparently made because the QI activities:

are conducted by persons involved in the care of patients for the specific purpose of improving care at our local institution, positive results are promptly incorporated into practice, the projects involve minimal risk, the lessons we learn are likely to be specific to our culture and workflow and are not necessarily generalizable to other institutions, and the projects are intended to increase the provision or uptake of recommended practices to improve care or avoid harm.Reference Finkelstein and Baily66

Nor were patients or providers permitted to opt out of these projects, “because this is largely not feasible for wholesale systems interventions, nor is it ethically required for quality-improvement work.”Reference Baily67

Importantly, health systems are not the only entities to take advantage of how cheap and easy — and, arguably, often ethically imperative68 — it has become to collect and analyze data or to use A/B testing to ensure that existing or contemplated policies and practices work as intended. Mobile health app owners, for instance, might engage in a variety of “learning activities” designed to improve or assure the quality of their app (rather than to contribute to generalizable knowledge) that an IRB could find falls outside the scope of the Common Rule, even if those regulations would otherwise directly or indirectly apply to the app owner.

2.human subjects

Even if an activity constitutes “research” under the Common Rule, it must also involve at least one “human subject,” who is “a living individual about whom an investigator”:

  • Obtains information … through intervention or interaction with the individual, and uses, studies, or analyzes the information …; or

  • Obtains, uses, studies, analyzes, or generates identifiable private information … 69

“Intervention” is not limited to physical procedures through which data are collected but, of relevance to mHealth, also includes “manipulations of the subject or the subject’s environment that are performed for research purposes.”70 Examples of interventional research involving mHealth include (but are not limited to): randomly assigning participants either to use or not use an app; A/B tests of various aspects of the app conducted on some or all users, which involves researchers intervening in the user’s “app environment”; and exercises that an mHealth app might ask a user to perform for research purposes, such as finger tapping, cognitive games, or pacing up and down a hallway in order to measure gait.

“Interaction” includes “communication or interpersonal contact between investigator and subject.”71 Traditional forms of research interactions include surveys, focus groups, and interviews. In the mHealth context, research interactions might again include surveys or other solicitation of user information for research purposes (e.g., phenotype surveys), but also a variety of researcher-to-user communications (e.g., reminders or motivational messages).

It is certainly possible for an mHealth app to be involved in human subjects research without there being any research intervention or interaction. This is most likely when the original purpose of the app is for something other than research. For instance, imagine a non-research health or lifestyle app that allows users to track the timing and symptoms of their menstrual periods. Assume that any interactions (e.g., reminders to the user pushed out through the app to log in that day, reminders that they should expect their next period soon, or invitations to the user to enter symptoms experienced during that cycle) or suggested interventions (e.g., admonitions to the user who reports symptoms to take a hot bath, apply a warm compress, or take a pain reliever) are built into the app to facilitate those health or lifestyle purposes, and not for research purposes.

If non-exempt research involves no intervention or interpersonal interaction, it will involve “human subjects” (and, hence, fall within the Common Rule’s scope) only if it involves the collection, analysis, or other use of data that are both “identifiable” and “private.”

a. “Identifiable” information. Consider, first, the requirement that data be identifiable. Some people reject any meaningful distinction between research with identifiable and non-identifiable data.72 Privacy, after all, does not exhaust the interests that someone can have in data about them; they may also have autonomy interests in controlling how data they contributed (wittingly or not) are used.Reference Meyer73 Moreover, many have argued that the distinctions between identifiable and non-identifiable data (or, in HIPAA terms, between identified and de-identified data) is illusory. A series of “re-identification attacks” by privacy researchers has demonstrated that, under certain circumstances, a variety of anonymous or pseudonymous data can be re-identified, including geolocation data, genomic data, other biometric data, Internet search data, and consumer data.Reference Meyer74 Yet the applicability of the Common Rule not only hinges on whether information is identifiable or not; the Common Rule’s bar for rendering data non-identifiable is fairly low.

Information is “identifiable” under the Common Rule if “the identity of the subject is or may readily be ascertained by the investigator or associated with the information.”75 Although the Common Rule does not define “readily ascertainable,” and therefore it is left to individual IRBs to interpret and apply that standard, few of the aforementioned re-identification methods would seem to qualify as rendering a data source’s identity “readily ascertainable.” OHRP guidance moreover suggests that information are not individually identifiable “when they cannot be linked to specific individuals by the investigator(s)[,] either directly or indirectly through coding systems.”76 To prevent coded data from being indirectly (re)identifiable under OHRP’s guidance, “the investigators and the holder of the key enter into an agreement prohibiting the release of the key to the investigators under any circumstances, until the individuals are deceased,” and no IRB needs to review that agreement.77 Thus, if an mHealth app company has a research arm, the business arm of the company could obtain individual-level data via the app, as usual, then replace identifiers with codes, and provide the coded dataset to the research arm under an agreement that the research arm will never obtain the key to the code — all without ever triggering the Common Rule, even if it directly applied.

Federal regulators are not unaware of either these emerging re-identification techniques or the autonomy interests that research participants might have in even non-identifiable data. To the contrary, during the several years-long process of revising the Common Rule, regulators cited both factors in proposing that the Common Rule’s jurisdiction be expanded to cover research with all biospecimens, whether or not those biospecimens were identifiable.78 Inexplicably, regulators did not propose to expand jurisdiction over non-identifiable data, even though the privacy and autonomy interests are largely the same.Reference Lynch, Bierer and Cohen79 In any event, the proposals failed. Instead, the 2018 Common Rule requires Common Rule agencies, within one year of the revised regulations going into effect and at least every four years thereafter, to (a) reconsider the Common Rule’s definition of “identifiable” and (b) consider whether any analytic technologies or techniques (such as whole genome sequencing) that should be considered to necessarily produce identifiable data. In the near term, regulatory efforts to tighten up the Common Rule’s definition of identifiability are likely to focus, once again, on biospecimens rather than data.Reference Lynch and Meyer80

b. “Private” information. Even if data are identifiable under the Common Rule’s relatively weak current definition, for data analysis to constitute human subjects research, the data must also be “private.” “Private,” here, does not refer to the extent to which data are or are not sensitive. Under the Common Rule, “private” information “includes information about behavior that occurs in a context in which an individual can reasonably expect that no observation or recording is taking place, and information that has been provided for specific purposes by an individual and that the individual can reasonably expect will not be made public (e.g., a medical record).”81 This is less a definition of “private information” than it is a listing of two kinds of private data.

With respect to the first kind — “information about behavior that occurs in a context in which an individual can reasonably expect that no observation or recording is taking place” — it will be hard to argue that mHealth app users have a reasonable expectation that “no” observation or recording of their in-app behavior, for any reason, is taking place.

The second example of private information — “information that has been provided for specific purposes by an individual and that the individual can reasonably expect will not be made public (e.g., a medical record)” — is difficult to parse, but seems more likely to apply to mHealth research. In the case of secondary research use of mHealth data, for instance, the user provides data for specific purposes (e.g., to be able to track her menstrual cycle) and does not expect that information to be “made public.” (Although research use of data is not usually synonymous with making data public, the Common Rule seems to use the latter as an odd proxy for the former.) Similarly, if a user of an mHealth research app knowingly provides data to the app developer for the specific purpose of research, she maintains a reasonable expectation that her data will not be made public, which renders those data “private” under the Common Rule.

3.exempthuman subjects research

Finally, an activity can meet the definitions of “research” and “human subjects” and take place in a Common Rule environment and still fall outside the scope of the Common Rule: “research activities in which the only involvement of human subjects will be in one or more of [8 specified] categories” are (more or less) “exempt” from the Common Rule.82 Under the 2018 Common Rule, the qualifier “more or less” exempt is necessary because some “exempt” human subjects research nevertheless requires “limited IRB review” for things like an appropriate data security plan or confirmation that secondary research use of existing data collected under broad consent fall within the scope of that consent.83

The Common Rule is silent about who must or should make exemption determinations, but OHRP guidance “recommends that, because of the potential for conflict of interest, investigators not be given the authority to make an independent determination that human subjects research is exempt.”84 During the Common Rule revision process, regulators proposed to develop a “decision tool” by which investigators would be permitted to make and certify exemption determinations by entering accurate answers to questions. Because the tool had not been developed and therefore the public could not comment on it sufficiently, this proposal did not become part of the 2018 Common Rule, but regulators have said that they will continue to explore this option.85 Some IRBs already use such a tool.86

One exemption concerns studies that involve only surveys and/or educational tests (such as cognitive or aptitude tests, which are sometimes used in mHealth research). Such studies are exempt from the Common Rule so long as one of the following three conditions is met: (1) the information is recorded by the investigator in a non-identifiable way; (2) the data are identifiable but not sensitive (i.e., disclosure of the data outside the research “would not reasonably place the subjects at risk of criminal or civil liability or be damaging to the subjects’ financial standing, employability, educational advancement, or reputation”); (3) or the data are identifiable (and, presumably, sensitive) and an IRB conducts a “limited IRB review” to ensure that appropriate data security measures are in place.87 That limited IRB review does not involve risk-benefit assessment, nor do the Common Rule’s elaborate informed consent provisions apply. In short, one way or another, all research involving surveys and educational tests is exempt from the Common Rule.

Research involving non-deceptive “benign behavioral interventions” — which are “brief in duration, harmless, painless, not physically invasive, not likely to have a significant adverse lasting impact on the subjects, and [not likely to be] offensive or embarrassing” — is also exempt if the participant prospectively consents to the intervention and at least one of the same three criteria described above is met.88 mHealth research that only involves such interventions as finger tapping, pacing, Stroop tests, games, puzzles, and the like (or other exempt research activities) are likely to be exempt from IRB review.

There are no special Common Rule provisions governing mHealth research. Instead, like all other non-exempt human subjects research, mHealth research must meet the criteria for IRB approval. The protections that are potentially afforded by such IRB review might be substantial — though empirical evidence supporting that conclusion is, as yet, scarce. But even by its own terms, the Common Rule’s protections are limited in ways that might surprise or disappoint some.

Note, however, that although some of these interventions are benign themselves, return of the individual results of these tasks might not be. Consider, for instance, an mHealth app designed to track symptoms of Parkinson’s (whether for ostensibly healthy individuals volunteering as controls, those at-risk for Parkinson’s, or those with an existing diagnosis) via finger tapping and pacing tasks.89 Those tasks in and of themselves are benign. But if participants’ results suggest onset or progression of symptoms (or are merely interpreted that way by participants), those results — but not the interventions — could (at least in theory90) have “a significant adverse lasting impact on” participants.

It is not entirely clear how return of results might affect such a study’s exempt status. The revised Common Rule clearly perceives return of individual results to be a source of potential harm to participants and to sometimes merit IRB review. For instance, secondary research on identifiable data that were collected under broad consent is exempt if, among other things, “[t] he investigator does not include returning individual research results to subjects as part of the study plan.”91 No such qualification is made for the exemption pertaining to benign behavioral interventions, however. Moreover, the Common Rule provides that “research activities in which the only involvement of human subjects will be in one or more of [8 specified] categories” are exempt. But investigators might wish to return individual results for a number of reasons unrelated to research purposes. For instance, instead of returning individual results in order to study how participants react to this information, investigators might return results in order to comply with the HIPAA Privacy Rule’s right of access or to express gratitude to participants or because the investigator believes participants have a right to individual research results.Reference Evans, Wolf, Bobe, Meyer and Church92 It is possible that an IRB could find that returning results under such circumstances does not constitute a “research activity” and therefore is no barrier to an exemption determination.93

Another important exemption is “secondary research for which consent is not required.” Recall that research with non-identifiable and/or non-private data does not (without more, i.e., intervention or interaction) involve human subjects and so falls outside the Common Rule. Although research with identifiable private data is covered by the Common Rule, it is nevertheless exempt if those data are either (a) “publicly available” or (b) “recorded by the investigator in such a manner that the identity of the human subjects cannot readily be ascertained directly or through identifiers linked to the subjects” (and the investigator neither contacts nor attempts to re-identify the participants). The Common Rule does not define any of the important terms in this exemption, including “publicly available” and “recorded.” The line between public and private spaces, and hence between data that are and are not publicly available, is not sharp.Reference Meyer94 As for the second option under this exemption, the general idea is that data that were collected for any purpose other than the present research study — whether that be clinical purposes, consumer purposes, administrative purposes, or for another research project — may be used in new, unrelated research with consent or IRB review, so long as identifiers are separated from the data before the remainder is used in research.95

III. When the Common Rule Applies, How Well Does(n’t) It Apply?

Assume that an activity meets the definitions of “research” and “human subjects,” is not “exempt,” and occurs at an institution that applies the Common Rule (either directly or indirectly) and is “engaged” in the research. Now that it “applies,” what does the Common Rule actually require? In brief:

IRB review is designed to protect research participants, and IRBs approve, disapprove, or require changes to each study accordingly. Before researchers recruit a single participant, IRBs review their recruitment plans, the detailed information disclosures that form the basis of participants’ voluntary, informed consent, and the protocol itself. They ensure that these materials fully, accurately, and in “understandable” language disclose to prospective participants, inter alia, “any reasonably foreseeable risks or discomforts to [them]” and “any benefits to [them] or to others which may reasonably be expected from the research.” They then consider these risks and expected benefits themselves, and approve only those studies whose “[r]isks to subjects are reasonable in relation to anticipated benefits, if any, to subjects, and the importance of the knowledge that may reasonably be expected to result.”96

There are no special Common Rule provisions governing mHealth research. Instead, like all other non-exempt human subjects research, mHealth research must meet the criteria for IRB approval. The protections that are potentially afforded by such IRB review might be substantial — though empirical evidence supporting that conclusion is, as yet, scarce.Reference Lynch97 But even by its own terms, the Common Rule’s protections are limited in ways that might surprise or disappoint some.

For instance, unlike the Belmont Report’s interpretation of the principle of beneficence,98 and despite the otherwise fairly tight nexus between the Belmont principles and their codification in the regulations, the Common Rule requires that the risks of research be minimized,99 but it does not require researchers to maximize the benefits of research.100 IRBs must consider the risks of research to participants, and virtually any probability of risk — no matter how speculative — and virtually any kind of risk — from physical to emotional to reputational — is fair game.101 But the Common Rule directs IRBs to consider only risks to the direct participants in research, not to any third parties, such as bystanders whose privacy interests might become entangled with those who are enrolled in research or groups who might be stigmatized by the results of research conducted with members of that group.Reference Drabiak-Syed102 Moreover, the Common Rule instructs IRBs not to “consider possible long-range effects of applying knowledge gained in the research (e.g., the possible effects of the research on public policy).”103

Finally, a common myth about human subjects research is that informed consent is always required. But this is not the case, even under the Common Rule and the Belmont principles.104 IRBs can and do permit alterations to the information that researchers normally must disclose to prospective participants — or waive consent altogether. It is true that alteration and waiver are possibilities only when the research is “minimal risk” and certain other conditions are met, chiefly, that the research could not be “practicably” conducted without the alteration or waiver.105 But although the Common Rule does define “minimal risk,”106 the term has been criticized as “ambiguous and poorly defined.”Reference Resnik, Joffe and Wertheimer107 For its part, the critical term “practicable” is not defined in the Common Rule at all, nor is it known how different IRBs interpret and apply it.

IV. Conclusion

The U.S. federal regulations that are designed to protect research participants directly apply to only a limited set of activities. On the other hand, the Common Rule indirectly applies to an increasing amount of activity as already-customary voluntary adoption spreads from the academic to the industry and nonprofit sectors. Moreover, the penalties for noncompliance when the Common Rule indirectly applies approximate those for noncompliance when it directly applies. This makes the gap between “regulated” and “unregulated” research somewhat less troublesome than one might assume. Instead, the greater difficulty for those who favor meaningful regulation of research and other learning activities might be that even when the Common Rule “applies” (directly or indirectly), its substantive application can be wanting. For instance, by limiting its scope to activities that are “designed to develop or contribute to generalizable knowledge,” the Common Rule omits other learning activities (and non-learning activities) that might present equal or greater risk, such as QI or innovation. By limiting its scope to research that involves interaction, intervention, or the use of identifiable, private information, “big data” research with non-identifiable data (weakly defined) eludes its grasp and other activities meet the requisite definition of “human subjects research,” only to be exempt. As for non-exempt human subjects research, research benefits need not be maximized, informed consent is not always required, and IRBs do not consider risks to third-parties or the long-term social risks of research. For mHealth research and other emerging activities, this means that the development and voluntary adoption of relatively new standards will be critical.108

Acknowledgment

Research on this article was funded by the following grant: Addressing ELS Issues in Unregulated Health Research Using Mobile Devices, No. 1R01CA20738-01A1, National Cancer Institute, National Human Genome Research Institute, and Office of Science Policy and Office of Behavioral and Social Sciences Research in the Office of the Director, National Institutes of Health, Mark A. Rothstein and John T. Wilbanks, Principal Investigators.

Footnotes

The author has no conflicts of interest to disclose.

References

45 C.F.R. Subpart A (2018).Google Scholar
Id. § 46.101(a). As of October of 2019, the following 16 federal departments and agencies are official signatories to the revised Common Rule, known as the “2018 Common Rule”: Department of Homeland Security, 6 C.F.R. Pt. 46; Department of Agriculture, 7 C.F.R. Pt. 1c; Department of Energy, 10 C.F.R. Pt. 745; National Aeronautics and Space Administration, 14 C.F.R. Pt. 1230; Department of Commerce, 15 C.F.R. Pt. 27; Social Security Administration, 20 C.F.R. Pt. 431; Agency for International Development, 22 C.F.R. Pt. 225; Department of Housing and Urban Development, 24 C.F.R. Pt. 60; Department of Labor, 29 C.F.R. Pt. 21; Department of Defense, 32 C.F.R. Pt. 219; Department of Education, 34 C.F.R. Pt. 97; Department of Veterans Affairs, 38 C.F.R. Pt. 16; Environmental Protection Agency, 40 C.F.R. Pt. 26; Department of Health and Human Services, 45 C.F.R. Pt. 46; National Science Foundation, 45 C.F.R. Pt. 690; Department of Transportation, 49 C.F.R. Pt. 11. In addition, according to OHRP, two federal entities that were signatories to the pre-2018 Common Rule — Department of Justice, 28 C.F.R. Pt. 46, and Consumer Product Safety Commission, 16 C.F.R. Pt. 1028 — intend to become signatories to the revised Common Rule. Office for Human Research Protections, Federal Policy for the Protection of Human Subjects (“Common Rule”), available at <https://www.hhs.gov/ohrp/regulations-and-policy/regulations/common-rule/index.html> (last visited January 20, 2020). Finally, both the Central Intelligence Agency and the Office of the Director of National Intelligence follow the 2018 Common Rule, per Executive Order 12333 (1981), as amended by Executive Orders 13284 (2003), 13355 (2004), and 13470 (2008). Id.+(last+visited+January+20,+2020).+Finally,+both+the+Central+Intelligence+Agency+and+the+Office+of+the+Director+of+National+Intelligence+follow+the+2018+Common+Rule,+per+Executive+Order+12333+(1981),+as+amended+by+Executive+Orders+13284+(2003),+13355+(2004),+and+13470+(2008).+Id.>Google Scholar
Meyer, M.N., “Regulating the Production of Knowledge: Research Risk-Benefit Analysis and the Heterogeneity Problem,” Administrative Law Review 65, no. 2 (2013): 237-298, at 246 (“Historically, between 74% and 90% of institutions have [checked the box].”); Office of the Secretary and Food and Drug Administration, Department of Health and Human Services, “Human Subject Research Protections: Enhancing Protections for Research Subjects and Reducing Burden, Delay, and Ambiguity for Investigators,” Federal Register 76 (July 26, 2011): 44,512-44,531, at 44,528 (hereinafter ANPRM) (“Most institutions voluntarily extend the applicability of their FWAs to all the research conducted at their institutions, even research not conducted or supported by one of the Federal departments or agencies that have adopted the Common Rule.”).Google Scholar
Meyer, supra note 4, at 246 n.33.Google Scholar
See, e.g., Federman, D.D., Hanns, K.E., and Rodriguez, L.L., eds., Responsible Research: A Systems Approach to Protecting Research Participants (Washington, D.C.: National Academies Press, 1993); National Bioethics Advisory Commission, Ethical and Policy Issues in Research Involving Human Participants (Bethesda, MD: 2001).Google Scholar
ANPRM, supra note 4, at 44,528 (proposal “requiring domestic institutions that receive some Federal funding from a Common Rule agency for research with human subjects to extend the Common Rule protections to all research studies conducted at their institution”).Google Scholar
Federal Policy for the Protection of Human Subjects, Federal Register 80 (Sept. 8, 2015): 53,933, 54,034 (NPRM).Google Scholar
Id., at 53,989-53,990.Google Scholar
Id., at 53,991.Google Scholar
Id., at 54,047 (“Clinical trial means a research study in which one or more human subjects are prospectively assigned to one or more interventions (which may include placebo or other control) to evaluate the effects of the interventions on biomedical or behavioral health-related outcomes.”).Google Scholar
Federal Policy for the Protection of Human Subjects, Federal Register 82 (January 19, 2017): 7149, 7155 (Final Rule).Google Scholar
Id., at 7156.Google Scholar
See Public Health Service Act, 42 U.S.C. ch. 6A §201 et seq at §§289(a) (“The Secretary shall by regulation require that each entity which applies for a grant, contract, or cooperative agreement under this chapter for any project or program which involves the conduct of biomedical or behavioral research involving human subjects submit … assurances satisfactory to the Secretary that it has established … an ‘Institutional Review Board’… to review biomedical and behavioral research involving human subjects conducted at or supported by such entity in order to protect the rights of the human subjects of such research.” (emphasis added)).Google Scholar
Final Rule, supra note 12, at 7155-7156.Google Scholar
Id., at 7156.Google Scholar
Id., at 7156 (acknowledging the proposal “would benefit from further deliberation”); id. (“[W]e are persuaded that the proposed extension of the Common Rule is not appropriate to include in a final rule at this time. We will continue to carefully consider the related issues.”).Google Scholar
Id. (“We concluded that [maintaining the “check the box” option] would not further the expressed goal of increasing the application of consistent protections to clinical trials, regardless of the source of support, because the extension of the FWA would be optional. We therefore plan to implement the proposed non-regulatory change to the assurance mechanism to eliminate the voluntary extension of the FWA to non-federally funded research.”).Google Scholar
See Tovino, S.A., “Mobile Research Applications and State Research Laws,” Journal of Law, Medicine & Ethics 48, no. 1, Suppl. 1 (2020): 82-86.CrossRefGoogle Scholar
NPRM, supra note 8, at 54,034 (“academic institutions… generally extend protections to all human subjects research at their institution, even if they have not ‘’checked the box’ on their FWA indicating that they do so”); Final Rule, supra note 12, at 7516 (“We recognize that institutions may choose to establish an institutional policy that would require IRB review of research that is not funded by a Common Rule department or agency (and indeed, as commenters noted, almost all institutions already do this).”).Google Scholar
23andMe subjects “much” of its research to review by an independent IRB, even when the research involves non-identifiable information and hence falls outside the scope of the Common Rule, even if those regulations directly applied to 23andMe. See 23andMe, “Protecting People in People Powered Research,” 23andMeBlog, July 30, 2014, available at <https://blog.23andme.com/23andme-research/protecting-people-in-people-powered-research/> (last visited October 21, 2019); 23andMe, “23andMe Improves Research Consent Process,” 23andMeBlog, June 24, 2010, available at <https://blog.23andme.com/23andme-research/23andme-improves-research-consent-process/> (last visited January 20, 2020). See also Hernandez, D. and Seetharaman, D., “Facebook Offers Details on How It Handles Research,” Wall Street Journal, June 14, 2016, available at <https://www.wsj.com/articles/facebook-offers-details-how-it-handles-research-1465930152> (last visited January 20, 2020) (“Microsoft and wearables maker Fitbit Inc…. contract with external IRBs for some of their research projects.”).+(last+visited+October+21,+2019);+23andMe,+“23andMe+Improves+Research+Consent+Process,”+23andMeBlog,+June+24,+2010,+available+at++(last+visited+January+20,+2020).+See+also+Hernandez,+D.+and+Seetharaman,+D.,+“Facebook+Offers+Details+on+How+It+Handles+Research,”+Wall+Street+Journal,+June+14,+2016,+available+at++(last+visited+January+20,+2020)+(“Microsoft+and+wearables+maker+Fitbit+Inc….+contract+with+external+IRBs+for+some+of+their+research+projects.”).>Google Scholar
See Jackman, M. and Kanerva, L., “Evolving the IRB: Building Robust Review for Industry Research,” Washington & Lee Law Review Online 72, no. 3 (2016): 442-457, (describing Facebook’s internal research review process); Hernandez and Seetharaman, supra note 21 (noting that, of the research at Microsoft and Fitbit that are not reviewed by independent IRBs, “[t]he rest are largely reviewed internally”); De Mooy, M. and Yuen, S., “Toward Privacy Aware Research and Development in Wearable Health: A Report from the Center for Democracy & Technology and Fitbit, Inc.,” May 2016, available at <https://healthblawg.com/images/2016/06/CDTFitbit-report.pdf> (last visited January 20, 2020) (describing Fitbit’s internal research review process).Google Scholar
See Meyer, M.N., “Ethical Considerations When Companies Study—and Fail to Study—Their Customers,” in Selinger, E., Polonetsky, J. and Tene, O., eds., The Cambridge Handbook of Consumer Privacy (Cambridge University Press, 2018): 207-231, at 224-227 (discussing how the Facebook and Fitbit internal review processes compare to the Belmont Report principles and the Common Rule and factors companies should consider in establishing and operating internal research ethics review boards).CrossRefGoogle Scholar
Consumer Privacy Protection Act of 2015, S. 1158, 114th Cong.Google Scholar
See White House, “Administration Discussion Draft: Consumer Privacy Bill of Rights Act of 2015,” February 27, 2015, available at <https://www.democraticmedia.org/sites/default/files/field/public/2015/draft_consumer_privacy_bill_of_rights_act.pdf> (last visited January 20, 2020). The criteria resemble the Common Rule’s criteria for waiving consent. See infra Part III.+(last+visited+January+20,+2020).+The+criteria+resemble+the+Common+Rule’s+criteria+for+waiving+consent.+See+infra+Part+III.>Google Scholar
See Meyer, supra note 4, at 246 n.36.Google Scholar
“App Store Review Guidelines,” at § 5.1.3(iv), available at <https://developer.apple.com/app-store/review/guidelines/#healthkit> (last visited January 20, 2020).+(last+visited+January+20,+2020).>Google Scholar
Compare id., at § 5.1.3(iii) (last visited January 20, 2020) with 45 C.F.R. § 46.116.Google Scholar
See, e.g., Rothstein, M.A. et al., “Unregulated Health Research Using Mobile Devices: Ethical Considerations and Policy Recommendations,” Journal of Law, Medicine & Ethics 48, no. 1, Suppl. 1 (2020): 196-226.CrossRefGoogle Scholar
The Common Rule merely requires that the informed consent form or process — when consent is required at all — disclose to prospective participants whether or not compensation or medical treatment for research-related injuries is available. 45 C.F.R. § 116(b)(6).Google Scholar
Public Health Services Act, supra note 14, at § 289(b)(2) (“The Secretary shall establish a process for the prompt and appropriate response to information provided to the Director of NIH respecting incidences of violations of the rights of human subjects of research for which funds have been made available under this chapter. The process shall include procedures for the receiving of reports of such information from recipients of funds under this chapter and taking appropriate action with respect to such violations.”). See also 45 C.F.R. § 46.123.Google Scholar
Office for Human Research Protections, Department of Health and Human Services, “Compliance Oversight Procedures for Evaluation Institutions” (last visited October 14, 2009), available at <https://www.hhs.gov/ohrp/compliance-and-reporting/evaluating-institutions/index.html> (last visited January 20, 2020).+(last+visited+January+20,+2020).>Google Scholar
See Ramnath, K. et al., “Incident Reports and Corrective Actions Received by OHRP,” IRB: Ethics & Human Research 38, no. 6 (2016), available at <https://www.thehastingscenter.org/irb_article/incident-reports-corrective-actions-received-ohrp> (last visited January 20, 2020) (OHRP staff reporting the five most common kinds of corrective actions institutions reported to OHRP).Google Scholar
See Office for Human Research Protections, Department of Health and Human Services, “OHRP Determination Letters and Other Correspondence,” available at <https://www.hhs.gov/ohrp/compliance-and-reporting/determination-letters/index.html> (last visited January 20, 2020).+(last+visited+January+20,+2020).>Google Scholar
Office for Human Research Protections, supra note 32; Department of Health and Human Services Office of Inspector General (OIG), “OHRP Generally Conducted Its Compliance Activities Independently, but Changes Would Strengthen Its Independence” (July 2017): 1-29, at 24, available at <https://oig.hhs.gov/oei/reports/oei-01-15-00350.pdf> (last visited January 20, 2020) (reviewing OHRP compliance oversight activities from 2000 through 2015).+(last+visited+January+20,+2020)+(reviewing+OHRP+compliance+oversight+activities+from+2000+through+2015).>Google Scholar
OHRP closed a total of 15 for-cause and not-for-cause evaluations in 2016, 1 evaluation in 2017, 1 evaluation in 2018, and 2 evaluations in 2019 to date. OHRP, supra note 36.Google Scholar
OIG, supra note 37, at 8-9.Google Scholar
In 2016, OHRP did publish in an academic journal aggregate statistics about the kinds of incident reports it received between 2008 and 2014 and the kinds of corrective actions institutions implemented as a result. See Ramnath et al., supra note 34.Google Scholar
Id., at 7-8.Google Scholar
OIG, supra note 37, at 14.Google Scholar
Id., at 7.Google Scholar
See, e.g., “‘Nothing Short of Appalling:’ Inaction by HHS Oversight Agencies Sets off Alarms,” Report on Research Compliance 14, no. 6 (2017): 1-5 (quoting critical comments by bioethicist Ruth Macklin and bioethicist and legal scholar Lois Shepherd); Delfino, T., “With Just One Investigation in 2013, OHRP Seems ‘Invisible’ After SUPPORT Dust-Up,” Report on Research Compliance (2014): 1-5, available at <https://www.bmj.com/sites/default/files/response_attachments/2014/07/rrc-reprint-0514.pdf> (last visited January 20, 2020) (quoting critical comments by bioethicist Art Caplan, Senator Charles Grassley, and Public Citizen’s Michael Carome).+(last+visited+January+20,+2020)+(quoting+critical+comments+by+bioethicist+Art+Caplan,+Senator+Charles+Grassley,+and+Public+Citizen’s+Michael+Carome).>Google Scholar
45 C.F.R. §§ 46.101(a), 46.103.Google Scholar
Office for Human Research Protections, Department of Health and Human Services, “Engagement of Institutions in Human Subjects Research,” available at <https://www.hhs.gov/ohrp/regulations-and-policy/guidance/guidance-on-engagement-of-institutions/index.html> (last accessed January 20, 2020). See also Office for Human Research Protections, Department of Health and Human Services, “Determining When Institutions are Engaged in Research,” January 13, 2009, available at <https://www.hhs.gov/ohrp/regulations-and-policy/guidance/determining-when-institutions-are-engaged-in-research/index.html> (last visited January 20, 2020).+(last+accessed+January+20,+2020).+See+also+Office+for+Human+Research+Protections,+Department+of+Health+and+Human+Services,+“Determining+When+Institutions+are+Engaged+in+Research,”+January+13,+2009,+available+at++(last+visited+January+20,+2020).>Google Scholar
Office for Human Research Protections (2008), supra note 48, at § B(4).Google Scholar
Id., at § B(6).Google Scholar
Id., at § B(11).Google Scholar
This is what happened, for example, when researchers at Cornell and Facebook collaborated on the infamous emotional contagion experiment. See Meyer, M.N., “Everything You Need to Know about Facebook’s Controversial Emotion Experiment,” Wired, June 30, 2014, available at <http://www.wired.com/2014/06/everything_you_need_to_know_about_facebooks_manipulative_experiment> (last visited January 20, 2020); Meyer, M.N., “Two Cheers for Corporate Experimentation: The A/B Illusion and the Virtues of Data-Driven Experimentation,” Colorado Technology Law Journal 13, no. 2 (2015): 273-331, at 311-312 and n.139.Google Scholar
See 45 C.F.R. §§ 46.107, 46.111(a)(3), 46.111(b). See also 45 C.F.R. Subparts B, C, and D (additional regulations which Common Rule departments may but need not adopt providing special protections for pregnant women, human fetuses and neonates; prisoners; and children, respectively).Google Scholar
Meyer, M.N., “Whose Business Is It If You Want To Induce a Bee To Sting Your Penis?” Bill of Health Blog, April 4, 2014, available at <http://blog.petrieflom.law.harvard.edu/2014/04/04/whose-business-is-it-if-you-want-to-induce-a-bee-to-sting-your-penis/> (last visited October 18, 2019).+(last+visited+October+18,+2019).>Google Scholar
Public Health Services Act, supra note 14, at §289(a) (“The Secretary shall by regulation require that each entity which applies for a grant, contract, or cooperative agreement under this chapter for any project or program which involves the conduct of biomedical or behavioral research involving human subjects submit in or with its application for such grant, contract, or cooperative agreement assurances satisfactory to the Secretary that it has established (in accordance with regulations which the Secretary shall prescribe) a board (to be known as an ‘Institutional Review Board’) to review biomedical and behavioral research involving human subjects conducted at or supported by such entity in order to protect the rights of the human subjects of such research.” (emphasis added)).Google Scholar
45 C.F.R. § 46.102(l).Google Scholar
See Smith, M.D. et al., eds., Better Care at Lower Cost: The Path to Continuously Learning Health Care in America (Washington, DC: National Academies Press, 2012). It is sometimes claimed that the concept of a learning health system encompasses only observational methods such as “big data” analysis. For a contrary view, see, e.g., Faden, R.R., Beauchamp, T.L., and Kass, N.E., “Learning Health Care Systems and Justice,” Hastings Center Report 41, no. 4 (2011): 3 (“[I]t can be ethically acceptable to randomize patients without express consent in trials comparing widely used, approved interventions that pose no additional risk. With appropriate oversight, learning health care systems ought to conduct such trials on a regular basis.”). See also Horwitz, L.I., Kuznetsova, M., and Jones, S.A., “Creating a Learning Health System through Rapid-Cycle, Randomized Testing,” New England Journal of Medicine 381, no. 12 (2019): 1175-1179 (describing several “randomized quality-improvement projects” conducted as part of a learning health system).Google Scholar
The research/practice distinction dates back to the work of the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research, which was called upon by Congress, in the National Research Act of 1974, to distinguish between biomedical and behavioral research, on the one hand, and “the accepted and routine practice of medicine,” on the other, in order to subject the former to special regulation. Title II of the National Research Act, Pub. L. 93–348, 88 Stat. 342 (1974) at §202(a)(1)(B)(i). See also Beauchamp, T.L. and Saghai, Y., “The Historical Foundations of the Research-Practice Distinction in Bioethics,” Theoretical Medicine and Bioethics 33, no. 1 (2012): 45-56. The Commission produced several important reports, including the Belmont Report, which formed the basis for the Common Rule.CrossRefGoogle Scholar
Delfino, supra note 45, at 5 (quoting John Lantos).Google Scholar
Office for Human Research Protections, “Frequently Asked Questions,” available at <https://www.hhs.gov/ohrp/regulations-and-policy/guidance/faq/index.html> (last visited January 20, 2020).+(last+visited+January+20,+2020).>Google Scholar
Office for Human Research Protections, “Quality Improvement Activities FAQs,” available at <https://www.hhs.gov/ohrp/regulations-and-policy/guidance/faq/quality-improvement-activities/index.html> (last visited January 20, 2020).+(last+visited+January+20,+2020).>Google Scholar
See Meyer, supra note 52, at 321-324.Google Scholar
See, e.g., Horwitz et al., supra note 57, at 1175.Google Scholar
Id., at 1178.Google Scholar
Id. (citing Finkelstein, J.A. et al., “Oversight on the Borderline: Quality Improvement and Pragmatic Research,” Clinical Trials 12, no. 5 (2015): 457-66; Baily, M.A., “Harming Through Protection?” New England Journal of Medicine 358, no. 8 (2008): 768-769.CrossRefGoogle Scholar
Id. (citing Baily, M.A. et al., “The Ethics of Using QI Methods to Improve Health Care Quality and Safety,” Hastings Center Report 36, no. 4 (2006): S1-S40.CrossRefGoogle Scholar
See Meyer, supra note 23.Google Scholar
45 C.F.R. § 46.102(e)(1). References to the collection and analyses of biospecimens in the definition of “human subject” have been omitted for conciseness, since mHealth apps alone cannot (yet) collect and transfer to researchers for use biospeci-mens. mHealth apps can, of course, be used in conjunction with biospecimen collection and analysis, whether for clinical, research, or infotainment purposes.Google Scholar
Id., at §46.102(e)(2).Google Scholar
Id., at §46.102(e)(3).Google Scholar
See, e.g., Rothstein et al., supra note 29, at 3 (“Although research with deidentified specimens or data is not considered human subjects research under the Common Rule, our focus on unregulated research does not make such a distinction.”).Google Scholar
See Meyer, M.N., “Research Ethics Issues Raised in Collecting and Maintaining Large Scale, Sensitive Online Data” (2019): 1-23, at 4, available at <https://securelysharingdata.com/resources/meyer.pdf> (last visited January 20, 2020) (white paper).+(last+visited+January+20,+2020)+(white+paper).>Google Scholar
For an overview, see Meyer, M.N., “Online Symposium on the Law, Ethics & Science of Re-identification Demonstrations,” Bill of Health Blog, May 13, 2013, available at <http://blog.petrieflom.law.harvard.edu/2013/05/13/online-symposium-on-the-law-ethics-science-of-re-identification-demonstrations/> (last visited January 20, 2020).Google Scholar
45 C.F.R. §46.102(e)(5).Google Scholar
Office for Human Research Protections, Department of Health and Human Services, “Coded Private Information or Specimens Use in Research, Guidance (2008),” October 16, 2008, available at <https://www.hhs.gov/ohrp/regulations-and-policy/guidance/research-involving-coded-private-information/index.html> (last visited January 20, 2020).+(last+visited+January+20,+2020).>Google Scholar
ANPRM, supra note 4, at 44,524; NPRM, supra note 8, at 53,942.Google Scholar
Lynch, H.F., Bierer, B.E., and Cohen, I.G., “Confronting Bio-specimen Exceptionalism in Proposed Revisions to the Common Rule,” Hastings Center Report 46, no. 1 (2016): 4-5.CrossRefGoogle Scholar
Lynch, H.F. and Meyer, M.N., “Regulating Research with Bio-specimens under the Revised Common Rule,” Hastings Center Report 47, no. 3 (2017): 3-4.CrossRefGoogle Scholar
45 C.F.R. § 46.102(e)(4).Google Scholar
Id., at § 46.104(d).Google Scholar
Id., at §§ 46.104(d)(2)(iii), 46.104(d)(3)(i)(C), 46.104(d)(7), 46.104(d)(8)(iii).Google Scholar
Office for Human Research Protections, Department of Health and Human Services, “Exempt Research Determination FAQs,” available at <https://www.hhs.gov/ohrp/regulations-and-policy/guidance/faq/exempt-research-determination/index.html> (emphasis added) (last visited January 20, 2020).+(emphasis+added)+(last+visited+January+20,+2020).>Google Scholar
Final Rule, supra note 12, at 7183-7184.Google Scholar
See, e.g., University of California, Irvine Office of Research, “Exempt Registration Confirmation,” available at <https://research.uci.edu/compliance/human-research-protections/researchers/how-to-submit-electronic-irb-applications-for-review.html#Exempt> (last visited January 20, 2020).+(last+visited+January+20,+2020).>Google Scholar
45 C.F.R. §46.104(d)(2).Google Scholar
Id., at §46.104(d)(3).Google Scholar
See, e.g., mPower, available at <https://parkinsonmpower.org/your-story> (last visited January 20, 2020).+(last+visited+January+20,+2020).>Google Scholar
The ethics community has a long history of worrying about the negative psychosocial effects of learning information about oneself — especially genomic information — with little supporting evidence to support those concerns. See generally “Special Report: Looking for the Psychosocial Impacts of Genomic Information,” Hastings Center Report 49, no. S1 (2019).Google Scholar
45 C.F.R. §46.104(d)(8)(iv).Google Scholar
See Evans, B.J. and Wolf, S.M., “A Faustian Bargain that Undermines Research Participants’ Privacy Right and Return of Results,” Florida Law Review 71 (2019): 1281-1345; Bobe, J., Meyer, M.N., and Church, G., “Privacy and Agency Are Critical to a Flourishing Biomedical Research Enterprise: Misconceptions about the Role of CLIA,” Florida Law Review 71 (forthcoming 2019).Google Scholar
The Common Rule itself acknowledges non-research reasons for returning individual results, including complying with legal requirements to do so. Id. (noting that “[t]his provision does not prevent an investigator from abiding by any legal requirements to return individual research results”). Whether an IRB would recognize other reasons for returning individual results as non-research activities is a closer call.Google Scholar
See Meyer, M.N., “Practical Tips for Ethical Data Sharing,” Advances in Methods and Practices in Psychological Science 1, no. 1 (2018): 1-14, at 12.CrossRefGoogle Scholar
See Email correspondence between Brenda Belcher, Research Compliance Analyst, University of California Santa Cruz, and Misti Ault Anderson, HHS Office for Human Research Protections, April 9, 2019-July 11, 2019, available at <https://drive.google.com/file/d/1GrxMHOl2L8GIinE-hp2V9VLvMTaIpD3D/view> (last visited January 20, 2020) (cited with permission of Ms. Belcher).+(last+visited+January+20,+2020)+(cited+with+permission+of+Ms.+Belcher).>Google Scholar
Meyer, supra note 4, at 244-245.Google Scholar
See Lynch, H.F. et al., on behalf of the Consortium to Advance Effective Research Ethics Oversight (AEREO), “Of Parachutes and Participant Protection: Moving Beyond Quality to Advance Effective Research Ethics Oversight,” Journal of Empirical Research on Human Research Ethics 14, no. 3 (2019): 190-196.CrossRefGoogle Scholar
National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research, The Belmont Report: Ethical Principles and Guidelines for the Protection of Human Subjects of Research (Washington, DC: US Gov’t Printing Office, 1979): section B-2.Google Scholar
45 C.F.R. § 46.111(a)(1).Google Scholar
Cf. M.A. Rothstein et al., supra note 29, at 17-19 (describing and interpreting the obligation to maximize the benefits of research).Google Scholar
Meyer, supra note 4, at 251-263.Google Scholar
See, e.g., Drabiak-Syed, K., “Lessons from Havasupai Tribe v. Arizona State University Board of Regents: Recognizing Group, Cultura, and Dignitary Harms as Legitimate Risks Warranting Integration into Research Practice,” Journal of Health & Biomedical Law 6, no. 1 (2010): 175-225.Google Scholar
45 C.F.R. § 46.111(a)(2).Google Scholar
See Meyer, supra note 4, at 292-298.Google Scholar
45 C.F.R. § 46.116(f).Google Scholar
Id., at § 46.102(j) (“Minimal risk means that the probability and magnitude of harm or discomfort anticipated in the research are not greater in and of themselves than those ordinarily encountered in daily life or during the performance of routine physical or psychological examinations or tests.”).Google Scholar
Resnik, D.B., “Eliminating the Daily Life Risks Standard from the Definition of Minimal Risk,” Journal of Medical Ethics 31 (2005): 35-38. See also Joffe, S. and Wertheimer, A., “Determining Minimal Risk for Comparative Effectiveness Research,” IRB: Ethics & Human Research 36, no. 3 (2014): available at <https://www.thehastingscenter.org/irb_article/determining-minimal-risk-for-comparative-effectiveness-research/?s=> (last visited January 20, 2020).CrossRefGoogle Scholar
See Meyer supra note 23, at 224-227 (discussing principles and process elements of the Common Rule and IRB review that unregulated entities might wish to voluntarily adopt).Google Scholar