Hostname: page-component-745bb68f8f-cphqk Total loading time: 0 Render date: 2025-02-06T14:51:49.108Z Has data issue: false hasContentIssue false

How Should Health Data Be Used?

Privacy, Secondary Use, and Big Data Sales

Published online by Cambridge University Press:  09 March 2016

Rights & Permissions [Opens in a new window]

Abstract:

Electronic health records, data sharing, big data, data mining, and secondary use are enabling exciting opportunities for improving health and healthcare while also exacerbating privacy concerns. Two court cases about selling prescription data, the Sorrell case in the U.S. and the Source case in the U.K., raise questions of what constitutes “privacy” and “public interest”; they present an opportunity for ethical analysis of data privacy, commodifying data for sale and ownership, combining public and private data, data for research, and transparency and consent. These interwoven issues involve discussion of big data benefits and harms and touch on common dualities of the individual versus the aggregate or the public interest, research (or, more broadly, innovation) versus privacy, individual versus institutional power, identification versus identity and authentication, and virtual versus real individuals and contextualized information. Transparency, flexibility, and accountability are needed for assessing appropriate, judicious, and ethical data uses and users, as some are more compatible with societal norms and values than others.

Type
Departments and Columns
Copyright
Copyright © Cambridge University Press 2016 

Introduction

Electronic health records, data sharing, big data, and secondary use of health data enable exciting opportunities for improving health and healthcare. They also contribute to new concerns over privacy, confidentiality, and data protection. Two court cases, one in the United Kingdom and one in the United States, provide opportunities for thinking through ethical issues related to these developments. Each case involved selling data for marketing prescription drugs, and in each case the court decided in favor of selling the data. However, the cases were decided on different grounds, raising more general issues of secondary use of health data and the growth of health-related databases, data sharing, data aggregation, and biometric identification.

Significant health data protection, policy, and ethical considerations are inherent in these cases. The cases call into question just what constitutes “privacy” and “public interest,” and considerations for balancing them. They provide an opportunity to weigh privacy against the numerous beneficial uses of data: for individual patient care, public health, research, biosurveillance, and marketing. The cases prompt ethical questions of commodifying medical information and of harmonizing policy across jurisdictional boundaries. They raise concerns of how health data can, and should, be used. Their consequences may affect biomedical informatics, patient and provider privacy, and regulation in ways this article explores, both in the United States and elsewhere.

How health data can, and should, be used is at the intersection of public health, research, care, privacy, and ethics. This article provides an ethical analysis of these interwoven ethical issues involving appropriate, judicious, and ethical secondary data use, reflecting a more general discussion of big data benefits and harms, and touching on common dualities of the individual versus the aggregate or the public interest, research (or, more broadly, outside the health field, innovation) versus privacy, individual versus institutional power, identification versus identity, identification versus authentication, and virtual versus real individuals and contextualized information.Footnote 1

I start by discussing what makes health data special, including international consensus on the importance of the clinician’s duty of confidentiality and on health data privacy or protection. Next I summarize the court cases. Then I consider who benefits from data disclosure and aggregation, and secondary use for data mining, research, and sale. Throughout, I highlight potential benefits and harms and argue that transparency, flexibility, and accountability is needed. Ethical and policy analysis should assess data uses and users, as some are more compatible with societal norms and values than others.

Considering how health data should be used in light of these issues suggests policy opportunities concerning patient data and privacy protection. As the use of electronic health records, electronic medical devices, mobile and e-health applications, and biometric, social and behavioral, and genomic data spreads, these considerations are becoming more relevant worldwide.

What’s Special about Health Data?—International Principles

All countries recognize confidentiality as a patient’s rightFootnote 2 that is good for individual patients and clinicians, and for society as a whole.Footnote 3 Intimacies are revealed in the interest of good healthcare, so clinicians’ professional and fiduciary duties include a duty of confidentiality. Therefore, health information is given special protection internationally, though specific ways of achieving it differ. Lifestyle choices, reproductive abilities, and stigmatizing conditions are considered highly sensitive. But what is included in these categories differs with cultural background, from place to place, and from time to time. What is considered very private, embarrassing, stigmatizing, or grounds for discrimination varies among individuals and groups.Footnote 4 Countries, likewise, vary in what personal information is treated as needing restricted collection, use, and disclosure.Footnote 5,Footnote 6 They also balance privacy and other considerations differently; thus privacy protection is more lax in some places than in others. In India, for example, the judiciary considers privacy on a case-by-case basis, as an exception to the rule that permits government interference in private life. Unlike in Europe and the United States, public interest, welfare, and safety take precedence over individual rights, liberty, and autonomy.Footnote 7

Yet, as discussed subsequently, even if individual clinicians scrupulously meet the professional obligation of confidentiality, confidentiality can be compromised by legal requirements to collect, document, and disseminate personal health information, especially when maintained in computer databases that can be combined easily with other sources of information about the person.Footnote 8 What patients reveal for the purpose of healthcare may then be used in ways they never intended. Privacy practices have not caught up to these trends.

Fair Information Practices and De-Identification

The same Fair Information Practices(FIPs) underpin privacy policies in both the European Union and the United States. The European Union and the United States each protect personal data, including data concerning health, albeit differently.

The United States approaches privacy by sector; separate laws address confidentiality in distinct domains, such as finance and healthcare. Health data privacy collected in the course of clinical care is governed by the U.S. Health Insurance Portability and Accountability Act (HIPAA) of 1996, extended by the HIPPA Privacy Rule in 2001 and again in 2013 by changes mandated by the 2009 Health Information Technology for Economic and Clinical Health (HITECH) Act (part of the American Recovery and Reinvestment Act [ARRA] of 2009) and the Genetic Information Non-Discrimination Act (GINA) of 2008.Footnote 9,Footnote 10,Footnote 11

The European Union takes a more comprehensive general approach to privacy; Article 8 of the European Convention on Human Rights includes the right to data protection. This right is embodied in the 1995 Directive 95/46/EC on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data.Footnote 12 Member states implement directives differently, but the EU Data Protection Regulation establishes a single set of rules for data protection across the EU; the final texts are expected to be adopted by the European Parliament at the beginning 2016 and the new rules to become applicable two years after.Footnote 13,Footnote 14

Despite their differences, both the United States and the EU construe privacy ascontrol and protection of data rather than other conceptions of privacy.Footnote 15 Both the United States and the EU also make special note of health information, and both rely on stripping data of content presumed to identify the individual represented by the data. As Paul Ohm points out: “In addition to HIPAA and the EU Data Protection Directive, almost every single privacy statute and regulation ever written in the U.S. and the EU embraces—implicitly or explicitly, pervasively or only incidentally—the assumption that anonymization protects privacy, most often by extending safe harbors from penalty to those who anonymize their data.”Footnote 16

As these safe harbors stipulate, neither HIPAA nor the EU Data Protection Directive applies after data are de-identified. However, relying on de-identification contributes to what is considered an inadequate and problematic legal framework for data protection.Footnote 17 Addressing concerns over de-identification “would require a significant shift in approach towards data-protection across Europe.”Footnote 18 Similar deficiencies plague the United States.Footnote 19,Footnote 20

Privacy protection, then, depends not merely on de-identification but on expectations, transparency, and how data are used. De-identification, or anonymization, presumes that it is possible to identify and enumerate the kinds of data that might contribute to privacy risks and to specify how to prevent harms,Footnote 21 that such a list is static and sufficient in all contexts,Footnote 22 and that there are no privacy harms if the individual is not identified, even though individuals may object to uses of their personal data even if they themselves are anonymous.Footnote 23 Furthermore, HIPAA permits secondary uses of data for research, public health, law enforcement, judicial proceedings, and other “public interest and benefit activities,” without individual authorization, thereby assuming that “public interest” is clearly understood.Footnote 24,Footnote 25 All are questionable assumptions.

Duty of Confidentiality

Health data privacy relates not only to expectations about privacy in general but also to norms involving professional practice, privilege, autonomy, paternalism, and protected communication and the duty of confidentiality, as well as to requirements for data collection, dissemination, and retention.

Physicians and nurses have duties both to their individual patients and to the health of their communities.Footnote 26 At least since the time of the Hippocratic Oath, it is believed, societal norms and common law have recognized that clinicians’ duty to patients includes maintaining confidentiality, except where protecting the public interest or other individuals requires overriding it. The International Code of Ethics of the World Medical Association (WMA) makes respecting the right to confidentiality a duty integral to a physicians’ responsibility to patients.Footnote 27 The WMA Declaration of Helsinki—Ethical Principles for Medical Research Involving Human Subjects (revised 2013) places a duty on physicians “to protect the life, health, dignity, integrity, right to self-determination, privacy, and confidentiality of personal information of research subjects . . . even though they have given consent.”Footnote 28

Recognizing that this personal information, whether collected for research or clinical practice, increasingly is held in databases, in 2002 the WMA adopted the Declaration on Ethical Considerations Regarding Health Databases: “Confidentiality is at the heart of medical practice and is essential for maintaining trust and integrity in the patient-physician relationship. Knowing that their privacy will be respected gives patients the freedom to share sensitive personal information with their physician.”Footnote 29

In this 2002 declaration, the WMA reaffirmed that violating this duty could “inhibit patients from confiding information for their own health care needs, exploit their vulnerability or inappropriately borrow on the trust that patients invest in their physicians” while at the same time recognizing the value of secondary health data use for quality assurance, risk management, and retrospective study.Footnote 30

Thus, a key reason for treating health data as requiring special protection is to maintain trust between clinician and patient in the interest of both social and public order as well as better care for each individual patient. In recognition of this ethical duty, confidentiality is seen worldwide as a health professional’s legal duty, one that protects the professional from giving legal testimony, thereby serving the interests of patient and public by maintaining trust during medical encounters. Nowhere can private data about a patient rightly be passed to a third party without that patient’s permission, except as required by law. French criminal law makes this universal spirit apparent by criminalizing a physician’s breach of confidentiality even in court testimony, even if the patient would allow it.Footnote 31

How people value and respond to concerns about health data privacy is affected by context and common expectations of privacy.Footnote 32 Many recognize that clinicians need highly personal information in order to care for them, and, because of the long-standing history of trust in professional confidentiality, such patients willingly share sensitive information with those who treat them. As Deryck Beyleveld and Elise Histed eloquently point out:

Information that patients provide for their treatment is about very personal and sensitive areas of their lives. Indeed, it relates to their very existence, both physically and symbolically. As such, it is not information that they may be presumed to be prepared to disclose or have used freely. It is their vulnerability, constituted by pain and distress, or fears about their health and lives, that leads them to disclose this information to health professionals. At the same time, people are apt to attach great importance to intimate information about themselves and their bodies, and this can be associated with mystical and religious beliefs, which by their very nature can be idiosyncratic.Footnote 33

Patient Benefits and Harms

Individuals also may provide health information freely via health-related social networking, web postings, and searches; or because it is required legally, as for prescriptions. Such information could be consolidated and linked with other data for beneficial or nefarious purposes, sometimes without individuals’ knowledge. Patients benefit from having their record information available from previous clinical visits, whether or not those visits were with the same clinician or in the same facility, because clinicians can make better care decisions in light of fuller understanding of their patients’ past clinical histories. Patients also benefit from public health surveillance and research that depends on combining health information from individual patients to improve public health and develop better treatments. Patients may benefit from making identifiable information concerning adverse drug events available to pharmaceutical companies so that those companies can follow up with patients and improve drug safety, as Source Informatics argued in the U.K. court case discussed subsequently, and as the International Pharmaceutical Privacy Consortium argues more generally.Footnote 34,Footnote 35 Data aggregator IMS Health Canada (IMS Health, Inc., was a plaintiff in one of the court cases) unsurprisingly takes the position that analyzing doctors’ prescribing habits contributes to patients becoming informed consumers.Footnote 36

Yet patients can be harmed when data about them are used to violate privacy: to deny employment, credit, housing, or insurance, and for identity theft and other unsavory purposes. Some fear that patients who are insecure about the confidentiality of prescription or other health record information may withhold information, refuse diagnostic and genetic testing, or decline electronic prescriptions.Footnote 37,Footnote 38,Footnote 39 People do change their behavior and withhold information in order to protect their health information privacy.Footnote 40 Even before the widespread use of electronic health records, a 2000 Gallup poll indicated that the vast majority of people in the United States opposed third-party access to medical data without a patient’s permission, and, furthermore, that 67 percent of those polled opposed the release of data to medical researchers.Footnote 41 Similarly, the Pew Internet and American Life Project reported that, to protect privacy, according to a 1999 survey, nearly one in six patients withheld information, provided inaccurate information, doctor-hopped, paid out of pocket instead of using insurance, or even avoided care. Eighty-five percent feared that seeking health information on the Internet would result in changes in insurance coverage or otherwise reveal their information.Footnote 42

Transparency and Consent

As information resources become more ubiquitous and information sharing becomes more profitable, more thought is needed concerning which data uses are acceptable and what control individuals should have over data about themselves. Privacy violations may compromise patient care, the information in patients’ records, and patient-clinician relationships. The principles of data protection—transparency, legitimacy, and proportionality—embodied in the EU Data Protection Directive, therefore, specify that the person from whom data are obtained should be informed of what will be done with this information and to whom it will be disclosed. This allows the individual to consent or object and to withdraw or correct the data. Also, according to the directive, the data should be kept only as long as necessary for the specified purpose,Footnote 43 even though that could compromise later retrospective research.

Patients’ privacy concerns are exacerbated when patients, and even clinicians, have little idea of what becomes of their data, or just what is protected and what is not.Footnote 44 Withholding information from one’s clinician is neither in the public interest nor beneficial to that patient’s individual interest in proper healthcare. Yet, removing identifying information from patient records may not alleviate concerns, especially in light of increasing public awareness of privacy violations surrounding big data and the ease with which data sets that were meant to be kept apart now are combined and used for reidentification.Footnote 45,Footnote 46,Footnote 47,Footnote 48,Footnote 49 Further, without transparency, consent is meaningless.

Two Court Cases

Two court cases provide occasion for thinking about the ethical implications of data sale and secondary use in light of international principles of health data privacy and protection. Each case involves selling prescription data for pharmaceutical marketing. In both the United States and the United Kingdom, data aggregators successfully challenged restrictions on such data use and sale.

The 2011 U.S. Supreme Court case Sorrell v. IMS Health Inc. et al. Footnote 50 was decided on free speech grounds. Although the legalities involve unique features of U.S. constitutional law, a similar case in the U.K. in 2000, R v. Department of Health, Ex Parte Source Informatics Ltd.,Footnote 51 points to the international nature of the ethical issues. That case was decided on the grounds that selling anonymized (de-identified) data did not violate pharmacists’ duty of confidentiality.

The decision in each case runs counter to public expectations of health data confidentiality. The public is hardly aware that aggregating and selling prescription and other health data are an international enterprise. Thus, the Sorrell and Source cases raise more general global concerns of privacy and data protection, on the one hand, and appropriate use and secondary use of data for data mining, marketing, research, public health, and healthcare, on the other. Elsewhere I address data de-identification, prescription and other health data aggregation and sale, and issues more specific to these two cases.Footnote 52 This article explores other ethical issues related to the cases—the benefits and harms of data sale; the trade-offs among privacy, individual health, and public health; and the need for transparency—so ethical dimensions of responsible and ethical health data collection and use can be assessed.

Who Benefits?

Clinical data include data that patients are required to provide to receive care. In both the Sorrell and Source cases, prescription data was aggregated and sold. Patients, prescribers, and pharmacies are required by law to collect information related to prescribing. Data aggregators perform a valuable service in collecting, cleaning, and combining these and other data into useful resources, though the value does not accrue directly to those who are the original source of the data. Data aggregators should be compensated for the value added, but the sources deserve some benefit as well. Currently, they primarily bear costs, both financially and in privacy.

The combination of required disclosure of personal data and the ease with which data can be collected and disseminated is not unique to pharmacies. It is a cost of healthcare to collect and store patient records, a cost passed on to patients and payers, whether private or governmental. The organizations providing these data obtain it from those legally required to provide it—from individuals who pay directly, or indirectly through their private or public insurers, for its collection and maintenance. These individuals gain little direct benefit from the aggregation and sale of data about them, and they may be harmed by it. It mostly occurs without their knowledge or permission. Even in light of arguments that patients should be required as a condition of treatment to allow data about them to be used for research—a requirement counter to professional norms to provide care—it seems improper to require either patients or clinicians to disclose data they would otherwise choose to keep private so that others may financially profit from them, whether or not the data are de-identified.

Secondary use and big data analytics also are affected by the costs of collecting, storing, and organizing data, as well as by the costs of meeting regulatory requirements. To reduce costs, health data processing is outsourced from countries with stronger privacy protections to countries with weaker ones, despite its sensitive nature and consequent privacy risks.Footnote 53 Also to reduce costs, U.S. marketing organizations oppose opt-in consenting on the grounds that it would increase the cost of doing business.Footnote 54

But costs must be paid somehow. Both the Source and Sorrell cases were fought to protect the commercial value of health information. One way of recovering costs is by selling these data. Though some sources provide some data sets at little or no cost to researchers, cost could make it easier for pharmaceutical companies and other commercial enterprises than for researchers to access data.Footnote 55,Footnote 56 Some fear that the trend toward treating data as private property could make it more difficult to develop comprehensive databases crucial for public health and research.Footnote 57

Research, trade, and innovation, as well as the globalized healthcare industry, provide considerable public benefit. There are ethical as well as economic costs to privileging privacy, but economic value may not be more important than privacy or other considerations. Law and common ethical practice prevent releasing medical information without a patient’s permission, but U.S. law does not prevent selling or transferring rights to records.Footnote 58 Data that can be sold, can be sold and replicated anywhere and, once sold, may be used for good or ill. Tracing the chain of data sales and use is difficult, making transparency and consent nearly impossible the further data are transferred from the original source.

Health Data Uses: Big Data, Data Mining, Research, and Biosurveillance

Electronic health records and health information networks provide a wealth of data for public health, health outcome improvements, and research. Data could be used for a range of beneficial purposes, from outcomes and comparative effectiveness research to designing clinical trials and monitoring drug safety. The benefits of these data for public health, marketing, research, drug development, identifying adverse effects, and biosurveillance; for reducing costs and overprescribing; and for regulating devices and software all are intertwined with privacy concerns. For some of these purposes, it is crucial to be able to identify individuals and link together an individual’s records, so a requirement for de-identification may further impair research.

However, there also could be harms. Patients may withhold sensitive information if they fear it will be used against them, even though it may be useful for other purposes. Studies based on analyzing large data sets could be compromised if individual prescribers or patients withhold information or their consent for data use.Footnote 59

Privacy advocates, researchers, and public health officials can be at odds over how to achieve benefits while protecting privacy; their disagreements may stem from different values and historical legacies. For example, the U.K.’s National Health Service (NHS), Royal College of Physicians, and the Wellcome Trust led a coalition of leading medical research organizations opposed to the proposed European General Data Protection Regulation, which, unlike the Data Protection Directive it would replace, would bind all 28 member countries. The proposal was acceptable to most EU nations; the European Parliament approved the committee report in full in 2014.Footnote 60

The regulation affects any organization that gathers, processes, and stores data, whether operating within the EU, doing business with organizations within the EU, or storing data in EU-member countries. As of this writing, most organizations were not ready for compliance. Research organizations were among those concerned about its impact. It is especially relevant here that the regulation defined personal data as any information about an individual, whether it relates to his or her private, professional, or public life; and thus such data includes medical information. Much of these personal data—a name, a photo, an email address, bank details, posts on social networking websites, or a computer’s IP addressFootnote 61—too, are part of medical records. The original proposed regulation, therefore, increased health data protection and would have made illegal the NHS mass database of citizens’ health information, which could provide a valuable resource for improving care.Footnote 62,Footnote 63,Footnote 64 Opposition from the NHS and other research organizations contributed to changes put forward by the EU justice ministers in March 2015 to improve data sharing across healthcare services. They also tabled amendments regarding how to manage such special forms of data as health and genetic data, and when patient consent is needed.Footnote 65 The European Parliament, the Council, and the Commission agreed on the new regulation late in 2015 and it is expected to be adopted by the European Parliament at the beginning of 2016.Footnote 66

This NHS database also provokes privacy concerns while providing financial benefit, as the NHS sells the data.Footnote 67 Individuals can opt out of the new care.gov database, which was to contain, for the first time, records from primary care (GP) practices. Privacy concerns delayed including those GP records.Footnote 68 Although other rules allow greater third-party access to other NHS databases,Footnote 69 insurers, pharmaceutical companies, and other private commercial enterprises will receive “pseudoanonymized” records that the NHS claims “will not contain information that identifies you,” but that instead include NHS numbers, birth dates, postcodes, and ethnicity and gender information.Footnote 70 The database was created, according to the NHS England website, to improve NHS services,Footnote 71 and to “drive economic growth by making England the default location for world-class health services research.”Footnote 72

In the United States, too, researchers and bioethicists recognize that privacy protections can impede research and healthcare quality improvement, with calls from such influential agencies as the Institute of Medicine to change the HIPAA Privacy Rule to allow for information-based research—that is, research using medical records or stored biological samples.Footnote 73

Some innovative approaches to meeting privacy, research, and commercial needs for data sharing include the new international Open Humans Network, which “attempts to break down health data silos through an online portal that will connect participants willing to share data about themselves publicly with researchers who are interested in using that public data and contributing their analyses and insight to it,”Footnote 74 and businesses based on similar ideas, such as PatientsLikeMe. Using the data people post, PatientsLikeMe produces publishable material on patient outcomes and comparative effectiveness, which is valuable for effectiveness research. Epidemiologic trends also can be identified through social media postings.Footnote 75,Footnote 76,Footnote 77 Those engaging in this social networking presumably feel it is beneficial to them. Even so, it would be better if they were aware of what is done with their data, instead of being surprised if they have not read subscription agreements carefully enough to know that PatientsLikeMe sells data to pharmaceutical and other companies and that sites such as Facebook are not private places.Footnote 78,Footnote 79

Who Sells and Uses Data? One Man’s Bread is Another Man’s Poison

As is evident from the multiplicity of uses, health data are valuable. Internationally, the idea of “liberating” data for secondary use is recognized as beneficial for individual and public benefit, research, entrepreneurship, and policy. Though transborder data flow is regulated by international agreements, such as the EU Data Protection Directive, presumably health data could be sold worldwide, to anyone, for any purpose. Balancing this with privacy concerns is fraught.Footnote 80 Strong privacy protection, such as the rights-centric approach of the European Court of Human Rights, could adversely affect the globalized healthcare industry, and innovation and trade.Footnote 81,Footnote 82,Footnote 83,Footnote 84

Entire patient records are among the many possible sources of data for which there is a lucrative market, for laudable as well as unsavory purposes. Incidents of medical identity theft increased by more than 20 percent in 2014 compared to 2013.Footnote 85 In the active black market in identifiable medical record information, health information is more valuable than U.S. Social Security numbers for identity theft.Footnote 86,Footnote 87 Though prices vary, such information sells for about ten times more than credit card numbers (which typically sell for no more than a few U.S. dollars) because it can be monetized by getting treatment paid for via identity theft or to extort money from hacked corporations.Footnote 88,Footnote 89

Electronic records also make it possible for computer or software vendors, intermediaries, or newly created organizations to bundle and sell rights and data,Footnote 90 a practice useful for research, policy, marketing, and business. In the United States, there is an exhaustive list of organizations that can use and legally sell health information,Footnote 91 some for purposes patients and clinicians would not anticipate. Data sold by both U.S. state and federal agencies can be linked to individuals by using publicly available information, even if some of the data are de-identified.Footnote 92,Footnote 93

Some may consider what is done with these data as harmful to some of the individuals who have provided the data and, at the same time, as beneficial to other individuals, depending on what the data reveal. This combination of benefits and harms is evident in a variety of examples in which one’s records affect one’s services and costs. In the United States, where private medical insurance is the norm, private insurers use prescription and other claims data to deny insurance, charge differential premiums, or exclude some conditions.Footnote 94 Businesses often check the MIB (Medical Information Bureau) for job applicants’ underwriting data.Footnote 95 Aggregators purchase and combine data from the states as well as from pharmacies.Footnote 96 Credit agencies are the most frequent buyers of multistate health profiles, though IMS Health also purchases data from the states.Footnote 97 Government fusion centers, designed to promote data sharing among federal agencies and state and local governments, combine data from multiple sources—including health record information—for law enforcement, immigration control, and homeland security.Footnote 98,Footnote 99

Organizations, too, may benefit financially while providing social benefit through data sales. The American Medical Association and the U.S. Centers for Medicare and Medicaid sell provider data, whereas state Health Information Exchanges (HIEs) sell secondary data.Footnote 100,Footnote 101,Footnote 102 The U.K.’s National Health Service, too, sells data.Footnote 103 Insurance companies or health information technology vendors might aggregate and sell provider-identified data on performance and quality measures, the number of procedures performed, U.S. meaningful use criteria, data security breaches, and other useful compilations. Cash-strapped community health organizations, state Regional Extension Centers (RECs), county hospitals, the U.S. Veterans Administration, the Indian Health Service, the Joint Commission, or hospital associations also could sell data for similarly beneficent purposes. Hospitals routinely sell birth records.Footnote 104

Genetic data are also double-edged. Such data are needed for research, personalized medicine, and biobanking but also can make individuals and communities vulnerable. For example, in 2000, Iceland’s parliament sold exclusive rights to all the genetic and genealogical data from each of its 275,000 citizens to the U.S. company deCODE Genetics. Soon thereafter, deCODE signed a $200 million contract with Hoffman LaRoche to search for several common human genetic diseases. Iceland had an opt-out policy, and the data were encrypted to de-identify individuals. Nevertheless, the Icelandic Supreme Court later ruled that creating the database was unconstitutional because it did not adequately protect personal privacy.Footnote 105

Clearly, provider or patient information is valuable. Hospitals could purchase data about competitors, providers could identify populations for treatment, researchers could conduct studies involving healthcare and public health practices, and government agencies could identify and influence health trends. If such sales were restricted, some fear, the data would not be collected or maintained at all, which could compromise research and new drug development.Footnote 106,Footnote 107 The Iceland genetic database sale, for example, led to identification of genes linked to disease,Footnote 108,Footnote 109 though capitalizing on these kinds of discoveries was limited to the company with exclusive rights to this gene discovery. DeCODE’s 2009 bankruptcy and the consequent database ownership change from a scientific research company to Saga Investments LLC, and the subsequent sale of the database in 2012 to biotech pioneer Amgen, again raised questions about data privacy and use.Footnote 110,Footnote 111

Countries as different as Canada, Estonia, Sweden, Singapore, and the Kingdom of Tonga have developed various models for protecting privacy and differing policies regarding commercial involvement and rights to samples for gene banks, all with the goal of improving the public health of the studied population, and, in some cases, to generate revenue for national healthcare budgets. Though all these policies are intended to maintain confidentiality, all of the data uses require personal identifiers so as to link individuals’ records from genetic, medical, genealogical, and lifestyle databases. International controversy over such databases, therefore, centers around confidentiality, consent, to what extent commercial interests should influence policy, and whether commercial ownership facilitates or impedes research,Footnote 112,Footnote 113,Footnote 114 all of which are concerns related to collecting and selling healthcare data in general.

As a way of raising additional considerations, I pose possibilities that might occur were there unrestricted selling of health data. Abortion opponents presumably could buy aggregated prescription information for medications that cause abortions, or animal rights activists could buy information about researchers’ animal or veterinary medicine purchases. Depending on who buys it and their purpose, such information could threaten or protect researchers’, clinicians’, and patients’ safety and might have adverse effects on research and clinical practice or might open new avenues. Physicians, patients, hospitals, and so on, in one country may be targeted for marketing by commercial ventures or medical tourism facilities in another. Some may welcome learning of such opportunities, whereas others may feel harassed or violated. Individuals in one country may experience salutary or salacious effects from having (identified or possibly re-identified) data available elsewhere. But without transparency, there is little chance of gaining individual consent or, on both individual and societal levels, assessing harm or benefit.

Ownership, Commodification, and De-Contextualization

The right to sell data is muddied by lack of clarity over the legalities of data ownership. Law in and outside the United States does not address health data ownership clearly; it is not clear who the owner should be, or whether ownership is better than current or alternative approaches.Footnote 115,Footnote 116 It also is not clear where those who sell data analytics services obtain the data, or how they might use them.Footnote 117 Well-known electronic health record vendors have sold de-identified copies of their patient databases to pharmaceutical companies, medical device makers, and health services researchers.Footnote 118 Vendor contracts are unusual in that some vendors lay claim to patient record data, whereas businesses and financial institutions typically do not give up their data to their software vendors.Footnote 119 Regardless of whether the data themselves or the means of access to them are owned by electronic health records vendors, some academic medical centers pay to get data from their own patients’ records. Vendors often consider their contracts intellectual property and do not reveal these and other contract provisions, a practice the American Medical Informatics Association considers unethical.Footnote 120

If health data are property, presumably, whoever owns the data can sell them. Some advocate clearly defined property rights in medical information, giving patients the right to monetize their access and control rights, as a way for individuals to control and benefit from what happens to data about them.Footnote 121 Others argue against property rights in patient data and advocate instead for public ownership akin to a data commons so that data from multiple sources can be de-identified and combined population-wide for public benefit.Footnote 122 Commodifying medical information strikes still others as anathema to professional values and the special relationship between doctor and patient. Privacy is valued because it facilitates ideals of personhood involving autonomy, individuality, respect, dignity, and worth as a human being.Footnote 123 Therefore, the idea of selling personal health data also disturbs those who think the practice commodifies the self and sullies ideas of personhood.Footnote 124,Footnote 125 Compromising of personhood is compounded because data in databases necessarily are de-contextualized. De-identification is an attempt to remove any connection with the person, but even identifiable health record data typically do not include all information a person may consider central to the self.

Conclusions

Widespread use of electronic patient record systems enables opportunities to improve healthcare through data sharing, secondary use, and big data analytics. Multiple healthcare professionals, payers, researchers, and commercial enterprises can access data and reduce costs by eliminating duplication of services and conducting research on effective care. However, widespread use of electronic patient records systems also creates more opportunities for privacy violations, data breaches, and inappropriate uses.

Ethical and policy analysis related to health data and informatics should consider benefits and harms, taking into account both the uses and users of the information.Footnote 126,Footnote 127 Embarrassing an estranged spouse by publishing his or her mental health records is more distasteful than using those records combined with others’ to study and improve mental health. As this example suggests, some users (the researcher) are more appealing than others (the spouse). Moreover, an uncontroversial use may be morally offensive if the user is unsavory or controversial.Footnote 128 How should distinctions be made so that some data uses and users are permissible and some not? On what grounds? And who is best placed to make such decisions: the courts or legislators, clinicians and researchers who are most familiar with their data needs, companies that develop and market new medications, or patients and prescribers, who are most affected by privacy violations and can best weigh the relative importance of various values to themselves.Footnote 129

Those most familiar with, closest to, and affected by the potential use should have a strong say. They need to know about those uses, though, to express their preferences in an informed, thoughtful way. Many patients do not know what is, or can be, done with data about them, but keeping them ignorant is not the way to address concerns. Lack of accountability and transparency about health data uses feeds the public’s privacy concerns,Footnote 130 undermines the possibility of informed consent, and impairs research, care, and public health.

Ethical considerations over data use will, and should, evolve as the public becomes more aware of the value and pitfalls of data sharing, data aggregation, and data mining. Cases like Source and Sorrell encourage debate over propriety and values related to different kinds of data use. They also lead to examining when it is in the public interest for personal health data to be made available, just what that “public interest” is,Footnote 131,Footnote 132 and, for that matter, just what “privacy” comprises and entails as norms evolve.Footnote 133 The issues include considering, in a healthcare context, the dualities playing out with respect to big data in domains other than healthcare: the individual versus the aggregate, research versus privacy, individual versus institutional power, identification versus identity, identification versus authentication, and virtual people versus real people and contextualized information. They involve big data harms and benefits related to innovation and economic advancement, power shifts, access to knowledge, and freedom of communication.

Societies and governments need to grapple with these ethical issues, tensions between privacy and other considerations, and shifting norms. The numerous cross-cutting issues suggest that other areas of law, ethics, and social policy also can inform related ethical and legal considerations. For some time, the legal, bioethics, and informatics communities have been considering issues such as appropriate secondary use of data; patient and clinician relationships in light of the growth of electronic health records and health information technologies;Footnote 134,Footnote 135,Footnote 136 reliance on increasingly untenable de-identification; burgeoning electronic data collection, sharing, transmission, and aggregation; data use for public health, research, and innovation; and the privacy and security of health data.

As health information exchanges and health tourism develops; as lifetime electronic health records that follow patients across governmental and institutional boundaries are used more widely; as databases grow and biobanks become digital; as biometric identification becomes more common; as radio frequency identification devices (RFIDs) are embedded in medical devices, smartpills, and patients; as home sensors and monitors are increasingly used; as mobile, wearable, and e-health applications expand; and as health information exchanges develop,Footnote 137,Footnote 138,Footnote 139,Footnote 140 informaticians can add to the conversation among governments, courts, regulatory agencies, professional societies, and other organizations to consider responses to issues involving health-related data. Combining legal and ethics scholarship with informaticians’ expertise concerning judicious and ethical data collection and use, together with their technical knowledge of data aggregation and identification, can contribute to more informed policies.

The Source and Sorrell court cases can provoke an initial reaction of outrage over privacy violations and data use without consent. Consequently, they call into question just what constitutes “privacy” and “public interest” and stimulate considerations as to how to balance them. They provide an opportunity to weigh privacy against numerous beneficial uses for data. Transparency and accountability are needed so that harms and benefits can be judged through public discussion and so that individual as well as societal decisions can be made on more informed and thoughtful grounds. Using data collected for one purpose (such as prescriptions) for another purpose (such as pharmaceutical marketing) can undermine public confidence, especially if the public is unaware of the reuse. Doing so without individuals’ permission violates international principles of data privacy.Footnote 141,Footnote 142,Footnote 143,Footnote 144,Footnote 145 The court cases prompt ethical questions about commodifying medical information and harmonizing policy across jurisdictional boundaries. Their consequences may affect biomedical informatics, patient and clinician privacy, and regulation in ways this article explores, in the United States, United Kingdom, and elsewhere.

References

Notes

1. Laura Wexler’s comments as a respondent at “The Critical Life of Information,” a conference at Yale University, April 11, 2014, outlined dualities related to big data; see http://wgss.yale.edu/sites/default/files/files/Critical%20Life%20of%20Information%20Program%20spreads.pdf (last accessed 19 Aug 2014) for conference information.

2. Jost, TS. Readings in Comparative Health Law and Bioethics. 2nd ed.Durham, NC: Carolina Academic Press; 2007.Google Scholar

3. Institute of Medicine (IOM). Beyond the HIPAA Privacy Rule: Enhancing Privacy, Improving Health Through Research. Washington, DC: The National Academies Press; 2009, at 78.Google Scholar

4. See note 3, IOM 2009, at 79.

5. Jones, P. Permission-based marketing under Canada’s new privacy laws. Franchise Law Journal 2004;24(2):267303.Google Scholar

6. Walden, I. Anonymising personal data. International Journal of Law and Information Technology 2002;10(2):224–37.CrossRefGoogle Scholar

7. Srinivas, N, Biswas, A. Protecting patient information in India: Data privacy law and its challenges. NUJS Law Review 2012;5(3):411–24.Google Scholar

8. Kaplan, B. Selling health data: De-identification, privacy, and speech. Cambridge Quarterly of Healthcare Ethics 2015;24(3):256–71.CrossRefGoogle Scholar

9. United States Government, Department of Health and Human Services, Office for Civil Rights. Summary of the HIPAA Privacy Rule; available at http://www.hhs.gov/ocr/privacy/hipaa/understanding/summary/ (last accessed 30 June 2013).

10. United States Government, Department of Health and Human Services, Office for Civil Rights. Standards for Privacy of Individually Identifiable Health Information; available at http://aspe.hhs.gov/admnsimp/final/pvcguide1.htm (last accessed 19 Jan 2014).

11. United States Government, Department of Health and Human Services, HSS Press Office, New rule protects patient privacy, secures health information 2013 Jan 17; available at http://www.hhs.gov/about/news/2013/01/17/new-rule-protects-patient-privacy-secures-health-information.html (last accessed 1 Jan 2016). See also United States Government, Department of Health and Human Services, Office of the Secretary. 45 CFR Parts 160 and 164: Modifications to the HIPAA Privacy, Security, Enforcement, and Breach Notification Rules Under the Health Information Technology for Economic and Clinical Health Act and the Genetic Information Nondiscrimination Act; other modifications to the HIPAA Rules; final rule. Federal Register 2013 Jan 25:5565–702; available at http://www.gpo.gov/fdsys/pkg/FR-2013-01-25/pdf/2013-01073.pdf (last accessed 2 July 2014).

12. European Union. EU Directive 95/46/EC—The Data Protection Directive; available at http://www.dataprotection.ie/docs/EU-Directive-95-46-EC--Chapter-2/93.htm (last accessed 23 Mar 2014).

13. European Commission, Directorate General for Justice and Consumers. Agreement on Commission's EU data protection reform will boost Digital Single Market 2015 Dec 15; available at http://europa.eu/rapid/press-release_IP-15-6321_en.htm (last accessed 5 Jan 2016). See also European Commission, Directorate General for Justice and Consumers. Reform of EU data protection rules; available at http://ec.europa.eu/justice/data-protection/reform/index_en.htm (last accessed 5 Jan 2016).

14. Rossi B. Countdown to the EU General Data Protection Regulation: 5 steps to prepare. Information Age 2015 Mar 24; available at http://www.information-age.com/it-management/risk-and-compliance/123459219/countdown-eu-general-data-protection-regulation-5-steps-prepare (last accessed 13 May 2015).

15. Solove, DJ. A taxonomy of privacy. University of Pennsylvania Law Review 2006;154(3):477560.CrossRefGoogle Scholar

16. Ohm, P. Broken promises of privacy: Responding to the surprising failure of anonymization. UCLA Law Review 2010;57:1701–77, at 270.Google Scholar

17. Taylor, MJ. Health research, data protection, and the public interest in notification. Medical Law Review 2011;19(2):267303.CrossRefGoogle ScholarPubMed

18. See note 17, Taylor 2011, at 303.

19. Kaplan B. Patient health data privacy. In: Yanisky-Ravid S, ed. The Challenges of the Digital Era: Privacy, Information and More. New York: Fordham University Press; forthcoming.

20. See note 8, Kaplan 2015.

21. See note 16, Ohm 2010.

22. See note 19, Kaplan forthcoming.

23. Beyleveld, D, Histed, E. Betrayal of confidence in the Court of Appeal. Medical Law International 2000;4:277311.CrossRefGoogle ScholarPubMed

24. Koontz L. What is privacy? In: Koontz L, ed. Information Privacy in the Evolving Healthcare Environment. Chicago: Healthcare Information and Management Society (HIMSS); 2013:1–20.

25. See note 19, Kaplan forthcoming.

26. See note 8, Kaplan 2015.

27. World Medical Association. International Code of Medical Ethics; available at http://www.wma.net/en/30publications/10policies/c8/index.html (last accessed 2 May 2014).

28. World Medical Association. Declaration of Helsinki—Ethical Principles for Medical Research Involving Human Subjects; available at http://www.wma.net/en/30publications/10policies/b3/ (last accessed 2 May 2014).

29. World Medical Association. Declaration on Ethical Considerations Regarding Health Databases; available at http://www.wma.net/en/30publications/10policies/d1/ (last accessed 2 May 2014).

30. See note 29, WMA 2014.

31. See note 2, Jost 2007.

32. Malin, BA, El Emam, K, O’Keefe, CM. Biomedical data privacy: Problems, perspectives, and recent advances. JAMIA (Journal of the American Medical Informatics Association) 2013;20(1):26.CrossRefGoogle ScholarPubMed

33. See note 23, Beyleveld, Histed 2000, at 296.

34. Dunkel, YF. Medical privacy rights in anonymous data: Discussion of rights in the United Kingdom and the United States in light of the Source Informatics cases. Loyola of Los Angeles International and Comparative Law Review 2001;23(1):4180.Google Scholar

35. See note 7, Srinivas, Biswas 2012.

36. See note 5, Jones 2004.

37. Powell, J, Fitton, R, Fitton, C. Sharing electronic health records: The patient view. Informatics in Primary Care 2006;14(1):55–7.Google ScholarPubMed

38. Schers, H, van den Hoogen, H, Grol, R, van den Bosch, W. Continuity of information in general practice: Patient views on confidentiality. Scandinavian Journal of Primary Health Care 2003;21(1):21–6.CrossRefGoogle Scholar

39. See note 23, Beyleveld, Histed 2000.

40. See note 32, Malin et al. 2013.

41. See note 34, Dunkel 2001, at 70.

42. Choy C, Hudson Z, Pritts J, Goldman J. Exposed Online: Why the New Federal Health Privacy Regulation Doesn’t Offer Much Protection to Internet Users. Health Privacy Project, Institute for Healthcare Research and Policy, Georgetown University: Pew Internet and American Life Project; 2001, at 4; available at http://www.pewinternet.org/files/old-media/Files/Reports/2001/PIP_HPP_HealthPriv_report.pdf.pdf (last accessed 11 May 2015).

43. See note 12, EU 2014.

44. McGraw, D. Building public trust in uses of Health Insurance Portability and Accountability Act de-identified data. JAMIA (Journal of the American Medical Informatics Association) 2013;20(1):2934.CrossRefGoogle ScholarPubMed

45. Curfman, GD, Morrissey, S, Drazen, JM. Prescriptions, privacy, and the First Amendment. New England Journal of Medicine 2011;364(21):2053–5.CrossRefGoogle ScholarPubMed

46. Tien L. Online behavioral tracking and the identification of Internet users. Paper presented at: From Mad Men to Mad Bots: Advertising in the Digital Age [conference]. The Information Society Project at the Yale Law School. New Haven, CT; 2011.

47. Benitez, K, Malin, B. Evaluating re-identification risks with respect to the HIPAA Privacy Rule. JAMIA (Journal of the American Medical Informatics Association) 2010;17(2):169–77.CrossRefGoogle ScholarPubMed

48. See note 16, Ohm 2010.

49. See note 8, Kaplan 2015.

50. Sorrell v. IMS Health, Inc., et al., 131 S. Ct. 2653 (2011).

51. R v. Department of Health, Ex Parte Source Informatics Ltd. [C.A. 2000] 1 All ER 786. See also R v. Department of Health, Ex Parte Source Informatics Ltd. European Law Report 2000;4:397–414.

52. See note 8, Kaplan 2015.

53. See note 7, Srinivas, Biswas 2012.

54. See note 5, Jones 2004.

55. Baxter, AD. IMS Health v. Ayotte: A new direction on commercial speech cases. Berkeley Technology Law Journal 2010;25:649–70.Google Scholar

56. Pasquale, F. Restoring transparency to automated authority. Journal on Telecommunications and High Technology Law 2011;9:235–54.Google Scholar

57. Rodwin, MA. Patient data: Property, privacy, and the public interest. American Journal of Law and Medicine 2010;36:586618, at 589.CrossRefGoogle ScholarPubMed

58. Hall, MA, Schulman, KA. Ownership of medical information. JAMA 2009;301(12):1282–4.CrossRefGoogle ScholarPubMed

59. Gooch, GR, Rohack, JJ, Finley, M. The moral from Sorrell: Educate, don’t legislate. Health Matrix 2013;23(1):237–77.Google Scholar

60. NHS European Office. Data Protection; 2015 Mar 24; available at http://www.nhsconfed.org/regions-and-eu/nhs-european-office/influencing-eu-policy/data-protection (last accessed 15 May 2015).

61. See note 14, Rossi 2015.

62. O’Donoghue C. EU research group condemns EU regulation for restricting growth in life sciences sector; 2014; available at http://www.globalregulatoryenforcementlawblog.com/2014/02/articles/data-security/eu-research-group-condemns-eu-regulation-for-restricting-growth-in-life-sciences-sector/ (last accessed 23 Mar 2014).

63. Farrar J. Sharing NHS data saves lives; EU obstruction will not. The Telegraph 2014 Jan 14; available at http://www.telegraph.co.uk/health/nhs/10569467/Sharing-NHS-data-saves-lives-EU-obstruction-will-not.html (last accessed 23 Mar 2014).

64. European Public Health Alliance. [Update] General Data Protection Regulation; available at http://www.epha.org/5926 (last accessed 23 Mar 2014).

65. NHS Confederation. EU ministers table changes to data privacy; 2015 Mar 13; available at http://nhsconfed.org/news/2015/03/eu-ministers-table-changes-to-data-privacy-laws (last accessed 14 May 2015).

66. See note 13, European Commission 2015.

67. Doctorow C. UK set to sell sensitive NHS records to commercial companies with no meaningful privacy protections—UPDATED; 2014 Feb 4; available at http://boingboing.net/2014/02/04/uk-set-to-sell-sensitive-nhs-r.html (last accessed 5 Feb 2014).

68. Donnelly L. Hospital records of all NHS patients sold to insurers. The Telegraph 2014 Feb 23; available at http://www.telegraph.co.uk/health/healthnews/10656893/Hospital-records-of-all-NHS-patients-sold-to-insurers.html (last accessed 24 July 2014).

69. See note 68, Donnelly 2014.

70. NHS Choices. Your records: Better information means better care; available at http://www.nhs.uk/nhsengland/thenhs/records/healthrecords/pages/care-data.aspx (last accessed 24 July 2014).

71. See note 70, NHS Choices 2014.

72. Ramesh R. NHS patient data to be made available for sale to drug and insurance firms. The Guardian 2014 Jan 19; available at http://www.theguardian.com/society/ 2014/jan/19/nhs-patient-data-available-companies-buy (last accessed 24 July 2014).

73. Institute of Medicine. Beyond the HIPAA Privacy Rule: Enhancing Privacy, Improving Health Through Research. Washington, DC: National Academies; 2009;Google Scholar available at http://www.iom.edu/∼/media/Files/Report%20Files/2009/Beyond-the-HIPAA-Privacy-Rule-Enhancing-Privacy-Improving-Health-Through-Research/HIPAA%20report%20brief%20FINAL.pdf (last accessed 22 Jan 2014).

74. Open Humans Network. Open Humans Network wins Knight News Challenge: Health Award; available at http://openhumans.org/ (last accessed 1 July 2014).

75. Christakis, NA, Fowler, JH. Social network visualization in epidemiology. Norwegian Journal of Epidemiology 2009;19(1):516.Google ScholarPubMed

76. Christakis, NA, Fowler, JH. Social network sensors for early detection of contagious outbreaks. PLoS ONE 2010;5(9):e12948.CrossRefGoogle Scholar

77. Velasco, E, Agheneza, T, Denecke, K, Kirchner, G, Eckmanns, T. Social media and Internet-based data in global systems for public health surveillance: A systematic review. The Milbank Quarterly 2014;93(1):733.CrossRefGoogle Scholar

78. Andrews, L. I Know Who You Are and I Saw What You Did: Social Networks and the Death of Data Privacy. New York: Free Press; 2011, at 1–3.Google Scholar

79. Angwin, J. Dragnet Nation: A Quest for Privacy, Security, and Freedom in a World of Relentless Surveillance. New York: Times Books, Henry Holt; 2014, at 33–4.Google Scholar

80. Geissbuhler, A, Safran, C, Buchan, I, Bellazzi, R, Labkoff, S, Eilenberg, K, et al. Trustworthy reuse of health data: A transnational perspective. International Journal of Medical Informatics 2013;83(1):19.CrossRefGoogle Scholar

81. See note 7, Srinivas, Biswas 2012.

82. See note 17, Taylor 2011.

83. Bambauer JR. Is data speech? Stanford Law Review 2014;66:57–120.

84. Zarsky TZ. The privacy/innovation conundrum. Lewis & Clark Law Review 2015;19(1); available at http://ssrn.com/abstract=2596822 (last accessed 19 May 2015).

85. Dvorak K. Med identity theft continues to rise; 2015 Feb 23; available at http://www.fiercehealthit.com/story/med-identity-theft-continues-rise/2015-02-23?utm_medium=nl&utm_source=internal (last accessed 14 May 2015).

86. Avila J, Marshall S. Your medical records may not be private: ABC News Investigation. ABC News 2012 Sept 13; available at http://abcnews.go.com/Health/medical-records-private-abc-news-investigation/story?id=17228986&page=2 (last accessed 22 Mar 2014).

87. Nguyen V, Nious K, Carroll J. Your medical records could be sold on black market: NBC Investigative Unit surprises strangers with private medical details. NBC Bay Area 2013 June 18; available at http://www.nbcbayarea.com/news/local/Medical-Records-Could-Be-Sold-on-Black-Market-212040241.html (last accessed 22 Mar 2014).

88. Lawrence D. End of Windows XP support means added opportunity for hackers. Businessweek 2014 Apr 4; available at http://www.businessweek.com/articles/2014-04-04/end-of-windows-xp-support-means-added-opportunity-for-hackers (last accessed 1 July 2014).

89. Shahani A. The black market for stolen health care data. NPR; 2015 Feb 13; available at http://www.npr.org/sections/alltechconsidered/2015/02/13/385901377/the-black-market-for-stolen-health-care-data (last accessed 14 May 2015).

90. See note 58, Hall, Schulman 2009.

91. See note 34, Dunkel 2001.

92. See note 47, Benitez, Malin 2010.

93. Roberston J. States’ hospital data for sale puts privacy in jeopardy. Health Leaders Media; 2013; available at http://www.healthleadersmedia.com/content/QUA-292963/States-hospital-data-for-sale-puts-privacy-in-jeopardy (last accessed 14 June 2013).

94. Brief for the New England Journal of Medicine, the Massachusetts Medical Society, the National Physicians Alliance, and the American Medical Students Association as Amici Curiae Supporting Petitioners, William H. Sorrell v. IMS Health, Inc. et al., 2010 U.S. Briefs 779 (No. 10-779), 2011 U.S. S. Ct. Briefs LEXIS 299.

95. Holtzman, DH. Privacy Lost: How Technology Is Endangering Your Privacy. San Francisco: Jossey-Bass; 2006, at 195.Google Scholar

96. See, for example, RPC Health Data Store. CMS MedPAR Hospital Data File; available at http://www.healthdatastore.com/cms-medpar-hospital-data-file.aspx (last accessed 13 Sept 2013).

97. [Winston JS]. States’ hospital data for sale puts patient privacy in jeopardy; 2013 June 7; available at https://www.annualmedicalreport.com/states-hospital-data-for-sale-puts-patient-privacy-in-jeopardy/ (last accessed 19 Jan 2014).

98. Bady A. World without walls—privacy laws should be recrafted for the data fusion age. Technology Review 2011;114(6):66–71.

99. United States Government, Department of Justice. Fusion Center Guidelines: Developing and Sharing Information and Intelligence in a New Era; 2006; available at http://www.it.ojp.gov/documents/fusion_center_guidelines.pdf (last accessed Mar 2012).

100. See note 45, Curfman et al. 2011.

101. United States Government, Department of Health and Human Services, Centers for Medicare and Medicaid Services. Agreement for Use of Centers for Medicare & Medicaid Services (CMS) Data Containing Unique Identifiers, Form CMS-R-0235, OMB No. 0938-0734; available at http://www.cms.gov/Medicare/CMS-Forms/CMS-Forms/downloads//cms-r-0235.pdf (last accessed 13 Sept 2013).

102. Hebda, T, Czar, P. Handbook of Informatics for Nurses and Healthcare Professionals. 4th ed.Upper Saddle River, NJ: Pearson/Prentice Hall; 2009, at 321.Google Scholar

103. See note 68, Donnelly 2014.

104. See note 95, Holtzman 2006, at 192.

105. McGraw Hill General and Human Biology Case Studies. Gene Banks versus Privacy Invasion; available at http://www.mhhe.com/biosci/genbio/casestudies/sellinggenes.mhtml (last accessed 2 May 2014).

106. Brief for the Association of Clinical Research Organizations as Amici Curiae Supporting Respondents, William H. Sorrell v. IMS Health, Inc., et al., 2011 WL 2647130 (2011) (No. 10-779), (2011).

107. See note 59, Gooch et al. 2013.

108. See note 105, McGraw Hill 2014.

109. Austin, MA, Harding, S, McElroy, C. Genebanks: A comparison of eight proposed international genetic databases. Community Genetics 2003;6(1):3745.Google ScholarPubMed

110. Gillham, WW. Genes, Chromosomes, and Disease: From Simple Traits, to Complex Traits, to Personalized Medicine. Upper Saddle River, NJ: Pearson Education, published as FT Press Science; 2011, at 18–19.Google Scholar

111. Amgen. Amgen to Acquire deCODE Genetics, a Global Leader in Human Genetics; available at www.amgen.com/media/media_pr_detail.jsp?releaseID=1765710 (last accessed 2 May 2014).

112. See note 109, Austin et al. 2003.

113. Annas, GJ. Rules for research on human genetic variation—lessons from Iceland. New England Journal of Medicine 2000;342(24):1830–3.CrossRefGoogle Scholar

114. Gulcher, JR, Stefánsson, K. The Icelandic Healthcare Database and informed consent. New England Journal of Medicine 2000;342(24):1827–9.CrossRefGoogle ScholarPubMed

115. See note 19, Kaplan forthcoming.

116. Evans, BJ. Much ado about data ownership. Harvard Journal of Law & Technology 2011;25(1):69130.Google Scholar

117. For example, GE Data Visualization uses information “based on 7.2 million patient records from GE’s proprietary database”; available at http://visualization.geblogs.com/visualization/network/ (last accessed 27 Sept 2013). GE Healthcare’s Healthcare IT Solutions—available at http://www3.gehealthcare.com/en/Products/Categories/Healthcare_IT?gclid=CIKQ4Z6P7LkCFcE7OgodTDIAPQ and http://www3.gehealthcare.com/en/Products/Categories/Healthcare_IT/Knowledge_Center (last accessed 27 Sept 2013)—includes patient records and patient portals.

118. Sittig DF, Singh H. Legal, ethical, and financial dilemmas in electronic health record adoption and use. Pediatrics 2011 Apr;127(4):e1042–7.

119. Moore J, Tholemeier R. Whose data is it anyway? The Health Care Blog; 2013 Nov 20; available at http://thehealthcareblog.com/blog/2013/11/20/whose-data-is-it-anyway-2/ (last accessed 3 Feb 2014).

120. Goodman, KW, Berner, E, Dente, MA, Kaplan, B, Koppel, R, Rucker, D, et al. Challenges in ethics, safety, best practices, and oversight regarding HIT vendors, their customers, and patients: A report of an AMIA special task force. JAMIA (Journal of the American Medical Informatics Association) 2011;18(1):7781.CrossRefGoogle ScholarPubMed

121. Hall, MA. Property, privacy, and the pursuit of interconnected electronic health records. Iowa Law Review 2010;95:631–63.Google Scholar

122. See note 57, Rodwin 2010.

123. See note 3, IOM 2009, at 77.

124. See note 58, Hall, Schulman 2009.

125. Atherley G. The public-private partnership between IMS Health and the Canada Pension Plan. Fraser Forum 2011:5–7.

126. Miller, RA, Schaffner, KF, Meisel, A. Ethical and legal issues related to the use of computer programs in clinical medicine. Annals of Internal Medicine 1985;102:529–36.CrossRefGoogle Scholar

127. Goodman, KW. Health information technology: Challenges in ethics, science and uncertainty. In: Himma, K, Tavani, H, eds. The Handbook of Information and Computer Ethics. Hoboken, NJ: Wiley; 2008:293309.CrossRefGoogle Scholar

128. See note 127, Goodman 2008.

129. Data mining case tests boundaries of medical privacy. CMAJ 2011;183(9):E509–10.

130. See note 44, McGraw 2013.

131. See note 17, Taylor 2011.

132. See note 57, Rodwin 2010, at 617–18.

133. See note 15, Solove 2006.

134. Goodman KW. Ethics, information technology, and public health: New challenges for the clinician-patient relationship. Journal of Law, Medicine and Ethics 2010 Spring:58–63.

135. Kaplan, B, Litewka, S. Ethical challenges of telemedicine and telehealth. Cambridge Quarterly of Healthcare Ethics 2008;17(4):401–16.CrossRefGoogle ScholarPubMed

136. See note 19, Kaplan forthcoming.

137. See note 134, Goodman 2010.

138. See note 135, Kaplan, Litewka 2008.

139. See note 19, Kaplan forthcoming.

140. Roland D. UK to get 200 high-tech factory jobs making “swallowable sensors.” The Telegraph 2014 Mar 10; available at http://www.telegraph.co.uk/finance/10687395/UK-to-get-200-high-tech-factory-jobs-making-swallowable-sensors.html (last accessed 17 July 2014).

141. See note 24, Koontz 2013.

142. See note 44, McGraw 2013.

143. See note 23, Beyleveld, Histed 2000.

144. See note 12, EU 2014.

145. Rodrigues, RJ, Wilson, P, Schanz, SJ. The Regulation of Privacy and Data Protection in the Use of Electronic Health Information: An International Perspective and Reference Source on Regulatory and Legal Issues Related to Person-Identifiable Health Databases. Washington, DC: World Health Organisation (WHO); 2001.Google Scholar