Hostname: page-component-7b9c58cd5d-g9frx Total loading time: 0 Render date: 2025-03-15T04:31:09.222Z Has data issue: false hasContentIssue false

Risk before Justice: When the Law Contests Its Own Suspension

Published online by Cambridge University Press:  01 December 2008

Rights & Permissions [Opens in a new window]

Abstract

Contemporary security practices pose a particular paradox in the relationship between law and norm. On the one hand, the institution of risk practices in advance of, and in place of, juridical decisions appears to have become the technical resolution of choice to the politics of targeted security in the ‘war on terror’. The risk calculus makes possible an array of interventions – from detention, deportation, or ‘secondary’ security to asset freezing and ‘blacklisting’ – that operate in place of, and in advance of, the legal thresholds of evidence and decision. And yet, this article demonstrates, it is not the case that law recedes as risk advances, but rather that law potentially both authorizes and contests specific modes of risk management. As risk practices in the war on terror operate on and through a distinctive and novel terrain of the uncertain future, the capacity of juridical intervention to contest the exposure of people to dehumanizing technologies itself faces new potentials and limits.

Type
ARTICLES
Copyright
Copyright © Foundation of the Leiden Journal of International Law 2008

The instruments of government will become diverse tactics rather than laws. Consequently, law recedes.Footnote 1

Instead of beginning with very specific rigid, legalistic rules, we ought to begin with shared principles. Sharing identity information will help us better understand who actually poses a risk and should receive more targeted security. Our risk management philosophy drives all that we do.Footnote 2

1. Introduction: ‘potentially dangerous people’

In May 2007, the US Secretary of Homeland Security, Michael Chertoff, addressed the European Parliament's Committee on Civil Liberties, Justice and Home Affairs. Amidst widespread European disquiet as to the deployment by US authorities of passenger name record (PNR) data for the pre-emptive risk scoring of air passengers entering the United States, Chertoff sought to place emphasis not on the question of the juridical grounds of the transnational extradition of data on European citizens, but on the establishment of norms and practices of risk management.Footnote 3 Indeed, during his visit to Europe, Chertoff was reported to have proffered a somewhat stark choice to the British government, between on the one hand the authorization of norms of data mining for the risk management of air passengers and on the other the withdrawal of the legal entitlement to visa waiver from some British citizens. Following the apparent foiling of a transatlantic aircraft bomb plot in August 2006, Chertoff discussed with the UK Home Office the intention to withdraw the right of visa waiver from ‘British citizens of Pakistani origin’ – categorizing British citizenship into degrees of risk, singling out those ‘potentially dangerous people’ to whom ‘we should pay greater attention’.Footnote 4

The alternative to what would be, in effect, the drawing of a legal exception to the exception of visa waiver, has emerged in the form of an apparently extra-legal administrative complex of fused public and private data mining. ‘We use this data’, argues Chertoff, ‘to focus on behaviour, not race and ethnicity. In fact, it allows us to move beyond crude profiling based on prejudice, and to look at actual conduct.’Footnote 5 Of course, arguably a risk-based system that is ‘not racial profiling’ but that classifies past travel or remittances to Pakistan and specific name algorithms, among many other associations as identifiers of ‘potentially dangerous people’, is precisely the withdrawal of the right to visa waiver by other means.

The deal struck between the US and UK governments, to accept ‘screening at their end, sharing intelligence with the Americans’ and ‘deporting Britons who failed screening once they arrived at an airport in the US’, leaves the mask of the legal category of visa waiver in place, while allowing the use of risk norms and categories to proliferate.Footnote 6 It is this institution of risk practices in advance of, and in place of, juridical decision that has become the technical resolution of choice to the politics of targeted security in the war on terror. At the time of writing, new memoranda of understanding governing the terms of the granting of US visa waiver to European citizens are being signed. One might say that, once instituted, such measures surpass and supersede all political and juridical debates on PNR or data mining of many kinds. The text of the agreements reveals the legal authorizations required in order for apparently extra-legal practices and procedures to be instituted: ‘section 711 of the Implementing Recommendations of the 9/11 Commission Act of 2007 reforms the visa waiver program, expanding opportunities for participation in the program subject to certain enhanced security requirements’.Footnote 7 Such enhanced security measures are outlined as ‘information exchange, screening information to combat terrorism and serious crime, general information on migration matters’, and ‘allowing for further dissemination of transferred information within the US Government’. The screening and risk scoring of information ‘in advance of travel’, limitless at least insofar as ‘the US Secretary of Homeland Security deems it necessary’, will determine the ‘eligibility of a citizen to travel to the US without a visa’ or ‘whether the citizen poses a law enforcement or security risk’.Footnote 8 In sum, the juridical designation of exceptions to the visa waiver clears space for multiple and proliferating risk practices that defer sovereign decisions – on citizenship, eligibility, deportation, and so on – into a discretionary complex.

The controversy of European extradition of data to the United States – as well as its subsequent ebbing away into apparently extra-legal, even apolitical, procedures (‘this is not profiling’; ‘private companies handle your data this way every day’, and so on) – is but one glimpse of a broader set of difficulties in how we have come to understand the relationship between law and norm. For, as law becomes ever more closely intertwined with a proliferating assemblage of expertise, risk consulting, administration, and discretion, it inhabits an inescapable paradox. On the one hand, law represents established rules, rights, and judgments – the ‘rule of law’, ‘which rests absolutely with every valid legislative act and consists in the production of legal effects’.Footnote 9 For many legal activists, as for civil liberties and human rights lawyers, it is precisely the defence of this rule of law that is most at stake in the challenging of apparent ‘suspensions’ such as those in the US Patriot Act. As lawyer and director of the Electronic Privacy Information Center (EPIC) Marc Rotenberg put it, ‘the EPIC juridical programme is in many ways not radical at all, but profoundly conservative. At least to the extent that it seeks to protect law made in Congress some thirty years ago.’Footnote 10 In this sense one might suggest, even if superficially, that law is engaged in the contestation of ‘counter laws’ that represent its own suspension.Footnote 11 And yet it is precisely law and juridical designation that make possible the authorization of judgments by other means that appear to us as extra-legal: a risk classification that revokes the rights of a citizen, an algorithmic judgement that places a person on a ‘selectee list’ for secondary security checks. Understood this way, it is not strictly the case that, as Foucault says, ‘law recedes’, but that ‘acts that do not have the value of law acquire its force’.Footnote 12 Thus, as in the floating suspension of visa waiver in favour of pre-emptive risk screening, the legal category of citizen is broken down and denuded, replaced with dissected degrees of recognition that are dependent upon ‘risk’, ‘conduct’, or ‘past patterns of travel’. It is not the case that ‘law recedes’ as risk advances, but rather that law itself authorizes a specific and particular mode of risk management.

As a consequence of the particular modern entanglement of law and norm, the legal contestation of risk measures instituted as discretionary administrative authority faces a quite specific set of problematics. Understood as a technique of governing, risk is uncertainty made certain. That is to say, it acts on the uncertain future in a way that undercuts law's conventional reliance on precedent, evidence, and judgement in the present. Risk technologies, as they have emerged in the contemporary war on terror, favour instead the rendering of pre-emptive decisions that do not calculate probability on the basis of past evidence, but rather on the horizon of what may happen in the future.Footnote 13 The risk calculus, then, has distinct and specific implications for international law, making possible an array of interventions – from detention, deportation, or ‘secondary’ security to asset freezing and ‘blacklisting’ – that operate in place of, and in advance of, the legal thresholds of evidence and decision.Footnote 14

In the sections that follow, I begin by mapping the contours of the contemporary risk–norm–law complex, then move to explore the means by which specific contemporary risk practices undercut the grounds for historical legal intervention: evidence, the judgement of the expert witness, and the legal subject as bearer of rights are all reoriented in a risk regime that acts pre-emptively and authorizes with indefinite and indeterminate limits. It is my argument that the law, as it becomes inextricably bound up with novel administrative and discretionary risk practices, is confronted with something of the contingency and fragility of its own essence, its own essential categories. To be clear, it is not the case that the scope for critical international legal intervention is entirely eradicated. After all, there can be little doubt that it is often by means of law and legal intervention that the normalization of risk practices is brought to public attention and made extraordinary. Indeed, in the absence of the legal challenges to practices of pre-emptive data mining, many of which actually revealed previously concealed security practices, would there be space for public engagement with the use of risk techniques in the war on terror? In the process of contesting the growing ubiquity of risk profiling, scoring, and screening, though, the law must confront its own paradox. As Mariana Valverde and Nikolas Rose suggest in their powerful genealogy of law and norm, ‘the codes, instruments and practices of law have functioned to extend the powers of administration over life in the name of reason’, and yet at the same time, ‘the discourse of rights and legality have been deployed as a principal critique of the extension of such rationalized powers over life’.Footnote 15 As risk practices in the war on terror operate on and through a distinctive and novel terrain of the uncertain future, the capacity of juridical intervention to contest the exposure of people to dehumanizing technologies itself faces new potentials and limits.

2. Risk, norm, and law

As a means of making an uncertain and unpredictable world appear amenable to management, the idea of risk has historically had a close and complex relationship with the principles and practices of law. Thus, for example, the early modern practices of life insurance – placing a wager on the merchant voyager's risk of death – were transformed into the rational calculations of actuarial practice by legal judgments and prohibitions on gambling on life.Footnote 16 Similarly, the nineteenth-century risks of work accident, sickness, or fatality were rendered manageable by the juridical devices of labour law, insurance, and benefit.Footnote 17 Understood in these terms, both risk and law have come to be ways of governing human life via calculation, judgment, and decision, as though the very unknowability and uncertainty of the future world could be brought into the purview of manageability in the present. As for François Ewald, ‘a technology of risk is a means of disassembling, reconstructing, and organizing certain elements of reality’.Footnote 18 Risk, then, is a ‘manufactured uncertainty’, a way in which ‘we govern and are governed’.Footnote 19 It is a promise that we may ‘confront the world of lived experience (and all of its terrors), with the more neutral and predictable world of risk’.Footnote 20

In the context of post-9/11 worlds, however, the historical fungibility and contingency of risk, and its relation to law, have somewhat fallen away from analyses of the world and our place within it. In Ulrich Beck's reading, for example, the ‘world risk society’ reaches its limit when science and law can no longer respond to the proliferation of ‘potentially catastrophic and uninsurable events’.Footnote 21 For Richard Ericson and Aaron Doyle, a similar risk limit is reached in a ‘terrorism that strikes at the heart of risk society’, rendering past ways of taming chance and uncertainty impossible, and presenting ‘a stark reminder of the limits to risk assessment and management’.Footnote 22 And yet it is not the case that risk society has reached its limits at the threshold of a world depicted in terms of catastrophe (whether terrorism, climate change, landslide, or flooding), but rather that risk is precisely a technology of limits – a means of pushing at what is possible. Because, as Ewald says, ‘there is no risk in reality’, then risk exists only as a ‘way we have come to understand the world and its problems’.Footnote 23 To define a world that is ‘post-9/11’ or characterized by the ‘war on terror’, then, is not to reach the limits of risk at all, but to come to understand the world deploying novel risk technologies with new and startling implications for law.Footnote 24

In the incorporation of risk management into contemporary security practice, the sheer unpredictability or incalculability of terrorist attack has not led, then, to the placing of limits on risk. On the contrary, it has made radical uncertainty the very basis for action via risk calculations. Thus, for example, Ron Suskind's account of the post-9/11 Bush administration's ‘one per cent doctrine’ positions the risk of catastrophe as the essence of a security decision: ‘if there is a one per cent chance of an event coming due, act as though it were a certainty’.Footnote 25 In effect, the precautionary principle is brought into security practice, ‘inviting one to anticipate what one does not know’ and making decisions ‘not in a context of certainty, nor even available knowledge, but of doubt, premonition, foreboding, fear and anxiety’.Footnote 26 In contrast to risk models that seek to prevent or to predict, then, those dominating security domains are pre-emptive – they function precisely by allowing movement, permitting life to play out. In his identification of risk as central to the security apparatus, Michel Foucault describes ‘the emergence of a completely different problem’ that orientates itself to ‘allowing circulations to take place, controlling them, sifting the good and the bad, ensuring things are always in movement’.Footnote 27 Integral to the idea of risk is the always already present possibility to govern on the very basis of uncertainty – via ‘differential risks’, ‘risk zones’, ‘different curves of normality’.Footnote 28

Just as the designation of emergency or exception does not suspend the deployment of risk as a means of governing, so the space of exception depicted by Agamben is not as ‘empty’, ‘anomic’, or ‘kenomatic’ as he suggests.Footnote 29 As Fleur Johns has argued in relation to Guantánamo Bay, although the security measures and violence may ‘imply an extra-legal status’, in fact the detainees held there have ‘been the focus of painstaking work of legal classification’.Footnote 30 Far from a designation of exception that suspends international law, we see a space that teems with the classifications, categories, and judgements of ‘petty sovereigns’ or ‘experts in unease’, whose actions are authorized in and through law itself.Footnote 31 The ‘elaborate, multi-stage screening and evaluation process’ of Guantánamo, as described by Johns, is in fact precisely mirrored by the risk practices of data mining, screening, algorithmic decision, and scoring at multiple international borders.Footnote 32 In a very real sense the transformations in ideas of state responsibility under international law are echoed by shifts in public–private modes of risk management.Footnote 33 The proliferation of risk norms as modes of governing life, then, itself requires a complex and abundant programme of legislation and legal intervention. Far from being an empty or anomic space, the contemporary assemblage of law, risk, and norm abounds with the drawing of sovereign lines, the designation of citizenship and degrees of citizenship – all of which rely on a mobile form of norm/anomaly in order to differentiate and discriminate.

In pre-emptive modes of risk management, though, and in contradiction to the idea of anomic space, the norm governs in a specific way and with quite particular implications for law. For Ewald, the norm represents a ‘means of producing the common standard, a rule for common judgement that makes law possible in modern societies’.Footnote 34 In the contemporary war on terror, meanwhile, we are seeing the emergence of a more mobile form of norm than that which Ewald identifies with prudential or insurance-led modes of risk. In contrast to a common standard or rule, the norm extends into domains of adjudicating on suspicions, identifying anomaly, verifying the movements and transactions of the ordinary, and designating the threshold of the out of the ordinary. As Foucault explains with regard to risk and the security apparatus, while disciplinary relations of law to norm ‘started from a norm’ such that ‘the normal could be distinguished from the abnormal’, we see instead ‘the plotting of the normal and the abnormal . . . the interplay of differential normalities’.Footnote 35 Thus, for example, in the advancement of risk scoring and data mining of airline passengers and border-crossers, the common standard or rule of immigration norms is overridden by a mobile deployment of norm/anomaly derived from associations made inside integrated databases. In this sense, it is not strictly the norm that insinuates itself inside contemporary legal complexes, but rather processes of differentiated normalization that always fall short of and exceed the norm as common standard. Indeed, a key concern of legal activists seeking to contest risk profiling is that the population targeted cannot know the norm against which they will be judged.Footnote 36 Put simply, pre-emptive risk practices reserve the right always to apply differential ‘common standards’, normalities, and anomalies.

In summary, the intensified entanglement of law with risk practices that are instituted by ‘experts’ in judging the mobile norm (from software corporations to risk consultants) has exposed the difficulty for law to intervene on the grounds of rights, justice, and common standards upon which it conventionally relies. The fragile and mutable nature of such categories as rights to ‘a private life’ or ‘human dignity’, in truth always present in legal struggles for recognition, are cast in sharp relief by novel emerging complexes of law, risk, and norm.Footnote 37 The discourse of rights and justice – historically deployed both to extend and to contest legal–normal complexes – now finds itself implicated by its proximity to risk, and yet also uniquely situated to prise open space for public engagement. What, then, are the precise forms and modalities taken by contemporary juridical complicity in, and contestation of, risk practices?

3. Evidence and the expert witness

In December 2006 a data-driven risk-screening programme, the ‘Automated Targeting System’ (ATS), deployed by US authorities to screen at all air, land, and sea borders, became the subject of public and legal debate. Initially developed for the specific purpose of assigning risk scores to cargo and container shipments into US ports of entry, screening imported goods in advance of their arrival at the US border, the ATS had been exported in full to the risk management of people at the border. As the Department of Homeland Security's privacy impact assessment for ATS reveals, the risk-based screening is thought to enable the ‘identification of previously unknown areas of note, concern or pattern’.Footnote 38 ATS analyses an array of data on passengers and border-crossers – including address, financial records, ‘no show’ history, how tickets were purchased, motor vehicle records, past instances of one-way travel, and seating and meal preferences – in order to assign a risk score to individuals. The score is used for a variety of purposes, including determining whether a person is to be placed on a ‘selectee list’ for further scrutiny, stopped for additional questioning, or, indeed, denied entry and deported. The risk-assessment calculation is classified and the results may be kept on file for up to forty years.

ATS represents but one example of an emerging complex of law, risk, and norm that is disturbing juridical conceptions of evidence and expert judgement, as well as international legal conceptions of responsibility and accountability. In effect, ATS deploys evidence before the fact and in advance of a future event that may or may not come to pass. As the precautionary principle migrates to security contexts, civil liberties and human rights lawyers confront the problem of data deployed as evidence of the unknown. Donald Rumsfeld's by now infamous comments on getting inside the ‘unknown unknowns’ led him to conclude that ‘the absence of evidence is not evidence of absence’.Footnote 39 It is precisely this assumption – that the threat of the unknown shifts the threshold of evidence – that has led legal activists to challenge the supposition of suspicion. According to Barry Steinhardt of the American Civil Liberties Union (ACLU), ATS casts suspicion on millions of innocent travellers, while seriously threatening privacy and constitutional liberties. Because the system screens entire populations in the search for ‘unknown terrorists’, deploying a calculation that can never itself be made visible, people can never meaningfully access, challenge, or correct their risk category. ‘Does the government get to scrutinize every address at which you've ever lived?’, asks Steinhardt, ‘quiz you about the fact that you once went 12 months without a job? About your web surfing, your online book purchases, your school transcripts, your associations?’Footnote 40

Similar concerns are raised by the senior counsel of EPIC, Melissa Ngo. ‘You can never really know what is in this risk score, or how to get off a “watch list”’, she argues; ‘citizens ask us this every day, will they ever tell you that you are or are not on a list? It is now seven hundred thousand names’.Footnote 41 For the EPIC lawyers filing complaints on ATS, the mobile and amorphous nature of the norm of evidence deployed is a crucial problem:

The sharp departure after 9/11 was to move from individualised surveillance to mass suspicion. In other words, governments always had tools to pursue individuals who were considered to be criminals or terrorists, but the understanding was that those investigations would be pursued based on targeted leads. The real erosion has been around the legal frameworks – it is a movement away from a targeted approach to establishing guilt, to a broad presumption that everyone may be a little bit guilty or a little bit terrorist inclined. That is the significance of the automated targeting system. They run a series of questions . . . it is a very odd thing for a democratic country to assign to people a probability of committing a terrorist act, even if it is a low probability.Footnote 42

The rise of pre-emptive risk calculations as a means of identifying vulnerable spaces and suspicious populations in the war on terror, then, has redefined by degrees the threshold of evidence required to make a security decision. Thus, for example, as Marieke de Goede has argued, the threshold for the freezing of financial assets or the monitoring of financial transactions is considered to be much lower than that for other forms of detention, and yet such actions have similar effects in terms of limiting life chances and withholding the means to a sustainable way of life.Footnote 43 Likewise, Bernard Harcourt suggests that the racial profiling of young Muslim men in the London Underground and the New York subway, whilst prohibited in established juridical models, is authorized in new forms by a supposition of suspicion in the war on terror. Citing Paul Sperry of Stanford's Hoover Institution, Harcourt illustrates how models of risk expertise from commercial contexts have come to be authorized in and through law: ‘it makes no sense to search old ladies or children; the police should target the high-risk population. Insurance companies profile policyholders based on probability of risk. That's just smart business. Likewise, profiling subway passengers based on proven security risk is just smart law enforcement’.Footnote 44

The diffusion of authority into a legal complex that incorporates private commercial expertise, as well as the assumption of ‘smart security’, is mirrored also in ‘smart borders’ programmes such as ATS. Assistant Secretary of Homeland Security Stewart Baker, speaking on the ATS programme, offers risk-based screening as an objective, neutral, and expert-led system that means that ‘grandmothers and infants move more quickly through security’, while others are screened in for a ‘closer look’.Footnote 45 The racial profiling and identity associations that are made inside the risk calculation are, of course, concealed by the appeal to techno-science.Footnote 46 The evidential basis for the designation of norm and anomaly is link analysis, an algorithmic model that looks for associations and flags risk on the basis of the links between data. In this way, pre-emptive risk screening makes the case for evidence deployed before the fact:

This is a lesson we learned from September 11. After-the-fact reviews of the hijackers travel reservations showed that we might have been able to uncover the plot if we'd had better computer systems and better access to travel data . . . We didn't connect those dots before 9/11, but we should have. We learned that lesson, and now ATS allows us to look for these links.Footnote 47

The connecting of dots celebrated by Baker is exactly the work of link analysis deployed across integrated databases. The 9/11 commission documents are replete with references to the need to ‘connect the dots’ in order to provide evidence before the event. Indeed, private commercial models for connecting dots – such as those developed by IBM, Microsoft, Oracle, Raytheon, and Accenture, and later deployed in major contracts to manage US and UK borders – were reported by the commission as a means by which ‘the events of 9/11 could have been anticipated and prevented’. The representation of evidence before the event as the solution to catastrophe risk in the war on terror has cleared substantial space for expert models of calculation. As Valverde and Rose note, in historical context ‘a plurality of different forms of expertise have attached themselves to the institutions and procedures of the law’.Footnote 48 At historical moments when law turns to expertise to authorize legal judgments and decision, it reconsiders the basis of legal evidence. It becomes possible, then, for expert risk managers to claim that the evidence for the events of 9/11 was already present before the attack, indeed, that if their procedures and risk judgements were to be authorized in decisions to deport, detain, or intercept, future terrorist crimes could be prevented.

If law is to prise open public space to contest the deployment of pre-emptive evidence ‘before the fact’, then it must first confront the contingency and performativity of all forms of evidence.Footnote 49 In many ways there is nothing at all new in the authorization of experts to judge criminality in advance or, as Foucault depicts twentieth-century medico-legal judgement, ‘expert opinion shows how the individual resembles his crime before he has committed it’.Footnote 50 To confront the contingency of legal categories is also to acknowledge law's intrinsic role in determining ‘the way in which we see and are given to the world to be seen’ – what Costas Douzinas refers to as the ‘legal screen’.Footnote 51 If the legal screen is interposed between the world of data, facts, and evidence and the making of social, political, and legal judgements and decisions, then pre-emptive evidence may itself be authenticated. Consider, by way of example, the case for data mining and biometric identifiers made by Michael Chertoff to the European Parliament:

In June 2003, using PNR data and other analytics, one of our inspectors at Chicago's O'Hare airport pulled aside an individual for secondary inspection and questioning. When the secondary officers were not satisfied with his answers they took his fingerprints and denied him entry to the United States. The next time we saw those fingerprints – or at least parts of them – they were on the steering wheel of a suicide vehicle that blew up and killed 132 people in Iraq.Footnote 52

The spectacular nature of the presentation of evidence – its intricate staging and performance – is writ large in Chertoff's appeal to new forms of fingerprint evidence. The conventional scene of crime fingerprint evidence, itself historically authorized by science and expertise, is invoked here to authenticate the biometric used to make a pre-emptive judgement at the border. Evidence at the threshold of unknown futures, although prevalent in novel forms in contemporary security practice, is actually authorized in ways that are not dissimilar to the commonplace and prosaic work of legal representations of what counts as evidence.

4. Digitized dissection and the legal subject

In 2005 the UK Home Office and the US Transportation Security Administration (TSA) began trials of new X-ray devices for the screening of bodies at border crossings. Rapiscan System's ‘Backscatter’ scanners appeared in Terminal 4 of London's Heathrow airport and in Phoenix's Sky Harbor airport, producing screened images of passenger's naked bodies as they passed through security checkpoints. In the TSA's budget statement to Congress in 2007, special mention was made of the substantial investment in ‘Whole Body Imaging’ or ‘Backscatter X-ray’, with a statement that ‘the technology produces an image to identify contraband secreted on an individual without subjecting them to an invasive inspection’. Proposed for use in all major US and UK airports, as well as in the London Underground system in the form of millimetre wave technology, Backscatter mirrors and reflects the risk-based data mining of pre-emptive border controls. In effect, the person experiences a doubled fracturing of their sense of identity – a digitized dissection made inside integrated databases and then visualized projection at the border. So the fractured subject derived from risk-based data mining arrives at the border before the person, who is then further denuded by risk visualization technologies such as Backscatter.

The response to Backscatter by human rights and civil liberties lawyers has centred on the question of rights to privacy and human dignity. EPIC's senior counsel, for example, has argued that Backscatter scans are ‘equivalent to a “virtual strip search” for all air travellers’, and that the machines show ‘extraordinary disregard for the privacy rights’ of passengers. The central critical juridical questions have become how, or whether, images collected and seen on the screen can be made to comply with data protection laws, and whether there is adequacy in the treatment of data by public authorities:

They're designed to capture images, and TSA made some very misleading statements early on about how they were showing operators only chalk lines and all these other ways to obscure the image, which basically has nothing to do with what's being recorded by the device. We pressed that issue with them and now they have given us some assurances that they're in fact not going to be saving these images but of course the devices are intended, designed to save the images. How will they be used, by whom, with what safeguards to privacy?Footnote 53

The claim to rights to privacy, bodily integrity, or human dignity, however, faces particular difficulties in relation to risk technologies that dissect and fragment the person into a series of risk factors. What happens to the legal subject when risk visualizations – extending from imaging devices such as Backscatter to the screened visual displays of risk scores – specifically divide and subclassify the body into differential traits, characteristics, behaviours? What are the limits of the citizen's obligation to reveal the elements, the prosaic daily intimacy of their lives? As Engin Isin has argued compellingly, the ‘neurotic citizen’, once reduced to a ‘species body’, actively strips herself down in order to ‘calibrate itself’ to the anxieties and dangers of the border.Footnote 54 Thus, the very category of legal subject, or indeed of citizen, is exposed in its full fragility. What happens to the body that cannot be verified, that does not calibrate to the mobile norm, or is not recognized or is mis-recognized under international law?

In the contestation of risk technologies such as Backscatter, then, the law once more confronts the making of its own categories. If it recognizes only a ‘a non-substantial, a thin personality, a public image that seriously mis-matches people's self-image’,Footnote 55 then legal intervention risks mirroring the stripping-down and denuding strategies of the homeland security state itself:

Human rights break down the body into functions and parts and replace its unity with rights . . . Encountering rights annihilates and dismembers the body: the right to privacy isolates the genital area and creates a ‘zone of privacy’ around it; free speech severs the mouth and protects its communicative but not its eating function, while free movement does the same with legs and feet, but offers no right of abode.Footnote 56

Put simply, the abstractions that are made in the defence of people's rights to privacy, just as in the digitized imaging of the body, risk recognizing only a facsimile of a person. The reduction of a person to their rights recognizes, as Douzinas puts it, only ‘the man of the rights of man’, who appears ‘without differentiation or distinction in his nakedness and simplicity, united with all others in an empty nature deprived of substantive characteristics’.Footnote 57 Certainly the nakedness of the stripped-down man in the rights of man is not recognized in its full political difficulty by the continual redrawing of a legal boundary. It is this redrawing that runs through most of the current legal appeals to privacy: ‘the infringement on privacy must be proportionate to the security threat’; ‘the collection and use of personal data must be transparent’; ‘subjects must be informed if they are on a no-fly list’.Footnote 58 Where someone is left to make a claim for recognition based on corporeal difference – ‘I am pregnant’; ‘I have a prosthetic limb’; ‘I have a mastectomy’ – they are rendered invisible, slipping away from the juridical domain.

Douzinas does offer, though, one possible critical route that may be open to legal intervention and responsibility: that is, to consider the appeal to rights of many kinds – privacy, bodily integrity, a private family life, freedom of expression – to be one specific struggle for recognition among many others, a specifically legal claim to human identity that is, nonetheless, always already also political and social. In practice, such a struggle for recognition is present in the claims, cases, and campaigns of critical lawyers seeking to challenge new risk technologies. As senior counsel Lillie Coney explains in relation to recognizing ‘non-citizens’ in border control and immigration issues,

The idea of the non-citizen. It's very disturbing to me, because I work on civil rights issues, that's my background, that's where I came from. And I'm doing a lot of work to try to bring that community into these discussions along privacy, but a lot of the discussions about non-citizens sound a lot like the discussion that occurred around equal rights for women, equal rights for minorities, within the United States. It can't be defined strictly as civil liberties because we were talking about people who are non-citizens. But we do need to start having discussions about human rights and that Americans can't suspend their care of human rights if the person is a non-, is not a US citizen.Footnote 59

On the one hand, then, liberal forms of legal recognition restrict our ability to talk back in the face of non-recognition, stripping away the specificities of family, culture, experience, race, and leaving only what Douzinas describes as an ‘empty unit’.Footnote 60 And yet, in the struggle for legal recognition there is an oscillation back and forth between claims to the universal, usually framed by ideas of ‘nation’, ‘international’, ‘citizen’, even ‘homeland’, and the specificities of the particular that defy and expose these categories. The legal ‘other’ is but one of the others in front of whom we seek to be recognized. ‘The typical harm of defective recognition’, as depicted by Douzinas, absolutely captures the multiple projected identities and mis-recognitions of risk imaging and screening, where a split occurs ‘between someone's self-image and the image that social institutions or others project upon that person’. If law is to intervene to contest this particular effect of its own suspension then it cannot avoid the encounter with multiple struggles for recognition. As in Lillie Coney's legal advocacy for the non-recognized, non-American communities, the grounds for legal personhood must also come into critical reappraisal. Rather than intervene to shore up the completeness of legal personhood – via strict rights of privacy, dignity, and bodily integrity – this would imply precisely the opposite, a questioning of how one becomes a legal subject and how one reconciles the exclusions and exceptions that make this possible.

As the assemblages of law and norm become ever more closely bound up with risk practices that model, pre-empt, and visualize, there is a double bind to confront in the discourse of human rights. As Jacques Derrida put the problem in the week following the events of 9/11,

We must more than ever stand on the side of human rights. We need human rights. We are in need of them and they are in need, for there is always a lack, a shortfall, a falling short, an insufficiency; human rights are never sufficient. They are not natural, they have a history – one that is recent, complex and unfinished'.Footnote 61

The proliferation of risk practices that make no ethical decisions but only defer decision into calculation must surely demand that we, as Derrida says, stand on the side of human rights. But this necessarily juridical intervention in otherwise denuded risk calculations can only proceed in full knowledge of its own historical context and contingency. Always there will be new claims that have to work from some sense of a unity from which they are excluded, and yet always these claims will come from difference, from a failure of the legal ‘other’ to recognize that difference.

5. Conclusions: law and public engagement

The concept of pure hospitality can have no legal or political status. No state can write it into its laws.Footnote 62

I shall conclude by returning to the dilemma with which I began – when European law intervened to prohibit data transfer for risk scoring, but authorized the multiple risk practices that are set to define the granting of visa waiver. One might say, as the domain of crime and evidence of crime becomes indistinguishable from national security actions, that juridical intervention can only act to redraw marginally the line of what is admissible, what constitutes adequate protection.Footnote 63 Perhaps to submit to the deployment of risk screening and avert the prohibition of entire groups of people from legal categories is the only choice that civil liberties or human rights law can meaningfully take. There may be, as I have suggested here, a new limit, now more visible than before, placed upon the capacity of law to extricate itself from the proliferation of discretionary norms and open public space for contestation. And yet, could it be that the inextricable binding together of legal practice with non-legal forms of expertise, though undeniably taking violent forms and having material and barbaric effects on the life chances of people, also contains the potentiality for its own undoing? Although it is not the case that law is contesting its own suspension – for, as I have suggested, it is by no means suspended – there are instances where it is at least suspending reliance on conventional categories of rights and justice.

If law depends so readily on the mask of representational practices of ‘evidence’, ‘rights’, the ‘legal subject’, then its entanglement with risk technologies that profoundly shake those categories is having material and political effects. Across the different examples of legal contestation of risk technologies I have explored, there is a struggle for recognition that somehow demands a personhood that is more than a risk calculus, that exceeds the categories of safe and dangerous. In pressing the questions of what counts as evidence, who decides, who is the rightful legal subject, what is the threshold for security decisions, law confronts the contingency and frailty of its own grounds for judgement. There is a distinctive ambiguity, then, in the relation between justice and the possibilities for a transformative ethics of the recognition of personhood. As William Connolly captures the ethico-political practices of public engagement, there is potential ‘in a modern world of justice’ for suffering to move from obscurity ‘below the register of justice’ to a visible ‘unmarked place on it’.Footnote 64 Thus the ‘non-citizens’ unrecognized by strictly ‘civil’ liberties haunt the terrain of rights and struggle for an unmarked place upon it. Can law foster a hospitality to the unknown, the unexpected and unanticipated arrival of a legal subject whom it does not recognize? Do the risk practices that categorize personhood at the horizon of unknown futures encounter a limit in modes of law that respond differently to the unknown subject?

References

1 M. Foucault, Security, Territory, Population: Lectures at the Collège de France 1977–1978, trans. G. Burchell (2007), 99.

2 M. Chertoff, ‘Remarks to European Parliament's Committee on Justice, Civil Liberties and Home Affairs’, US Department of Homeland Security, 15 May 2007, available at www.dhs.gov/xnews/speeches/sp_1180627041914.sthm.

3 In May 2006 the European Court of Justice (ECJ) ruled that ‘the Commission Decision of 14 May 2004 on the adequate protection of personal data contained in the Passenger Name Record of air passengers transferred to the US Bureau of Customs and Border Protection be annulled’ (Case 318/04, European Parliament v. Commission of the European Communities), and that ‘the Council decision of 17 May 2004 on the conclusion of an agreement between the European Community and the United States on the transfer and processing of PNR data be annulled’ (Case C-317/04, European Parliament v. Council of the European Union). The Parliament had sought to challenge the extradition of personal data on European citizens under Art. 8 of the European Convention for the Protection of Human Rights and Fundamental Freedoms (ECHR): ‘everyone has the right to respect for his private and family life, his home and his correspondence. There shall be no interference by a public authority with the exercise of this right, except such as is necessary in a democratic society in the interests of national security.’ The ECJ did not uphold the infringement of Art. 8, basing its judgment instead on the inadequate legal basis in the European Treaty.

4 M. Chertoff, ‘US Seeks Closing of Visa Loophole for Britons’, New York Times, 2 May 2007, 5.

5 M. Chertoff, ‘Remarks by Secretary Michael Chertoff to the Johns Hopkins University Paul H. Nitze School of Advanced International Studies’, 3 May 2007, available at www.dhs.gov/xnews/speeches/sp_1178288606838.shtm (last visited May 2007).

7 Department of Homeland Security, ‘Memorandum of understanding regarding the US Visa Waiver Program and enhanced security measures’ (2008).

8 Ibid., at 3.

9 G. Agamben, State of Exception (2005), 37.

10 Interview with Marc Rotenberg, director of the Electronic Privacy Information Center and senior counsel, Washington, DC, 30 October 2007. Rotenberg has testified worldwide on the effects of national security exemptions to laws governing privacy, surveillance, and the use of electronic data. He testified before the 9/11 Commission on ‘Security and Liberty: Protecting Privacy, Preventing Terrorism’.

11 R. Ericson, Crime in an Insecure World (2007), 24.

12 Agamben, supra note 9, at 38.

13 See Amoore, L. and de Goede, M., ‘Transactions after 9/11: The Banal Face of the Preemptive Strike’, (2008) 33 (2)Transactions of the Institute of British Geographers 173CrossRefGoogle Scholar; Van Munster, R., ‘The War on Terrorism: When the Exception Becomes the Rule’, (2004) 17 International Journal for the Semiotics of Law 141CrossRefGoogle Scholar; Massumi, B., ‘Potential Politics and the Primacy of Preemption’, (2007) 10 (2)Theory & Event, available at http://muse.jhu.edu/login?uri=/journals/theory_and_event/v010/10.2massumi.htmlGoogle Scholar.

14 Kessler, O. and Werner, W., ‘Extrajudicial Killing as Risk Management’, (2008) 39 Security Dialogue 289CrossRefGoogle Scholar.

15 Valverde, M. and Rose, N., ‘Governed by Law?’, (1998) 7 Social and Legal Studies 541, at 543Google Scholar.

16 See L. Daston, Classical Probability in the Enlightenment (1988); G. Clark, Betting on Lives: The Culture of Life Insurance in England (1999).

17 F. Ewald, ‘Insurance and Risk’, in G. Burchell, C. Gordon, and P. Miller (eds.), The Foucault Effect: Studies in Governmentality (1991).

18 Ewald, F., ‘Norms, Discipline, and the Law’, (1990) 30 Representations 138, at 142CrossRefGoogle Scholar.

19 B. Adam and J. van Loon, ‘Introduction’, in B. Adam and J. van Loon (eds.), The Risk Society and Beyond: Critical Issues for Social Theory (2000), 2; O'Malley, P., ‘Introduction: Configurations of Risk’, (2000) 29 Economy and Society 457, at 458CrossRefGoogle Scholar.

20 Ewald, supra note 18, at 142.

21 U. Beck, World Risk Society (1999), 4. See also U. Beck, ‘Risk Society Revisited: Theory, Politics and Research Programmes’, in B. Adam, U. Beck, and J. van Loon (eds.), The Risk Society and Beyond: Critical Issues for Social Theory (2000).

22 Ericson, R. and Doyle, A., ‘Catastrophe Risk, Insurance and Terrorism’, (2004) 33 (2)Economy and Society 135, at 141CrossRefGoogle Scholar.

23 Ewald, supra note 18, at 199.

24 L. Amoore and M. de Goede, ‘Governing by Risk in the War on Terror’, in L. Amoore and M. de Goede (eds.), Risk and the War on Terror (2008).

25 R. Suskind, The One Percent Doctrine (2006), 14.

26 F. Ewald, ‘The Return of Descartes' Malicious Demon: An Outline of a Philosophy of Precaution’, in T. Baker and J. Simon (eds.), Embracing Risk (2002), 294.

27 Foucault, supra note 1, at 65.

28 Ibid., at 63.

29 Agamben, supra note 9, at 39.

30 F. Johns, ‘Guantánamo Bay and the Annihilation of Exception’, (2005) 16 EJIL 613, at 617.

31 J. Butler, Precarious Life: The Powers of Mourning and Violence (2004); D. Bigo, Illiberal Practices of Liberal Regimes: The Insecurity Games (2006).

32 See Amoore, L., ‘Biometric Borders: Governing Mobilities in the War on Terror’, (2006) 25 (3)Political Geography 336CrossRefGoogle Scholar; Salter, M., ‘Passports, Mobility and Security: How Smart Can the Border Be?’, (2004) 5 International Studies Perspectives 69CrossRefGoogle Scholar.

33 Werner, W., ‘Responding to the Undesired State Responsibility, Risk Management and Precaution’, (2005) 36 Netherlands Yearbook of International Law 57CrossRefGoogle Scholar.

34 Ewald, supra note 18, at 155.

35 Foucault, supra note 1, at 63.

36 Much of the legal advocacy work of organizations such as the American Civil Liberties Union (ACLU), the Electronic Frontier Foundation (EFF) and the UK's Liberty has coalesced around how citizens can know which law, norm, or rule applies in public spaces. As subjects of new modes of pre-emptive risk management, as Barry Steinhardt of the ACLU has it, ‘even the option to moderate or modify their behaviour is no longer open, they cannot know what is to be judged suspicious and scored accordingly’. The disciplinary deployment of norm, then, geared as it is to the governing of the self, appears here to be displaced by more mobile and less visible modes of norm and anomaly.

37 Douzinas, C., ‘Identity, Recognition, Rights, or What Hegel Can Teach Us about Human Rights’, (2002) 29 Journal of Law and Society 379CrossRefGoogle Scholar.

38 Department of Homeland Security, ‘Survey of DHS Data Mining Activities’, (2006), 9.

39 D. Rumsfeld, ‘Press conference by US Secretary of Defence Donald Rumsfeld’, NATO, Brussels, 6–7 June 2002, available at www.nato.int/docu/speech/2002/s020606g.htm (last visited April 2007).

40 B. Steinhardt, ‘The Automated Targeting System – A Violation of American Law, the US–EU PNR Agreement and Basic Human Rights’, presented to European Parliament, Brussels, 27 March 2007.

41 Interview with Melissa Ngo, Director of the Identification and Surveillance Project, Electronic Privacy Information Center (EPIC), Washington, DC, 30 October 2007.

42 Interview with Marc Rotenberg, supra note 10.

43 de Goede, M., ‘The Politics of Preemption and the War on Terror in Europe’, (2008) 14 European Journal of International Relations 161CrossRefGoogle Scholar.

44 B. Harcourt, Against Prediction: Profiling, Policing and Punishing in an Actuarial Age (2007), 228.

45 S. Baker, Remarks, Center for Strategic and International Studies, Washington, 19 December 2006 available at www.dhs.gov/xnews/speeches/sp_1166557969765.shtm (last visited 8 October 2007).

46 It is the profiling and targeting potential of integrated databases that is of key concern, for example, to the UK's human rights group Liberty: ‘Of course once you get into individual profiling, what we mean in this political and current environment that we're speaking when we're talking about profiling, we're talking about racial profiling. My concern is that eventually when they've sold these systems on, that there will be the only place there is to go’ (Gareth Crossman, policy director and counsel, Liberty, interviewed 2 September 2007).

47 Baker, supra note 45.

48 Valverde and Rose, supra note 15, at 548.

49 See E. Morgan, ‘New Evidence: The Aesthetics of International Law’, (2005) 18 LJIL 163.

50 M. Foucault, Abnormal: Lectures at the Collège de France 1974–1975, trans. G. Burchell (2003), 23.

51 C. Douzinas, ‘The Legality of the Image’, (2000) 63 Modern Law Review 830.

52 M. Chertoff, ‘Remarks to European Parliament's Committee on Justice, Civil Liberties and Home Affairs’, 15 May 2007.

53 Interview with Rotenberg, supra note 10.

54 Isin, E., ‘The Neurotic Citizen’, (2004) 8 Citizenship Studies 232CrossRefGoogle Scholar.

55 Douzinas, supra note 37, at 397.

56 Ibid., at 399.

57 Ibid., at 398.

58 European Parliament, ‘Findings of the Article 29 Data Protection Working Party of the European Parliament's Committee on Civil Liberties, Justice and Home Affairs’, Brussels, 26 March 2007.

59 Interview with Lillie Coney, senior counsel and deputy director, EPIC, Washington, DC, 31 October 2007.

60 Douzinas, supra note 37, at 398.

61 J. Derrida, ‘Autoimmunity: Real and Symbolic Suicides’, in G. Borradori, Philosophy in a Time of Terror: Dialogues with Jürgen Habermas and Jacques Derrida (2003), 132.

62 Ibid., at 129.

63 E. Guild, ‘The Foreigner in the Security Continuum: Judicial Resistance in the United Kingdom’, in P. K. Rajaram and C. Grundy-Warr (eds.), Borderscapes: Hidden Geographies at Territory's Edge (2007).

64 W. Connolly, Why I Am Not a Secularist (1999), 63.