Hostname: page-component-6bf8c574d5-rwnhh Total loading time: 0 Render date: 2025-02-21T03:55:01.934Z Has data issue: false hasContentIssue false

Precautionary Politics: Principle and Practice in Confronting Environmental Risk. By Kerry Whiteside. Cambridge, MA: MIT Press, 2006. 198p. $50.00 cloth, $21.00 paper.

Published online by Cambridge University Press:  12 February 2009

Scott Barrett
Affiliation:
Johns Hopkins University
Rights & Permissions [Opens in a new window]

Extract

Here are two challenges that the world has had to face in 2008: 1) Construction of the Large Hadron Collider was recently completed. Experiments using this machine will yield new knowledge of a fundamental kind. There is also a theoretical risk, believed to be vanishingly small but not zero, that the machine could create a black hole capable of destroying the Earth. Should the machine be turned on? 2) Fertilizing “desert” regions of the oceans with iron is expected to stimulate phytoplankton growth, sucking carbon dioxide into the oceans and thus helping to mitigate climate change. It might also alter vital ocean ecosystems. To know the full consequences of ocean fertilization, large-scale experiments are needed. Should they be allowed?

Type
Critical Dialogues
Copyright
Copyright © American Political Science Association 2009

Here are two challenges that the world has had to face in 2008: 1) Construction of the Large Hadron Collider was recently completed. Experiments using this machine will yield new knowledge of a fundamental kind. There is also a theoretical risk, believed to be vanishingly small but not zero, that the machine could create a black hole capable of destroying the Earth. Should the machine be turned on? 2) Fertilizing “desert” regions of the oceans with iron is expected to stimulate phytoplankton growth, sucking carbon dioxide into the oceans and thus helping to mitigate climate change. It might also alter vital ocean ecosystems. To know the full consequences of ocean fertilization, large-scale experiments are needed. Should they be allowed?

The “precautionary principle,” the subject of Kerry Whiteside's new book, Precautionary Politics, emerged to help us make decisions like these—decisions involving genuine uncertainty rather than risk, with consequences that are both global in scale and irreversible. I return to these two questions later in this review, but first let me summarize the book.

In Whiteside's own words, his book is a study of the principle's “meaning, rationale, and policy implications along with the controversies it has provoked. It is also an argument in favor of it, particularly in its deliberative form” (p. ix). Sandwiched between an introduction and conclusion are five chapters.

The first chapter presents a case study of genetically modified organisms. The United States pushed ahead with the technology, and promoted exports of it, even though its full effects were unknown, whereas Europe was much more cautious. The dispute is important because in a globalized world, it is hard to keep new technologies out of any market—or, in this case, out of the environment itself. Whiteside persuasively argues that situations like this one require new tools for decision making.

In Chapter 2, Whiteside makes the case for the precautionary principle by showing its superiority to an alternative he calls “science-based risk assessment.” I share his criticisms of the latter approach, as he presents it, but his alternative is a straw man. For his description of this alternative ignores important developments in cost–benefit analysis, such as the concept of (quasi-) option value, developed in the mid-1970s to address situations involving uncertainty, learning, and irreversibility. He reproaches critics of the precautionary principle, who, he believes, created their own straw man by asserting that it prohibits doing anything when the consequences are uncertain, but the alternative he constructs is also a distortion.

In Chapter 3, Whiteside explains that Europe has embraced the precautionary principle with more energy than has the United States, but that Europe's behavior has not been consistently more precautionary. He argues that for the United States, “precaution has always been ad hoc and discretionary” (p. 70). He also argues that Europe's fondness for the principle promises policies that are less so. I was not persuaded.

How to ensure that publics adhere to the principle? In Chapter 4, Whiteside contrasts two philosophical theories of governance, one that advocates authoritarian rule by a political elite, advised by scientific experts, and one that seeks to bring “the sciences into democracy” (p. 110). In Chapter 5, he makes a case for a process of “deliberative precaution” that encourages public debate and engagement, thereby conferring legitimacy upon public decisions. I am with him here. The big questions society must answer depend on much more than “objective science” and “quantitative analysis.” They depend on society's values—what people care about. People need to deliberate about the new challenges because they are very different from previous ones, and because the decisions we make about them will be profoundly consequential.

This book will appeal to readers interested in debates about the merits of the precautionary principle. Personally, I am more interested in how to address the kind of challenges that introduced my review. Does the precautionary principle, and Whiteside's advocacy for it, offer anything for readers like me? Let us see.

Should the Large Hadron Collider be switched on? A simplistic interpretation of the precautionary principle would say “no.” Whiteside rejects this interpretation but does not offer his own definition. After reading his book, I do not know how he would answer the question.

Can cost–benefit analysis help? Let us do some calculations. The benefit of turning on the machine consists of the probability that we will survive the experiment, which is approximately equal to one, times the value of the knowledge we expect to gain from the experiment, which is anyone's guess but certainly finite. The cost of turning on the machine is the probability that the experiment will destroy the Earth, which is almost but not quite equal to zero, times the value we assign to the Earth's existence, which is infinite. Infinity times any number, including a number nearly equal to but not quite equal to zero, is itself infinite. Since infinity is bigger than any finite number, cost–benefit analysis advises us not to turn the machine on.

Though Whiteside is critical of cost–benefit analysis, even a crude application of it commends a precautionary approach in this case. He applauds Europe's support for the precautionary principle, but the Large Hadron Collider was turned on at Europe's command. Cost–benefit analysis is not necessarily an enemy of precaution, nor is Europe necessarily its friend.

Was the decision to switch on the Collider wrong? Physicists connected with the European Organization for Nuclear Research (CERN) concluded that the experiments presented no danger and that there are no reasons for concern. But should technical experts make this decision for us, for humanity, for the world? Whiteside prefers that a matter such as this “be brought before the public at large for comment, debate, and in some cases, resolution” (p. 118). I agree. But, of course, the public might decide to throw caution to the wind.

If the Large Hadron Collider were not turned on, we would be no worse-off as compared with the preexisting situation. Ocean fertilization, however, may help us avoid some climate change. Alternatively, it may help us avoid doing something else to reduce atmospheric concentrations, such as increasing nuclear power generation or capturing and storing carbon dioxide underground—alternatives with their own legacies of long-term risks.

So, what would the precautionary principle have us do? Once again, a simplistic application of the principle would prohibit the experiment. Whiteside, however, understands that there can be risk–risk trade-offs. He says that “[i]f some strict precautionary measure might itself cause great harm (for example, banning a pesticide allows a disease to run rampant), there are strong reasons for trying to find a way to allow the activity to proceed” (p. 57). He also sees the value of research. “Precaution,” he says, “can mean setting up research programs whose purpose is to gather further information about the risk and test successive hypotheses about it” (p. 53). But how would he decide? Again, after reading his book, I do not know.

In May 2008, parties to the Convention on Biological Diversity urged “other Governments, in accordance with the precautionary approach [emphasis added], to ensure that ocean fertilization activities do not take place until there is an adequate scientific basis on which to justify such activities.” A scientific committee established by the Intergovernmental Oceanographic Commission to advise on ocean fertilization disagreed, saying that there were “good scientific reasons to do larger experiments.” So, who is right? Although the Convention's parties invoked the precautionary principle in recommending a prohibition, the response by the scientific committee seems more in line with Whiteside's thinking.

We should not be surprised that proponents could disagree about what the precautionary principle implies. As Whiteside says, “there are many different versions of the precautionary principle” (p. 150). But therein, for me, lies the principle's greatest shortcoming. I have never opposed the precautionary principle, nor, after reading this book, am I inclined to campaign for it.