Hostname: page-component-745bb68f8f-hvd4g Total loading time: 0 Render date: 2025-02-11T06:25:39.184Z Has data issue: false hasContentIssue false

Philosophy of the Precautionary Principle, Daniel Steel. Cambridge University Press, 2015, xv + 256 pages.

Published online by Cambridge University Press:  08 January 2016

Charlotte Werndl*
Affiliation:
Department of Philosophy (KGW), University of Salzburg, Franziskanergasse 1, 5020 Salzburg, Austria and Department of Philosophy, Logic and Scientific Method, London School of Economics, Houghton Street, London WC2A 2AE, UK. Email: charlotte.werndl@sbg.ac.at. URL: http://charlottewerndl.net.
Rights & Permissions [Opens in a new window]

Abstract

Type
Reviews
Copyright
Copyright © Cambridge University Press 2016 

This book aims to defend an interpretation of the precautionary principle from the perspective of philosophy of science. Overall, the book is very successful in achieving this and should be of interest for all enquirers into the precautionary principle and its role in the theory and practice of public decision-making on such issues as the environment and public health.

The structure of the book is as follows. In Chapter 1, Steel states his version of the precautionary principle. Chapter 2 dismisses one of the main objections against the precautionary principle, namely that it is either trivial or incoherent. Chapter 3 is about the unity of the precautionary principle. Here Steel argues that there is a general precautionary principle from which the different versions of this principle stem. Chapter 4 provides an historical argument for precaution: history contains many prolonged and costly delays to tackling serious harms while seriously harmful environmental regulations in the past were rare. Chapter 5 proposes a new definition of uncertainty. Chapter 6 examines the relationship between the precautionary principle and discounting, i.e. how the interests of the present should be balanced against those of the future. It is claimed that intergenerational impartiality is an important component of the precautionary principle. Chapter 7 argues against the idea of value-free science and discusses the relationship between values in science and the precautionary principle. Chapter 8 continues the discussion of values in science and argues that non-epistemic values should not override epistemic values in scientific research. Chapter 9 provides a conclusion and a discussion of further case studies (climate change mitigation, bovine growth hormones and regulation of chemicals). Finally, the Appendix explains how the interpretation of the precautionary principle defended by Steel can be formalized. While all chapters present original contributions, I will concentrate my critical analysis mainly on Chapters 1, 2, 5, 7, 8 and 9. My main criticism will concern Steel's rejection of cost-benefit analysis, his assessment of the decision-theoretic notion of uncertainty and his distinction between epistemic and non-epistemic values.

Steel's Precautionary Principle consists of the following three components:

  1. (i) The Meta-Precautionary Principle, asserting that in the face of serious environmental threats, uncertainty should not be a reason for inaction;

  2. (ii) The ‘Tripod’, consisting of (ii-a) the knowledge condition; (ii-b) the harm condition; and (ii-c) the recommended precaution; and

  3. (iii) Proportionality – the idea that the aggressiveness of one's precautions should correspond to how plausible and severe the threat is. Proportionality is defined by two subsidiary principles called consistency and efficiency: consistency requires that a precaution is not at the same time recommended against by the version of the precautionary principle that was used to justify it, and efficiency states that the least costly precaution should be implemented (9–10).

The particulars of each application will determine how to fill in these components and in this way will give rise to versions of the precautionary principle. For instance, a version of the precautionary principle is the claim that “If a scientifically plausible mechanism exists whereby an activity can lead to a catastrophe, then that activity should be phased out or significantly reduced” (28). Steel applies this version to climate policy and argues that it warrants demanding mitigation policies, such as those proposed by Ackerman and Stanton (Reference Ackerman and Stanton2013).

In the literature, the precautionary principle is defended in the form of three different roles (10-1): as a decision rule, as a procedural requirement placing general constraints on how decisions should be made, or as an epistemic rule that makes claims about how inferences should proceed in light of risk and the possibility of error. Steel claims that what is distinctive about his proposal is that these three roles are tied together. In particular, the aforementioned elements of the precautionary principle jointly function as a decision rule, and, clearly, this implies that it (more specifically, the Meta-Precautionary Principle) also functions as a procedural requirement. Steel also claims that his precautionary principle is an epistemic rule. Here the discussion could be clearer. I do not see why this is always the case. For instance, consider the version of the precautionary principle above: ‘If a scientifically plausible mechanism exists whereby an activity can lead to a catastrophe, then that activity should be phased out or significantly reduced’ (28). It is unclear that it also prescribes how scientific inferences should proceed in the light of risks and error. However, Steel might just mean that there is often an epistemic aspect to the principle. This is a valid claim as the study of chemicals in Chapter 9 illustrates (namely, where the underlying rationale was that it is better to delay the commercialization of a safe chemical than to belatedly discover that a chemical in use is harmful).

One of the most common objections to the precautionary principle is that in a weak interpretation it is trivial and in a strong interpretation it is incoherent. Chapter 2 discusses and dismisses these objections. Steel identifies the weak interpretation of the precautionary principle with his Meta-Precautionary Principle, i.e. the claim that uncertainty should not be a reason for inaction in the face of serious threats. That this principle is not trivial can be seen, for instance, by comparing it with cost-benefit analysis. In situations where cost-benefit analysis remains silent on what to do because one is unable to make quantitative predictions of the costs and benefits of alternative policy options, the Meta-Precautionary Principle might still constrain decisions (22-4). For instance, even if making precise quantitative estimates about the social costs of carbon is impossible, the Meta-Precautionary Principle demands that this uncertainty should not be a reason for inaction. Another argument given by Steel against the triviality of the Meta-Precautionary Principle is that what exactly this principle asserts depends crucially on what one understands by ‘uncertainty’. Steel (24-6) argues that the triviality objection implicitly assumes the very weak notion of uncertainty that one cannot predict the outcome with probability one. While this is a common notion of risk in decision theory, I agree with Steel that often in science and ordinary discourse this notion is not of concern because it is very weak. Instead, of concern are stronger notions such as that one is not able to make any probabilistic predictions at all. These stronger notions of uncertainty arise less often and, Steel argues, for this reason the triviality objection is blocked.

The second objection is that the Precautionary Principle is incoherent (26–35). The worry here is that the principle forbids all action because regulations that aim to protect against some harm may themselves pose dangers and hence at the same time the principle will advise against introducing the regulations. Recall that a requirement of Steel's precautionary principle is consistency – that a precaution should not be at the same time recommended against by the version of the precautionary principle that was used to justify it. Thus consistency ensures that incoherence does not arise. Of course, the important remaining question is whether versions of the precautionary principle can be formulated that are consistent. Steel convincingly argues that often this is the case, and hence the incoherence objection is blocked. For example, consider the principle ‘If a scientifically plausible mechanism exists whereby an activity can lead to a catastrophe, then that activity should be phased out or significantly reduced’. There exist potentially plausible mechanisms whereby continued omissions of greenhouse gases will lead to a catastrophe. The remedy is to significantly restrict greenhouse gases, e.g. with a carbon tax. Consistency is satisfied here because if the tax is not introduced abruptly and is not too high (at least at the beginning), then there is no expectation that this will lead to any catastrophic effects.

Chapter 5 considers how to define scientific uncertainty. Decision theory distinguishes between risk and uncertainty (or ambiguity). There is uncertainty when possible outcomes of actions are known but not their probabilities; in the case of risk, also the probabilities are known. Steel (97–101) starts by criticizing this decision-theoretic notion of uncertainty. One valid criticism is that it does not correspond to the way uncertainty is understood in the sciences and ordinary discourse and is also too limited (e.g. because it demands that all the possible outcomes are known).

Another of Steel's criticisms is less on the mark. He argues that if probabilities are interpreted as subjective probabilities, then there would be very little uncertainty because subjective probabilities can nearly always be determined. Hence, Steel argues, probabilities in the decision-theoretic definition can only be interpreted as chances. Yet, since chances rarely exist outside casinos, he doubts the significance of the decision-theoretic notion of uncertainty. This assessment seems misguided. It is too strong to claim that personal probabilities can nearly always determined: for example, ask some climate experts on the probability that certain outcomes will occur and they will not be prepared to assign a probability to these outcomes because they do not know enough, or think about the Ellsberg paradox, which also illustrates that it is not true that subjective probabilities can always be determined.

As an alternative, Steel (101) proposes that ‘scientific uncertainty be understood as the lack of a model whose predictive validity for the task in question is empirically well confirmed’. Given this notion of uncertainty, the precautionary principle is compatible with quantitative approaches. That is, while for some versions of this principle the uncertainty involved will be qualitative, for others it will be quantitative. One concern about this definition is the focus on models: it is not clear that it is always models that are used to arrive at predictions (it could also be, for example, physical understanding). Nonetheless, I take Steel's definition to be a convincing general and informal definition of uncertainty.

Chapter 5 also contains a discussion of the relationship between the precautionary principle and cost-benefit analysis. Steel argues that a standard assumption in cost-benefit analysis is value commensurability, i.e. that all costs and benefits can be converted to a common monetary scale. Steel (113–18) argues that value-commensurability can be viewed as an empirical claim or as a pragmatic assumption for reasoned decisions and that it is not defensible in either of these interpretations. It is not acceptable as descriptive thesis because research shows that, e.g. arbitrary anchors such as the last four digits of our social security number influence our willingness to pay (cf. Kahneman et al. Reference Kahneman, Ritov, Schkade, Litchenstein and Slovic2006). Hence actual preferences are not created by stable and value-commensurable preferences. Further, it is not acceptable as pragmatic assumption because it will be far from the truth in all applications.

This criticism is too strong. I agree that cost-benefit analysis is unsuited in many applications (sometimes because value-commensurability is violated, sometimes for other reasons – cf. Bradley and Steele Reference Bradley and Steele2015). Yet I think that there are many cases where cost-benefit analysis provides a very valuable tool and where value-commensurability is not problematic. More specifically, value-commensurability is often a normative claim, but this case is not discussed. In this case, results about the arbitrary anchors only question the reliability of the way we trade off values, but do not question that there is a right way to trade them off. In any case, the key point is the relationship between the precautionary principle and cost-benefit analysis. Here I agree with Steel that the precautionary principle is different from cost-benefit analysis because the latter, but not the former, presumes value-commensurability. I also agree that in many important environmental and public health policy decisions it is impossible to validly predict the consequences of decisions, as required by cost-benefit analysis (106–7). Still, I would have liked to see a discussion about the relationship between the precautionary principle and cost-benefit analysis in circumstances when value-commensurability is not problematic and probabilities of all outcomes can be estimated. One could guess that cost-benefit analysis is a special case of the precautionary principle, but the book does not discuss this issue.

Chapter 7 is on value-free science. Here the book defends the argument from inductive risk. According to this famous classic argument put forward by Rudner (Reference Rudner1953), decisions whether to accept or reject a hypothesis are influenced by non-epistemic values (such as ethical or social values). The main premises of this argument are that (i) a central aim of science is to decide whether to accept or reject scientific hypotheses, and that (ii) decisions about acceptance and rejection of scientific hypotheses can have implications for practical actions. The discussion in the book mainly centres on defending (ii) (149–58). In essence, I would agree here with Steel's argumentation. Concerning premise (i), Steel (158–9) discusses Jeffrey's (Reference Jeffrey1956) claim that the argument from inductive risk leads to a contradiction. This argument has two premises: if the job of scientists is to make value judgements, then they must engage in value judgements; and if scientists make value judgements, then this does not involve accepting or rejecting hypotheses. Steel argues that Jeffrey is mistaken to endorse the second premise, which he considers to be weak. He then also goes on to briefly discuss Jeffrey's alternative proposal that one should simply report probabilities and thus need not decide whether to accept or reject hypotheses. A standard reply in the literature (e.g. Douglas Reference Douglas2009), which Steel also accepts, is the second-order argument that Jeffery's move is of no help because claims about probabilities also have to be accepted. I am not sure about the validity of this reply, but it is very popular.

Rejection of the ideal of value-free science usually goes hand-in-hand with a rejection of the distinction between epistemic and non-epistemic values. However, Steel's position is different: he endorses the distinction between epistemic and non-epistemic values. Non-epistemic values are allowed to influence the design and interpretation of research as long as they do not lead to conflicts with epistemic values. Steel argues that his position is novel precisely because it combines the rejection of value-free science with an endorsement of the distinction between epistemic and non-epistemic values (146).

Two worries arise in relation to this view. The first is about Steel's claim that non-epistemic values can influence the design and interpretation of research as long as they do not lead to conflicts with epistemic values. Given this, one wonders why one could not simply require that scientists appeal only to epistemic values. When this does not lead to a final decision, it should be left to policymakers to make their decisions by appealing to non-epistemic values. For example, one of the main case studies of the book is the field of toxicology. From a purely epistemic view, the choice between expedited and slower risk assessment methods results in a trade-off between quicker decisions and a greater chance of error (183–4). One could argue that scientists should simply present this trade-off to policymakers, who can then make the final decision given their non-epistemic values. Presumably, Steel would repeat here his argument that it is the scientists who should make these decisions. Yet in this case it seems reasonable to demand that someone representing the values of society (such as policymakers) rather than scientists should make the decisions.

Second, Steel first introduces epistemic values as those that promote the attainment of truth (non-epistemic values are simply the complement of this). As the discussion continues, he makes clear that in many cases epistemic values aim not at truth but at approximate truth and that ‘it is relevant for the present discussion to observe that decisions about what should count as ‘close enough’ to true may depend on value judgments of an ethical or practical nature’ (161). Hence, epistemic values have as their aim also social and practical values, and it is unclear whether the distinction between epistemic and non-epistemic values can be upheld. Indeed, then it seems one ends up with the argument that there is no clear distinction between epistemic and non-epistemic values (Machamer and Ozbeck Reference Machamer, Ozbeck, Machamer and Wolters2004; Douglas Reference Douglas2009). Steel dismisses this objection by arguing that epistemic values are often social only in the sense that they are embedded in the social practices and norms of a scientific community, etc. While I agree that epistemic values are often social in this sense, the real issue is not whether values are social in this sense but whether the aims they mandate is of non-epistemic (e.g. social or ethical) nature.

To sum up, this book defends an interpretation of the precautionary principle as a decision rule from the perspective of philosophy of science. Notwithstanding the criticism noted, overall, the book is very successful: the main claims are well-defended, it is clearly written and deals with an admirably broad range of arguments and case studies. Hence, I predict it will be a standard reference for future debates about the precautionary principle.

ACKNOWLEDGEMENTS

Research for this book review was supported by the British Arts and Humanities Research Council through grant AH/J006033/1 to the ‘Managing Severe Uncertainty Project’ at LSE and the ESRC Centre for Climate Change Economics and Policy, funded by the Economic and Social Research Council through grant number ES/K006576/1.

References

REFERENCES

Ackerman, F. and Stanton, L.. 2013. Climate Economics: The State of the Art. London: Routledge.CrossRefGoogle Scholar
Bradley, R. and Steele, K.. 2015. Making climate decisions. Philosophy Compass.CrossRefGoogle Scholar
Douglas, H. 2009. Science and the Value-Free Ideal. Pittsburgh, PA: University of Pittsburgh Press.Google Scholar
Jeffrey, R. 1956. Valuation and acceptance of scientific hypotheses. Philosophy of Science 23: 237246.CrossRefGoogle Scholar
Kahneman, D., Ritov, I. and Schkade, D.. 2006. Economic preferences or attitude expressions? An analysis of dollar responses to public issues. In The Construction of Preference, ed. Litchenstein, S. and Slovic, P., 565593. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
Machamer, P. and Ozbeck, L.. 2004. The social and the epistemic. In Science, Values and Objectivity, ed. Machamer, P. and Wolters, G., 7889. Pittsburgh, PA: University of Pittsburgh Press.CrossRefGoogle Scholar
Rudner, R. 1953. The scientist qua scientist makes value judgements. Philosophy of Science 20: 16.CrossRefGoogle Scholar