1. Introduction.
In recent years the reduction of chemistry has been discussed in a variety of ways. Many studies have concentrated on intertheoretical reduction between theories of chemistry and theories of physics (Bunge Reference Bunge1982). Others have discussed the reduction of chemistry in a naturalistic manner, by examining the question of how some typically molecular properties can be deduced from quantum mechanics in an ab initio fashion or whether the periodic system can be deduced from quantum mechanics (Scerri Reference Scerri, Hull, Forbes and Burian1994, Reference Scerri2007). More recently a number of authors have turned to discussing the ontological reduction of chemistry (McLaughlin Reference McLaughlin, Beckerman, Flohr and Kim1992; Le Poidevin Reference Le Poidevin2005). The present article examines the claims regarding emergence and the ontological reduction of chemistry in the last two cited articles.
2. McLaughlin on British Emergentism and the Relationship of Chemistry to Physics.
McLaughlin (Reference McLaughlin, Beckerman, Flohr and Kim1992) has written a frequently cited paper in which he seeks to give an overview of the philosophical school that he dubs ‘British Emergentism’, which includes the work of J. S. Mill, Bain, Morgan and most recently C. D. Broad. I begin with a brief summary of McLaughlin's characterization of these philosophers, especially of C. D. Broad.
Emergentists held, rather uncontroversially, that the natural kinds at each scientific level are wholly composed of kinds of lower levels, and ultimately of kinds of elementary particles. However, they also maintained that
Some special science kinds from each special science can be wholly composed of the types of structures of material particles that endow the kinds in question with fundamental causal powers. (McLaughlin Reference McLaughlin, Beckerman, Flohr and Kim1992, 50–51)
These powers were said to ‘emerge’ from the types of structures in question. One example given repeatedly by the British emergentists was that of chemical elements, which have the power to bond to other elements by virtue of their internal microscopic structures. According to the emergentists, when these causal powers operate, they bring about the movement of particles. The striking part, as McLaughlin calls it, about the emergentist claim, is that the kinds pertaining to a special science, such as chemistry, are said to have the power to influence microscopic motions of particles in ways that are not anticipated by the laws governing the microscopic particles. Emergentism is thus committed to the possibility of ‘downward causation’.
For example, emergentists such as Broad believed that chemical bonding represents an example of emergence and the operation of downward causation. Indeed he went as far as to declare that
the situation with which we are faced in chemistry … seems to offer the most plausible example of emergent behaviour. (Broad Reference Broad1925, 65)
Broad believed that emergent and mechanistic chemistry (non-emergent chemistry) agree in the following respect:
That all the different chemical elements are composed of positive and negative electrified particles in different numbers and arrangements; and that these differences of number and arrangement are the only ultimate difference between them. (Broad Reference Broad1925, 69)
However, he also stressed that if mechanistic chemistry were true it should be possible to deduce the chemical behavior of any element from the number and arrangement of such particles, without needing to observe a sample of the element in question, which is something that is clearly not the case.
Against this position McLaughlin maintains that the coming of quantum mechanics and the quantum mechanical theory of bonding has rendered these emergentist claims untenable. In fact he is very categorical about the prospects for modern day emergentism:
It is, I contend, no coincidence that the last major work in the British Emergentist tradition coincided with the advent of quantum mechanics. Quantum mechanics and the various scientific advances made possible are arguably what led to British Emergentism's downfall … quantum mechanical explanations of chemical bonding in terms of electromagneticism [sic], and various advances this made possible in molecular biology and genetics—for example the discovery of the structure of DNA—make the main doctrines of British emergentism, so far as the chemical and the biological are concerned at least, seem enormously implausible. Given the advent of quantum mechanics and these other scientific theories, there seems not a scintilla of evidence that there are emergent causal powers or laws in the sense in question … and there seems not a scintilla of evidence that there is downward causation from the psychological, biological and chemical levels. (McLaughlin Reference McLaughlin, Beckerman, Flohr and Kim1992, 54–55)
These anti-emergentist claims can be criticized on several different fronts. Granted that the quantum mechanical theory of bonding that McLaughlin appeals to does provide a more fundamental account of chemical bonding than the classical, or Lewis's, theory. Nevertheless, it does not permit one to predict in advance the behavior of elements or the properties that a compound might have once any two or more elements have combined together. Moreover, it is not as though there was a complete absence of any theoretical understanding of chemical bonding before the quantum theory was introduced. Lewis's theory, whereby covalent bonds occur when elements share pairs of electrons, gave a good account of the bonding in most compounds. Lewis arrived at his theory through the crucial realization that most stable molecules have an even number of electrons, while unstable ones such as nitrogen monoxide (NO) possess an odd number of electrons. Lewis thus naturally assumed that bonding to form stable molecules involved the pairing of electrons in bonds or as lone pairs.
Admittedly the quantum mechanical theory, devised by Heitler, London, Pauling, Millikan, and others, goes beyond this ‘homely picture’ of pairs of electrons, mysteriously holding atoms together. However, Lewis's concept of bonds as pairs of electrons is not thereby refuted but rather given a deeper physical mechanism. According to the quantum mechanical account, electrons are regarded as occupying bonding and anti-bonding orbitals. To a first approximation, if the number of bonding electrons exceeds the number of anti-bonding electrons, then the molecule is predicted to be a stable one.Footnote 1 Moreover, the electrons occupy these orbitals, two by two, in pairs. The deeper understanding lies in the fact that the electrons are regarded as spinning in opposite directions within all such pairs. Indeed, it is the exchange energy associated with electron spin which accounts quantitatively for the bonding in any compound, and it is in this last respect that the quantum mechanical theory goes beyond Lewis's theory.
Pauling, one of the chief architects of the quantum mechanical account of chemical bonding, was quick to point out the continuity with Lewis's concept when he wrote:Footnote 2
It may be pointed out that this theory is in simple cases entirely equivalent to G. N. Lewis's successful theory of the shared electron pair, advanced in 1916 on the basis of purely chemical evidence. Lewis's electron pair consists now of two electrons which are in identical states except that their spins are opposed. (Pauling Reference Pauling1928, 359)
There is another aspect of McLaughlin's above cited passage that is entirely incorrect, namely his claim that the discovery of the structure of DNA owes something to the quantum mechanical theory of bonding. As a matter of fact, there is no connection whatsoever between these two developments. All I can think of to explain McLaughlin's statement is that Pauling was involved in both developments.Footnote 3 But of course, Pauling rather famously failed to find the structure of DNA and was beaten to it by Crick and Watson.
The discovery of the structure of DNA was driven almost entirely by the X-ray diffraction evidence that became available to Crick and Watson, courtesy of Wilkins and Franklin. It did not rest on any quantum mechanical calculations or indeed any insights provided by the theory. It involved model building and cardboard cutouts of bases. McLaughlin does not say anything whatsoever about pre-quantum mechanical theories of bonding, except to imply that they were completely inadequate. At the same time, he suggests that the quantum mechanical theory has provided a complete answer to the question of bonding. Neither of these extreme positions is correct.
It is not clear whether it is the superior quantitative nature of the quantum mechanical theory that McLaughlin is so impressed by, since he does not say. The only argument offered is that the quantum mechanical theory led directly to the elucidation of the structure of DNA and so on. If one puts aside these false arguments as I am urging, it raises the question of why McLaughlin believes that quantum mechanics was so overwhelmingly successful in chemistry, to the extent of rendering emergentism about bonding completely untenable. McLaughlin offers us no such argument for the superiority of the quantum mechanical account of bonding over the earlier classical theory of Lewis. McLaughlin implies that the quantum mechanical theory provides what the classical theory could not, namely the power to predict how two elements might react together. Or is McLaughlin suggesting that using quantum mechanics we can predict the properties of an element from a knowledge of the number of fundamental particles that its atoms possess?
Unfortunately, as anyone who is aware of the current state of quantum chemistry knows well, neither of these feats are possible. In the case of elements we can predict particular properties perhaps such as ionization energies but not chemical behavior. In the case of compounds, what can be achieved is an accurate estimate, and in many cases even predictions, regarding specific properties in compounds that are known to have formed between the elements in question. Quantum mechanics cannot yet predict what compounds will actually form. Broad's complaint about the inability of mechanistic or classical chemistry to predict the properties of elements, or the outcome of chemical reactions between any two given elements, remains unanswered to this day. Why then should we accept McLaughlin's claim that pioneer quantum chemistry, or even today's version of the theory of bonding, can so decisively deal a death blow to any notions of emergence and downward causation?
In any case, as McLaughlin himself seems to concede, the advent of a quantum mechanical theory of bonding did not in fact eliminate emergentism completely since some prominent biologists and neurophysiologists such as Roger Sperry, whom he cites, continued to work in this tradition. Moreover, if one surveys the literature, one cannot fail to be struck by the ‘re-emergence of emergence’, as it has aptly been termed (Cunningham Reference Cunningham2001). This is equally true of the humanities as it is of the physical sciences. For example, the prominent Harvard chemist George Whitesides has been showing increasing support for claims for the emergence of chemical phenomena from physical ones, precisely the example of emergence which McLaughlin wishes to deny so strenuously (Whitesides and Ismagilov Reference Whitesides and Ismagilov1999). Rather than being ‘killed off’ by the quantum mechanical account of chemical bonding, emergence is alive and well. McLaughlin's attempt to assert the reduction of chemistry by appealing to the nonexistence of emergence of the chemical from the physical, and his associated denial of downward causation are thus entirely unconvincing, at least to the present author.
Finally, as Kim (Reference Kim1999) has pointed out in another context, the notion of emergence is a perfectly respectable one that bears some striking similarities to the currently popular notion of non-reductive physicalism that prevails in the philosophy of mind.Footnote 4 I do not believe that a straightforward appeal to the quantum mechanical account of chemical bonding can be taken as signaling the demise of emergence of chemistry from physics.
3. Another Approach to the Reduction of Chemistry—Le Poidevin.
The second article under consideration also raises the question of the ontology of chemistry. To what extent can we avail ourselves of knowledge obtained through theories such as quantum mechanics? Le Poidevin (Reference Le Poidevin2005), contrary to McLaughlin's approach, believes that we need to separate ontology from epistemology rather sharply. He claims to have given an argument in favor of the ontological reduction of chemistry, which does not appeal to the fortunes of any particular physical or chemical theory. He also hopes to bypass the kinds of problems that beset a physicalist approach to ontological reduction. As he explains, these problems apply to the reduction of the mental as much as they do to the reduction of the biological or chemical levels to fundamental physics.
Le Poidevin makes special mention of the periodic system and of Mendeleev's prediction of new elements. He sets out to discover why Mendeleev was so confident that the elements he predicted actually existed. Le Poidevin claims that this is not a question about Mendeleev's confidence in the periodic law but rather about an implicit conceptual move. If one grants that the gaps in the periodic table represented genuine possibilities, elements that could exist, why did Mendeleev assume that the possibilities would actually be realized?
Le Poidevin then draws the following distinction: “Even if some elements in the table are merely possible, there is a genuine difference between the physical possibility of an element between, say, zinc and arsenic (atomic numbers 30 and 33), and the mere logical possibility of an element between potassium and calcium (19 and 20)” (Le Poidevin Reference Le Poidevin2005, 119).
I refer to this passage because the discreteness in the existence of elements goes on to play a pivotal role in Le Poidevin's eventual argument in favor of the ontological reduction of chemistry. Le Poidevin agrees with those who in recent years have claimed that chemistry is not reduced to physics in an epistemological sense but, to repeat, his real goal is to examine the ontological question without appeal to theories: “There is, I think, a strong intuition that ontological reduction is true, whatever the fortunes of epistemological reduction. But what is the source of this intuition? Can ontological reduction be defended independently of epistemological reduction?” (Le Poidevin Reference Le Poidevin2005, 120–121).
Le Poidevin's answer to the last question is that it can. In addition, he is well aware that the frequent appeal to physicalism that is made, especially in the philosophy of mind, is plagued by some rather serious problems. The author reminds us that the claim that chemical properties supervene on those properties described by the supposedly complete science is just as trivial as the thesis that mental properties do. Secondly he brings up the so called ‘symmetry problem’. Even if we suppose a one to one correspondence between a given chemical property and one described by physics, that correspondence would not by itself suggest that one is more fundamental than the other.
Le Poidevin considers the relationship between valence and electronic configuration in an effort to cast further light on these issues:
Suppose, for example, valency to supervene on electronic configuration. At first sight, the relation appears to be asymmetric because of a valency of 1, for example, can be realized by a number of distinct configurations, but nothing can differ in terms of valency without also differing in terms of electronic configuration. However, the relevant part of the configuration—the part that determines valency—will not vary among elements of the same valency. The determination therefore goes both ways. (Le Poidevin Reference Le Poidevin2005, 123–124)
But is Le Poidevin correct in his assertion that “nothing can differ in terms of valencywithout also differing in terms of electronic configuration”? In fact this is not the case since, as is well known, most non-metal elements can show variable valences in spite of possessing a single electronic configuration. Sulfur, to take just one example, has the electronic configuration of 1s2, 2s2, 2p6, 3s2, 3p4. Nevertheless, it commonly shows valences of +2, +4, or +6 such as in the compounds SCl2, SO2, and SO3, respectively.
But Le Poidevin is nevertheless correct in pointing out that in general the symmetry problem is a pressing one. The grounding of reduction requires something more than the physicalist prejudice, or the hope, that physical levels determine chemical levels and not vice versa.
Le Poidevin (Reference Le Poidevin2005, 124) proposes to circumvent both this problem and the problem of vacuity, mentioned above, by an approach that he terms combinatorialism:
The central contention of combinatorialism is this: possibilities are just combinations of actually existing simple items (individuals, properties, relations). Let us call this the principle of recombination. To illustrate it, suppose the actual world to contain just two individuals, a and b, and two monadic properties, F and G, such that (Fa & Gb). Assuming F and G to be incompatible properties, and ignoring the possibility of there being nothing at all, then the following is an exhaustive list of the other possibilities:
1. Fa
2. Fb
3. Ga
4. Gb
5. Fa & Fb
6. Ga & Gb
7. Ga & Fb.
Le Poidevin explains that combinatorialism is a form of reductionism about possibilia. He claims that the talk of nonexistent possibilia is made true by virtue of actual objects and their properties, just as the inhabitants of his model world are made possible by virtue of a and b and the properties F and G. The idea is that we should consider Mendeleev's predicted elements in this way. According to Le Poidevin's approach, the elements that are as yet nonexistent but physically possible are those that can be regarded as combinations of some undefined basic objects and/or basic properties.
Le Poidevin suggests that this approach provides a means of establishing the required asymmetry in order to ground the reduction of the chemical to the physical or the mental to the physical, and a means of countering the symmetry problem alluded to earlier:
A property-type F is ontologically reducible to a more fundamental property-type G is the possibility of something's being F is constituted by a recombination of actual instances of G, but the possibility of something's being G is not constituted by a recombination of actual instances of F. (Le Poidevin Reference Le Poidevin2005, 129)
I come now to the crucial argument in Le Poidevin's paper:
But since the thesis of ontological reduction is about properties, we do have to have a clear conception of what is to count as a chemical property. I shall take the identity of an element, as defined by its position in a periodic ordering, and its associated macroscopic properties (capacity to form compounds of a given composition with other elements, solubility etc.) to be paradigmatically chemical properties. … The question of the ontological reduction of chemistry (or at least the question I am interested in) is the question of whether these paradigmatically chemical properties reduce to more fundamental properties. (Le Poidevin Reference Le Poidevin2005, 131)
Let me say something about the second sentence, since I think this will turn out to be Le Poidevin's undoing. In his brief list of what he terms ‘paradigmatically chemical properties’, the author has lumped together (a) the identity of elements, (b) their capacity to form compounds of a certain composition, and (c) their solubilities. But there is a long standing philosophical view whereby elements should be regarded as having a dual nature consisting of basic substances and of simple substances (Paneth Reference Paneth1962). If one takes this dual view seriously, it casts doubt on Le Poidevin's lumping together of the existence of elements and their properties such as solubilities.
As Mendeleev, and more recently Paneth among others have stressed, the notion of an element as a basic substance concerns just its identity and its ability to act as the bearer of properties. A basic substance does not however possess any properties.Footnote 5 The ‘properties’ of an element, however, reside in the simple substance and not in the element as a basic substance. According to this view, the identity of an element and its properties are regarded as being quite separate. If we consider le Poidevin's three examples, namely identity, capacity to form compounds, and solubility, we see a conflation of basic substance aspects (identity) with simple substance aspects (solubility). It is only by failing to distinguish between the identity of elements and their possessing properties, such as solubility, that Le Poidevin is able to give the impression that he has provided an argument for the ontological reduction of chemistry as a whole.
He then adds:
We might, just accept it as a brute fact about the world that the series of elements was discrete. But if there were a finite number of properties, combinations of which generate the physical possibilities represented by the periodic table, then variation would necessarily be discrete rather than continuous. … The point is that, given the principle of recombination, unless those more fundamental properties exist, unactualized elements would not be physical possibilities. (Le Poidevin Reference Le Poidevin2005, 131–132)
Let me try to rephrase the argument. We assume that the combination of a finite number of fundamental properties, via a combinatorial approach, leads to a discrete set of macroscopic physical possibilities. We also know empirically that the chemical elements occur in a discrete manner, since there are no intermediate elements between, say, hydrogen and helium. Le Poidevin is thus claiming that his combinatorial approach can be taken as an explanation for the discreteness in the occurrence of elements and furthermore that it justifies the fact that Mendeleev regarded the yet undiscovered elements like gallium as being physical possibilities rather than merely logical ones.
4. Further Comments on Le Poidevin.
One might even grant that Le Poidevin's arguments provide the sought after justification for the ontological reduction of the chemical elements to fundamental physical properties. But has Le Poidevin provided any grounding for the ontological reduction of chemistry tout court? I think not. For example, the solubilities of elements, which the author included in his list of paradigmatically chemical properties, does not occur in a discrete manner. A particular ionic compound can have a solubility of 5 grams per liter. Another one might have a solubility of 6 grams per liter of water. But there is nothing discrete about solubility. It is quite possible that other salts will display solubilities falling anywhere between these two values.
Unlike the existence of chemical elements, which does appear to be a discrete phenomenon, solubility or acidity or indeed almost every “paradigmatically chemical property” does not form a discrete set. As a result, one cannot invoke a combinatorial argument of the type suggested by Le Poidevin in order to provide an ontological grounding for these properties.
As to whether Le Poidevin has separated the question of ontological reduction as fully from that of epistemological reduction as he seemed to promise in his article, I have some doubts. Admittedly, the ordering of the chemical elements may not be in any sense theoretical, as he states, but there is no denying that ordering the elements by way of atomic number, or by whatever other means, is dependent on our knowledge of the elements. It is just that this knowledge takes the form of a classification or ordering rather than a theory, as Le Poidevin correctly points out. But surely this does not render the act of classification any less epistemological.
Finally, I would like to point out some specific points concerning Le Poidevin's analysis. Let me return to the question of the discrete manner in which the elements occur. Le Poidevin takes this fact to support a combinatorial argument whereby a finite number of fundamental entities combine together to give a discrete set of composite elements. But what if we consider the combination of quarks (charge = 1/3), instead of protons (charge = 1)? In the former case a finite number of quarks would also produce a discrete set of atoms of the elements, only the discreteness would involve increments of one third instead of integral units. In fact, chemists and physicists have been actively searching for such ‘quark matter’ (J⊘rgensen Reference J⊘rgensen1978).
And if this matter were found, it would then be physically possible for there to be two elements between, say,
$Z=19$
and
$Z=20$
, to use Le Poidevin's example. Let us further suppose that a future theory might hold that the fundamental particles are some form of subquarks, with a charge of 0.1 units. Under these conditions combinatorialism would lead to the existence of nine physical possibilities between elements 19 and 20, and so on. It would appear that Le Poidevin's distinction between a physical possibility, as opposed to a merely logical one, is dependent on the state of knowledge of fundamental particles at any particular epoch in the history of science, which is surely not what he intends. Indeed, the distinction proposed by Le Poidevin would appear to be susceptible to a form of vacuity, not altogether unlike that faced by physicalism, and which was supposed to be circumvented by appeal to combinatorialism.
Finally there is a somewhat general objection to the use of combinatorialism in order to ground the ontological reduction of chemistry. It would seem that the assumption that fundamental entities combine together to form macroscopic chemical entities ensures from the start that the hoped for asymmetry is present. If one assumes that macroscopic chemical entities like elements are comprised of subatomic particles, then of course it follows that the reverse is not true. The hoped for asymmetry appears to have been written directly into the account, I claim, rather than deduced.
5. Conclusion.
After many years during which philosophers of chemistry concentrated on the question of the epistemological reduction of chemistry, and had perhaps dismissed the question of ontological reduction as a foregone conclusion, there has been a recent resurgence of interest in the ontological question. McLaughlin has used the success of the quantum theory of chemical bonding to conclude incorrectly that the emergence of chemistry from physics is entirely ruled out. Le Poidevin claims to have given an ontological argument in favor of the reduction of chemistry which does not appeal to any physical theories and yet it appears to do just that.
My own conclusion is that one should exercise moderation between an extreme Quinean approach of attending mainly to scientific theories and Le Poidevin's approach of dispensing altogether with the findings of scientific theories. Surely a more subtle approach is required in trying to uncover the ontology of chemistry, or any other special science. Of course, one needs to consult the findings of the empirical sciences in question, but there is still scope for philosophical consideration, perhaps along the general lines offered by Le Poidevin. Philosophical positions such as reductionism, atomism, and emergentism cannot be judged only on the basis of some contemporary theory or other. In addition, if one does consult the findings of scientific theories to draw ontological lessons, it is essential for one to do so in an accurate manner and not in the way that these two authors appear to have done. Nevertheless, it is encouraging to see mainstream philosophers now taking an interest in chemistry.