In a pre-scientific era, the Greek philosopher Socrates “demonstrated” that all learning is remembering, by leading an illiterate slave step by step to prove the Pythagorean theorem simply by asking him questions and drawing lines in the sand with a stick (Plato Reference Plato and Scott2006). Of course, such a demonstration reveals more about the mind of Socrates than that of the slave. So it is with contemporary attempts to demonstrate that Homo sapiens are “intuitive Bayesians” (e.g., Gigerenzer & Hoffrage Reference Gigerenzer and Hoffrage1995). Researchers can encourage behavior somewhat aligned with Bayes’ theorem by providing participants with natural frequencies, posing questions that facilitate Bayesian computation, organizing statistical information around the reference class, or presenting diagrams that highlight the set structure of problems (see Table 2 in the target article). All of these tasks are useful in illuminating our understanding of judgment and decision-making; none of them demonstrate that people are essentially Bayesian.
At its best, evolutionary psychology provides useful constraints on theorizing and more closely aligns brain and behavioral sciences with modern evolutionary biology. At worst, however, claims about the environment of evolutionary adaptation become “just-so stories” conferring scientific legitimacy on the author's initial assumptions rather than producing falsifiable hypotheses. In the case of judgment under uncertainty, it is obvious that our ancestors did not reason with percentages. However, there is no evidence that the mind “naturally” processes frequencies. Indeed, aesthetically, it may be seem more “natural” to imagine our ancestors reasoning about “the chance that this mushroom I just picked is poisonous” rather than “what number out of 40 similar looking mushrooms picked under similar circumstances are poisonous.” More to the point, there is very good evidence that at least one contemporary hunter-gatherer culture, the Pirahã people of Brazil, have no words for numbers other than “one, two, and many” and that on numerical cognition tasks, their performance with quantities greater than three is “remarkably poor” (Gordon Reference Gordon2004). If hunter-gatherer peoples of our own time can get by without numeric concepts, why should we assume that proto-humans in the ancestral environment developed hardwired mechanisms that “embody aspects of a calculus of probability” (Cosmides & Tooby Reference Cosmides and Tooby1996, p. 17; quoted in the target article, sect. 1.2.2, para. 3) enabling us to automatically solve story problems in a Bayesian fashion?
A more reasonable assertion is that Homo sapiens have evolved a cognitive architecture characterized by adaptive redundancy. In many areas of reasoning – problem solving, judgment, and decision making – people make use of more than one kind of cognitive process operating on more than one type of cognitive representation. The particular substance of these processes and representations are developed through learning in a cultural context, although the cognitive architecture itself may be part of our biological inheritance. Dual-process models are beginning to characterize the nuts and bolts of this adaptive redundancy in human cognition. The portrait emerging from the research is of a human organism that is generally capable and adaptive (the glass is half full) but also prone to ignoring base rates and other systematic deviations from normative performance (the glass is half empty). Barbey & Sloman's (B&S's) careful review of the literature in the target article clearly suggests that dual process theories best account for the empirical evidence pertaining to base-rate neglect.
B&S highlight the similarities between several dual process theories, asserting that people reason with two systems they label associative and rule-based. They attribute judgmental errors to associative processes and more accurate performance with base rates to rule-based inferences – provided that problems are presented in formats that cue the representation of nested sets underlying Bayesian inference problems. As the authors note, this is the heart of the Tversky and Kahneman (Reference Tversky and Kahneman1983) nested set hypothesis. It is here where differences among the dual process theories begin to emerge and where the specific details of Fuzzy-Trace Theory (FTT; Reyna & Brainerd Reference Reyna and Brainerd1995) shed light on intuitive probability judgments.
The dual systems of FTT operate on verbatim and gist representations. FTT asserts that vague impressions are encoded along with precise verbatim information. Individual knowledge items are represented along a continuum such that fuzzy and verbatim memory traces coexist. Gist memory traces are not derived from verbatim representations but are formed in parallel using different mechanisms. The result is the creation of multiple traces in memory. Verbatim and gist traces are functionally independent, and people generally prefer to reason with gist representations for a given task.
FTT predicts that people have difficulty with conditional and joint probabilities because it is hard to reason about nested, hierarchical relationships between items and events. Nested or overlapping class-inclusion relations create processing interference and confusion even in educated thinkers who understand probabilities (Reyna & Brainerd Reference Reyna and Brainerd1995). People prefer to reason with simplified gist representations of problems (the fuzzy-processing preference), and one specific way of simplifying predicted by FTT is denominator neglect.
Denominator neglect consists of behaving as if one is ignoring the marginal denominators in a 2×2 table. Thus, in a 2×2 table the base-rate P(B) is the marginal total of P(B and A)+P(B not A). Ignoring marginal denominators such as P(B) in estimating P(A and B) or P(A given B) can lead to logical fallacies. The FTT principle of denominator neglect allows for a priori and precise predictions about errors of conjunction and disjunction as well as base-rate neglect. We have found that ignoring marginal denominators can lead to systematic errors in problems involving base rates (Wolfe Reference Wolfe1995) and conjunctive and disjunctive probability estimates (Wolfe & Reyna, under review).
Denominator neglect also explains conversion errors in conditional probability judgments, that is, confusing P(A given B) with P(B given A) (Wolfe Reference Wolfe1995). When problems are presented in a format that affords an accurate representation of nested sets, conjunction and disjunction fallacies, as well as base-rate neglect are generally reduced. Yet, improving performance is one thing, proving that we are intuitive Bayesians is another. The adaptive redundancy that gives us flexibility and cognitive frugality can also lead to serious and systematic errors, a fate shared by Socrates and the slave alike.