A voluminous literature documents the many shortcomings people exhibit in judging probability. Barbey & Sloman (B&S) focus on a subset of this research that explores people's abilities to aggregate statistical information in order to judge posterior probabilities of the form P(H|D), where D and H represent data and a focal hypothesis, respectively. Some of this literature indicates that people neglect base rates, although some of the findings are consistent with other judgment errors, such as the inverse fallacy (Koehler Reference Koehler1996), which involves confusing P(H|D) with P(D|H). For instance, Villejoubert and Mandel (Reference Villejoubert and Mandel2002) observed that bias (i.e., systematic inaccuracy) and incoherence (i.e., nonadditivity) in posterior probability judgments was well explained by the inverse fallacy, even though base-rate neglect could not account for the observed performance decrements. Thus, there is some question regarding exactly how much of what has been called base-rate neglect is in fact base-rate neglect. A safer claim is that performance on such Bayesian inference tasks is often suboptimal and much of the error observed is systematic.
B&S challenge a set of theoretical positions oriented around the core notion that humans are better at judging probabilities when the information they are provided with is in the form of natural frequencies (Gigerenzer & Hoffrage Reference Gigerenzer and Hoffrage1995). They argue – convincingly, I believe – that variation in frequency versus probability formats neither explains away performance errors, nor does it account for errors as well as variation in the transparency of the nested set structure of an inference task. I shall not repeat their arguments here. Rather, my aim is, first, to sketch out some key propositions of nested sets theory (NST), which have yet to be described as a series of interlocking principles. Second, I will argue that NST would be on even firmer theoretical ground if the dual-systems assumptions that currently pervade B&S's version of it were jettisoned.
At its core, NST consists of a few simple propositions: First, performance on a range of reasoning tasks can be improved by making the partitions between relevant sets of events more transparent. I call this the representation principle. Second, because many reasoning tasks, such as posterior probability judgment, involve nested set relations, transparency often entails making those relations clear as well. I call this the relational principle. Third, holding transparency constant, nested set representations that minimize computational complexity will optimize performance. I call this the complexity principle. Fourth, the manner in which task queries are framed will affect performance by varying the degree to which default or otherwise salient representations minimize task complexity. In effect, this is the flip side of the complexity principle, and I call it the framing principle. Fifth, improvements in the clarity of nested set representations can be brought about through different modalities of expression (e.g., verbal description vs. visual representation). I call this the multi-modal principle. Sixth, within a given modality, there are multiple ways to improve the clarity of representation. I call this the equifinality principle. This list is almost certainly incomplete, yet it provides a starting point for developing a more explicit exposition of NST, which up until now has been more of an assemblage of hypotheses, empirical findings, and rebuttals to theorists proposing some form of the “frequentist mind” perspective. In the future, attempts to develop NST could link up with other recent attempts to develop a comprehensive theory of the representational processes in probability judgment (e.g., Johnson-Laird et al. Reference Johnson-Laird, Legrenzi, Girotto, Legrenzi and Caverni1999; Mandel, in press).
Although NST is not intrinsically a dual-systems theory (DST), B&S have tried to make it “DST-compatible.” This is unfortunate for two main reasons. First, although DSTs are in vogue (for an overview, see Stanovich & West Reference Stanovich and West2000) – perhaps because they offer a type of Aristotelian explanation long favored by psychologists (Lewin Reference Lewin1931) – they are not particularly coherent theoretical frameworks. Rather, they provide a rough categorization of the processes that guide reasoning and that influence performance through an effort-accuracy tradeoff. The second reason for preferring “pure NST” to an NST-DST hybrid is that the former is not only more parsimonious, it actually offers a better explanatory account. According to the hybrid theory, when the nested set structure of a reasoning task is unclear, people have difficulty applying the rigorous rule-based system (also called “System 2”) and fall back on the more error-prone associative system (also called “System 1”). However, B&S say little about how judgment biases arise from those associative processes, or how system switching may occur.
Pure NST does not preclude the idea that impoverished representations of nested set relations can shift judgment towards a greater reliance on associative reasoning processes, but nor does it depend on that idea either. A viable alternative explanation is that impoverished representations lead to performance decrements because they increase one's chances of failing to access the correct solution to a problem. This does not necessarily mean that they switch to associative processes. It may simply mean that they fail to apply the correct principle or that they select the wrong information aggregation rule. Consider the inverse fallacy: It seems more likely that the error stems from a failure to understand how to combine P(D |¬H) with P(D | H) (and with the base rates of H and ¬H where they are unequal) than that it follows from use of associative processes.
Improving the representational quality of nested sets may also influence rule-based processes by simplifying the computations required to implement a normative information aggregation strategy. Indeed, as B&S indicate, the performance decrements on Bayesian judgment tasks that Girotto and Gonzalez (Reference Girotto and Gonzalez2001) observed when participants were presented with “defective” (but nevertheless transparent) nested sets, appear to be attributable to the fact that such representations require at least one additional computational (subtraction) step. That computation itself may not be difficult to perform, but if it is missed the participant's judgment will surely be wrong.
In short, the types of errors that arise from impoverished representations of nested set relations are generally consistent with a rule-based system. NST should remain pure and single, unencumbered by a marriage to dual-systems assumptions.