Hostname: page-component-745bb68f8f-d8cs5 Total loading time: 0 Render date: 2025-02-11T22:06:08.894Z Has data issue: false hasContentIssue false

The cognitive economy: The probabilistic turn in psychology and human cognition

Published online by Cambridge University Press:  14 May 2013

Petko Kusev
Affiliation:
Department of Psychology, Kingston University London, London KT1 2EE, United Kingdom. p.kusev@kingston.ac.ukhttp://kusev.co.uk City University, London EC1V OHB, United Kingdom
Paul van Schaik
Affiliation:
Department of Psychology, Teesside University, Middlesbrough TS1 3BA, United Kingdom. P.Van-Schaik@tees.ac.ukhttp://sss-studnet.tees.ac.uk/psychology/staff/Paul_vs/index.htm

Abstract

According to the foundations of economic theory, agents have stable and coherent “global” preferences that guide their choices among alternatives. However, people are constrained by information-processing and memory limitations and hence have a propensity to avoid cognitive load. We propose that this in turn will encourage them to respond to “local” preferences and goals influenced by context and memory representations.

Type
Open Peer Commentary
Copyright
Copyright © Cambridge University Press 2013 

One of the most significant current discussions in economics and psychology is about the (lack of) link between normative (what we should do, based on probability and logic) and descriptive theories (what we do) of decision making. An obvious challenge for and advantage of good theory in decision making is that it is general, being able to account for both normative (rational) and descriptive psychological mechanisms and assumptions (Kusev et al. Reference Kusev, van Schaik, Ayton, Dent and Chater2009). According to the foundation of economic theory, people have stable and coherent “global” preferences that guide their choices among alternatives varying in risk and reward. In all their variations and formulations, normative utility theory (von Neumann & Morgenstern Reference von Neumann and Morgenstern1947), and descriptive prospect theory (Kahneman & Tversky Reference Kahneman and Tversky1979; Tversky & Kahneman Reference Tversky and Kahneman1992) share this assumption (Kusev et al. Reference Kusev, van Schaik, Ayton, Dent and Chater2009). However, a consistent claim from behavioral decision researchers is that, contrary to the assumptions of classical economics, preferences are not stable and inherent in individuals but are “locally” constructed “on the fly” and are strongly influenced by context and the available choice options (e.g., Kusev et al. Reference Kusev, van Schaik, Ayton, Dent and Chater2009; Reference Kusev, Tsaneva-Atanasova, van Schaik and Chater2012a; Reference Kusev, van Schaik and Aldrovandi2012b; Slovic Reference Slovic1995). For example, the preference reversal phenomenon (Lichtenstein & Slovic Reference Lichtenstein and Slovic1971; Reference Lichtenstein and Slovic1973) suggests that no stable pattern of preference underlies even basic choices; in other words, consistent trade-offs between lotteries with different probabilities and values are not made. Accordingly, we shall present some evidence for violations of the classical probability framework, inspired by existing research in judgment and decision making, and also review plausible sources for understanding human decision making: specifically simplicity in human information processing, based on local decision making goals and strategies.

Theorists in cognitive science achieve general theoretical propositions, based on the following assumptions: (1) the cognitive systems solve problems, optimally, given environmental and processing constrains (Anderson Reference Anderson1990; Reference Anderson1991); therefore, the objective is to understand the structure of the problem from the point of view of cognitive systems and (2) cognitive goals determine choice behavior: when a general cognitive goal is intractable, a more specific cognitive goal, relevant to achieving the general goal, may be tractable (Oaksford & Chater Reference Oaksford and Chater2007; Reference Oaksford and Chater2009). For example, all local goals are assumed to be relevant to more general goals, such as maximizing expected utility. The observation that the local goals may be optimized as surrogates for the larger aims of the cognitive system raises another important question about the use of rational models of human cognition. Specifically, Oaksford and Chater (Reference Oaksford and Chater2007; Reference Oaksford and Chater2009) propose that optimality is not the same as rationality. The fact that a model involves optimization does not necessarily imply a rational model; rationality requires that local goals are (1) relevant to general goals and (2) reasonable. Here we make a very simple assumption about how and whether the cognitive system optimizes. We assume that the cognitive system simplifies and adopts “local” goals, and that these goals will be influenced by contextual and memory representations. Accordingly, we argue, strengthening such an account could provide a challenge to classical probability approach.

Whereas some phenomena in judgment and decision making systematically violate basic probabilities rules – classic examples include the conjunction fallacy (Tversky & Kahneman Reference Tversky and Kahneman1983), the disjunction effect (Tversky & Shafir Reference Tversky and Shafir1992), the subadditivity principle (e.g., Tversky & Koehler Reference Tversky and Koehler1994), and the preference reversal phenomenon (Lichtenstein & Slovic Reference Lichtenstein and Slovic1971; Reference Lichtenstein and Slovic1973; Slovic Reference Slovic1995) – the simplicity framework suggested in this commentary argues that people are constrained by information-processing and memory limitations, and hence have a propensity to avoid cognitive load. Research in judgment and decision making demonstrates that by focusing on local goals (e.g., the representativeness heuristic) people may violate principles of classical probability theory (e.g., fallacies in which specific conditions are assumed to be more probable than a single general one). For example, the independence assumption states that the occurrence of one event makes it neither more nor less probable that the other occurs; examples of violation include the conjunction fallacy (Tversky & Kahneman Reference Tversky and Kahneman1983), the disjunction effect (Tversky & Shafir Reference Tversky and Shafir1992), and the familiarity bias (e.g., Fox & Levav Reference Fox and Levav2000; Tversky & Koehler Reference Tversky and Koehler1994). One plausible account for these effects, as an alternative to classical logic, classical probability, and the classical information-processing paradigm, is quantum probability theory (Busemeyer & Wang Reference Busemeyer, Wang, Bruza, Lawless, van Rijsbergen and Sofge2007; Busemeyer et al. Reference Busemeyer, Wang and Townsend2006; Reference Busemeyer, Pothos, Franco and Trueblood2011; target article, sects. 1 and 2). These authors argue that the “classical” view forces highly restrictive assumptions on the representation of the complex cognitive system. In particular, they suggest that (1) the brain is a complex information-processing system with a large number of unobservable states, (2) the brain is highly sensitive to context and, finally, (3) the measurements that we obtain from the brain are noisy and subject to uncertainty. Pothos & Busemeyer (P&B) show that quantum probability theory allows the modeling of decision-making phenomena (e.g., the conjunction fallacy and violations of the sure-thing principle), going beyond classical probability theory. This is because quantum probability theory can account for context- and order-dependence of human behavior.

We conclude that the cognitive system is likely to respond to “local” goals (that might be tractable) influenced by memory representations and context that may be indicative of probability and frequency judgments. Therefore, in contrast to classical probability theory, quantum probability theory has the potential to account for context- and order-dependent behavior that is indicative of human propensity to adopt local goals. Moreover, there is mounting evidence for this type of behavior and simple mechanisms ruling human behavior (Kusev et al. Reference Kusev, Ayton, van Schaik, Tsaneva-Atanasova, Stewart and Chater2011). Therefore, quantum probability has the potential to account for cognitive economy in many domains of human cognition.

References

Anderson, J. R. (1990) The adaptive character of thought. Erlbaum.Google Scholar
Anderson, J. R. (1991) The adaptive nature of human categorization. Psychological Review 98:409–29.CrossRefGoogle Scholar
Busemeyer, J. R., Pothos, E. M., Franco, R. & Trueblood, J. S. (2011) A quantum theoretical explanation for probability judgment errors. Psychological Review 118(2):193218.CrossRefGoogle ScholarPubMed
Busemeyer, J. R. & Wang, Z. (2007) Quantum information processing explanation for interactions between inferences and decisions. In: Quantum Interaction, AAAI Spring Symposium, Technical Report SS-07-08, ed. Bruza, P. D., Lawless, W., van Rijsbergen, K. & Sofge, D. A., pp. 9197. AAAI Press Google Scholar
Busemeyer, J. R., Wang, Z. & Townsend, J. T. (2006) Quantum dynamics of human decision-making. Journal of Mathematical Psychology 50:220–41.CrossRefGoogle Scholar
Fox, C. R. & Levav, J. (2000) Familiarity bias and belief reversal in relative likelihood judgments. Organizational Behavior and Human Decision Processes, 82:268–92.CrossRefGoogle Scholar
Kahneman, D. & Tversky, A. (1979) Prospect theory: An analysis of decision under risk. Econometrica 47:263–91.CrossRefGoogle Scholar
Kusev, P., Ayton, P., van Schaik, P., Tsaneva-Atanasova, K., Stewart, N. & Chater, N. (2011) Judgments relative to patterns: How temporal sequence patterns affect judgments and memory. Journal of Experimental Psychology: Human Perception and Performance 37:1874–886.Google ScholarPubMed
Kusev, P., Tsaneva-Atanasova, K., van Schaik, P. & Chater, N. (2012a) Modelling judgement of sequentially presented categories using weighting and sampling without replacement. Behavior Research Methods. 44:1129–34.CrossRefGoogle ScholarPubMed
Kusev, P., van Schaik, P. & Aldrovandi, S. (2012b) Preferences induced by accessibility: Evidence from priming. Journal of Neuroscience, Psychology, and Economics 5:250–58.CrossRefGoogle Scholar
Kusev, P., van Schaik, P., Ayton, P., Dent, J. & Chater, N. (2009) Exaggerated risk: Prospect theory and probability weighting in risky choice. Journal of Experimental Psychology: Learning, Memory, and Cognition 35:1487–505.Google ScholarPubMed
Lichtenstein, S. & Slovic, P. (1971) Reversals of preference between bids and choices in gambling decisions. Journal of Experimental Psychology 89:4655.CrossRefGoogle Scholar
Lichtenstein, S. & Slovic, P. (1973) Response-induced reversals of preference in gambling: An extended replication in Las Vegas. Journal of Experimental Psychology 101:1620.CrossRefGoogle Scholar
Oaksford, M. & Chater, N. (2007) Bayesian rationality: The probabilistic approach to human reasoning. Oxford University Press.CrossRefGoogle Scholar
Oaksford, M. & Chater, N. (2009) Pre'cis of Bayesian rationality: The probabilistic approach to human reasoning. Behavioral and Brain Sciences 32:69120.CrossRefGoogle Scholar
Slovic, P. (1995) The construction of preferences. American Psychologist 50:364–71.CrossRefGoogle Scholar
Tversky, A. & Kahneman, D. (1983) Extensional versus intuitive reasoning: The conjunction fallacy in probability judgment. Psychological Review 90(4): 293315.CrossRefGoogle Scholar
Tversky, A. & Kahneman, D. (1992) Advances in prospect theory: Cumulative representation of uncertainty. Journal of Risk and Uncertainty 5:297323.CrossRefGoogle Scholar
Tversky, A. & Koehler, D. J. (1994) Support theory: A nonextensional representation of subjective probability. Psychological Review 101:547–67.CrossRefGoogle Scholar
Tversky, A. & Shafir, E. (1992) The disjunction effect in choice under uncertainty. Psychological Science 3:305309.CrossRefGoogle Scholar
von Neumann, J. & Morgenstern, O. (1947) Theory of games and economic behavior (2nd ed.). Princeton University Press.Google Scholar