Hostname: page-component-745bb68f8f-grxwn Total loading time: 0 Render date: 2025-02-11T15:16:48.161Z Has data issue: false hasContentIssue false

Quantum modeling of common sense

Published online by Cambridge University Press:  14 May 2013

Hamid R. Noori
Affiliation:
Institute of Psychopharmacology, Central Institute for Mental Health, Medical Faculty Mannheim, University of Heidelberg, J 5, 68159 Mannheim, Germany. hamid.noori@zi-mannheim.derainer.spanagel@zi-mannheim.dehttp://www.zi-mannheim.de/psychopharmacology.html
Rainer Spanagel
Affiliation:
Institute of Psychopharmacology, Central Institute for Mental Health, Medical Faculty Mannheim, University of Heidelberg, J 5, 68159 Mannheim, Germany. hamid.noori@zi-mannheim.derainer.spanagel@zi-mannheim.dehttp://www.zi-mannheim.de/psychopharmacology.html

Abstract

Quantum theory is a powerful framework for probabilistic modeling of cognition. Strong empirical evidence suggests the context- and order-dependent representation of human judgment and decision-making processes, which falls beyond the scope of classical Bayesian probability theories. However, considering behavior as the output of underlying neurobiological processes, a fundamental question remains unanswered: Is cognition a probabilistic process at all?

Type
Open Peer Commentary
Copyright
Copyright © Cambridge University Press 2013 

Using quantum theory for understanding cognitive processes was mainly inspired by the Copenhagen interpretation of quantum mechanics in the 1920s. However, it took scientists almost a century to formalize cognitive models utilizing the unique features of quantum probability (Aerts & Aerts Reference Aerts and Aerts1995).

Based on the Kolmogorov probability axioms, classical theories rely on intuitive mathematical foundations that have substantiated their application for modeling psychological phenomena for a long time. Nevertheless, a large body of evidence on order and context dependent effects and the violation of the law of total probability suggest that human judgment might follow counterintuitive principles.

One of the major achievements of quantum theory was the conceptualization of the superposition principle. Measurements of a system, which is linearly composed by a set of independent states, assign it to a particular state and stop it from being in any other one. As a consequence, the act of observation influences the state of a phenomenon being observed, and indicates a general dependency on the order of observations. These are two significant aspects of cognitive processes that dramatically challenge the classical theories.

The lack of reliable mathematical formalisms within the classical frameworks to address such issues, the inclusion of the classical assessments as specific (trivial) quantum assessments, and the wider application range of quantum probability, even suggest the superiority of quantum theory for modeling cognition.

Pothos & Busemeyer (P&B) discuss this matter thoroughly and elegantly. Modeling non-commutative processes is a unique characteristic of quantum theory, which provides a noticeable distinction between compatible and incompatible questions for cognitive systems. From the psychological point of view, incompatibility between questions refers to the inability of a cognitive agent in formulating single thoughts for combinations of corresponding outcomes. Each question influences the human state of mind in a context-dependent manner and, therefore, affects the consideration of any subsequent question (induced order dependency). Bayesian models simply fail to represent the context- and order-dependent scenarios, whereas for non-dynamic compatible questions their predictions converge to the assessment of the quantum probability theories.

In addition to the more general nature of quantum theory in describing static cognitive processes, this formalism could also be viewed as an extension of the classical probabilities for dynamic processes. Whereas time evolution is in both theories represented as linear transformations, the dynamic quantum probabilities are nonlinear functions, which in general implicate possible violations of the law of total probability.

In the long run, the framework of quantum modeling of cognitive processes presents a generalization of the Bayesian theories, with a deeper notion of uncertainty and natural approaches to problems.

Despite the convincing superiority of quantum theories with respect to classical probability models, a fundamental question is still open: Are cognitive processes governed by stochastic principles at all? To address this question, we will focus on the compatibility and expediency of stochastic approaches from the point of view of neuroscience research, and omit any involvement in philosophical discussions on the nature of human judgment and decision making.

Following Griffiths and colleagues (Reference Griffiths, Chater, Kemp, Perfors and Tenenbaum2010), the authors characterize probabilistic models of cognition as “top-down” or “function-first.” Furthermore, P&B espouse the philosophy that “neuroscience methods and computational bottom-up approaches are typically unable to provide much insight into the fundamental why and how questions of cognitive process” (sect. 1.2) and hence suggest the right modeling strategy as beginning with abstract (stochastic) principles and then reducing them into (deterministic) neural processes.

However, the assumption of a stochastic nature of cognition and behavior has severe consequences both mathematically and biologically.

  1. 1. Convergence of stochastic and deterministic results: Using a deterministic dynamic systems model, computational neuroscience and biophysics improved our understanding of neuronal processes in the last decades. Particularly, the models reproduced and predicted neurochemical and electrophysiological processes that were shown to induce alterations in the behavior of animals and human (Knowlton et al. Reference Knowlton, Morrison, Hummel and Holyoak2012; Maia & Frank Reference Maia and Frank2011; Noori & Jäger Reference Noori and Jäger2010). The sum of these theoretical models and their experimental validations suggests a deterministic relationship between neural processes and behavior. On the other hand, stochastic models of cognition assign to each behavioral output a proper random variable in a probability space. Therefore, a certain behavior could be characterized as a deterministic function of biological variations and a random variable simultaneously. This paradoxical duality requires the deterministic and stochastic functions to converge to the same behavioral outcome under given conditions. Consequently, the compatibility of the top-down stochastic approaches with biological findings and the possibility of a reduction into lower-level neural processes depend on the existence of appropriate convergence criteria, which have not been provided to date.

  2. 2. Interactions of neural processes and human behavior: From cellular dynamics to oscillations at the neurocircuitry level, numerous studies have identified biological processes that define/influence behavior (Morrison & Baxter Reference Morrison and Baxter2012; Noori et al. Reference Noori, Spanagel and Hansson2012; Shin & Liberzon Reference Shin and Liberzon2010). Therefore, cognition is a causal consequence of a series of biological events. In light of these investigations, a cognitive process of a stochastic nature inherited its “random” character from its underlying biology. In other words, the top-down reduction of the abstract stochastic principles into neural processes implies probabilistic dynamic behavior of neural systems at different spatiotemporal scales. However, the lack of theoretical or experimental models confirming the probabilistic nature of neural mechanisms challenges the proposed top-down strategy.

In conclusion, although the quantum probability theories significantly extend the application field of classical Bayesian theories for modeling cognitive processes, they do not address the general criticisms towards top-down modeling approaches.

References

Aerts, D. & Aerts, S. (1995) Applications of quantum statistics in psychological studies of decision processes. Foundations of Science 1:8597.Google Scholar
Griffiths, T. L., Chater, N., Kemp, C., Perfors, A. & Tenenbaum, J. B. (2010) Probabilistic models of cognition: Exploring representations and inductive biases. Trends in Cognitive Sciences 14:357–64.Google Scholar
Knowlton, B. J., Morrison, R. G., Hummel, J. E. & Holyoak, K. J. (2012) A neurocomputational model for relational reasoning. Trends in Cognitive Sciences 16:373–81.Google Scholar
Maia, T. V. & Frank, M. J. (2011) From reinforcement learning models to psychiatric and neurological disorders. Nature Neuroscience 14:154–62.Google Scholar
Morrison, J. H. & Baxter, M. G. (2012) The aging cortical synapse: Hallmarks and implications for cognitive decline. Nature Reviews Neuroscience 13:240–50.Google Scholar
Noori, H. R. & Jäger, W. (2010) Neurochemical oscillations in the basal ganglia. Bulletin of Mathematical Biology 72:133–47.Google Scholar
Noori, H. R., Spanagel, R. & Hansson, A. C. (2012) Neurocircuitry for modeling drug effects. Addiction Biology 17:827–64.Google Scholar
Shin, L. M. & Liberzon, I. (2010) The neurocircuitry of fear, stress, and anxiety disorders. Neuropsychopharmacology 35:169–91.CrossRefGoogle ScholarPubMed