Hostname: page-component-745bb68f8f-grxwn Total loading time: 0 Render date: 2025-02-11T06:46:23.839Z Has data issue: false hasContentIssue false

The evolution of priming in cognitive competencies: To what extent is analogical reasoning adaptive?

Published online by Cambridge University Press:  29 July 2008

Paul Bouissac
Affiliation:
Department of French Linguistics, University of Toronto, Victoria College 205, Toronto, Ontario, M5S 1K7, Canada. paul.bouissac@utoronto.cahttp://www.semioticon.com/people/bouissac.htm
Rights & Permissions [Opens in a new window]

Abstract

This commentary questions the general assumptions concerning the cognitive value of analogical reasoning on which the argument developed by Leech et al. appears to rest. In order to better assess the findings of their meta-analysis, it shifts the perspective from development to evolution, and frames their “how” concern within a broader “why” issue.

Type
Open Peer Commentary
Copyright
Copyright © Cambridge University Press 2008

This commentary does not bear upon the technicalities of the empirical evidence marshaled by Leech et al. to support their developmental claims concerning analogical reasoning. Nor does it address the validity of the connectionist simulations which are presented as plausible neurological explanations for this cognitive competency, which is a defining feature of human psychology. It concerns instead the general assumptions regarding analogical reasoning on which the authors' whole argument implicitly rests.

First, the pervasive notion that analogical reasoning can be construed as normative can be questioned. The authors constantly refer to the “correct” analogy in their analysis of the experimental results of the tests based upon the formula: a is to b as c is to what? It is noticeable that the “correct” answers that are expected are all ultimately based on lexical units and their semantic values even when the tests are apparently purely visual (e.g., a whole pie and a slice of pie). The most convincing part of the general argument is that the “correct” answers rely on the general knowledge available to the children tested. This also holds for the adults. But general knowledge is necessarily “cultural knowledge.” General or “common sense” knowledge is largely built on stereotyped analogies, which are a part of the cultural endowment imparted to children during their earliest languaging and socializing process. It is in this sense normative and culture-dependent. It is socially adaptive, and it does not have any heuristic value.

But analogical thinking is also the source of innovative thinking when it is applied outside the confines of the cultural box. There is anecdotal evidence that children make “wild” analogies. Analogical thinking can be highly innovative and lead to counterintuitive scientific hypotheses or, alternatively, to unsettling artistic visions as in the creative methods promoted by the Surrealist literary movement of the 1920s: what they called the “image” was the perception of a relation – semantic or morphological – between cognitively “distant” realities (Breton Reference Breton, Seaver and Lane1924/1969, pp. 20–21). In those cases, analogical thinking is unpredictable and has the power to shatter commonsense knowledge. Do infants (and children) start with such “wild” analogical processing of information, and do they need to be “educated” according to their cultural analogical norms? Training by the caretakers is after all very similar to the tests that are described in the target article (e.g., the kitten is the puppy of the cat), but saying that the apple is the kitten of the apple tree probably would not be accepted as a valid answer, although it would be biologically insightful. Then, what about rocks being analogically construed as the puppies of the mountain?

Analogical thinking may be a defining feature of human cognition, but this evolved mental tool to infer knowledge is definitively double-edged. It can equally be the source of scientific discoveries or the perpetuation of erroneous beliefs. A large part of counterintuitive scientific knowledge was built upon the overcoming of entrenched analogical thinking that sustained intuitive evidence, notably in physics and medicine. This is why it is dangerous to uncritically mix analogical and objective truth as the authors do. They overlook the fact that their implicit notion of “normative” analogical thinking is culture-dependent, and not necessarily adaptive in the evolutionary sense of the term. Their examples show that their reasoning is based on positivistic “scientific” (commonsense) knowledge with the corresponding semantic map that is conveyed to infants and children with gesture and language. Other worldviews in other cultures do inculcate different analogies. And some of these analogies can determine, more often than not, maladaptive behavior.

Without raising the thorny issues of the evo-devo debate, it might be productive to evoke the evolutionary significance of analogical reasoning. It seems that the human brain is particularly adept, for better or worse, at relying on analogical reasoning. Inferring knowledge from incomplete information (and acting upon it) is undoubtedly adaptive in most (or at least some) cases. There seems to be good empirical evidence that all perceptions are based on priming and anticipation (e.g., Glimcher Reference Glimcher2003; Gold & Shadlen Reference Gold and Shadlen2007). But, obviously, like the much-celebrated “theory of mind” (TOM) that evolved as a highly adaptive feature of the human genetic endowment (e.g., Penn et al. Reference Penn, Holyoak and Povinelli2008), it can extend beyond its adaptive range. If TOM prompts populations to attribute mental states to apparently animated objects such as volcanoes, meteorological events, or virtual entities, and if analogical reasoning construes these objects as bloodthirsty predators that have to be satiated or pacified by sacrificing precious resources, obviously TOM (and analogical thinking upon which TOM ultimately rests) is not unambiguously adaptive.

Finally, it should be pointed out that the two main examples of analogical thinking put forward by Leech et al. are particularly infelicitous – and probably prove my point. Niels Bohr's popular suggestion in 1913 that atoms were to be thought of as miniature planetary systems proved to be seriously misleading and was soon repudiated by Bohr himself, who recanted as early as 1919 (von Baeyer Reference von Baeyer2003, p. 63). No serious progress could have been achieved in physics if this analogy had become entrenched. As to the Gulf War, the analogy with World War II explored at length by the authors cannot stand political or historical scrutiny. This analogy was created as a political argument developed by the American administration of the time to convince their European allies to join the fight. The image was indeed powerful through the foregrounding of superficial similarities, but very shallow if closely examined. Its purpose was to achieve an emotional impact on the media. Analogies can provide compelling rhetorical arguments through a drastic simplification of situations and problems but are a very shaky ground for constructing scientific or historical knowledge. At most, they can be heuristic. Priming is an effective shortcut to decision making that is not risk-free and can be very costly. Not all shortcuts lead to survival (e.g., Chittka & Osorio Reference Chittka and Osorio2007).

References

Breton, A. (1924/1969) Manifestos of surrealism. Trans. Seaver, R. & Lane, H. R.. University of Michigan Press.Google Scholar
Chittka, L. & Osorio, D. (2007) Cognitive dimensions of predator responses to imperfect mimicry. PLoS Biology 5(12):e339. Available at: www.plosbiology.org.CrossRefGoogle ScholarPubMed
Glimcher, P. W. (2003) Decisions, uncertainty, and the brain: The science of neuroeconomics. MIT Press.CrossRefGoogle Scholar
Gold, J. I. & Shadlen, M. N. (2007) The neural basis of decision making. Annual Review of Neuroscience 30:535–74.CrossRefGoogle ScholarPubMed
Penn, D. C., Holyoak, K. J. & Povinelli, D. J. (2008) Darwin's mistake: Explaining the discontinuity between human and nonhuman minds. Behavioral and Brain Sciences 31(2):109–78.CrossRefGoogle ScholarPubMed
von Baeyer, H. C. (2003) Information: The new language of science. Orion Books.Google Scholar