Hostname: page-component-745bb68f8f-hvd4g Total loading time: 0 Render date: 2025-02-06T00:58:36.875Z Has data issue: false hasContentIssue false

What would be pre-modern human cognition?

Published online by Cambridge University Press:  14 January 2025

Nicholas Blurton Jones*
Affiliation:
Department of Anthropology, University of California Los Angeles, San Clemente, CA, USA nickbj@ucla.edu
*
*Corresponding author.

Abstract

Stibbard-Hawkes's detailed demonstration that in the case of hunter-gatherer artifacts, absence of evidence is not evidence of absence must never be forgotten. The belief that there is a single coherent “human cognitive capacity” difference between modern humans and some unspecified earlier form should be rigorously re-examined.

Type
Open Peer Commentary
Copyright
Copyright © The Author(s), 2025. Published by Cambridge University Press

The first message of Stibbard-Hawkes's target article is loud and clear and very important: Many complex, personally made objects essential to hunting and gathering would leave no archaeological record. It would be a serious error to interpret the absence of archeological evidence for such objects as evidence that they were not in common use. Nor can it be taken as evidence for the absence of whatever mental capacities are required to make and use these objects.

The author could have added a fourth example, the Dobe !Kung, based on the extensive publications by Richard Lee and John Yellen. It is surprising that he omitted their foundational work, and the provocation of the field by Washburn and DeVore. It is sad if the younger generation pays so little heed to the founders of a field in which they are rapidly themselves coming to the forefront.

Was there a single unified leap forward?

I am happy to endorse the author's view that even the people with the least elaborate technologies are clearly human. San and Hadza are close to the root of the evolutionary trees sometimes produced by geneticists, but like other field workers I found no difficulty in interacting with either. A caveat, like other field workers before me, though the different ethic concerning ownership and exchange was easily recognizable, it was sometimes stressful for both parties. Later visitors may have benefitted from already established “working relationships” between researchers and their subjects.

The “advanced cognitive capacities” topic descends from Klein's proposal (as cited by Stibbard-Hawkes) that the archaeological record suggested a sharp difference around 50 kya, which among other things offered a quick explanation for why Homo sapiens spread so rapidly out of Africa and around the rest of the world, rapidly replacing previous populations. The idea of a sharp change in “cognitive capacity” was quickly challenged by McBrearty and Brooks (Reference McBrearty and Brooks2000) based on data from African archaeology, to my mind successfully. But that leaves the 50 kya expansion lacking an explanation. I suspect an answer may be found in technologies that enabled women to better exploit higher latitude plant foods. Think Acorns. Women had the “advanced cognitive abilities” long before this time.

What would we mean by “advanced modern capacities,” and how would we know them when we see them? The author could not resist being dragged into this intellectual quagmire and has made a significant start at draining the swamp. If we think we know what we mean by “fully modern human cognition” we must have some concept of “not yet quite modern,” or “pre-modern cognition.” If we do not have such a concept, we need to get one and tell the world what it might be like. A common starting point would be comparisons between humans and other apes. There are many, carefully studied by a generation of researchers. But the “fully modern human cognition” literature seems to concern itself with a range of abilities already far out of reach of any surviving apes.

It is easy to see the temptation to believe that there is such a cluster of advanced modern capacities, and that it comprises an inter-dependent set of features, and that they might all have come into being close in time. But can we just assume this and leave it unexplored? Would we all make the same list of indicators of “advanced modern capacities”?

Stibbard-Hawkes does imply that dating the transition, if it was a transition, is unlikely to be easy. The taxonomy of earlier forms of Homo is inevitably based on anatomical features. There is little else to guide the taxonomy. Connections between “cognitive capacity” and anatomy are likely to be remote at best. The relatively small volume of the apparently crucial ventromedial prefrontal cortex (Sapolsky, Reference Sapolsky2023), which Sapolsky, and Lockwood et al. (Reference Lockwood, Cutler, Drew, Abdurahman, Jeyaretna, Apps and Manohar2024) link to prosocial behavior, is unlikely to be reflected accurately in skull shapes.

References

Lockwood, P. L., Cutler, J., Drew, D., Abdurahman, A., Jeyaretna, D. S., Apps, M. A. J., … Manohar, S. G. (2024). Human ventromedial prefrontal cortex is necessary for prosocial motivation. Nature Human Behaviour, 8(7), 114. https://doi.org/10.1038/s41562-024-01899-4CrossRefGoogle ScholarPubMed
McBrearty, S., & Brooks, A. S. (2000). The revolution that wasn't: A new interpretation of the origins of modern human behavior. Journal of Human Evolution, 39, 453563. https://doi.org/10.1006/jhev.2000.0435CrossRefGoogle ScholarPubMed
Sapolsky, R. M. (2023). Determined: A science of life without free will. Penguin Press.Google Scholar