Hostname: page-component-745bb68f8f-l4dxg Total loading time: 0 Render date: 2025-02-06T11:28:48.801Z Has data issue: false hasContentIssue false

Functional theories can describe many features of conscious phenomenology but cannot account for its existence

Published online by Cambridge University Press:  23 March 2022

Max Velmans*
Affiliation:
Department of Psychology, Goldsmiths, University of London, LondonSE14 6NW, UK. m.velmans@gold.ac.ukhttps://www.gold.ac.uk/psychology/staff/velmans/

Abstract

Merker, Williford, and Rudrauf argue persuasively that integrated information is not identical to or sufficient for consciousness, and that projective geometries more closely formalize the spatial features of conscious phenomenology. However, these too are not identical to or sufficient for consciousness. Although such third-person specifiable functional theories can describe the many forms of consciousness, they cannot account for its existence.

Type
Open Peer Commentary
Copyright
Copyright © The Author(s), 2022. Published by Cambridge University Press

Merker, Williford, and Rudrauf have provided a thoughtful, and, in my view, decisive critique of the integrated information theory (IIT) claiming that “consciousness is one and the same as integrated information” (Oizumi, Albantakis, & Tononi, Reference Oizumi, Albantakis and Tononi2014; Tononi, Reference Tononi2008). Rather, Φ (the formal measure of integrated information within IIT) is one measure of network efficiency that can be applied to network information processing in general. For this reason, information integration efficiency can be doubly dissociated from consciousness. For example, there can be efficient information flows in complex economic, social, and transportation systems that are far removed from those usually thought to have a unified, integrated consciousness, and there is extensive evidence for efficient unconscious integrated information processing in systems that do have consciousness, namely human minds (see e.g., Kihlstrom, Reference Kihlstrom and Velmans1996; Velmans, Reference Velmans1991). If so, integrated information processing is not a sufficient condition for consciousness.

Given the inability of IIT to clearly demarcate systems (or subsystems) that don't have consciousness from those that do, Merker et al. suggest an alternative measure of what it is like to have a conscious “point of view” employing a form of projective geometry that specifies the “point-horizon” structure of human consciousness (Rudrauf et al., Reference Rudrauf, Bennequin, Granic, Landini, Friston and Williford2017). As with the other theories that focus on the projected, three-dimensional nature of conscious phenomenology (e.g., Lehar, Reference Lehar2003; Pereira, Reference Pereira2018; Revonsuo, Reference Revonsuo2006; Trehub, Reference Trehub2007; Velmans, Reference Velmans1990, Reference Velmans2008, Reference Velmans2009), this specification of self-location within a three-dimensional phenomenal world captures a central, but often ignored feature of human conscious phenomenology that has many important consequences for understanding consciousness (cf. Velmans, Reference Velmans2009). However, it does not follow that the ability to generate and implement such a geometry demarcates conscious entities from nonconscious ones. As with integrated information, double dissociations between systems that instantiate projective geometries and consciousness are easy to envisage. For example, one can devise complex nonconscious-guided missile systems that have sophisticated ways of computing and navigating their own location within a surrounding terrain. Conversely, although projective geometries can be used to formalize self-location and navigational aspects of human consciousness, there are many “qualia” of consciousness that cannot be formalized in this way, such as the taste of coffee or the smell of a rose.

Which leads to a deeper question: Is it possible, even in principle, to specify a third-person observable, functional principle that demarcates conscious entities from nonconscious ones? In Velmans (Reference Velmans2009, Reference Velmans2012), I have surveyed many suggested criteria for distinguishing entities that are conscious from those that are not. Broadly speaking, theories about the distribution of consciousness divide into continuity and discontinuity theories. Discontinuity theories all claim that consciousness emerged at a particular point in the evolution of the universe. They merely disagree about which point. Most try to define the point of transition in functional terms, although they disagree about the nature of the critical function. Some think consciousness “switched on” only in humans, for example, once they acquired language or a theory of mind. Some believe that consciousness emerged once brains reached a critical size or complexity. Others believe it co-emerged with the ability to learn, or to respond in an adaptive way to the environment. However, it is well recognized in modern philosophy of mind that all such theories face the problem of “brute emergence.” If consciousness was entirely absent before the emergence of a critical function, what is it about that function that suddenly creates consciousness?

In Velmans (Reference Velmans2009, Reference Velmans2012), I have argued that such theories confuse the conditions for the existence of consciousness with the added conditions that determine the many forms that it can take. Who can doubt that verbal thoughts require language, or that full human self-consciousness requires a theory of mind? Without internal representations of the world, how could consciousness be of anything? And, without motility and the ability to approach or avoid, what point would there be to rudimentary pleasure or pain? However, none of these theories explains what it is about such biological functions that suddenly switch on consciousness.

Continuity theorists do not face this problem for the simple reason that they do not believe that consciousness suddenly emerged at any stage of evolution. Rather, as Sherrington (Reference Sherrington1942) suggested, consciousness is a “development of mind from unrecognisable into recognisable.” According to such panpsychist, panexperientialist, and cosmopsychist views (Seager, Reference Seager2020; Shani, Reference Shani2015; Skrbina, Reference Skrbina and Skrbina2009; Velmans, Reference Velmans, Kelly and Marshall2021), consciousness co-emerged with matter and co-evolves with it. As matter became more differentiated and developed in complexity, consciousness became correspondingly differentiated and complex. The emergence of carbon-based life forms developed into creatures with sensory systems that had associated sensory “qualia.” The development of representation was accompanied by the development of consciousness that is of something. The development of self-representation was accompanied by the dawn of differentiated self-consciousness and so on. On this view, functional evolution can, in principle, account for the different forms that consciousness takes. But consciousness, in some primordial form, did not emerge at any particular stage of evolution. Rather, it was there from the beginning. Its emergence, with the birth of the universe is neither more nor less mysterious than the emergence of matter and energy.

Which view is correct? One must choose for oneself. However, in the absence of anything other than arbitrary criteria for when consciousness suddenly emerged, continuity theory is arguably more elegant. Continuity in the evolution of consciousness favours continuity in the distribution of consciousness, although there may be critical transition points in the forms of consciousness associated with the development of life, representation, self-representation, and so on.

Conflict of interest

None.

References

Kihlstrom, J. (1996). Perception without awareness of what is perceived, learning without awareness of what is learned. In Velmans, M. (ed.), The science of consciousness: Psychological, neuropsychological and clinical reviews (pp. 2346). Routledge.CrossRefGoogle Scholar
Lehar, S. (2003). Gestalt isomorphism and the primacy of subjective conscious experience: A gestalt bubble model. Behavioral & Brain Sciences, 26(4), 375444.CrossRefGoogle ScholarPubMed
Oizumi, M., Albantakis, L., & Tononi, G. (2014). From the phenomenology to the mechanisms of consciousness: Integrated information theory 3.0. PLoS Computational Biology, 10(5), e1003588. doi: 10.1371/journal.pcbi.1003588CrossRefGoogle Scholar
Pereira, A. Jr. (2018). The projective theory of consciousness: From neuroscience to philosophical psychology. Trans/Form/Ação, Marília, 41, 199232. https://doi.org/10.1590/0101-3173.2018.v41esp.11.p199CrossRefGoogle Scholar
Revonsuo, A. (2006). Inner presence: Consciousness as a biological phenomenon. The MIT Press.Google Scholar
Rudrauf, D., Bennequin, D., Granic, I., Landini, G., Friston, K., & Williford, K. (2017). A mathematical model of embodied consciousness. Journal of Theoretical Biology, 428, 106131.CrossRefGoogle ScholarPubMed
Seager, W. (Ed.) (2020). The Routledge handbook of panpsychism. Routledge.Google Scholar
Shani, I. (2015). Cosmopsychism: A holistic approach to the metaphysics of experience. Philosophical Papers, 44(3), 389437.CrossRefGoogle Scholar
Sherrington, C. S. (1942). Man on His nature. Cambridge University Press.Google Scholar
Skrbina, D. (2009). Panpsychism in history: An overview. In Skrbina, D. (Ed.), Mind that abides: Panpsychism in the new millennium. John Benjamins, pp. 132.CrossRefGoogle Scholar
Tononi, G. (2008). Consciousness as integrated information: A provisional manifesto. Biological Bulletin, 215, 216242.CrossRefGoogle ScholarPubMed
Trehub, A. (2007). Space, self, and the theater of consciousness. Consciousness and Cognition, 16, 310330.CrossRefGoogle Scholar
Velmans, M. (1990). Consciousness, brain, and the physical world. Philosophical Psychology, 3, 7799.CrossRefGoogle Scholar
Velmans, M. (1991). Is human information processing conscious? Behavioral and Brain Sciences, 14(4), 651726.CrossRefGoogle Scholar
Velmans, M. (2008). Reflexive monism. Journal of Consciousness Studies, 15(2), 550.Google Scholar
Velmans, M. (2009). Understanding consciousness, 2nd ed. Routledge.CrossRefGoogle Scholar
Velmans, M. (2012) The evolution of consciousness. In D. Canter & D. Tunbull (Eds.), Contemporary Social Science. Special Issue on Biologising the Social Sciences: Challenging Darwinian and Neuroscience Explanations, 7(2), 117138.CrossRefGoogle Scholar
Velmans, M. (2021) Is the universe conscious? Reflexive monism and the ground of being. In Kelly, E. & Marshall, P. (Eds.) Consciousness unbound (pp. 175228). Rowman & Littlefield.Google Scholar