Hostname: page-component-745bb68f8f-hvd4g Total loading time: 0 Render date: 2025-02-11T18:29:43.706Z Has data issue: false hasContentIssue false

Passive frame theory: A new synthesis

Published online by Cambridge University Press:  24 November 2016

Ezequiel Morsella
Affiliation:
Department of Psychology, San Francisco State University, San Francisco, CA 94132-4168morsella@sfsu.eduhttp://online.sfsu.edu/morsella/index.html Department of Neurology, University of California, San Francisco, San Francisco, CA 94158
Christine A. Godwin
Affiliation:
School of Psychology, Georgia Institute of Technology, Atlanta, GA 30318cgodwin9@gatech.eduhttp://control.gatech.edu/people/graduate/cgodwin/
Tiffany K. Jantz
Affiliation:
Department of Psychology, University of Michigan, Ann Arbor, MI 48109-1043tkjantz@umich.eduhttp://prod.lsa.umich.edu/psych/people/graduate-students/tkjantz.html
Stephen C. Krieger
Affiliation:
Department of Neurology, Mount Sinai Medical Center, New York, NY 10029-6574stephen.krieger@mssm.eduhttp://www.mountsinai.org/profiles/stephen-krieger
Adam Gazzaley
Affiliation:
Department of Neurology, University of California, San Francisco, San Francisco, CA 94158 Departments of Psychiatry and Physiology, University of California, San Francisco, San Francisco, CA 94158. adam.gazzaley@ucsf.eduhttp://gazzaleylab.ucsf.edu/people-profiles/adam-gazzaley/

Abstract

Passive frame theory attempts to illuminate what consciousness is, in mechanistic and functional terms; it does not address the “implementation” level of analysis (how neurons instantiate conscious states), an enigma for various disciplines. However, in response to the commentaries, we discuss how our framework provides clues regarding this enigma. In the framework, consciousness is passive albeit essential. Without consciousness, there would not be adaptive skeletomotor action.

Type
Authors' Response
Copyright
Copyright © Cambridge University Press 2016 

R1. The nature of the problem and definitions of consciousness

For decades, the question of what consciousness contributes to the functioning of the nervous system has perplexed theorists and experimentalists, leading some of the greatest scientific minds, including the Nobel Laureates Leon Cooper, Francis Crick, Gerald Edelman, Eric Kandel, and Charles Sherrington, to conclude that answering this question is one of the greatest puzzles in science. As Shallice (Reference Shallice1972) asserts, “The problem of consciousness occupies an analogous position for cognitive psychology as the problem of language behavior does for behaviorism, namely, an unsolved anomaly within the domain of the approach” (p. 383). Passive frame theory (PFT), a synthesis of diverse ideas, attempts to answer this question and yield novel insights about consciousness and the brain, all within a conceptual framework that, importantly, is internally coherent (Dux, Gainotti) and comprehensive (Mudrik). PFT attempts to illuminate what consciousness contributes to nervous function, how it serves this role, and what consciousness is, at least in mechanistic, functional terms. In the framework, consciousness is passive albeit essential. Without it, there would not be adaptive skeletomotor function. The theory does not address what Marr (Reference Marr1982) refers to as the “implementation” level of analysis, that is, how neural activities instantiate these conscious states, which is an enigma for various disciplines, including neurobiology. However, in response to the commentaries, we discuss how PFT provides clues regarding this enigma. In addition, we emphasize how the framework is different from established models.

Regarding the commentaries, we are grateful for their collegial, constructive, and thoughtful nature. We will study these commentaries for years to come. Several of them contain insights (e.g., tasting always requires a voluntary act [Gallo], action selection in dreams [Porte], and unique properties of olfaction [Merker]) that will serve as a basis for future research. Many commentaries also contained deep questions that can be answered only after years of further investigation. We were pleased that many commentators viewed PFT as a unique, novel, and internally coherent framework. In the following sections, we respond to the general themes raised in the commentaries.

With our EASE (elemental, action-based, simple, and evolutionary-based) approach, we focused on the most basic form of consciousness (e.g., the experience of a smell, visual afterimage, tooth pain, or urge to scratch an itch) and contrasted it with unconscious processes (e.g., the pupillary reflex, peristalsis). We avoided precise definitions of the phenomenon under investigation because, as noted by Sir Karl Popper, definition is the final stage, and not the beginning, of scientific inquiry. For scientific progress, one needs only identifications and contrasts (e.g., nausea vs. the pupillary reflex). A useful working definition for basic consciousness is provided by Nagel (Reference Nagel1974), who proposed that an organism possesses basic consciousness if there is something it is like to be that organism – something it is like, for example, to be human and experience pain, breathlessness, or yellow afterimages. Similarly, Block (Reference Block1995b) proposes that, “the phenomenally conscious aspect of a state is what it is like to be in that state” (p. 227). All of the contents of which one is conscious compose the conscious field, which changes over time (Fig. R1). The size of the field changes, in a sense, when a new content, which could stem from polysensory configurations of afference (e.g., the McGurk effect), becomes conscious (Fig. R2).

Figure R1. The conscious field, with a different medley of conscious contents at each moment in time. Each of the three conscious fields, representing three different moments in time, possesses its own configuration of conscious contents (the filled shapes). One conscious content (e.g., the triangle) can be a sound; another conscious content (e.g., the square) can be an olfactory stimulus or an action-related urge.

Figure R2. The conscious field, with a different medley of conscious contents (filled shapes) at each moment in time. The conscious field during the third moment includes the percept “da,” induced by the intersensory, McGurk illusion. Another conscious content could be a phonological representation (e.g., /haus/) triggered by seeing a word (e.g., “HOUSE”). The lines feeding into each conscious field represent the unconscious and often polysensory configurations of afference that are involved in the generation of each content.

R2. Explanatory power and novelty of passive frame theory

Blackmore, Lin, Mudrik, and Prinz requested clarifications about the explanatory power of PFT. As a synthesis of six theoretical approaches from diverse fields of study (ideomotor theory, subset consensus, integration consensus, encapsulation, sensorium hypothesis, and PRISM [parallel responses into skeletal muscle]), PFT yields novel insights. For instance, our approach specifies how, when these hypotheses are united, then something resembling our proposed “frame” architecture (in which the skeletomotor response to one conscious content is framed by the other contents) would be required for adaptive action. PFT also explains subjective data from (a) intersensory conflicts, (b) smooth muscle conflicts (Morsella et al. Reference Morsella, Gray, Krieger and Bargh2009a), (c) synchrony blindness (see sect. R7), and (d) skeletomotor conflicts (e.g., holding one's breath). PFT also explains (e) why one is conscious of contents even when those contents are irrelevant to other contents or to ongoing action, and, relevant to Seth's comment regarding the relationship between consciousness and effector systems, (f) why skeletal muscle is the only “voluntary” muscle. Regarding (f), it is important to emphasize that the conscious acts of expressing (or suppressing) inhaling, blinking, swallowing, and micturating all involve, specifically, skeletal muscle. Accordingly, regarding digestion, one is conscious only of those phases of the process that require coordination with skeletomotor plans (e.g., chewing or micturating) and none of those that do not (e.g., peristalsis). Conversely, no skeletomotor plans are involved in the actions of consciously impenetrable processes such as the pupillary reflex, peristalsis, and stomach action, all of which involve smooth muscle.Footnote 1 Thus, in response to Seth, the integration achieved through conscious processing is intimately related, not to perceptual processing, smooth muscle control, or motor control, but to skeletomotor action selection. Simply put, PFT explains that consciousness is for voluntary action. Without conscious mediation, adaptive integration fails to occur, as in the case of unintegrated actions (e.g., dropping a hot dish or failing to hold one's breath while underwater).

Many commentators have mentioned that the approach is novel. Others (e.g., Blackmore, Hommel & Wiers) have requested more information regarding its novelty. (As a synthesis, PFT naturally contains ideas that have been presented elsewhere.) First, unlike other approaches, PFT specifies which integrations require consciousness and which do not, which is a current and major problem (cf. Mudrik et al. Reference Mudrik, Faivre and Koch2014). Several approaches posit that the integration associated with consciousness is for high-level semantic processes (Mudrik et al. Reference Mudrik, Faivre and Koch2014; Thagard & Stewart Reference Thagard and Stewart2014), which is at odds with our more “low-level,” action-based proposal. There are also accounts in which consciousness is, not for intra-organismic processes, but for high-level, sociocultural interactions (Banks Reference Banks1995; Carlson Reference Carlson1994; Frith Reference Frith2010; Macphail Reference Macphail1998; Prinz Reference Prinz2012). Other high-level accounts (e.g., Clark Reference Clark2002; Koch Reference Koch2004) propose that conscious processes are functionally unique because they tax semantic memory or top-down processes or are capable of anticipating the future. Moreover, some have hypothesized that consciousness serves no function in action control (Hommel Reference Hommel2013; Koch Reference Koch2014; Masicampo & Baumeister Reference Masicampo and Baumeister2013; see also Jackson Reference Jackson1986; Kinsbourne Reference Kinsbourne, Cohen and Schooler1996; Reference Kinsbourne2000; Pinker Reference Pinker1997). (Here we are excluding mention of the many theories in which consciousness is epiphenomenal.) As is evident in the commentaries, some theorists (e.g., Blackmore) disagree with our basic assumptions that there is a difference between conscious and unconscious processing and that consciousness is associated with only a subset of nervous function.

Second, unlike other approaches, we propose that the integration provided by consciousness is associated, not with perceptual processing (e.g., afference binding), efference binding, or smooth muscle binding, but with binding for a peculiar kind of action control. Third, PFT is unique in specifying that conscious contents influence only action systems and not content generators: Conscious contents are sampled only by action systems. As far as we know, PFT, building on Morsella (Reference Morsella2005) and Merker (Reference Merker2013c), is the only such account. PFT is also unique in that it focuses on olfaction instead of on vision and is action-based instead of perception-based.

Moreover, PFT is unique in proposing that (a) conscious contents cannot influence each other either at the same time or across time, which counters the everyday notion that one conscious thought can lead to another conscious thought; (b) one conscious content does not “know,” and should not know, of its relevance to ongoing action, to higher-order goals, or to other contents in the field; (c) though consciousness is not epiphenomenal or omnipresent (e.g., as in panpsychism), its role is more passive and less teleological than that of other accounts (e.g., Baars Reference Baars1988; Reference Baars2002; Dehaene Reference Dehaene2014); and (d) during a frame check, the field functions as a unitary entity in terms of its influence over the skeletomotor output system. The latter, (d), is an important point that renders moot the debate concerning whether the conscious field should be construed as a unitary or componential entity (cf. Searle Reference Searle2000): Action-related modules in the skeletomotor output system must treat, in terms of functional consequences, the mosaic of contents in the field as one thing. Last, (e) PFT reveals that, during action selection, anticipated action effects, actual action effects, and information about the immediate environment must exist as comparable tokens in a common decision-space. Although consciousness has historically been associated with the highest levels of processing, here it is revealed that consciousness must occur at the level of processing that is shared with that of representations of the immediate external environment (e.g., olfactory stimuli). The conscious field is most concerned, not about the future or past, but about the immediate present (Pahlavan & Arouss), the scenario in which overt action will unfold. Last-minute changes in a course of action might arise from entry of new contents (Merker Reference Merker2013c).

Hommel & Wiers request that we differentiate PFT from “global workspace” approaches. Consistent with Baars (Reference Baars1988; Reference Baars2002; see similar models in Anderson Reference Anderson1983; Minsky Reference Minsky1985; Selfridge Reference Selfridge, Blake and Uttley1959), PFT proposes that these states integrate nervous processes that are otherwise independent. However, PFT is based more on Jamesian ideomotor approaches, which are action-based, than it is on workspace models. In the development of PFT, that which rendered it internally coherent was reconciling ideomotor theory with encapsulation: If one adopts the notions of encapsulation and of ideomotor processing, in which percepts must activate response codes (which also occurs in “continuous flow” [McClelland Reference McClelland1979] or “cascade” [Eriksen & Schultz Reference Eriksen and Schultz1979] models), and if one accepts that intuitions about perception-and-action (as in the case of a blackboard diagram in which sensory inputs are connected to motor outputs) are actually computationally impossible (Tsotsos Reference Tsotsos1995; Reference Tsotsos2011), then it becomes apparent that something like our proposed frame is needed for adaptive action. Unlike the workspace models (e.g., Baars Reference Baars1988; Dehaene Reference Dehaene2014), which propose that conscious representations are broadcast to modules engaged in both stimulus interpretation and content generation, in PFT (as in Merker Reference Merker2007), the contents of the conscious field are directed only at the unconscious processes of the skeletomotor output system (Fig. R3). The proposed architecture is consistent with the view (Cisek Reference Cisek2007; Cisek & Kalaska Reference Cisek and Kalaska2010) that, in the nervous system, actional processes cannot be distinguished from decision-making processes (see Hardcastle, White, Kloos, & Hardcastle [Hardcastle et al.]). Finally, unlike in workspace approaches, in which consciousness serves more than a handful of functions (e.g., Baars Reference Baars1988; Dehaene Reference Dehaene2014), we propose that the conscious field serves only one basic, passive role. It performs this same basic function for several kinds of processes, including some high-level functions (e.g., in adult humans). Figuratively speaking, in PFT, the real work is not done in the conscious field (Gur): The conscious field is a workspace without the work (Lashley Reference Lashley1956). In contrast, according to Baars (Reference Baars1988), consciousness serves many functions, including adaptation and learning, decision making, analogy forming, editing and debugging, metacognitive self-monitoring, and autoprogramming.

Figure R3. At one moment in time, the conscious contents are apprehended by the unconscious, action mechanisms of the skeletal muscle output system. Each mechanism (represented by a gray sensor) is associated with a certain kind of (unconsciously mediated) action (e.g., articulating vs. reaching), which is signified by the Rs (for Responses).

R2.1. Field construction and conscious versus unconscious integration

In PFT, both the content generators and response systems are complex, unconscious processes, which is consistent with comments (Lin, Mudrik) about the sophistication of unconscious processes. It is clear that, in the construction of the conscious field, there is the participation of unconscious representations from both bottom-up and top-down sources, including those associated with frontal cortex (Bar et al. Reference Bar, Kassam, Ghuman, Boshyan, Schmid, Dale, Hämäläinen, Marinkovic, Schacter, Rosen and Halgren2006; Perlovsky). As noted by Perlovsky, and by LeDoux (Reference LeDoux1996), some of these pre-conscious representations are “vague” and poorly developed compared to conscious contents (Perlovsky). The mechanisms of multiple drafts (Dennett Reference Dennett1991), apperception (Wundt Reference Wundt and Titchener1902/1904), and re-entrant processing involve unconscious afference from top-down and bottom-up sources (Bar et al. Reference Bar, Kassam, Ghuman, Boshyan, Schmid, Dale, Hämäläinen, Marinkovic, Schacter, Rosen and Halgren2006; Basso, Dux, Perlovsky, Porte). These processes illuminate how the contents in the conscious field could satisfy the criteria of multiple modules, a form of multiple-constraint satisfaction (Dennett Reference Dennett1991; Merker Reference Merker, Shimon, Tomer and Zach2012) yielding the “global best estimate” of what each content should be (Helmholtz Reference Helmholtz and Shipley1856/1961; Merker Reference Merker, Shimon, Tomer and Zach2012). Although these mechanisms explain the underpinnings of good field construction, they do not account for the first-person perspective or subjectivity.

Related to Blackmore, Gur, Mudrik, and Lin's comments about conscious versus unconscious integrations, the latter can occur for perceptual (e.g., intersensory) processing, smooth muscle control, motor programming, and stimulus-response reflexes. Unconscious integrations also occur in the perception of the flavor of food, which involves the combining of information from multiple modalities (including haptic, gustatory, and olfactory; Shepherd Reference Shepherd2006), and in pain perception, in which there is, for example, interaction between sensory (lateral pain system) and affective (medial pain system) components (Melzack & Casey Reference Melzack, Casey and Kenshalo1968; Nagasako et al. Reference Nagasako, Oaklander and Dworkin2003). Evidence reveals that, even for high-level executive processing, much of what transpires is actually unconscious (Dehaene Reference Dehaene2014; Suhler & Churchland Reference Suhler and Churchland2009), as in the case of determining tendencies (Ach Reference Ach and Rapaport1905/1951). In response to Mudrik's comment about the conscious penetrability of perception, it is important to reiterate that rivalrous percepts (e.g., the Necker cube or binocular rivalry) or intersensory illusions (e.g., the McGurk effect) are resolved unconsciously: At one moment, consciousness is occupied by only one unambiguous interpretation of the stimulus (Merker Reference Merker, Shimon, Tomer and Zach2012). One is conscious only of the outcome of any competitive processes. Thus, for an experimental subject, the McGurk effectFootnote 2 “just happens” in much the same manner as does dream content (Lin, Porte). The only conflicts that one is conscious of are associated with action selection.

We agree with Basso, Franz, Gur, Lin, Mudrik, Pahlavan & Arouss, and Schwartz & Pournaghdali that the conscious field affords a type of processing in which the behavioral response to a given content (e.g., a stimulus eliciting aggression or the McGurk effect) is modulated (or “framed”) by the other contents (e.g., smoke) composing the field (Fig. R2). Accordingly, as Lin notes, responding adaptively to a complex array of stimuli (e.g., a visual scene) requires consciousness. The conscious field permits downstream, motor-related processes to respond to a stimulus in a manner that is contextually sensitive. This kind of sensitivity appears to be unlike anything we find in the robots of today. In short, the conscious field solves the problem, in Behaviorism, of the complex discriminative stimulus.

This sensitivity has led some to conclude that the conscious field is for flexible responding (Basso, Franz, Gur, Lin, Mudrik, Pahlavan & Arouss, Schwartz & Pournaghdali). PFT is in accord with proposals in which consciousness affords what appears to be a flexible, multi-determined response (Crick & Koch Reference Crick, Koch and Metzinger2000; Searle Reference Searle2000; Sergent & Dehaene Reference Sergent and Dahaene2004). However, it should be clarified that consciousness, over time, seems to be more flexible than it is. PFT reveals that the same contents will always yield the same voluntary action, with the combination of contents wholly and exclusively determining voluntary action selection. Consider that, if one experienced half of what is normally the whole conscious field, though behavior would certainly suffer, one would not notice the absence of information independent of a content generated by a system dedicated to detecting such discrepancies. In this way, confusion and other forms of metacognition are not givens. (Similarly, the perception of time and spatial “perspective,” whether in the first-person perspective or third-person perspective, are not givens [Einstein & Infeld Reference Einstein and Infeld1938/1967]. Instead, they are mental creations that are experimentally manipulable; Ehrsson Reference Ehrsson2007.) These contents are generated by devoted systems that constrain action selection, as Schwartz & Pournaghdali note. In the conscious field, there is often the absence of information but not information about absence (Bridgeman, personal communication, April, 16, 2014; Simons & Levin Reference Simons and Levin1997). Accordingly, and related to Hommel & Wiers's insights about addiction, urges in the field can be short-lived but have, in the absence of strong tendencies against them, a strong influence on behavior (Loewenstein Reference Loewenstein1996).

Faced with the apparent flexibility of consciousness, one might propose that the function of consciousness is one of instantiating stimulus-response relationships that are “arbitrary.” One problem with this hypothesis is that (a) it is difficult to define what constitutes an arbitrary mapping between perception and action; (b) there are countless cases of unconscious processes that seem to involve arbitrary mappings, as in the case of motor programming (Grossberg Reference Grossberg1999; Rosenbaum Reference Rosenbaum and Yantis2002); and (c) some non-arbitrary mappings (e.g., holding one's breath yields a negative subjective state) never become unconscious, despite extensive training and rehearsing of the perception-to-action mappings (Poehlman et al. Reference Poehlman, Jantz and Morsella2012). Moreover, unlike PFT, this hypothesis fails to explain why smooth muscle actions and intersensory conflicts are always mediated unconsciously.

R3. The focus on simple cases: Low-level versus high-level conscious contents

To investigate the primary function of consciousness, we were influenced by how progress was achieved in the field of physics (Einstein & Infeld Reference Einstein and Infeld1938/1967). Hence, in the target article, we focused on simple cases and low-level phenomena (e.g., percepts and urges). Regarding urges, in our “creature in the cave” scenario, the noxious stimulus leading to an inborn avoidance tendency could have been, instead of smoke, some other odorant (e.g., hydrogen sulfide eliciting an inborn disposition of disgust; Gallo, Gur, Lathe, Mudrik). Our point was to illustrate that the inborn action tendency toward the smell changed the manner in which the creature responded to perceptual features of the scene (e.g., the opening) that were already represented in the conscious field. In response to Gur's comments about the capabilities of our creature in the cave, we should emphasize that we interpreted as parsimoniously as possible the operations giving rise to the behavior of this creature, a creature that is not as hypothetical as we thought (see D'Souza & Bremner).

Regarding the ubiquitous distinction between high- and low-level conscious contents (de Vries & Ward; Dux; Franz; Gur; Hardcastle et al.; Hommel & Wiers; Jordan & Vinson; Melo, Koscik, Vrantsidis, Hathaway, & Cunningham [Melo et al.]; Mudrik; Perlovsky; Swain, Caluser, Mahmood, Meldrim, & Morelen [Swain et al.]; Vonasch, Masicampo, & Baumeister [Vonasch et al.]), one contribution of our action-based approach is that these distinctions are unnecessary. In PFT, all of these different kinds of contents can be construed as tokens which, through the conscious field, constrain voluntary action. From this standpoint, a percept, urge, nausea, and even a high-level sense, such as the sense of agency (which is experimentally manipulable [Riddle et al. Reference Riddle, Rosen and Morsella2015] and was touched upon by Gur, Massaro & Rowe, Mudrik, Pahlavan & Arouss, Prinz, and Schwartz & Pournaghdali), are just contents in the conscious field that constrain action selection. For example, while driving, a tip-of-the-tongue state concerning important information could influence action selection (e.g., one decides to park in order to concentrate on retrieval; cf. Schwartz & Pournaghdali). Hence, a percept, urge, and even a metacognition are similar in terms of their functional consequences. From this standpoint, a primitive form of self is necessary, not for the instantiation of consciousness (as in Prinz Reference Prinz2012; see Prinz's commentary), but rather for adaptive action selection.

In addition, for adaptive action, the highest of conscious contents must be married, in a sense, to the representational format of the lowest and phylogenetically oldest contents: High-level contents must exist in the same decision space and share the representational format of contents as primitive as those of olfaction. Hence, knowledge of the most primitive systems involved in the conscious field will reveal much about the nature of the highest systems, such as those of concern to Dux and Vonasch et al. As de Vries & Ward note, the latter are constrained by the former, which might be the most tractable to investigate. This is consistent with Shepherd's (Reference Shepherd2007) conclusion that “the basic architecture of the neural basis of consciousness in mammals, including primates, should be sought in the olfactory system, with adaptations for the other sensory pathways reflecting their relative importance in the different species” (p. 93). In humans, skeletomotor action conflicts often include a medley of high- and low-level contents.Footnote 3 This is evident when, faced with a yellow traffic light, one decides to begin to brake or to speed up (it depends on context; see Woodman & Vogel Reference Woodman and Vogel2005, p. 111). It is also evident in Merker's “burnt broccoli” example and in the following scenario: “I am hungry and desire steak, but I am Catholic, and it is Good Friday.” Our “Thanksgiving dinner” example was intended to demonstrate how a high-level function – language – must take into account contents from a phylogenetically older function – that is, olfaction. Consistent with Melo et al., Schwartz & Pournaghdali, and Merker, the high-level contents (including metacognitions and emotion) participate with the low-level contents in the constraining of action selection. Accordingly, Freeman (Reference Freeman2004) proposes that conscious representations of different sensory origins must at some level be similar in form in order for them to be integrated into a polysensory Gestalt of the world. The format must permit interaction between perceptual and motor systems (Freeman Reference Freeman2004) if there is to be perception-to-action translations (W. Prinz Reference Prinz, Maasen, Prinz and Roth2003b).

In agreement with Vonasch et al., much can be learned from what has been regarded as high-level contents (e.g., our insights about the McGurk effect and subvocalization, Helmholtz's [Reference Helmholtz and Shipley1856/1961] insights about automatic reading). The focus on simpler aspects of human behavior reflects only the presently intended scope of PFT. It is hoped that, after empirical and theoretical developments, the model will generalize to more complex phenomena, including aggression (Pahlavan & Arouss), taste (Gallo, Lathe), development (D'Souza & Bremner), emotion (Gainotti, Melo et al., Pahlavan & Arouss), addiction-related behaviors (Hommel & Wiers), dream consciousness (Porte), and parental consciousness (Swain et al.). The commentators have illuminated the under-explored action-related aspects of these phenomena. Emotional phenomena, with their action-related components (Melo et al.; Frijda Reference Frijda1986), provide a rich domain in which to develop PFT (Gallo, Gainotti, Melo et al., Pahlavan & Arouss). (As Gainotti notes, affective neuroscience provides additional evidence for a cortical account of consciousness [see also Dux].) Concerning human development (D'Souza & Bremner, Swain et al.), we were delighted to learn that our creature in the cave may exist not only hypothetically. Such insights from the commentators will help test and extend PFT. In addition, our aim is to eventually unify PFT, already a synthesis, with frameworks concerning ecological perception (de Vries & Ward), attention (e.g., the allocation of attention to action model of Franz), emotion (e.g., the iterative reprocessing model of Melo et al.), and chaos and complexity (Hardcastle et al.).

Regarding the generation of high-level contents, it is important to reiterate that, when discussing unconscious inference, Helmholtz (Reference Helmholtz and Shipley1856/1961) was referring not only to the unconscious generation of low-level contents, but also to the generation of high-level contents (e.g., automatic word reading; Augustinova & Ferrand Reference Augustinova and Ferrand2014). In many cases, high-level contents (e.g., subvocalizations) “just happen,” as do low-level contents (e.g., nausea; see Porte; see also reflex-like activations of high-level contents in Haidt [Reference Haidt2001] and Tetlock [Reference Tetlock2002]). To investigate how such high-level contents can arise in consciousness unintentionally and in a reflex-like manner, we developed the reflexive imagery task (RIT; Allen et al. Reference Allen, Wilkins, Gazzaley and Morsella2013; see review in Bhangal et al. Reference Bhangal, Cho, Geisler and Morsella2016). In this paradigm, the insuppressible conscious contents are from high-level processes, including involuntary object counting (Merrick et al. Reference Merrick, Farnia, Jantz, Gazzaley and Morsella2015), subvocalizations, and even the kind of word transformations used in the childhood game of Pig Latin (Cho et al. Reference Cho, Zarolia, Gazzaley and Morsella2016). The more we learn from the RIT about the generation of high-level contents, the more this generative process resembles that of low-level contents.

From the standpoint of PFT and in response to Massaro & Rowe, Mudrik, Perlovsky, Schwartz & Pournaghdali, and Seth, the outputs from a “narratorium” or an “interpreter” module that draws coherent (albeit often incorrect) conclusions about the current context (Roser & Gazzaniga Reference Roser and Gazzaniga2004) are just another kind of content, produced by dedicated systems that participate in the conscious field. Consciousness, which is passive, does not reason or draw conclusions. Instead, systems devoted to reasoning or to the drawing of certain kinds of conclusions (e.g., about a puzzle or a piece of music) have their outputs represented in the conscious field, just as a modular system can maintain an undesired “earworm” in consciousness.

PFT provides a similar answer regarding memory. We agree that many memorial processes are consciously impenetrable (Schwartz & Pournaghdali) and that the activities of content generators are influenced by past experiences. Memory is essential for the adaptive generation of, and response to, contents (Basso, D'Souza & Bremner, Gur, Hardcastle et al., Hommel & Wiers, Jordan & Vinson, and Lathe). However, these processes stem from dedicated systems operating outside the conscious field, which itself has no memory. (This is evident in neurological conditions.) Similarly, anticipatory processing (Jordan & Vinson, Seth), which is essential for perception and other forms of content generation, occurs outside the conscious field, which is not burdened with the operations of memory, prospection, or deliberation (Hardcastle et al.). (Relevant to the experimental findings by Hardcastle et al., which involved dissociations between perception and action, expressed actions could sometimes influence, not perception, but only the memory of the expressed action; see Cooper et al. Reference Cooper, Sterling, Bacon and Bridgeman2012.)

R3.1. Metacognition and consciousness across the phyla

Compared to all of the functions that have been proposed to be the primary function of conscious processing, the proposed function of consciousness in PFT is the most basic and primitive one. In response to Rosenbaum's point about animal consciousness, PFT, when further developed, might serve as a tractable framework with which to investigate consciousness in animals. However, at this stage of understanding, it is premature to apply our framework to other animals. Across the phyla, it is the case that the same function (e.g., locating an object) can be carried out by vastly different mechanisms (e.g., vision vs. echolocation). Therefore, one cannot conclude that a given action (e.g., holding of breath) in species X is carried out by the same mechanisms by which it is carried out in species Y. Even within one species, the same function could be carried out by more than one mechanism (Dawkins Reference Dawkins1982). Hence, before making cross-species comparisons, which are problematic, science must explain the perplexing contrasts within self-reporting, adult humans. (This conclusion is relevant also to the idea of extending PFT to infants; D'Souza & Bremner.) At present, in humans alone there are sufficient contrasts (e.g., the pupillary reflex vs. conscious pain) that are difficult to explain. (In addition, as Swain et al. note, the most basic of human functions – as simple and primitive as they have been deemed to be – nonetheless seem to occupy much of the brain's activity and to dominate much of human existence, especially during the most critical of periods [e.g., early childhood, parenting].)

Nevertheless, it is worth reiterating that, because our approach is simplistic and evolutionary-based, it may be the most suitable framework to be extended to the study of other animals. Insofar as one would like to study consciousness across species, one should focus on olfaction, for several reasons, including that it is the most phylogenetically preserved modality and a “common denominator” region of the vertebrate brain. Moreover, when conducting experiments on adult humans, it is much easier to induce olfactory percepts than high-level, metacognitive states and background states, such as those mentioned by Basso, de Vries & Ward, Franz, Perlovsky, and Schwartz & Pournaghdali. For instance, regarding tip-of-the-tongue states and feelings of knowing, it is difficult to control exactly when these states arise, and little is known about their neural correlates. At this stage of understanding, it is progressive to focus on obvious, easily reportable forms of conscious content (e.g., detecting a smell) and not on more nebulous conscious states (e.g., background states). The lack of verbal report regarding the stimulation of frontal areas may indeed reflect the greater difficulty of communicating about these contents than about perceptual events. Hence, in the current version of PFT, we limit ourselves to the kinds of conscious contents occurring in our “creature in the cave” scenario. Nevertheless, these high-level states, which involve the frontal cortex, may provide unique insights regarding consciousness.

Franz questions whether olfaction is a model system for consciousness research. Justifications for our focus on olfaction can be found in the target article (see Note 1 and sect. 3.5). We are grateful for Merker's listing of additional properties of olfaction (e.g., minimal spatialization, habituability of ordinary odorants, innate preferences, pheromonal signaling) that render it a good modality for investigating consciousness. As Gallo notes, other unique properties of olfaction are that there is no such thing as a “neutral” odorant and that swallowing is almost always a voluntary act. Concerning habituation (Merker), this phenomenon occurs in a special manner for olfaction because of the absence of the possibility of voluntary re-access to an exposed odorant (Merker; see also Stevenson Reference Stevenson2009). Regarding the “experiential nothingness” associated with habituation, research indicates that (a) activation in the orbitofrontal cortex does not decrease over odorant exposure (60 seconds; Poellinger et al. Reference Poellinger, Thomas, Lio, Lee, Makris, Rosen and Kwong2001), and (b) accurate odor detection persists after activation in the piriform cortex decreases to a baseline (or below baseline) level (Poellinger et al. Reference Poellinger, Thomas, Lio, Lee, Makris, Rosen and Kwong2001; Sobel et al. Reference Sobel, Prabhakaran, Zhao, Desmond, Glover, Sullivan and Gabrieli2000). When isolating the neural correlates of olfactory consciousness, one should seek regions that are active most during conscious detection but not during habituation or subliminal perception (Merrick et al. Reference Merrick, Godwin, Geisler and Morsella2014). Merker's forward-looking insights also reveal that much can be learned about olfactory consciousness by examining the activities of cortical layer VI pyramidal cells and the effects on olfactory consciousness from perturbations (e.g., lesions) of the dorsal pulvinar, a multimodal region whose activity is tightly linked with conscious perception (Wilke et al. Reference Wilke, Mueller and Leopold2009).

R4. Passive frame theory explains the primary function of consciousness

Prinz astutely questions whether PFT provides a functional account of consciousness. (Prinz also requested a definition of subjective experience that is distinct from our functional account of consciousness. This definition is provided in section R1.) We agree that a constellation of isolated facts (e.g., that skeletal muscle but not smooth muscle can be consciously controlled) does not by itself constitute an explanatory model, just as one can know that the sun rises every morning and that the seasons change without having an explanatory model of the solar system. Thus, it is one thing to know that skeletal muscle can be consciously controlled, but it is an entirely different matter to have an explanatory framework that integrates such a fact into a coherent, causal account. PFT provides such a framework.

Perhaps this issue is better illustrated by analogy. In naively observing the digestive system, one could generate hypotheses concerning the circumstances under which, say, salivary amylase is secreted into the mouth. However, these hypotheses would be fundamentally different from those regarding the primary function of salivary amylase, which would have to propose, in addition, what it is for (e.g., digestion of starch in the mouth) and, by extension, what phenomena would occur to a lesser degree without it. As a theory about function, PFT supersedes these criteria by claiming that (a) consciousness is necessary for collective influence over the skeletomotor output system, (b) no other process performs this role, and, (c) without these states, collective influence and integrated actions would be absent. Except for the fact that the actions of salivary amylase upon starch can be observed directly (e.g., in a Petri dish), the functional claims about this enzyme and consciousness are analogous.

Then, of course, there are questions regarding how salivary amylase breaks down starches and why salivary amylase, and not some other substance, was selected in evolution to carry out this function. With respect to biological systems, how and why questions are fundamentally different from what for questions (Lorenz Reference Lorenz1963; Simpson Reference Simpson1949). PFT explains what consciousness is for, that is, the nature of its primary function. How physical processes carry out collective influence is a variant of the “hard problem” and is thus outside the scope of PFT. In addition, one must distinguish the primary role of evolutionary adaptations from their secondary roles (Dawkins Reference Dawkins1982; Gould Reference Gould1977; Lorenz Reference Lorenz1963). A scientist could argue, for example, that color perception evolved for selecting fruits and detecting camouflaged prey, but no one doubts that color perception could also be used to appreciate a painting. The color harmony of a painting is perceptible to us because it involves the kinds of stimuli that are of adaptive significance in other contexts. In response to Prinz, we perceive the kinds of things that we evolved to act upon (Dawkins Reference Dawkins1982; LeDoux Reference LeDoux2012). As humans, we have inherited the conscious field. Like the eye, it has a fixed architecture, one that, though having a net adaptive effect across all of the stages of ontogeny, may not function adaptively in all contexts. Its adaptive value may not be evident in one particular situation (e.g., when experiencing unhealthy urges).

R4.1. The conscious field does not require conflict or skeletal muscle

In response to comments about the liaison between conflict and consciousness (D'Souza & Bremner, de Vries & Ward, Gur, Hardcastle et al., Hommel & Wiers, Lathe, and Massaro & Rowe), in PFT, a content does not require conflict to enter the conscious field, which is a “continuous feed” system. The conscious field is not for conflict monitoring per se, but rather for collective influence, which is especially important under conditions of conflict. (We agree with Jordan & Vinson that people underestimate how many action conflicts occur on a given day.) In PFT, contents, because of the requirement of encapsulation, do not know about the nature of other contents nor about whether there is conflict. In addition, the contents do not know whether they are relevant for ongoing action. The notion that no conscious content can directly influence another conscious content, either at the same time or across time, is consistent with Gestalt theory (e.g., Werner and Kaplan's [Reference Werner and Kaplan1963] notion of dynamic schematization) and with Helmholtz's (Reference Helmholtz and Shipley1856/1961) “global perception,” in which unconscious configurations of afference interact to generate an unambiguous conscious field, which can be construed as a static mosaic of conscious contents. As Merker (Reference Merker2013c) notes, the kinds of sensorial interactions involved in illusions (the McGurk effect), color constancy phenomena, and other perceptual phenomena occur, not in the conscious field, but in the (unconscious) processing of afference, occurring before the construction of the conscious field. For instance, though a content such as the McGurk effect is influenced by different kinds of configurations of afference, once the content is conscious, it cannot be modified on the basis of other contents. In PFT, the contents are not there to communicate with each other.

Concerning Rosenbaum's question about animals lacking skeletal muscle, in PFT, there is nothing intrinsically special about skeletal muscle that causes it to be related to consciousness. Skeletal muscle is one of many “multi-determined” effectors in the body. (Consider that the pupillary reflex, involving smooth muscle, too, is multi-determined, as it is influenced by light conditions, emotions, and other variables.) Conscious processes are distinguished from unconscious ones not simply because they “involve” skeletal muscle, but because they involve skeletal muscle in a particular manner, in which encapsulated systems (having different operating principles and phylogenetic origins) vie to express their respective skeletomotor plans.

Pertinent to Rosenbaum's question, one can also ask: If conscious states are primarily for skeletomotor action, then why do they persist even when the skeletomotor system is deactivated because of, for example, neural damage? In response to this criticism, one should consider the following analogy. Many of today's automobiles contain navigational systems whose primary function is to aid navigation. With this in mind, it is conceivable that the navigational system would continue to function despite problems with, say, the transmission of the car. Similarly, conscious processes, whose primary function is serving skeletomotor action, can continue to function after the peripheral structures that they are intended to serve are nonoperational. (Similar decoupling of central conscious processing from peripheral events occurs in phantom limb; Ramachandran Reference Ramachandran1999.)

R5. Attention, automaticity, and the timescale of consciousness

In response to the concerns of Basso, Dux, Lin, and Franz about the relationship between attention and consciousness (see differing views about this relationship in Koch and Tsuchiya [Reference Koch and Tsuchiya2007] and Cohen et al. [Reference Cohen, Cavanagh, Chun and Nakayama2012]), we posit that the nature of this relationship depends in large part on one's definition of attention. As noted by Tsotsos (Reference Tsotsos2011), there are more than a handful of definitions of attention. Most theorists construe attention as a cause, something that influences information processing in a certain way, while other theorists, interestingly, construe it as an effect, for example, as a by-product of a value-based selection process centered on the basal ganglia (Krauzlis et al. Reference Krauzlis, Bollimunta, Arcizet and Wang2014). In addition, for Oberauer and Hein (Reference Oberauer and Hein2012), there is a low-level form of attention, having certain properties, and a separate, higher-level form of attention, having other properties. It could be argued that one of these forms of attention, but not the other, is somehow necessary for basic consciousness. From our standpoint, it has to be explained how attention, defined one way or another, is necessary for the detection of basic conscious contents such as nausea or a gas leak (see related Merker comment about olfactory attention).

We next turn to the topic of automaticity, which was raised by Gur, Massaro & Rowe, and Rosenbaum. Regarding the contrast between novel actions (e.g., the first time one ties one's shoes) and automatized actions (e.g., the ten thousandth time one ties one's shoes), it is difficult to ascertain which aspects of consciousness, if any, are diminished in the latter. In making such a contrast, one could easily conflate changes in consciousness and the well-known changes in attentional processes that stem from automaticity (Baars Reference Baars1997b; Logan et al. Reference Logan, Taylor and Etherton1999; Puttemans et al. Reference Puttemans, Wenderoth and Swinnen2005). The question is: What would be no longer consciously accessible when driving home automatically? One benefit of PFT is that when contrasting conscious and unconscious processes, the contrast is between processes that one is never conscious of (e.g., the pupillary reflex and peristalsis) and processes that one is almost always conscious of (e.g., pain and air hunger while holding one's breath).

In response to Hommel & Wiers's thoughtful question about the timescale at which consciousness operates (related to Massaro & Rowe's comments), we should state that according to PFT, the timescale must be that associated with normal, ongoing voluntary action selection, which is different for different kinds of voluntary acts (e.g., taking a deep breath vs. looking left to right) and is often slower than that of reflexive actions. Whatever the exact timescale in humans (or in other species; de Vries & Ward) may be, it must be slow enough for the frame check to include the relevant contents for the adaptive employment of a given effector and quick enough (in the order of hundreds of milliseconds) to benefit actions occurring at fast rates (e.g., voluntary saccades). Regarding the former, it is interesting to consider that the instruction to a subject to press a button whenever there is a noticeable change in any part of the sensorium requires the action processes associated with that effector to, in a sense, sample a wide variety of contents in a very short time.

R6. Evolution, functionalism, and emotion

Prinz and others (Massaro & Rowe, Mudrik) question why the function attributed to consciousness is not solved unconsciously, as are many other functions. After all, it is easy to imagine integrated actions (e.g., suppressing inhalation) occurring without anything like a conscious field. However, there are many hypothetical solutions to phylogenetic challenges that the human body did not arrive at by way of evolution. In our descriptive account, intuitions regarding how the nervous system should work take a back seat to actual data revealing the manner in which it actually works. Hence, it seems premature to adopt an epiphenomenal stance (e.g., Blackmore) until there is a sufficient scientific understanding about the place of consciousness in nature. We should add two comments. First, our intuitions of how nervous systems should function, in which sensory inputs are connected to motor outputs, are actually computationally impossible (Tsotsos Reference Tsotsos1995; Reference Tsotsos2011). (Regarding this issue, Neal Miller [Reference Miller and Koch1959] hypothesized that “central states” could provide a solution to the problem; see discussion in Corr & Morsella Reference Corr, Morsella, Corr, Fajkowska, Eysenck and Wytykowska2015.) Second, some of the counterintuitive adaptations in evolutionary history (e.g., intra-psychic conflict) are actually good solutions given the hardware at hand (e.g., slow processing units; Livnat & Pippenger Reference Livnat and Pippenger2006).

One could also ask: If consciousness is for action selection, then why is one aware of so many things that do not demand immediate action (e.g., the plot of a story; Basso, de Vries & Ward, Gur, Perlovsky, Porte)? One could certainly imagine more efficient systems – and more falsifiable models – that invoke the conscious field only under conditions when it is needed most (e.g., conflicting action tendencies). The apparent inefficiency of the field is an incontrovertible, first-person property of these states. In the absence of any conflict or obvious demands upon action, one is continuously conscious of, say, a red object standing before one. This counterintuitive property of the field can be likened to the efficiency of the continuously running conveyor belt of the ball-return machine at a bowling alley (Morsella Reference Morsella2005), which is inefficient in the sense that it constantly expends energy, even when there are no bowling balls needing to be returned to players. However, the machine is more efficient than a machine having an additional mechanism that determines whether a ball needs to be returned. Such deceptively “inefficient” solutions can be observed in physiology outside the nervous system, as in biological filters (e.g., the kidneys), which continuously filter a substrate regardless of the status of the substrate. Just as eyes do not turn off when there is nothing interesting to look at, the conscious field does not turn off when its role is unneeded.

Regarding one's consciousness of a story that may never influence behavior, it should be stated that simulacra such as novels have been constructed to incite attentional, emotional, and other kinds of processes for only an infinitesimally recent fraction of human history. Consistent with PFT, the stimuli from simulacra succeed in part because they activate inflexible, encapsulated systems that, at some level, are incapable of “knowing” that what is occurring is not “real.” (Consistent with this view is the idea that emotional systems [e.g., for fear and aggression] evolved independently and are modularized in the brain [LeDoux Reference LeDoux2000; Reference LeDoux2012; Öhman & Mineka Reference Öhman and Mineka2001].) Scary movies, for example, are capable of activating to some degree the kinds of affect associated with the natural observations of the portrayed events. For the majority of our natural history, such activation was clearly adaptive. Though such inclinations and imagery operate in a realm shielded from that of expressed action, they are still intimately related to action. (Relevant to Porte, in certain neurological conditions, sleep paralysis fails and patients act out their dreams, revealing the intimate link between conscious content and action. Sleep paralysis is mediated by inhibition only at the latest stages of motor control [e.g., at the spinal level; Glenn & Dement Reference Glenn and Dement1981].) Thorndike (Reference Thorndike and Thorndike1905) concludes, “The function of thoughts and feelings is to influence actions . . . Thought aims at knowledge, but with the final aim of using the knowledge to guide action” (p. 111).

Prinz asks how the design of consciousness, a product of evolution, reflects the needs of adaptive action control. First, we subscribe to the uncommon position that consciousness is best understood by examining the requirements of adaptive efferent action control rather than the needs of perceptual analysis.Footnote 4 Accordingly, as noted by Merker (Reference Merker2013c), the conscious field is organized egocentrically (see J. Prinz Reference Prinz, Velmans and Schneider2007), around an active agent that reaches, locomotes, and performs other acts that are spatially directed. This organization is found also in the dream world (Porte). (PFT pertains to normal waking consciousness, but as noted by Porte, the kind of action selection occurring in dreams seems to be isomorphic to that of waking.) For action selection, it would be disadvantageous for this first-person arrangement to break down and for an object on the left to be represented as if on the right. According to PFT, the first-person perspective and other properties of the field (e.g., its having varied elements) result from the demands of adaptive action selection. Second, the valence and other properties of a conscious content are in some ways isomorphic to ongoing action. It is not the case, for example, that pleasant states are associated with avoidant behaviors or that unpleasant ones are associated with approach behaviors. (As noted by James [1890], this non-arbitrary relationship between valence and action poses a problem for epiphenomenalism.) Similarly, as Sperry (Reference Sperry1952) notes, and consistent with de Vries & Ward's discussion of affordances, perceptual contents (e.g., the shape of a triangle) are often more isomorphic with action plans (e.g., tracing) than with sensory inputs (the proximal stimulus on the retina). In conclusion, much of the way the field is designed, including a primitive sense of self (the first-person perspective), reflects the demands of adaptive action selection.

R7. Integrated (“voluntary”) action and subliminal stimuli

The evidence for PFT is the combined evidence for each of the hypotheses composing the framework. For such a synthesis, which attempts to account for so many disparate observations, there will naturally be both evidence for and against each of its constituent hypotheses. For example, there are bits of empirical evidence that, at first glance, appear to challenge the hypotheses of encapsulation and ideomotor processing and that consciousness is dependent on cortical processes. In the coming years, the evidence for and against the tenets composing PFT should be examined carefully. Some of the tenets of PFT will be amended; others will be abandoned.

Nevertheless, today there is substantially more evidence in favor of PFT than there is against it. Moreover, much of the evidence supporting PFT can be obtained, not just in the laboratory, but in everyday scenarios (e.g., the unawareness of peristalsis and the pupillary reflex vs. the subjective experience of holding one's breath). These bits of evidence, realized in everyday life scenarios and not dependent on the technicalities of the laboratory, are often the most compelling forms of evidence. PFT focuses on modal findings (i.e., the most common findings) regarding when conflicts are conscious and when they are not. At this stage of the scientific understanding of a problem as thorny as consciousness in the brain, the mode is very informative.

Ayars mentions the well-known finding in which a prime (e.g., a rightward arrow) is presented subliminally before a target stimulus (e.g., a leftward arrow) to which the subject must respond. In this paradigm, the response to the target is modulated by the prior presentation of the prime. For example, interference (e.g., increased response latencies) is less when the two arrows match (the congruent condition) than when they mismatch (the incongruent condition; Eimer & Schlaghecken Reference Eimer and Schlaghecken2003; Schlaghecken et al. Reference Schlaghecken, Bowman and Eimer2006). Similarly, the subliminal presentation of the image of a spider could, in principle, ramp down an appetitive, behavioral “approach” system. The data mentioned by Ayars are corroborated and complemented by many similar findings (e.g., Hughes et al. Reference Hughes, Velmans and de Fockert2009; Van Opstal et al. Reference Van Opstal, Gevers, Osman and Verguts2010), some of which stem from our own laboratory, in which subliminal stimuli induced behavioral inclinations that induced subjective urges and also influenced behavior, even though subjects were unaware of the source of these urges (discussed further on here and in Morsella et al. Reference Morsella, Berger and Krieger2011). Accordingly, Desender et al. (Reference Desender, van Opstal and van den Bussche2014), after reviewing the literature on subliminally induced inhibition in interference paradigms, conclude, “The difference between awareness of a prime and experience of a conflict is of crucial importance … Response conflict might give participants the general feeling that something is wrong, without their knowing why or what is wrong” (p. 681). To Desender et al. (Reference Desender, van Opstal and van den Bussche2014), this subjective experience of conflict, regardless of awareness of the prime, is essential for the top-down control of behavior.

The effect described by Ayars could arise from various mechanisms (Logan et al. Reference Logan, Yamaguchi, Schall and Palmeri2015; Munakata et al. Reference Munakata, Herd, Chatham, Depue, Banich and O'Reilly2011), including negative priming (Tipper Reference Tipper1985) or the residual effects from the prime having activated, through unconscious efference binding, a response code (consistent with section 3.2 of the target article) that does not match the subsequently activated response code. (In section 3.2, we also discuss the relevant fact that motor programs, which are unconscious, are modulated in sophisticated ways by external stimuli.) We agree that past experience can influence the activities of content generators and response systems (discussed subsequently). For example, a response code is likely to function more quickly when it was activated recently (repetition priming) than when it was not activated recently.

Second, and more important, we should clarify that collective influence from a frame check pertains to a class of phenomena (e.g., holding one's breath while underwater or suppressing some other, prepotent and simultaneously activated action plan) that is fundamentally different from the kind of established, sequential priming effect described by Ayars, in which a prime modulates (but does not fully suppress) a subsequent behavior. As noted in Morsella and Bargh (Reference Morsella, Bargh, Cacioppo and Decety2011):

The level of activation of the plans involved in integrated action is far beyond that of “sub-threshold” activations. For example, in psycholinguistic research, there is substantial evidence that naming “dog” primes the action plan for naming a member of the same category (e.g., “horse”; Levelt, Reference Levelt1989). The level of activation that we are speaking of in our definition of integrated action is far above this threshold—it is at the level of activation at which action plans would not only influence overt action but also trigger action. (p. 341)

They also state that “integrated action occurs when two (or more) action plans that could normally influence behavior on their own (when existing at that level of activation) are simultaneously co-activated and trying to influence the same skeletal muscle effector” (Morsella & Bargh Reference Morsella, Bargh, Cacioppo and Decety2011, p. 341). These actions occur when one holds one's breath, refrains from dropping a hot dish, suppresses the urge to scratch an itch, or makes oneself breathe faster for some reward (Morsella Reference Morsella2005; Morsella et al. Reference Morsella, Gray, Krieger and Bargh2009a). These are discrete, goal-directed actions (e.g., carrying an object, depressing a lever), behaviors Skinner (Reference Skinner1953) characterized as operants (pp. 14–15). Based on ideomotor theory (e.g., Harleß Reference Harleß1861; James Reference James1890), PFT proposes that the conscious contents associated with behavioral control are action outcomes in the world (e.g., a button depressed) or the body (e.g., fingers snapping). This addresses the second question by Rosenbaum.

Before returning to the experimental effect mentioned by Ayars, we should add that, through collective influence, the field permits for a massive many-to-one (or, at least, many-to-few) conversion (Merker Reference Merker2013c), for there are many action-related contents but only a few operants that, at one time, direct behavior. This may reflect the mechanical limitations of the skeletomotor system, in which only one word can be uttered at a time (Wundt 1900), or, more precisely, in which only one (or a few) operant(s) can be expressed at one time. As noted by Rosenbaum, this aspect of PFT should eventually be integrated with research on the psychological refractory period (Pashler Reference Pashler1993; Welford Reference Welford1952), which, though not directly concerning conscious processing, yields conclusions that are consonant with PFT. As Pashler (Reference Pashler1993) concludes:

The limitations in carrying out stimulus-response tasks concurrently are not introduced at the level of stimulus perception, nor in production of the motor response. Those mental operations can work in parallel. Rather, the problem is in deciding what the response will be, and this kind of mental operation seems to be carried out in series – that is, one task at a time. (p. 52)

One of our aims is to first integrate PFT with this important research (along with research on the role of action control on attention; e.g., Allport Reference Allport and Posner1989; Neumann Reference Neumann, Heuer and Sanders1987) and then isolate, within the architecture of PFT, the locus of the action-selection bottleneck. For now, PFT is consistent with Jackendoff's (Reference Jackendoff1990) view that consciousness reflects some form of intermediate, action-planning stage in-between sensory and motor processing. More specifically, we propose that consciousness is associated with stages that, though clearly subsequent to those of sensory processing (Hochberg Reference Hochberg and Hochberg1998; Logothetis & Schall Reference Logothetis and Schall1989; Marcel Reference Marcel, Bock and Marsh1993), precede those of action selection. It seems that one is unaware of the computational products of action conflicts, resolutions that, should they exist (see Kaufman et al. Reference Kaufman, Churchland, Ryu and Shenoy2015), determine the general course of observed action (pressing one button instead of another). Consciousness reflects action options, both those that are selected and unselected.

The co-activation of action plans in integrated action can be indexed by behavioral and neural measures, as there are several behavioral and neural features that could be used to distinguish integrated from unintegrated action. For example, like any behavior of low strength, conflicted action (a form of integrated action) is easier to perturb than un-conflicted or unintegrated action (Skinner Reference Skinner1953). Unlike the conscious field, overt behavior is “integrated” in the sense that only one discrete operant is manifest (Lin), even though behavior is influenced by two action plans, as in the case of the Stroop incongruent condition. Unlike in the experimental effect described by Ayars, collective influence permits for one operant to be expressed, leading to only one effect in the world (e.g., a button pressed), while another operant is (almost) fully suppressed, leading to no noticeable effects in the world.Footnote 5 Interestingly, the suppressed operant in a conflict could be the prepotent plan, as in the case of holding one's breath. (Investigators have begun to examine the behavioral consequences of such unselected plans; Filevich & Haggard Reference Filevich and Haggard2013.) With this in mind, we can respond to Jordan & Vinson's insightful point challenging the distinction between perception and action. First, in natural selection, it is overt behavior and not musings and mentations that are directly selected. The latter, along with mental simulations, are, in a sense, less costly than is overt behavior. Importantly, behavioral inclinations can often be behaviorally suppressed but not mentally suppressed (Bargh & Morsella Reference Bargh and Morsella2008). Second, the distinction between perception and action is an informative one because motor control, unlike perception, is largely unconsciously mediated. This is one of the many differences between perception and action.

Relevant to the critiques by Blackmore, Lin, Mudrik, and Prinz about the explanatory power of PFT, the framework reveals that, unlike involuntary actions (e.g., dropping a hot dish because of the pain-withdrawal reflex), voluntary actions can be construed as a form of integrated action. Hence, PFT defines voluntary action in ways more informative than the common “homuncular” definition of these acts – that an action is voluntary if the organism intended to do it. As noted by Passingham (Reference Passingham1995), voluntary actions are special in that they can be suppressed; from the present standpoint, the act of suppression (suppressing a cough) is an archetypal integrated action. Again, this act is different from many forms of inhibition in the nervous system, many of which are unconscious, as in the case of lateral inhibition and negative priming. (Relevant to Ayars, PFT does not state that inhibition requires consciousness.) One might then argue that, instead of proposing a framework such as PFT, it is more parsimonious to hypothesize that the role of consciousness is to suppress actions, for holding one's breath or performing response interference tasks (e.g., the Stroop task) involves response suppression. However, this fails to account for the role of consciousness in integrated actions such as breathing faster for some reward, which requires collective influence but not suppression.

Regarding subliminal stimuli, we agree that these controversial stimuli can influence subsequent behavior in one way or another (see review in Morsella & Bargh Reference Morsella, Bargh, Cacioppo and Decety2011). (Consistent with the conclusions of Lin, some have argued that subjects do perceive these stimuli but that, for some reason [e.g., confabulation or distortions of memory], subjects fail to report about the conscious percept [Block Reference Block2007].) Moreover, unlike many, we do believe that these controversial stimuli can be regarded as unconscious. Interestingly, in some cases, the subject can be unconscious of the stimulus but be aware of the skeletomotor urges engendered by them (Morsella et al. Reference Morsella, Berger and Krieger2011), which is consistent with the idea that one can be aware of skeletomotor inclinations (e.g., urges) but be unaware of the sources of these inclinations, as in many of the cases mentioned by Lathe and Seth and as found in research from our laboratory (Morsella et al. Reference Morsella, Berger and Krieger2011) and in the classic research by Nisbett and Wilson (Reference Nisbett and Wilson1977), in which participants were unaware of the factors influencing their decisions (e.g., to aid a stranger). Thus, people can be conscious of tendencies (e.g., urges and cravings), but not necessarily of the factors engendering the tendencies (Baker et al. Reference Baker, Piper, McCarthy, Majeskie and Fiore2004; Nisbett & Wilson Reference Nisbett and Wilson1977). For example, the subjects in Nisbett and Wilson's experiments were certainly aware of their “urge” to, say, help a stranger who had collapsed. If the subjects had been physically unable to aid the stranger because the stranger was in a precarious environment, then subjects would have certainly reported that, though they suppressed helping behavior, they nonetheless experienced the urge to help. This inclination would be conscious even though the factors giving rise to it would be unconscious. PFT predicts that the operating principles within content generators (e.g., for urges) can be opaque to awareness, as in the unconscious factors that engender addiction-related urges (Baker et al. Reference Baker, Piper, McCarthy, Majeskie and Fiore2004). Accordingly, research has shown that people can have inexplicable “gut feelings” (or “somatic markers”; cf. Damasio et al. Reference Damasio, Tranel, Damasio, Levin, Eisenberg and Benton1991) reflecting the response tendencies of systems whose inner workings and learning histories are opaque to awareness (LeDoux Reference LeDoux2000; Öhman & Mineka Reference Öhman and Mineka2001; Olsson & Phelps Reference Olsson and Phelps2004). In short, the source of a response tendency is distinct from the awareness of that inclination.

Thus, in the kind of experiment mentioned by Ayars, subjects might be unaware of the stimuli but be aware of the skeletomotor urges triggered by such stimuli. Desender et al. (Reference Desender, van Opstal and van den Bussche2014) conclude that such awareness is required to counteract the interference effects from response conflict induced by subliminal or supraliminal stimuli. We have found such effects with subliminal stimuli (discussed in Morsella et al. Reference Morsella, Berger and Krieger2011) but have found it challenging to prove unequivocally that the stimuli were unconscious (cf. Lin). In the kind of experiment described by Ayars, one must examine not only whether subjects are unaware of the stimuli and, should they exist, the stimulus-elicited urges, but also that the contrast between the two conditions is not driven solely by a facilitatory effect from the congruent condition.

We should add that subliminal stimuli are problematic also because they are stimuli of very weak strength, unlike the kind of stimuli on which unconscious processes usually operate (Bargh & Morsella Reference Bargh and Morsella2008). Hence, in several studies (e.g., Molapour et al. Reference Molapour, Berger and Morsella2011; Morsella et al. Reference Morsella, Gray, Krieger and Bargh2009a; Reference Morsella, Wilson, Berger, Honhongva, Gazzaley and Bargh2009c), we induced action-related urges by the presentation of supraliminal distractor stimuli in paradigms such as the Stroop and flanker tasks. Consistent with Nisbett and Wilson (Reference Nisbett and Wilson1977), it is unclear whether subjects were aware of the source of these urges (see discussion in Morsella et al. Reference Morsella, Wilson, Berger, Honhongva, Gazzaley and Bargh2009c). For example, though the urges arising from different flanker conditions were systematic (Morsella et al. Reference Morsella, Wilson, Berger, Honhongva, Gazzaley and Bargh2009c), it seemed that subjects were unaware of why urges differed across conditions. To further investigate how action-related conscious contents can be triggered systematically and unintentionally by supraliminal stimuli, we developed the aforementioned RIT. With this paradigm, one can examine how unconscious processes operate over supraliminal stimuli. According to the traditions of Freud and Helmholtz, this is the usual way in which unconscious processes operate.

The conclusions on which PFT is based do not stem from controversial techniques such as visual masking (see Lin). Again, much of the evidence supporting PFT can be obtained in everyday scenarios. Regarding laboratory data, of the many conditions in interference paradigms, the strongest perturbations in consciousness (e.g., urges to err) are found in conditions involving the activation of incompatible skeletomotor plans (Morsella et al. Reference Morsella, Gray, Krieger and Bargh2009a; Reference Morsella, Wilson, Berger, Honhongva, Gazzaley and Bargh2009c), such as in the incongruent Stroop condition or the response interference (versus perceptual interference) condition of the flanker task (see quantitative review of evidence in Morsella et al. [2011]). Conversely, when distinct processes lead to harmonious action plans, as when a congruent Stroop stimulus activates harmonious word-reading and color-naming plans (e.g., BLUE in blue font), there are little such perturbations in consciousness, and participants may even be unaware that more than one plan influenced overt action (e.g., uttering “blue”). This phenomenon, called synchrony blindness (Molapour et al. Reference Molapour, Berger and Morsella2011), is perhaps more striking in the congruent (“pro-saccade”) condition of the anti-saccade task (Hallett Reference Hallett1978), in which distinct brain regions/processes indicate that the eyes should move in the same direction (cf. Morsella et al. Reference Morsella, Zarolia, Gazzaley, Gawronski and Strack2012). Regarding the Stroop congruent condition, MacLeod and MacDonald (Reference MacLeod and MacDonald2000), after carefully reviewing the behavioral and psychophysiological data, conclude that, “The experimenter (perhaps the participant as well) cannot discriminate which dimension gave rise to the response on a given congruent trial” (p. 386). Last, as mentioned in Note 8 of the target article, experiments have revealed that, in simpler tasks, incompatible skeletomotor intentions (e.g., to point right and left) do produce systematic intrusions into consciousness, but, as predicted by PFT, no such changes accompany smooth muscle conflicts or conflicts occurring at perceptual stages of processing (e.g., intersensory processing; Morsella et al. Reference Morsella, Berger and Krieger2011).

We are in agreement with Bridgeman that there are many perceptual events that are unconscious, as in the case of backward masking, saccadic suppression, change blindness, and changes in self-generated action that are below the just noticeable difference for proprioception (Jeannerod Reference Jeannerod2006). (The eye, discussed by Bridgeman, is interesting because it includes all of the kinds of actions contrasted in PFT: smooth muscle action [for pupillary reflex], involuntary skeletomotor action [e.g., a blink], and voluntary skeletomotor action [e.g., a wink].) PFT proposes not that all perceptual processes are conscious, but that motor control is unconscious and that, for the little that is conscious in the perception-to-action cycle, it is associated with the former. In short, one is trapped in the sensorium, but, even within it, one is not conscious of everything. Consistent with PFT, that which one is conscious of regarding eye movements consists of the kinds of things that are important for adaptive action selection. Seeing the world as unstable or the blurring of the retinal image would not serve this end.

According to Ayars, D'Souza & Bremner, Keller, and Schwartz & Pournaghdali, PFT is too restrictive in proposing that the integrative role of consciousness is only for the skeletomotor output system. (For example, Seth astutely recommends that we extend PFT to include effects upon the autonomic nervous system.) We believe that, first, our restriction renders the framework more falsifiable and fecund, and, second, the majority of the strongest bits of evidence corroborates it. Figuratively speaking, and in response to Lin, people tend not to experience conflict-related perturbations in consciousness while experiencing the McGurk effect, ventriloquist effect, or conflict in the pupillary reflex (Morsella et al. Reference Morsella, Gray, Krieger and Bargh2009a), but such is not the case while people perform the Stroop task or exert self-control (Baumeister & Vohs Reference Baumeister and Vohs2004; Preston & Wegner Reference Preston, Wegner, Morsella, Bargh and Gollwitzer2009), all of which involve skeletomotor conflict. Regarding the related, important insights by Keller, PFT is consistent with the notion that something represented in the conscious field may incidentally, because of the unconscious afference with which the content is necessarily coupled (e.g., the sensory inputs in the McGurk effect), be linked with effects other than those upon skeletomotor action. For example, in indirect cognitive control, which is germane to the sailing boat analogy by Hommel & Wiers, adult humans often rely on such effects. A process such as salivation, noted by Keller, could be controlled voluntarily only in such a sophisticated manner. Consistent with PFT, because conflicts involving salivation (or the pupillary reflex) do not involve the skeletomotor output system, one is oblivious about their existence.

Regarding the scope of PFT, it is worth noting that, regarding the role of consciousness, PFT is less restrictive than Godwin et al. (Reference Godwin, Gazzaley, Morsella, Pereira and Lehmann2013), who propose that consciousness is a tool used not by all sub-systems in the skeletomotor output system, but by only what has been construed as the instrumental response system (Bindra Reference Bindra1974; Reference Bindra1978; Morsella Reference Morsella2005; Tolman Reference Tolman1948). Moreover, PFT does not go as far as Morsella (Reference Morsella2005) in limiting the abilities of unconscious processes (e.g., in limiting the amount of unconscious “cross talk” between systems). However, PFT portrays consciousness as more passive than what is proposed in Morsella (Reference Morsella2005).

R8. Clues from PFT regarding the neural correlates of consciousness

Even though PFT is not about the neural mechanisms giving rise to consciousness, which is Marr's implementation level of analysis, the framework provides new clues regarding the matter and does rule out certain possibilities (e.g., panpsychism). Hence, though PFT does not attempt to solve the hard problem, it does restrict potential candidate explanations.

One benefit of the architecture outlined in PFT is that, for it, the puzzle of the mind-body problem is the same whether (a) the tokens (conscious contents) differ from each other qualitatively or quantitatively, (b) there is one or many tokens (the former might be more theoretically tractable), or (c) the field is unitary or componential. The first is important because we know that the brain can implement quantitative codes. For example, it has been proposed that consciousness depends on “precise synchronization of oscillatory neuronal responses in the high frequency range (beta, gamma)” (Singer Reference Singer, Laureys and Tononi2011, p. 43). Uhlhaas et al. (Reference Uhlhaas, Pipa, Lima, Melloni, Neuenschwander, Nikolic and Singer2009) specify that the earliest signature of conscious processing is “the precise phase locking across a widely distributed cortical network” (p. 11). Singer (Reference Singer, Laureys and Tononi2011) adds that “brain states compatible with conscious processing should be characterized by a high degree of synchrony” (p. 43). Similar conclusions about the role of high frequencies (e.g., > 30 Hz) in consciousness can be found in other projects (Aru & Bachmann Reference Aru and Bachmann2009; Crick & Koch Reference Crick and Koch1990; Doesburg et al. Reference Doesburg, Kitajo and Ward2005; Reference Doesburg, Green, McDonald and Ward2009; Engel & Singer Reference Engel and Singer2001; Hameroff Reference Hameroff2010; Jung-Beeman et al. Reference Jung-Beeman, Bowden, Haberman, Frymiare, Arambel-Liu, Greenblatt, Reber and Kounios2004; Meador et al. Reference Meador, Ray, Echauz, Loring and Vachtsevanos2002; Panagiotaropoulos et al. 2012; Uhlhass et al. 2009; Wessel et al. Reference Wessel, Haider and Rose2012). More generally, it has been proposed that, to instantiate consciousness of any kind, the mode of interaction among regions (interregional synchrony) is as important as the nature and loci of the regions activated (Buzsáki Reference Buzsáki2006; Fries Reference Fries2005; Hummel & Gerloff Reference Hummel and Gerloff2005; Lewis et al. Reference Lewis, Weiner, Mukamel, Donoghue, Eskandar, Madsen, Anderson, Hochberg, Cash, Brown and Purdon2012; Ward Reference Ward2003).

There is less consensus regarding how, to instantiate the conscious field, high-frequency bands such as gamma (gamma in the rat, ranging from 40 Hz to 100 Hz; Adrian Reference Adrian1942; Kay & Beshel Reference Kay and Beshel2010) must interact with ongoing, lower-frequency bands. It appears that these interactions between frequency bands are complex and dynamic. In addition, controversy continues regarding which brain regions are primarily responsible for the high-frequency brain rhythms linked to consciousness and whether cortical electroencephalography reflects consciousness (Merker Reference Merker, Shimon, Tomer and Zach2012; 2013b).

Olfaction provides a portal for understanding the neural correlates of “additions” to the conscious field. (Olfaction was one of the first systems in which the nature of oscillatory activity in the brain was investigated [e.g., Adrian Reference Adrian1942].) Olfactory information may be encoded through oscillating neural assemblies (Adrian Reference Adrian1942; Reference Adrian1950a; Reference Adrian1950b; Eeckman & Freeman Reference Eeckman and Freeman1990; Freeman Reference Freeman1975; Kim et al. Reference Kim, Singer and Zochowski2006; Laurent & Davidowitz Reference Laurent and Davidowitz1994). Different odorants elicit different patterns across spatially distributed neural ensembles of the olfactory bulb (Freeman Reference Freeman and Basar1987; Laurent & Davidowitz Reference Laurent and Davidowitz1994; Xu et al. Reference Xu, Greer and Shepherd2000).

In our “creature in the cave” example, the smell of smoke is an addition to the conscious field that influences skeletomotor responses toward other conscious contents (e.g., the percept of the opening). Examining the neural correlates of such an addition reveals more evidence for the integration consensus.Footnote 6 In olfaction, it is frequencies in the beta range (~15–30 Hz in the rat; Kay & Beshel Reference Kay and Beshel2010; Kay et al. Reference Kay, Beshel, Brea, Martin, Rojas-L'bano and Kopell2009) that link olfactory processing to non-sensory, cognitive areas (Vanderwolf & Zibrowski Reference Vanderwolf and Zibrowski2001; Zibrowski & Vanderwolf Reference Zibrowski and Vanderwolf1997). Specifically, beta oscillations in the olfactory bulb “entrain” both areas of the piriform cortex, suggesting that beta oscillations may serve the purpose of transmitting olfactory information from the olfactory bulb to higher-order, more cognitive areas, including cortical and subcortical areas. Consistent with this view, research outside of olfaction has found that beta may be involved in large-scale coupling for sensorimotor integration (Freeman Reference Freeman2007; Siegel et al. Reference Siegel, Donner and Engel2012).

In addition, Kay et al. (Reference Kay, Beshel, Brea, Martin, Rojas-L'bano and Kopell2009) propose that “beta oscillations are associated with motor models, favoring this oscillation as a good substrate for long-distance communication” (p. 7). Accordingly, beta coherence between the olfactory bulb and the hippocampus accompanies odor learning in a go/no-go task (Martin et al. Reference Martin, Beshel and Kay2007). (See Lathe's treatment about olfaction and the hippocampus.) It has been proposed that, though the higher frequency of gamma (in the rat, 40–100 Hz; Kay & Beshel Reference Kay and Beshel2010) can be observed in processing at primary sensory areas, when the sensory information becomes part of a wider network which includes activations from other sensory modalities, then the frequencies are in the beta range (Freeman Reference Freeman2007). (Mechanisms engendering gamma during odor perception reside within the olfactory bulb; Freeman Reference Freeman1979.) Based in part on such neural evidence, it has been hypothesized that one becomes conscious of an olfactory percept only when the representation is part of a wider network involving other systems (Cooney & Gazzaniga Reference Cooney and Gazzaniga2003), such as motor (Mainland & Sobel Reference Mainland and Sobel2006) or semantic-linguistic (Herz Reference Herz2003) systems. In line with this view, sensory research outside of olfaction has found evidence that beta may be involved in sensory gating (Hong et al. Reference Hong, Buchanan, Thaker, Shepard and Summerfelt2008) or in large-scale coupling for sensori-motor integration (Siegel et al. Reference Siegel, Donner and Engel2012).

Importantly, unlike gamma oscillations, oscillations in the beta range require participation of (at least) the piriform cortex (Neville & Haberly Reference Neville and Haberly2003). (If the lateral olfactory tract is disrupted, gamma oscillations in the bulb persist; Gray & Skinner Reference Gray and Skinner1988.) The higher the task demand (e.g., fine discrimination vs. simple discrimination), the higher the gamma amplitude will be in early perceptual processing (Beshel et al. Reference Beshel, Kopell and Kay2007; Stopfer et al. Reference Stopfer, Bhagavan, Smith and Laurent1997). Accordingly, disturbing gamma oscillations in invertebrates impairs the discrimination of similar odors (a high task demand) but does not impair the discrimination of dissimilar odors (a low task demand; Stopfer et al. Reference Stopfer, Bhagavan, Smith and Laurent1997).

Appreciation of the long-studied oscillatory properties of the olfactory system corroborates what has been observed in other sensory modalities (cf. Fries Reference Fries2005; Sauseng & Klimesch Reference Sauseng and Klimesch2008; Siegel et al. Reference Siegel, Donner and Engel2012; Singer Reference Singer, Laureys and Tononi2011): (a) the synchronizations of high frequencies (e.g., gamma) in local (e.g., olfactory bulb) afferent processing (Bruns & Eckhorn Reference Bruns and Eckhorn2004; Kay & Beshel Reference Kay and Beshel2010; von Stein & Sarnthein Reference von Stein and Sarnthein2000), especially when the process is challenging (e.g., fine discrimination vs. simple discrimination; Kay & Beshel Reference Kay and Beshel2010); and (b) the synchronization at a somewhat slower frequency range (e.g., beta or theta) for integration within a larger-scale cognitive network (Kay et al. Reference Kay, Beshel, Brea, Martin, Rojas-L'bano and Kopell2009; Key & Beshel 2010). (For a review of the neural correlates of olfactory consciousness, see Merrick et al. [2014].)

The foregoing reveals some conceptual progress regarding the neural correlates of consciousness. As noted by Merker, it is clear that isolating the neural correlates of olfactory consciousness will require further investigation. (It should be reiterated that, for good reasons, some have claimed that cortical electroencephalography does not reflect conscious processing; Merker 2013b; Reference Merker2013c.) Critical for the study of the neural correlates of consciousness, and for the cortical-subcortical controversy, is Merker's insight about the necessary role of the dorsal pulvinar in olfactory consciousness.

As is evident in the commentaries, we believe that PFT will spur the field to think about the problem of how consciousness arises from nervous function in more theoretically driven and evolutionary-based ways.

R9. Conclusion

In phylogeny, there has been a trend toward the increased compartmentalization of function (Allman Reference Allman2000). Having different brain circuits devoted to different kinds of tasks introduces the struggle of parts problem (Mayr Reference Mayr2001), which occurs when the introduction of new structures such as organs involves competitive interactions with extant ones. This problem may have increased the pressure for “many-to-one” solutions, including the conscious field. From our EASE perspective, although such a solution could conceivably occur without something like consciousness, such a possibility was not selected in evolutionary history, in which problems are sometimes solved by counterintuitive and suboptimal strategies (Dawkins Reference Dawkins1982; Gould Reference Gould1977; Mayr Reference Mayr2001; Roe & Simpson Reference Roe and Simpson1958). (We should reiterate that intuitions regarding how sensations should be mapped onto responses are actually computationally impossible; see Tsotsos [1995; 2011].)

The commentaries give us confidence that PFT, though based in part on established ideas from diverse fields of study, is a novel synthesis that advances understanding of the role of conscious states in nervous function. PFT attempts to redefine the nature of consciousness. One can propose that, if the heart can be conceptualized as a pump and the kidney as a filter, then consciousness could be conceptualized as a frame composed of tokens (e.g., the color blue, a smell, or pain) that are in a common format, a format that can be sampled only by the action systems of the skeletomotor output system. As an interface of sorts for the action system, the conscious field permits for the response to a given content to be framed by the other contents composing the field. The physical basis of the frame associated with consciousness is most likely unlike anything we currently understand.

ACKNOWLEDGMENTS

Ezequiel Morsella dedicates this Response article to Robert M. Krauss, his doctoral advisor, who taught him all of the important lessons. This research was supported by David Matsumoto and the Center for Human Culture and Behavior at San Francisco State University. All authors are grateful for the assistance of Jed Katzel, T. Andrew Poehlman, Lawrence Williams, Pat Miller, Ryan Howell, Lucia Jacobs, Allison Allen, Sabrina Bhangal, Hyein Cho, Donish Cushing, Wei Dou, Reza Ghafur, Jessica McMillin, Pooya Razavi, Zaviera Reyes, and Anthony Velasquez.

Footnotes

1. One might argue that smooth muscle actions (e.g., the pupillary reflex) are not veritable forms of action and hence should not be contrasted with voluntary actions, which, to the conscious actor, feel like “real actions.” However, it is important to appreciate that, to an intelligent nonhuman observer (e.g., an imaginary, extraterrestrial ethologist), events such as the pupillary reflex would be worthy of being “coded” and jotted down as actions on an observation log. To an observer that is agnostic regarding our internal states, the pupillary reflex would appear as action-like as a wink, blink, or the movements of a finger (Skinner Reference Skinner1953).

2. See neural evidence for this effect in Nath and Beauchamp (Reference Nath and Beauchamp2012).

3. Conscious conflicts are often between a high-level system and a low-level system, but they may also be between (a) two low-level systems, as when one is thirsty and must drink painfully cold ice water (Morsella Reference Morsella2005), or (b) two high-level systems, as in the incongruent condition of the Stroop task.

4. In PFT, conscious contents can be construed as action options, or, more precisely, as constraining dimensions (Morsella & Bargh Reference Morsella and Bargh2010b), because that which is conscious reduces the space of possible skeletomotor action selection. Unlike with the traditional (circular) definition of conscious representation (which is defined only in terms of being conscious), here the representations are defined by more than their being conscious: They are also defined by their ability to constrain action selection in the skeletomotor output system. These conscious constraining dimensions are not involved in intersensory conflicts, intrasensory conflicts, or the conflicts involving non-skeletal muscle effectors (Morsella et al. Reference Morsella, Gray, Krieger and Bargh2009a). Akin to a single steering wheel that is controlled by multiple agentic systems, the skeletomotor output system suffers from a particular kind of multi-determined guidance. Just as simple motor acts suffer from the “degrees of freedom” problem, because there are countless ways to instantiate a motor act such as grasping a handle (Rosenbaum Reference Rosenbaum and Yantis2002), so does action selection, for there are many action options. For action-goal selection, the challenge is met not by unconscious motor algorithms (as in the case of motor programming; Rosenbaum Reference Rosenbaum and Yantis2002), but by the involvement of the conscious field. In line with this view, Goodale and Milner (Reference Goodale and Milner2004) conclude that “the primary role of [conscious] perceptual representations is not in the execution of actions, but rather in helping the person or animal arrive at a decision to act in a particular way” (p. 48).

5. Similarly, at the level of operant behavior, skeletomotor considerations are unaffected by, say, incentive states. For example, the actions of navigating through a maze or drawing a candy cane would be carried out in roughly the same manner regardless of the nature of the reward contingencies (Skinner Reference Skinner1953).

6. Supporting the integration consensus, findings in the field of anesthesiology suggest that anesthetic agents work on consciousness in part by halting the integration of information (Alkire et al. Reference Alkire, Hudetz and Tononi2008; Lee et al. Reference Lee, Kim, Noh, Choi, Hwang and Mashour2009; Mashour Reference Mashour2004; see related evidence in Boveroux et al. Reference Boveroux, Vanhaudenhuyse, Bruno, Noirhomme, Lauwick, Luxen, Degueldre, Plenevaux, Schnakers, Phillips, Brichant, Bonhomme, Maquet, Greicius, Laureys and Boly2010; Långsjö et al. Reference Långsjö, Alkire, Kaskinoro, Hayama, Maksimow, Kaisti, Aalto, Aantaa, Jääskeläinen, Revonsuo and Scheinin2012; Lewis et al. Reference Lewis, Weiner, Mukamel, Donoghue, Eskandar, Madsen, Anderson, Hochberg, Cash, Brown and Purdon2012; Schroter et al. Reference Schroter, Spoormaker, Schorer, Wohlschlager, Czish, Kochs, Zimmer, Hemmer, Schneider, Jordan and Ilg2012; Schrouff et al. Reference Schrouff, Perlbarg, Boly, Marrelec, Boveroux, Vanhaudenhuyse, Bruno, Laureys, Phillips, Pélégrini-Isaac, Maquet and Benali2011). Regarding thalamic accounts of consciousness, some anesthetics can cause a reduction in thalamic blood flow and metabolism during the loss of consciousness, whereas other kinds of anesthetics result in increases in thalamic metabolism (e.g., ketamine) or decreases in metabolism while the subject remains conscious (e.g., during sevoflurane sedation; cf. Alkire et al. Reference Alkire, Hudetz and Tononi2008). Additionally, studies using electroencephalography have shown that as soon as a subject loses consciousness, there is a marked change in cortical electroencephalography, while the thalamic electroencephalography remains relatively the same for some minutes afterwards. According to Alkire et al. (Reference Alkire, Hudetz and Tononi2008), this suggests that the thalamus may not be the sole location of consciousness. Investigations into feed-forward and feed-backward connectivity while under anesthesia suggest that conscious states are associated with fronto-parietal networks (Lee et al. Reference Lee, Kim, Noh, Choi, Hwang and Mashour2009). (See further discussion on anesthesia and consciousness in Poehlman et al. [2012].)

References

Ach, N. (1905/1951) Determining tendencies: Awareness. In: Organization and pathology of thought, ed. Rapaport, D., pp. 1538. Columbia University Press (Original work published in 1905).Google Scholar
Adrian, E. D. (1942) Olfactory reactions in the brain of the hedgehog. The Journal of Physiology 100(4):459–73.Google Scholar
Adrian, E. D. (1950a) Sensory discrimination: With some recent evidence from the olfactory organ. British Medical Bulletin 6(4):330–32.Google Scholar
Adrian, E. D. (1950b) The electrical activity of the mammalian olfactory bulb. Electroencephalography and Clinical Neurophysiology 2(1):377–88.Google Scholar
Alkire, M., Hudetz, A. & Tononi, G. (2008) Consciousness and anesthesia. Science 322(5903):876–80.Google Scholar
Allen, A. K., Wilkins, K., Gazzaley, A. & Morsella, E. (2013) Conscious thoughts from reflex-like processes: A new experimental paradigm for consciousness research. Consciousness and Cognition 22:1318–31.CrossRefGoogle ScholarPubMed
Allman, J. M. (2000) Evolving brains. Scientific American Library.Google Scholar
Allport, D. A. (1989) Visual attention. In: Foundations of cognitive science, vol. 2, ed. Posner, M. I., pp. 631–82. MIT Press.CrossRefGoogle Scholar
Anderson, J. (1983) The architecture of cognition. Harvard University Press.Google Scholar
Aru, J. & Bachmann, T. (2009) Occipital EEG correlates of conscious awareness when subjective target shine-through and effective visual masking are compared: Bifocal early increase in gamma power and speed-up of P1. Brain Research 1271:6073.Google Scholar
Augustinova, M. & Ferrand, L. (2014) Automaticity of word reading: Evidence from the semantic Stroop paradigm. Current Directions in Psychological Science 23:343–48.Google Scholar
Baars, B. (1988) A cognitive theory of consciousness. Cambridge University Press.Google Scholar
Baars, B. J. (1997b) Some essential differences between consciousness and attention, perception, and working memory. Consciousness and Cognition 6:363–71.CrossRefGoogle ScholarPubMed
Baars, B. J. (2002) The conscious access hypothesis: Origins and recent evidence. Trends in Cognitive Sciences 6(1):4752.Google Scholar
Baker, T. B., Piper, M. E., McCarthy, D. E., Majeskie, M. R. & Fiore, M. C. (2004) Addiction motivation reformulated: An affective processing model of negative reinforcement. Psychological Review 111:3351.Google Scholar
Banks, W. P. (1995) Evidence for consciousness. Consciousness and Cognition 4:270–72.Google Scholar
Bar, M., Kassam, K. S., Ghuman, A. S., Boshyan, J., Schmid, A. M., Dale, A. M., Hämäläinen, M. S., Marinkovic, K., Schacter, D. L., Rosen, B. R. & Halgren, E. (2006) Top-down facilitation of visual recognition. Proceedings of the National Academy of Sciences USA 103:449–54.Google Scholar
Bargh, J. A. & Morsella, E. (2008) The unconscious mind. Perspectives on Psychological Science 3:7379.Google Scholar
Baumeister, R. F. & Vohs, K. D. (2004) Handbook of self-regulation: Research, theory, and applications. Guilford Press.Google Scholar
Beshel, J., Kopell, N. & Kay, L. M. (2007) Olfactory bulb gamma oscillations are enhanced with task demands. Journal of Neuroscience 27:8358–65.Google Scholar
Bhangal, S., Cho, H., Geisler, M. W. & Morsella, E. (2016) The prospective nature of voluntary action: Insights from the reflexive imagery task. Review of General Psychology 20:101–17.Google Scholar
Bindra, D. (1974) A motivational view of learning, performance, and behavior modification. Psychological Review 81:199213.CrossRefGoogle ScholarPubMed
Bindra, D. (1978) How adaptive behavior is produced: A perceptual-motivational alternative to response-reinforcement. Behavioral and Brain Sciences 1:4191.Google Scholar
Block, N. (1995b) On a confusion about a function of consciousness. (Target article) Behavioral and Brain Sciences 18(2):227–47; discussion 247–87.Google Scholar
Block, N. (2007) Consciousness, accessibility, and the mesh between psychology and neuroscience. Behavioral and Brain Sciences 30:481548.CrossRefGoogle ScholarPubMed
Boveroux, P., Vanhaudenhuyse, A., Bruno, M. A., Noirhomme, Q., Lauwick, S., Luxen, A., Degueldre, C., Plenevaux, A., Schnakers, C., Phillips, C., Brichant, J. F., Bonhomme, V., Maquet, P., Greicius, M. D., Laureys, S. & Boly, M. (2010) Breakdown of within- and between-network resting state functional magnetic resonance imaging connectivity during propofol-induced loss of consciousness. Anesthesiology 113:1038–53.Google Scholar
Bruns, A. & Eckhorn, R. (2004) Task-related coupling from high-to low-frequency signals among visual cortical areas in human subdural recordings. International Journal of Psychophysiology 51(2):97116.Google Scholar
Buzsáki, G. (2006) Rhythms of the brain. Oxford University Press.Google Scholar
Carlson, N. R. (1994) Physiology of behavior. Allyn and Bacon.Google Scholar
Cho, H., Zarolia, P., Gazzaley, A. & Morsella, E. (2016) Involuntary symbol manipulation (Pig Latin) from external control: Implications for thought suppression. Acta Psychologica 166:3741.Google Scholar
Cisek, P. (2007) Cortical mechanisms of action selection: The affordance competition hypothesis. Philosophical Transactions of the Royal Society B 362:1585–99.Google Scholar
Cisek, P. & Kalaska, J. F. (2010) Neural mechanisms for interacting with a world full of action choices. Annual Review of Neuroscience 33:269–98.CrossRefGoogle ScholarPubMed
Clark, A. (2002) Is seeing all it seems? Action, reason and the grand illusion. Journal of Consciousness Studies 9:181202.Google Scholar
Cohen, M. A., Cavanagh, P., Chun, M. M. & Nakayama, K. (2012) The attentional requirements of consciousness. Trends in Cognitive Sciences 16:411–17.Google Scholar
Cooney, J. W. & Gazzaniga, M. S. (2003) Neurological disorders and the structure of human consciousness. Trends in Cognitive Sciences 7:161–66.Google Scholar
Cooper, A. D., Sterling, C. P., Bacon, M. P. & Bridgeman, B. (2012) Does action affect perception or memory? Vision Research 62:235–40.Google Scholar
Corr, P. J. & Morsella, E. (2015) The conscious control of behavior: Revisiting Gray's comparator model. In: Personality and control, vol. 4, ed. Corr, P. J., Fajkowska, M., Eysenck, M. W. & Wytykowska, A., pp. 1542. Eliot Werner.Google Scholar
Crick, F. & Koch, C. (1990) Toward a neurobiological theory of consciousness. Seminars in the Neurosciences 2:263–75.Google Scholar
Crick, F. & Koch, C. (2000) The unconscious homunculus. In: Neural correlates of consciousness, ed. Metzinger, T., pp. 103–10. MIT Press.Google Scholar
Damasio, A. R., Tranel, D. & Damasio, H. C. (1991) Somatic markers and the guidance of behavior: Theory and preliminary testing. In: Frontal lobe function and dysfunction, ed. Levin, H. S., Eisenberg, H. M. & Benton, A. L., pp. 217–29. Oxford University Press.Google Scholar
Dawkins, R. (1982) The extended phenotype: The long reach of the gene. Oxford University Press.Google Scholar
Dehaene, S. (2014) Consciousness and the brain: Deciphering how the brain codes our thoughts. Viking.Google Scholar
Dennett, D. C. (1991) Consciousness explained. Little, Brown.Google Scholar
Desender, K., van Opstal, F. V. & van den Bussche, E. (2014) Feeling the conflict: The crucial role of conflict experience in adaptation. Psychological Science 25:675–83.Google Scholar
Doesburg, S. M., Green, J. L., McDonald, J. J. & Ward, L. M. (2009) Rhythms of consciousness: Binocular rivalry reveals large-scale oscillatory network dynamics mediating visual perception. PLoS ONE 4:e0006142.Google Scholar
Doesburg, S. M., Kitajo, K. & Ward, L. M. (2005) Increased gamma-band synchrony precedes switching of conscious perceptual objects in binocular rivalry. NeuroReport 16:1139–42.CrossRefGoogle ScholarPubMed
Eeckman, F. H. & Freeman, W. J. (1990) Correlations between unit firing and EEG in the rat olfactory system. Brain Research 528(2):238–44.Google Scholar
Ehrsson, H. H. (2007) The experimental induction of out-of-body experiences. Science 317(5841):1048.Google Scholar
Eimer, M. & Schlaghecken, F. (2003) Response facilitation and inhibition in subliminal priming. Biological Psychology 64(1):726.Google Scholar
Einstein, A. & Infeld, L. (1938/1967) The evolution of physics. Cambridge University Press/Touchstone. (Original work published in 1938).Google Scholar
Engel, A. K. & Singer, W. (2001) Temporal binding and the neural correlates of sensory awareness. Trends in Cognitive Sciences 5:1625.Google Scholar
Eriksen, C. W. & Schultz, D. W. (1979) Information processing in visual search: A continuous flow conception and experimental results. Perception and Psychophysics 25:249–63.Google Scholar
Filevich, E. & Haggard, P. (2013) Persistence of internal representations of alternative voluntary actions. Frontiers in Psychology 4, article 202. (Online journal) doi:10.3389/fpsyg.2013.00202.Google Scholar
Freeman, W. J. (1975) Mass action in the nervous system: Examination of the neurophysiological basis of adaptive behavior through the EEG. Academic Press.Google Scholar
Freeman, W. J. (1979) Nonlinear dynamics of paleocortex manifested in the olfactory EEG. Biological Cybernetics 35(1):2137.Google Scholar
Freeman, W. J. (1987) Nonlinear neural dynamics in olfaction as a model for cognition. In: Dynamics of sensory and cognitive processing in the brain, ed. Basar, E., pp. 1929. Springer-Verlag.Google Scholar
Freeman, W. J. (2004) William James on consciousness, revisited. Chaos and Complexity Letters 1:1742.Google Scholar
Freeman, W. J. (2007) Indirect biological measures of consciousness from field studies of brains as dynamical systems. Neural Networks 20:1021–31.Google Scholar
Fries, P. (2005) A mechanism for cognitive dynamics: Neuronal communication through neuronal coherence. Trends in Cognitive Sciences 9:474–80.Google Scholar
Frijda, N. H. (1986) The emotions. Cambridge University Press.Google Scholar
Frith, C. D. (2010) What is consciousness for? Pragmatics and Cognition 18(3):497551.Google Scholar
Glenn, L. L. & Dement, W. C. (1981) Membrane potential and input resistance of cat spinal motoneurons in wakefulness and sleep. Behavioural Brain Research 2:231–36.Google Scholar
Godwin, C. A., Gazzaley, A. & Morsella, E. (2013) Homing in on the brain mechanisms linked to consciousness: Buffer of the perception-and-action interface. In: The unity of mind, brain and world: Current perspectives on a science of consciousness, ed. Pereira, A. Jr. & Lehmann, D., pp. 4376. Cambridge University Press.Google Scholar
Goodale, M. & Milner, D. (2004) Sight unseen: An exploration of conscious and unconscious vision. Oxford University Press.Google Scholar
Gould, S. J. (1977) Ever since Darwin: Reflections in natural history. Norton.Google Scholar
Gray, C. M. & Skinner, J. E. (1988) Centrifugal regulation of neuronal activity in the olfactory bulb of the waking rabbit as revealed by reversible cryogenic blockade. Experimental Brain Research 69:378–86.Google Scholar
Grossberg, S. (1999) The link between brain learning, attention, and consciousness. Consciousness and Cognition 8(1):144.CrossRefGoogle ScholarPubMed
Haidt, J. (2001) The emotional dog and its rational tail: A social intuitionist approach to moral judgment. Psychological Review 108:814–34.CrossRefGoogle ScholarPubMed
Hallett, P. E. (1978) Primary and secondary saccades to goals defined by instructions. Vision Research 18:1279–96.Google Scholar
Hameroff, S. (2010) The “conscious pilot” – dendritic synchrony moves through the brain to mediate consciousness. Journal of Biological Physics 36:7193.Google Scholar
Harleß, E. (1861) Der Apparat des Willens [The apparatus of the will]. Zeitshrift für Philosophie und philosophische Kritik 38:499507.Google Scholar
Helmholtz, H. von (1856/1961) Treatise of physiological optics: Concerning the perceptions in general. In: Classics in psychology, ed. Shipley, T., pp. 79127. Philosophy Library. (Original work published in 1856).Google Scholar
Herz, R. S. (2003) The effect of verbal context on olfactory perception. Journal of Experimental Psychology: General 132:595606.CrossRefGoogle ScholarPubMed
Hochberg, J. (1998) Gestalt theory and its legacy: Organization in eye and brain, in attention and mental representation. In: Perception and cognition at century's end: Handbook of perception and cognition, 2nd edition, ed. Hochberg, J., pp. 253306. Academic Press.Google Scholar
Hommel, B. (2013) Dancing in the dark: No role for consciousness in action control. Frontiers in Psychology 4, article 380. (Online journal). doi:10.3389/fpsyg.2013.00380.Google Scholar
Hong, L. E., Buchanan, R. W., Thaker, G. K., Shepard, P. D. & Summerfelt, A. (2008) Beta (~16 Hz) frequency neural oscillations mediate auditory sensory gating in humans. Psychophysiology 45(2):197204.CrossRefGoogle ScholarPubMed
Hughes, G., Velmans, M. & de Fockert, J. (2009) Unconscious priming of a no-go response. Psychophysiology 46:1258–69.Google Scholar
Hummel, F. & Gerloff, C. (2005) Larger interregional synchrony is associated with greater behavioral success in a complex sensory integration task in humans. Cerebral Cortex 15:670–78.Google Scholar
Jackendoff, R. S. (1990) Consciousness and the computational mind. MIT Press.Google Scholar
Jackson, F. (1986) What Mary didn't know. The Journal of Philosophy 83:291–95.Google Scholar
James, W. (1890) The principles of psychology, vols. 1 & 2. Holt/Dover.Google Scholar
Jeannerod, M. (2006) Motor cognition: What action tells the self. Oxford University Press.CrossRefGoogle Scholar
Jung-Beeman, M., Bowden, E. M., Haberman, J., Frymiare, J. L., Arambel-Liu, S., Greenblatt, R., Reber, P. J. & Kounios, J. (2004) Neural activity when people solve verbal problems with insight. PLoS Biology 2:500–10.Google Scholar
Kaufman, M. T., Churchland, M. M., Ryu, S. I. & Shenoy, K. V. (2015) Vacillation, indecision and hesitation in moment-by-moment decoding of monkey cortex. eLife 4:e04677. doi:10.7554/eLife.04677.Google Scholar
Kay, L. M. & Beshel, J. (2010) A beta oscillation network in the rat olfactory system during a 2-alternative choice odor discrimination task. Journal of Neurophysiology 104:829–39.Google Scholar
Kay, L. M., Beshel, J., Brea, J., Martin, C., Rojas-L'bano, D. & Kopell, N. (2009) Olfactory oscillations: The what, how and what for. Trends in Neurosciences 32:207–14.Google Scholar
Kim, S., Singer, B. H. & Zochowski, M. (2006) Changing roles for temporal representation of odorant during the oscillatory response of the olfactory bulb. Neural Computation 18(4):794816.CrossRefGoogle ScholarPubMed
Kinsbourne, M. (1996) What qualifies a representation for a role in consciousness? In: Scientific approaches to consciousness, ed. Cohen, J. D. & Schooler, J. W., pp. 335–55. Erlbaum.Google Scholar
Kinsbourne, M. (2000) How is consciousness expressed in the cerebral activation manifold? Brain and Mind 2:265–74.Google Scholar
Koch, C. (2004) The quest for consciousness: A neurobiological approach. Roberts.Google Scholar
Koch, C. & Tsuchiya, N. (2007) Attention and consciousness: Two distinct brain processes. Trends in Cognitive Sciences 11:1622.Google Scholar
Koch, C. (2014) Consciousness. Colloquium delivered at the Redwood Center for Theoretical Neuroscience, December 2, 2014. University of California.Google Scholar
Krauzlis, R. J., Bollimunta, A., Arcizet, F. & Wang, L. (2014) Attention as an effect not a cause. Trends in Cognitive Sciences 18:457–64.Google Scholar
Långsjö, J. W., Alkire, M. T., Kaskinoro, K., Hayama, H., Maksimow, A., Kaisti, K. K., Aalto, S., Aantaa, R., Jääskeläinen, S. K., Revonsuo, A. & Scheinin, H. (2012) Returning from oblivion: Imaging the neural core of consciousness. The Journal of Neuroscience 32:4935–43.Google Scholar
Lashley, K. S. (1956) Cerebral organization and behavior. Proceedings of the Association for Research in Nervous and Mental Diseases 36:118.Google Scholar
Laurent, G. & Davidowitz, H. (1994) Encoding of olfactory information with oscillating neural assemblies. Science 265(5180):1872–75.Google Scholar
LeDoux, J. E. (1996) The emotional brain: The mysterious underpinnings of emotional life. Simon and Schuster.Google Scholar
LeDoux, J. E. (2000) Emotion circuits in the brain. Annual Review of Neuroscience 23:155–84.Google Scholar
LeDoux, J. E. (2012) Rethinking the emotional brain. Neuron 73:653–76.Google Scholar
Lee, U., Kim, S., Noh, G. J., Choi, B. M., Hwang, E. & Mashour, G. (2009) The directionality and functional organization of frontoparietal connectivity during consciousness and anesthesia in humans. Consciousness and Cognition 18:1069–78.Google Scholar
Levelt, W. J. M. (1989) Speaking: From intention to articulation. MIT Press.Google Scholar
Lewis, L. D., Weiner, V. S., Mukamel, E. A., Donoghue, J. A., Eskandar, E. N., Madsen, J. R., Anderson, W., Hochberg, L. R., Cash, S. S., Brown, E. N. & Purdon, P. L. (2012) Rapid fragmentation of neuronal networks at the onset of propofol-induced unconsciousness. Proceedings of the National Academy of Sciences USA 109(49):E3377–86.Google Scholar
Livnat, A. & Pippenger, N. (2006) An optimal brain can be composed of conflicting agents. Proceedings of the National Academy of Sciences USA 103:3198–202.Google Scholar
Loewenstein, G. (1996) Out of control: visceral influences on behavior. Organizational Behavior and Human Decision Processes 65:272–92.Google Scholar
Logan, G. D., Taylor, S. E. & Etherton, J. L. (1999) Attention and automaticity: Toward a theoretical integration. Psychological Research 62:165–81.Google Scholar
Logan, G. D., Yamaguchi, M., Schall, J. D. & Palmeri, T. J. (2015) Inhibitory control in mind and brain 2.0: Blocked-input models of saccadic countermanding. Psychological Review 122:115–47.CrossRefGoogle ScholarPubMed
Logothetis, N. K. & Schall, J. D. (1989) Neuronal correlates of subjective visual perception. Science 245:761–62.Google Scholar
Lorenz, K. (1963) On aggression. Harcourt, Brace & World.Google Scholar
MacLeod, C. M. & MacDonald, P. A. (2000) Interdimensional interference in the Stroop effect: Uncovering the cognitive and neural anatomy of attention. Trends in Cognitive Sciences 4:383–91.Google Scholar
Macphail, E. M. (1998) The evolution of consciousness. Oxford University Press.Google Scholar
Mainland, J. D. & Sobel, N. (2006) The sniff is part of the olfactory percept. Chemical Senses 31:181–96.Google Scholar
Marcel, A. J. (1993) Slippage in the unity of consciousness. In: Experimental and theoretical studies of consciousness. Ciba Foundation Symposium 174, ed. Bock, G. R. & Marsh, J., pp. 168–80. Wiley.Google Scholar
Marr, D. (1982) Vision: A computational investigation into the human representation and processing of visual information. W. H. Freeman.Google Scholar
Martin, C., Beshel, J. & Kay, L. M. (2007) An olfacto-hippocampal network is dynamically involved in odor-discrimination learning. Journal of Neurophysiology 98(4):2196–205.Google Scholar
Mashour, G. A. (2004) Consciousness unbound: Toward a paradigm of general anesthesia. Anesthesiology 100:428–33.Google Scholar
Masicampo, E. J. & Baumeister, R. F. (2013) Conscious thought does not guide moment-to-moment actions – it serves social and cultural functions. Frontiers in Psychology 4, article 478. (Online journal). doi:10.3389/fpsyg.2013.00478.Google Scholar
Mayr, E. (2001) What evolution is. Weidenfeld & Nicolson.Google Scholar
McClelland, J. L. (1979) On the time-relations of mental processes: An examination of systems of processes in cascade. Psychological Review 86:287–30.Google Scholar
Meador, K. J., Ray, P. G., Echauz, J. R., Loring, D. W. & Vachtsevanos, G. J. (2002) Gamma coherence and conscious perception. Neurology 59:847–54.Google Scholar
Melzack, R. & Casey, K. L. (1968) Sensory, motivational, and central control determinants of pain: A new conceptual model. In: The skin senses, ed. Kenshalo, D. R., pp. 423–39. Charles C. Thomas.Google Scholar
Merker, B. (2007) Consciousness without a cerebral cortex: A challenge for neuroscience and medicine. Behavioral and Brain Sciences 30(1):6381; discussion 81–134.Google Scholar
Merker, B. (2012) From probabilities to percepts: A subcortical “global best estimate buffer” as locus of phenomenal experience. In: Being in time: Dynamical models of phenomenal experience, ed. Shimon, E., Tomer, F. & Zach, N., pp. 3780. John Benjamins.Google Scholar
Merker, B. (2013c) The efference cascade, consciousness, and its self: Naturalizing the first person pivot of action control. Frontiers in Psychology 4, article 501:120. (Online journal). doi:10.3389/fpsyg.2013.00501.Google Scholar
Merrick, C., Farnia, M., Jantz, T. K., Gazzaley, A. & Morsella, E. (2015) External control of the stream of consciousness: Stimulus-based effects on involuntary thought sequences. Consciousness and Cognition 33:217225.Google Scholar
Merrick, M. C., Godwin, C. A., Geisler, M. W. & Morsella, E. (2014) The olfactory system as the gateway to the neural correlates of consciousness. Frontiers in Psychology 4, article1011. (Online journal). doi:10.3389/fpsyg.2013.01011.Google Scholar
Miller, N. E. (1959) Liberalization of basic S-R concepts: Extensions to conflict behavior, motivation, and social learning. In: Psychology: A study of a science, vol. 2, ed. Koch, S., pp. 196292. McGraw-Hill.Google Scholar
Minsky, M. (1985) The society of mind. Simon and Schuster.Google Scholar
Molapour, T., Berger, C. C. & Morsella, E. (2011) Did I read or did I name? Process blindness from congruent processing “outputs.Consciousness and Cognition 20:1776–80.Google Scholar
Morsella, E. (2005) The function of phenomenal states: Supramodular interaction theory. Psychological Review 112:1000–21.Google Scholar
Morsella, E. & Bargh, J. A. (2010b) What is an output? Psychological Inquiry 21:354–70.Google Scholar
Morsella, E. & Bargh, J. A. (2011) Unconscious action tendencies: Sources of “un-integrated” action. In: The handbook of social neuroscience, ed. Cacioppo, J. T. & Decety, J., pp. 335–47. Oxford University Press.Google Scholar
Morsella, E., Berger, C. C. & Krieger, S. C. (2011) Cognitive and neural components of the phenomenology of agency. Neurocase 17:209–30.Google Scholar
Morsella, E., Gray, J. R., Krieger, S. C. & Bargh, J. A. (2009a) The essence of conscious conflict: Subjective effects of sustaining incompatible intentions. Emotion 9:717–28.Google Scholar
Morsella, E., Wilson, L. E., Berger, C. C., Honhongva, M., Gazzaley, A. & Bargh, J. A. (2009c) Subjective aspects of cognitive control at different stages of processing. Attention, Perception and Psychophysics 71:1807–24.Google Scholar
Morsella, E., Zarolia, P. & Gazzaley, A. (2012) Cognitive conflict and consciousness. In: Cognitive consistency: A unifying concept in social psychology, ed. Gawronski, B. & Strack, F., pp. 1946. Guilford Press.Google Scholar
Mudrik, L., Faivre, N. & Koch, C. (2014) Information integration without awareness. Trends in Cognitive Sciences 18(9):488–96.Google Scholar
Munakata, Y., Herd, S. A., Chatham, C. H., Depue, B. E., Banich, M. T. & O'Reilly, R. C. (2011) A unified framework for inhibitory control. Trends in Cognitive Sciences 15:453–59.Google Scholar
Nagasako, E. M., Oaklander, A. L. & Dworkin, R. H. (2003) Congenital insensitivity to pain: An update. Pain 101:213–19.Google Scholar
Nagel, T. (1974) What is it like to be a bat? Philosophical Review 83:435–50.Google Scholar
Nath, A. R. & Beauchamp, M. S. (2012) A neural basis for interindividual differences in the McGurk effect, a multisensory speech illusion. NeuroImage 59:781–87.Google Scholar
Neumann, O. (1987) Beyond capacity: A functional view of attention. In: Perspectives on perception and action, ed. Heuer, H. & Sanders, A. F., pp. 361–94. Erlbaum.Google Scholar
Neville, K. R. & Haberly, L. B. (2003) Beta and gamma oscillations in the olfactory system of the urethane-anesthetized rat. Journal of Neurophysiology 90(6):3921–30.Google Scholar
Nisbett, R. E. & Wilson, T. D. (1977) Telling more than we can know: Verbal reports on mental processes. Psychological Review 84:231–59.Google Scholar
Oberauer, K. & Hein, L. (2012) Attention to information in working memory. Current Directions in Psychological Science 21:164–69.Google Scholar
Öhman, A. & Mineka, S. (2001) Fears, phobias, and preparedness: Toward an evolved module of fear and fear learning. Psychological Review 108:483522.Google Scholar
Olsson, A. & Phelps, E. A. (2004) Learned fear of “unseen” faces after Pavlovian, observational, and instructed fear. Psychological Science 15:822–28.CrossRefGoogle ScholarPubMed
Panagiotaropolous, T. I., Deco, G., Kapoor, V. & Logothetis, N. K. (2012) Neuronal discharges and gamma oscillations explicitly reflect visual consciousness in the lateral prefrontal cortex. Neuron 74:924–35.Google Scholar
Pashler, H. (1993) Doing two things at the same time. American Scientist 81:4855.Google Scholar
Passingham, R. (1995) The frontal lobes and voluntary action. Oxford University Press.Google Scholar
Pinker, S. (1997) How the mind works. Norton.Google Scholar
Poellinger, A., Thomas, R., Lio, P., Lee, A., Makris, N., Rosen, B. R. & Kwong, K. K. (2001) Activation and habituation in olfaction-an fMRI study. Neuroimage 13(4):547–60.Google Scholar
Poehlman, T. A., Jantz, T. K. & Morsella, E. (2012) Adaptive skeletal muscle action requires anticipation and “conscious broadcasting.Frontiers in Cognition 3, article 369. (Online journal). doi: 10.3389/fpsyg.2012.00369.Google Scholar
Preston, J. & Wegner, D. M. (2009) Elbow grease: The experience of effort in action. In: Oxford handbook of human action, ed. Morsella, E., Bargh, J. A. & Gollwitzer, P. M., pp. 469–86. Oxford University Press.Google Scholar
Prinz, J. (2007) The intermediate level theory of consciousness. In: The Blackwell companion to consciousness, ed. Velmans, M. & Schneider, S., pp. 248–60. Blackwell.Google Scholar
Prinz, W. (2003b) How do we know about our own actions? In: Voluntary action: Brains, minds, and sociality, ed. Maasen, S., Prinz, W. & Roth, G., pp. 2133. Oxford University Press.Google Scholar
Prinz, W. (2012) Open minds: The social making of agency and intentionality. MIT Press.Google Scholar
Puttemans, V., Wenderoth, N. & Swinnen, S. P. (2005) Changes in brain activation during the acquisition of a multifrequency bimanual coordination task: From the cognitive stage to advanced levels of automaticity. Journal of Neuroscience 25:4270–78. Available at: http://dx.doi.org/10.1523/JNEUROSCI.3866-04.2005 Google Scholar
Ramachandran, V. S. (1999) Phantoms in the brain: Probing the mysteries of the human mind. Harper Perennial.Google Scholar
Riddle, T. A., Rosen, H. J. & Morsella, E. (2015) Is that me? Sense of agency as a function of intra-psychic conflict. Journal of Mind and Behavior 36:2746.Google Scholar
Roe, A. & Simpson, G. G. (1958) Behavior and evolution. Yale University Press.Google Scholar
Roser, M. & Gazzaniga, M. S. (2004) Automatic brains—interpretive minds. Current Directions in Psychological Science 13:5659.Google Scholar
Rosenbaum, D. A. (2002) Motor control. In: Stevens' handbook of experimental psychology: Vol. 1. Sensation and perception, 3rd edition, ed. Yantis, S., pp. 315–39. [Series editor: H. Pashler]. Wiley.Google Scholar
Sauseng, P. & Klimesch, W. (2008) What does phase information of oscillatory brain activity tell us about cognitive processes? Neuroscience and Biobehavioral Reviews 32(5):1001–13.Google Scholar
Schlaghecken, F., Bowman, H. & Eimer, M. (2006) Dissociating local and global levels of perceptuo-motor control in masked priming. Journal of Experimental Psychology: Human Perception and Performance 32(3):618–32.Google Scholar
Schroter, M., Spoormaker, V., Schorer, A., Wohlschlager, A., Czish, M., Kochs, E., Zimmer, C., Hemmer, B., Schneider, G., Jordan, D. & Ilg, R. (2012) Spatiotemporal reconfiguration of large-scale brain functional networks during propofol-induced loss of consciousness. Journal of Neuroscience 32:12832–40.Google Scholar
Schrouff, J., Perlbarg, V., Boly, M., Marrelec, G., Boveroux, P., Vanhaudenhuyse, A., Bruno, M. A., Laureys, S., Phillips, C., Pélégrini-Isaac, M., Maquet, P. & Benali, H. (2011) Brain functional integration decreases during propofol-induced loss of consciousness. NeuroImage 57:198205.Google Scholar
Searle, J. R. (2000) Consciousness. Annual Review of Neurosciences 23:557–78.Google Scholar
Selfridge, O. G. (1959) Pandemonium: A paradigm for learning. In: Mechanization of thought processes: Proceedings of a Symposium held at the National Physics Laboratory, November 1958, vol. 1, ed. Blake, D. V. & Uttley, A. J.. pp. 511–26. Her Majesty's Stationary Office.Google Scholar
Sergent, C. & Dahaene, S. (2004) Is consciousness a gradual phenomenon? Evidence for an all-or-none bifurcation during the attentional blink. Psychological Science 15:720–28.Google Scholar
Shallice, T. (1972) Dual functions of consciousness. Psychological Review 79:383–93.Google Scholar
Shepherd, G. M. (2006) Smell images and the flavour system in the human brain. Nature 444(7117):316–21.Google Scholar
Shepherd, G. M. (2007) Perspectives on olfactory processing, conscious perception, and orbitofrontal cortex. Annals of the New York Academy of Sciences 1121:87101.Google Scholar
Siegel, M., Donner, T. H. & Engel, A. K. (2012) Spectral fingerprints of large-scale neuronal interactions. Nature Reviews Neuroscience 13:121–34.Google Scholar
Simons, D. J. & Levin, D. T. (1997) Change blindness. Trends in Cognitive Sciences 1:261–67.Google Scholar
Simpson, G. G. (1949) The meaning of evolution. Yale University Press.Google Scholar
Singer, W. (2011) Consciousness and neuronal synchronization. In: The neurology of consciousness, ed. Laureys, S. & Tononi, G., pp. 4352. Academic Press.Google Scholar
Skinner, B. F. (1953) Science and human behavior. Macmillan.Google Scholar
Sobel, N., Prabhakaran, V., Zhao, Z., Desmond, J. E., Glover, G. H., Sullivan, E. V. & Gabrieli, J. D. (2000) Time course of odorant-induced activation in the human primary olfactory cortex. Journal of Neurophysiology 83(1):537–51.Google Scholar
Sperry, R. W. (1952) Neurology and the mind-brain problem. American Scientist 40:291312.Google Scholar
Stevenson, R. J. (2009) Phenomenal and access consciousness in olfaction. Consciousness and Cognition 18:1004–17. doi:10.1016/j.concog.2009.09.005.Google Scholar
Stopfer, M., Bhagavan, S., Smith, B. H. & Laurent, G. (1997) Impaired odour discrimination on desynchronization of odour-encoding neural assemblies. Nature 390(6655):7074.Google Scholar
Suhler, C. L. & Churchland, P. S. (2009) Control: Conscious and otherwise. Trends in Cognitive Sciences 13:341–47.Google Scholar
Tetlock, P. E. (2002) Social functionalist frameworks for judgment and choice: Intuitive politicians, theologians, and prosecutors. Psychological Review 109:451–71.Google Scholar
Thagard, P. & Stewart, T. C. (2014) Two theories of consciousness: Semantic pointer competition vs. information integration. Consciousness and Cognition 30:7390.Google Scholar
Thorndike, E. L. (1905) The functions of mental states. In: The elements of psychology, ed. Thorndike, E. L., pp. 111–19. A. G. Seiler.Google Scholar
Tipper, S. P. (1985) The negative priming effect: Inhibitory priming by ignored objects. The Quarterly Journal of Experimental Psychology 37A:571–90.Google Scholar
Tolman, E. C. (1948) Cognitive maps in rats and men. Psychological Review 55:189208.Google Scholar
Tsotsos, J. K. (1995) Behaviorist intelligence and the scaling problem. Artificial Intelligence 75:135–60.Google Scholar
Tsotsos, J. K. (2011) A computational perspective on visual attention. MIT Press.Google Scholar
Uhlhaas, P. J., Pipa, G., Lima, B., Melloni, L., Neuenschwander, S., Nikolic, D. & Singer, W. (2009) Neural synchrony in cortical networks: History, concept and current status. Frontiers in Integrative Neuroscience 3, article 17. (Online journal). doi:10.3389/neuro.07.017.2009.Google Scholar
Van Opstal, F., Gevers, W., Osman, M. & Verguts, T. (2010) Unconscious task application. Consciousness and Cognition 19:9991006.Google Scholar
Vanderwolf, C. H. & Zibrowski, E. M. (2001) Pyriform cortex beta-waves: Odor-specific sensitization following repeated olfactory stimulation. Brain Research 892:301308.Google Scholar
von Stein, A. & Sarnthein, J. (2000) Different frequencies for different scales of cortical integration: From local gamma to long range alpha/theta synchronization. International Journal of Psychophysiology 38(3):301–13.Google Scholar
Ward, L. M. (2003) Synchronous neural oscillations and cognitive processes. Trends in Cognitive Sciences 7:553–59.Google Scholar
Welford, A. T. (1952) The “psychological refractory period” and the timing of high-speed performance: A review and a theory. British Journal of Psychology 43:219.Google Scholar
Werner, H. & Kaplan, B. (1963) Symbol formation. Wiley.Google Scholar
Wessel, J. R., Haider, H. & Rose, M. (2012) The transition from implicit to explicit representations in incidental learning situations: More evidence from high-frequency EEG coupling. Experimental Brain Research 217:153–62.Google Scholar
Wilke, M., Mueller, K.-M. & Leopold, D. A. (2009) Neural activity in the visual thalamus reflects perceptual suppression. Proceedings of the National Academy of Sciences USA 106:9465–70. doi:10.1073/pnas.0900714106.Google Scholar
Woodman, G. F. & Vogel, E. K. (2005) Fractionating working memory: Consolidation and maintenance are independent. Psychological Science 16:106113.Google Scholar
Wundt, W. (1902/1904) Principles of physiological psychology, trans. Titchener, E. B.. Sonnenschein. (Original work published in 1902; English translation from the 5th German edition [1904]).Google Scholar
Xu, F., Greer, C. A. & Shepherd, G. M. (2000) Odor maps in the olfactory bulb. Journal of Comparative Neurology 422(4):489–95.Google Scholar
Zibrowski, E. M. & Vanderwolf, C. H. (1997) Oscillatory fast wave activity in the rat pyriform cortex: Relations to olfaction and behavior. Brain Research 766:3949.Google Scholar
Figure 0

Figure R1. The conscious field, with a different medley of conscious contents at each moment in time. Each of the three conscious fields, representing three different moments in time, possesses its own configuration of conscious contents (the filled shapes). One conscious content (e.g., the triangle) can be a sound; another conscious content (e.g., the square) can be an olfactory stimulus or an action-related urge.

Figure 1

Figure R2. The conscious field, with a different medley of conscious contents (filled shapes) at each moment in time. The conscious field during the third moment includes the percept “da,” induced by the intersensory, McGurk illusion. Another conscious content could be a phonological representation (e.g., /haus/) triggered by seeing a word (e.g., “HOUSE”). The lines feeding into each conscious field represent the unconscious and often polysensory configurations of afference that are involved in the generation of each content.

Figure 2

Figure R3. At one moment in time, the conscious contents are apprehended by the unconscious, action mechanisms of the skeletal muscle output system. Each mechanism (represented by a gray sensor) is associated with a certain kind of (unconsciously mediated) action (e.g., articulating vs. reaching), which is signified by the Rs (for Responses).