Maybe the best instantiation of a minimalist theory of grammar is to prove that there is no grammar to begin with. William O'Grady's Syntactic carpentry follows this strong course and argues for an emergentist approach to language, which differs from most variants of generative grammar. In O'Grady's emergentist approach, syntactic theory is subsumed under a generalized theory of processing, with the ultimate aim to discard Universal Grammar (UG) as a whole. This grammar-as-processing theory of language leaves no room for an autonomous system of generative rules (cf. also Hawkins Reference Hawkins1994); instead language evolves as a product of efficient processing, reflecting the general, non-language-specific cognitive capacity of human beings. In other words, no constructional blueprint for sentence formation is assumed; native speakers are ‘language carpenters’ who design a sentence by combining lexical items. Sentence building is understood to be a target-oriented task that needs to be completed at first opportunity, regulated by our limits of efficiency-driven processing (30). As O'Grady sets out to show, ‘efficiency is the driving force behind the design and operation of the computational system for human language’ (2).
The concept of emergentism is related to connectionism, which models learning and cognition in terms of networks of neuron-like units. Both approaches pursue the strong claim that symbolic representations (including syntactic representations) do not play any role in the computation of language. In (O'Grady's version of) emergentism, the computational routines of linguistic knowledge rest on efficiency-driven processing constraints.
Chapter 1, ‘Language without grammar’, and chapter 2, ‘More on structure building’, provide the background for O'Grady's subsequent analyses and illustrate the basic formalism of his emergentist approach. Here, O'Grady addresses the issue of the basic dichotomy of the lexicon and the computational system. The set of properties specified for a given lexical item include its syntactic category and a selectional grid that lists the arguments subcategorized for by that lexical item. In addition, the lexical entry contains information as to the thematic role of each argument and spells out its linear position with respect to the selecting category, similar to the modelling of the functor–argument distinction in Categorial Grammar (Steedman Reference Steedman1996). The example in (1) shows the lexical entries for the verbs carry, hop and the preposition on, respectively.
(1)
While O'Grady seems to assume a minimal computational system, this system, as he repeatedly asserts, does not include an operation like Chomsky's (Reference Chomsky1995) Merge, which derives complex structures by recursively combining two syntactic objects. The major task for the computational system, according to O'Grady, is to resolve the dependency relations stated in the lexicon, such as verb–argument relations, subject–agreement relations and coindexation, as soon as possible.
The task is achieved by combining two lexical items, where ‘Combine’ is understood as a processing condition that resolves argument dependencies (‘Resolve’) and evolves syntactic categories, building sentences left to right. The following example illustrates the derivation of Mary speaks French (8f.)
(2)
In (2), Mary is the argument, whereas speaks is the functor. Combine applies to Mary and speaks and resolves the verb's first argument dependency, combining the verb with the nominal to its left (Ni). Sentence-building continues, and Combine applies to speaks and French, thereby resolving the verb's second argument dependency, combining the verb with the nominal to its right (Nj). Each individual step of Combine creates a new syntactic representation, although in O'Grady conception, syntactic representations ‘are just a fleeting residual record of how the computational system goes about its work’ (9). In other words, syntactic representations have no computational status at all; rather, they are a list of footprints left by efficient processing.
Since O'Grady's emergentist theory does not recognize empty categories, the grammatical relation between functor and arguments needs to be explicitly expressed at each step of Combine and Resolve. For instance, in double-object constructions in which a ditransitive verb selects for three arguments, we observe that the verb (e.g. teach) combines with its third argument directly (16f.), cf. (3).
(3)
The claim that a ditransitive verb and its third argument form a discontinuous constituent is supported by the existence of various idiomatic expressions containing the verb and the theme, such as give X a hard time, lend X a hand (18).
Although Combine does not always resolve dependency relations, the efficiency-driven processor has no alternative but to combine two syntactic items for the purpose of minimizing the burden on working memory. In the sentence Jerry quickly succeeded, in which the adverb quickly intervenes between the subject and the verb, no formal relation is established between Jerry and quickly. Yet Combine nonetheless applies to the nominal and the adverb. In a second step, quickly combines with succeeded and resolves the adverb's dependency on a verbal category. What is different from the derivation in (3) is that the verb's argument dependency is passed upward (an operation which O'Grady terms ‘feature passing’) until it can be resolved by the required nominal (Jerry).
(4)
In Chapter 3, ‘Pronoun interpretation’, O'Grady claims that referential dependencies are resolved in exactly the same manner as argument dependencies and argues that Binding Theory, which defines the licensing of coindexical relations in terms of structural relations, can be dispensed with. In Harvey admires himself, an argument dependency is established between Harvey and admires at the first step of Combine, followed by an argument dependency between admires and himself. The referential dependency introduced by the reflexive pronoun results in coindexation between Harvey and himself on the assumption that admires copies the referential index from Harvey, which is immediately transferred to himself as a result of efficient processing. According to O'Grady, efficiency-driven processing provides a better explanation than Binding Theory for the locality of coindexation: in John ithinks Jerry joverestimates himself *i/j, coindexation between himself and the closer antecedent Jerry minimizes the burden on working memory, and hence the referential dependency between Jerry and himself is preferred to the long-distance relation between John and himself. O'Grady further follows standard Binding Theoretical approaches in arguing that plain pronouns should not be treated analogously to anaphors. In his emergentist framework, the distinction between Principle A of the Binding Theory, according to which an anaphor requires a c-commanding antecedent in its minimal domain, and Principle B, which states that a plain pronoun cannot have a c-commanding antecedent within the same domain, is ascribed to the pragmatic realm.
Chapter 4, ‘Control’, and chapter 5, ‘“Raising” structures’, discuss the analysis of control and raising constructions. Dispensing with a movement analysis and the postulation of null elements such as PRO in control structures, O'Grady claims that it is referential dependencies that account for the fact that, in a structure like John decided to leave, John is the thematic subject of leave. More specifically, he proposes that the referential index of John is copied to decided and immediately transferred to to leave. In subject infinitival clauses, such as To leave early embarrasses the hosts, in which no referential dependency can be established for leave, the computational system has to resort to pragmatic information and the subject of the control infinitive is rendered ambiguous (i.e. it is interpreted as either ‘for anyone to leave now’ or ‘for us to leave now’). Raising constructions, on the other hand, such as John seems to tend to win, which are standardly taken to involve successive movement, are derived by successive downward transferal of a referential dependency from the matrix predicate (seems) to the embedded predicates (to tend, to win). As a result, it is the same processing principle, viz. a referential dependency between the infinitival verb and the subject, that accounts for both control and raising constructions. What determines the difference between the two structures is their distinctive lexical properties. Control verbs determine the thematic role for their subject argument, while raising verbs do not.
Chapter 6, ‘Agreement’, introduces agreement dependencies. For instance, in the sentence John runs, the agreement dependency of runs needs to be resolved by John via Combine. Agreement dependencies are secondary to argument dependencies. Since argument dependencies are the driving force for agreement, they provide a clue as to why agreement in questions (e.g. Who are/*is the boys sitting next to?) and expletive constructions (e.g. There are/*is three men in the garden) occurs with the ‘associative’ subject rather than with the wh-phrase or the expletive: the arguments in questions and expletive constructions are to the right of the predicates, not to their left as in most other cases. That agreement dependencies are resolved at the first opportunity is further instantiated in coordination, in which the predicate may agree with the first instead of the second conjunct (e.g. There is water and sand on the floor; 104). Partial agreement cases in Moroccan Arabic and Brazilian Portuguese are claimed to provide strong empirical support for O'Grady's approach.
Chapter 7 focuses on ‘Wh questions’. O'Grady assumes that wh-words contain a wh-dependency feature which needs to be resolved by a neighboring predicate. He argues that a fundamental advantage of his processing approach lies in its treatment of island effects. For O'Grady, island effects arise because wh-dependencies of a given wh-word are not resolved immediately, which increases the burden on working memory in sentence building.
Chapter 8, ‘The syntax of contraction’, claims that contractions (e.g. I've, she's) provide further support for the dependency between a subject and a verb, and that the phonological reduction observed suggests that the two elements are indeed combined at some point in the course of sentence formation. In chapter 9, ‘Syntax and processing’, O'Grady argues that the idea that argument dependencies must be resolved at the first opportunity is supported by findings from numerous psycholinguistic studies, including cross-modal priming tests for reactivation of antecedents, self-paced reading tasks and Event-Related Potential (ERP) experiments.
In chapter 10, ‘Language acquisition’, O'Grady suggests that language acquisition is facilitated by efficient processing that minimizes working memory, and that linguistic development can be explained by the emergence of ‘computational routines’ that are defined by the efficient language processor, i.e. operations that are needed to form and/or interpret particular sentences, thus making it possible to dispense with UG. O'Grady differs from other critics of generative grammar in that he also stresses that language acquisition is not achieved by experience only, given the poverty of the input and the speed with which children master language. The connection between minimization of the demands placed on working memory and language production/comprehension is strengthened by cases of agrammatism, in which agrammatic aphasia results from ‘a reduction in the working memory resources available for sentence processing’ (202). The book ends with ‘Concluding remarks’ in chapter 11.
O'Grady's book argues forcefully that syntax and, what is more, grammar should be entirely dispensed with in favor of task-oriented efficiency-driven processing. It should be noted that O'Grady's critique of a human language faculty as a generative device echoes ideas voiced since the inception of generative grammar in the 1950s, and this book does not endorse a radically different stance from the one familiar from the literature on language processing and psycholinguistics. Yet, the book deserves particular attention from syntacticians who treat any considerations of linguistic function to be external to the computational system of grammar. O'Grady's claim that syntacticians are working on something that virtually does not exist merits serious consideration and, possibly, criticism, instead of simply being ignored by syntacticians.
Linguistic emergentism raises a number of questions. The most serious question to me as a syntactician concerns the notion of phrasal categories in this theory. In O'Grady's approach, phrasal categories do not have any independent status in computation, i.e. the computational system combines and resolves the argument dependencies between two items at the first opportunity, without postulating any intermediate or maximal projections. However, central to O'Grady's theory is the mechanism of feature passing, which forces recourse to a higher level of syntactic organization. Is there any way to resolve the argument dependency between Jerry and succeeded in Jerry quickly succeeded, without making use of a mother node as a site for feature passing? Readers should bear in mind that the mother node, as indicated in the book's tree diagrams, is nothing but a grouping of two lexical items, which is conceptually independent of the postulation of phrasal categories. Neither mother nodes nor phrasal categories enjoy any formal status during the computation in emergentism, even though feature passing through the mother node seems an indispensable property of the whole framework. The issue extends to particular problems in pronoun interpretation. In the sentence Mary j's sister ioverestimates herself i (34), Mary's sister (instead of Mary or sister) is the antecedent of the anaphor herself. Placing a referential feature at the mother node of Mary's and sister is no different from saying that Mary's sister should be treated as a single constituent. If O'Grady is correct, we would expect feature passing not to be a design feature of the computational system that combines and resolves dependencies. While it would be wrong to conclude on this evidence alone that emergentism on the whole is flawed, we should understand that the resort to phrase structures (or not) is de facto the central issue of the whole inquiry, i.e. the question as to whether a grammar describing the internal structure of a sentence or a phrase is part of our language faculty. O'Grady's claim that there is no grammar fails to convince this particular reader.
A related issue concerns the notion of constituency. Although not impossible, a theory that does not recognize phrasal categories and understands constituency to merely indicate a grouping of lexical items faces considerable problems. Recall English contraction, in which the subject and the verb are combined at first opportunity, which, as O'Grady claims, supports the left-to-right theory of sentence building. Contraction and coordination phenomena alike imply that phrasal categories are important as primitives of constituents. For instance, in John bought a book and a car, the two conjuncts must be of the same category since they are subject to the Coordinate Structure Constraint (*What did John buy a book and?). A theory of constituents without phrasal categories will be unable to account for the ungrammaticality of Coordinate Structure Constraint violations. While this book presents the reader with a host of interesting questions (for example, concerning the incorporation of linear order into the theory of language), a larger body of work that addresses the important generalizations and observations made in generative syntax will be needed to support the emergentist proposal.