Recursion theory is generally argued to form procedures that would determine potentially infinite non-arbitrary sets such as Hilbert’s problem in mathematics and the Turing machine in computer science, and it is also regarded as a research program for cognitive science (Brattico Reference Brattico2010). The idea of recursion in linguistics can be traced back to von Humboldt’s famous observation of ‘infinite use of finite means’, which characterizes humans’ ability to generate discrete infinity within cognitive domains such as language, thought, imagination and social cognition. Though always present throughout the developments of Generative Grammar since its birth, recursion has become the object of robust inquiry only recently, stirred by the conceptual distinction between the Faculty of Language in the broad sense and the Faculty of Language in the narrow sense (Hauser, Fitch & Chomsky Reference Hauser, Chomsky and Fitch2002). Accordingly, the latter only includes recursion, the unique component of the Faculty of Language. Without properly distinguishing recursive process and recursive structure, however, the claim invokes hot debates and controversies. Since then extensive studies have been emerging, regarding theoretical construction, language acquisition and biological behavior comparisons. Most works fail to provide an orderly computational theory in terms of faculties and modules.
In this thought-provoking book, Recursion: A Computational Investigation into the Representation and Processing of Language, David J. Lobina makes efforts to provide a comprehensive account of recursion, keeping a distinction between computation and actual implementations of recursion. This book begins with ‘Putting up barriers’, primarily as an introduction to give the book a general framework. Chapter 1 provides the necessary technical preliminaries in the formalization of algorithms and computations. Chapter 2 establishes the actual nature of algorithm underlying the faculty of language, Chapter 3 examines the operation of computational system, Chapter 4 takes the universality and uniqueness of recursion in language, Chapter 5 probes the potentially recursive nature of a sub-operation of the syntactic parser, and Chapter 6 critically assesses alternative methods applied to probe language recursion. The book concludes with ‘Putting it all together’.
Recursion in language should be understood at the level of competence or mental architecture, hence the Universality Claim, which concerns the invariant potential capacity of a generative system rather than the actualities of linguistic behavior. Such a system is unique to language, hence the Uniqueness Claim, which is unattested in other species till now. Previous biological research involving the artificial grammar learning paradigm actually targets the expressive power of language, rather than the core property of language structure. To probe structural properties, Lobina relates language and general cognition, arguing that general cognition such as vision and Theory of Mind certainly exhibit hierarchically nested or self-embedding structures. However, language and cognition have completely different organizations, and their representations are not isomorphic. More specifically, discrete infinity or productivity of linguistic performance unavoidably needs to postulate a recursion system with generative potential probability, but hierarchical structure building does not mean recursive structure building.
A given cognitive phenomenon, according to the Modularity of Mind (Marr Reference Marr1982), can be properly investigated only in the ordered hierarchical levels of implementation (or hardware), mechanism, algorithm, and computation. Along this line, Lobina argues that recursion can be identified and clarified at the level of mechanisms, representations, and processes. That is, the first establishes what kind of mechanical procedure is underlying a cognitive domain, the second works out how the mechanical procedure is implemented, and the third figures out the actual shape of specific mental processes in behavior.
Recursion in formal mathematics and computer science is generally understood as the generative procedure such as successor function, an inductive definition such as the famous Fibonacci sequence, which is generally defined that after the first two seed values
$\text{F}_{1}=1$
and
$\text{F}_{2}=1$
, every number is the sum of the two preceding ones,
$\text{F}_{\text{n}}=\text{F}_{\text{n}-1}+\text{F}_{\text{n}-2}$
. Such a definition, without distinguishing algorithm, implementation, and process (the third is composed of operation, and variables or mental representations in cognition), is inadequate for describing and defining the nature of recursion. More importantly, recursively-defined procedures may be implemented recursively or non-recursively (such as iteration), both resulting in an apparent hierarchy or embedding structure. A recursion encompasses different input–output relations, imposing the necessity to distinguish constructs of recursive generation, processing, and data structure. On this basis, Lobina argues that recursion is to formulate constructs such as functions, production systems and recursors, sharing the very property of self-reference (or self-call) that makes the constructs recursive. The denotation of self-reference is furthermore associated with four constructs which can apply to different levels of explanation, that is, a definition by induction, recursive generation (a general property of particular computational systems), an architectural attribute of structures, and a property of computational process. These connotations, if not properly identified and located, may potentially be conflated into a single phenomenon, leading to many confusing explanations.
One of the conflations running though the development of Generative Grammar is recursive mechanism and recursive structures. Chomsky’s focus in most of his works is on the recursive specification of the computational operation, mostly based on the influence of mathematic logic. Recursion was introduced into linguistic theory in the 1950s as a feature to make a production system an appropriate formalization of a mechanical procedure. The application of recursion within production systems such as re-writing rules and self-embedded sentences is not related to generation, the global property of recursion, resulting in an illusion of close connection between recursion and self-embedding. Merge, the syntactic operation recently delineated as a set theoretic operation, is analogical to successor function, whose process without forming chains of deferred operations is not recursive. The recursive character of the generated structures is entirely independent of recursive generation of Merge.
The next question is the theory of computation of linguistic knowledge, that is, how a recursor or recursive generation translates into the actual derivation. In line with the distinction between a state description and a process description in complex systems (Simon Reference Simon1962), Lobina argues that it is necessary to distinguish a derived tree structure (the final geometrical derivation) from a derivation structure (the process descriptions codifying the computational operation of Merge). Though both hierarchical representations may be described as recursive in their own terms, the derivation structure attracts the attention of linguists, since the structural descriptions of a given formalism are the result of the operations of the derived tree. In doing so, Lobina decomposes linguistic derivation into atomic components, that is, lexical items, syntactic derivation (computational process), general computational principles, and interface conditions imposed by sensorimotor (SM) and conceptual/intentional (C/I) system. Following the Minimalist Program, Lobina shows that lexical items start and drive syntactic derivations, but the feature-valuation processes do not contain deferred operations. Merge operates over lexical items in conjunction with other operations (Select, Minimal Search and Transfer) at each step, but does not permit any kind of self-calls during its operations. The shape of a derivation therefore is iterative in nature.
Finally, Lobina takes the problem of how to relate linguistic competence and performance, by concentrating on interaction between grammar and parser, and interaction between parser and memory load. Linguistic competence is composed of computational operation, lexical items and two interfaces, and linguistic performance is composed of a grammar (competence), a parser, SM system, a recognizer, various memories and some others. Crucially the parser can be divided into two stages, the preliminary phrase packager (PPP), which can be further divided into two building operations, minimal attachment (incorporating a word into a structure) and late closure (attaching new material to the node currently being processed), and sentence structure supervisor (SSS). The application of PPP creates local parse by recalling some of the properties of the preliminary analysis and then are put together by the SSS, indicating a possible recursive implementation in language processing and understanding.
Different from techniques in computer science and natural language processing, Lobina adopts a bottom–up approach, incorporating perceptual and lexical retrieval into language processing. The parser connects perceptual system and grammar, and therefore,
$\text{A}\times \text{S}$
(analysis-by-synthesis) can be modeled as encompassing two stages, first the generation of a candidate representation of input followed by a comparison of candidate representations, and second a number of calculations of perceptual candidates until synthesized into the correct representation. Such experimental design follows the spirit of Turing’s methodology in probing the structural properties of an abstract machine. It is different from previous works in cognitive psychology, which is informative only of how the mind conceptualizes structural properties of an input, but not on recursive processing, since mere processing is not a test for recursion.
The perceptual system, parser, and grammar all interact to construct asymmetry Head Complement phrases, suggesting a recursive solution for a complex problem. Given that the parser builds SHCs (Specifier–Head–Complement) in succession, a recursive process at edge S or C (the loci of possible deferred operations) would postulate more memory load than an iterative process, because stacked memory is required for storage and integration until lower-level operations are completed. Lobina employs the tone-monitoring paradigm extensively applied in syntactic parsing, designing two Spanish sentence structures, a complex subject but a simple object as [NP–[PP–NP]–[VP–NP]] and a simple object but a complex object as [NP–[VP–NP–[PP–NP]]]. To track the memory load in structure building and syntactic parsing, Lobina records and calculates the correlation between participants’ reaction time and complexity through click-detection method. One noticeable finding is the decreasing tendency for reaction times across each type, suggesting that the perceptual position effect and processing load gradually decrease towards the end of a clause. The overall result shows that recursively-specified linguistic procedure is implemented through reduction of recursor to iterators, confirming the general conclusion in computer science that recursively defined algorithms are implemented in a cost-efficient iterative manner.
Summarily, recursion can apply to recursive definitions in many disciplines, to the general and central property of algorithms as in the production system, to the actual processing operations at the level of performance, and to the hierarchically structured representations which are unique in humans. This book also clarifies some misconstruals of recursion interpreted and employed in cognitive studies, where recursion more or less means self-embedding, either in terms of structure or operation. For example, previous experimental studies which involve the contrasts between humans and tamarin monkeys (Fitch & Hauser Reference Fitch and Hauser2004) actually aim at, according to the book, the expressive power of language but not at any level of recursion, and therefore blur the computational and algorithmic level. From this aspect, the book provides some illuminating directions for further empirical and theoretical research.
There are some other highlights of this book. One is that it provides a novel way to probe the link between linguistic competence and performance, and the relation between recursion and Merge. Another one is that the book keeps the central principles of current biolinguistics by sticking to a bottom–up approach to syntactic building, and focusing on dynamic syntactic derivations. This may bring about dramatic and interesting changes in the thoughts of constructing syntactic theory. Along this line, many previously accepted theory construals such as DP or CP may be the result of syntactic derivation convergence, rather than an existing structure before derivation as widely assumed in Generative Grammar.
In regard to biolinguistics, however, there is some confusion in this book. As Boeckx (Reference Boeckx2015: 5) explicitly points out, ‘lexicocentrism may be the biggest internal obstacle that lies on the road towards a level of explanation that Chomsky has referred to as “beyond explanatory adequacy”’. Surprisingly this book assumes that lexical items are full of lexical features, which deterministically derive syntactic derivation. This is evolutionarily implausible because relatively short evolutionary time cannot provide adequate space to supply lexical items with complex and elaborated features for a given language. Furthermore, the book needs more linguistic typology from a range of languages to support its conclusion. This aside, the book offers very thought-provoking insights into the research of recursion.