Juliet Romeo
Romeo Juliet
Word
Juliet
Romeo
Foot (R)
Jul
Rom eo
Output
“Romiet and Juleo”
As figure 9.2 suggests, the lowest unitized elements of phonetic output are
deactivated after performance by cerebellar deperseveration. This deactivation
causes a rebound at the lowest level of cerebral planning, just as closing one’s
eyes can generate a McCollough effect rebound. Above the lowest levels of
phonetic output, it appears that rhythmic dipoles play an increasing role, so
that cerebellar deperseveration and dipole rebounds supply a bottom-up ter
mination signal to motor plans.
The Spooner circuit models developed thus far not only explain metathetic
“slips of the tongue” like Romiet and Juleo or lought and thanguage but also cor-
Figure 9.2.
A Spooner circuit for lought and thanguage.
ROMIET AND JULEO
• 139
rectly exclude metathetic faux pas such as 9.5–9.9, which are quite unattested
in the speech error literature:
*Romjul and Eoiet
(9.5)
*dear our queen old
(9.6)
*our dean old queer
(9.7)
*langt and thouguage
(9.8)
*thoughj and languat
(9.9)
It appears that metrical feet are not merely a poetic metaphor. They are
universal to all languages because subdividing an immediate memory span of
four beats into two groups of two (or three or, rarely, four) gives greater stabil
ity to the serial plan. Perhaps it is not accidental that, after humans, the ani
mals which we credit with the greatest ability to plan and to speak, sign, or
otherwise perform serial acts with anything even approaching the complexity
of language are also bipedal.
1
In discussing metrical feet, we must be careful not to confuse basic dipole
rhythm with stress, timing, or other rhythmic epiphenomena. English is often
described as a “stress-timed language,” apparently to distinguish its downbeat-
offbeat rhythm from languages like French which do not alternate heavy and
light syllables or from putatively monosyllabic languages like Chinese.
2
Down-
beat/offbeat and metronomic timing may make language poetic and dancelike,
but they are not necessary. Physiologically, they are quite impossible for one’s
feet to maintain when hiking in the woods, and musically, Gregorian chant does
without them as well. Only repetitive alternation is needed for dipole rhythm.
Offbeat Morphology
One might reasonably ask how little words like and or of affect spoonerisms
and the rhythm of language. These “little words,” sometimes called functors or
grammatical morphemes, are the grammatical glue that holds sentences together
(and in which language learners always seem to get stuck). So what role do the
grammatical morphemes play in spoonerisms like Romiet and her Juleo or in lought
and thanguage? The answer appears to be that they play no role at all.
These grammatical morphemes are universally unstressed.
3
They occur on
the “offbeats” of language, and this in itself may account for a good portion of
the difficulty of learning a language. In the sentence Then the murderer killed the
innocent peasant, the nouns and verb are prominent and dramatic; it is easy to
understand how such nouns and verbs can resonate in the mind and form
“downbeats,” milestones in the motor plan of the sentence. But what about
140 •
HOW THE BRAIN EVOLVED LANGUAGE
the time-pronoun then or the or the -ed of kill? Leave them out and our sen
tences become telegraphic, like in Broca’s aphasia. What kind of resonant
subnetworks support these little words? What does “then” mean? The brain can’t
smell then with its olfactory bulb or see it with striate cortex or feel it with tac
tile cortex, so there’s nothing substantive to associate then with in parietal cor
tex. It seems grammatical morphemes can exist only as auditory images in
temporal cortex and motor plans in Broca’s area. In what system can these ethe
real morphemes resonate? The answer appears to be in the grammatical system.
As critical as we must be of generative theory, it had many valuable insights,
and one of them was its postulation of an autonomous syntax. In an adaptive
grammar, this autonomous syntax seems to be more morphology than syntax
and word order, but morphology does seem to organize its own networks, on
the offbeat, a mortar to the substantive bricks of meaning.
From our analysis so far, we can analyze five unitization levels of rhythm
and morphophonology: phrase, word, foot, syllable, and phone sets. At the top, as
in figure 9.1, is the phrase. The phrase is a primacy gradient of substantive
words. Each word is organized into feet, each with a downbeat and an offbeat.
Each beat consists of one or several syllables. Each syllable can be subdivided
into two phone sets: consonant(s) and vowel(s) or, more abstractly, onset and
rhyme.
One can imagine how the rhythm of language might have evolved through
phylogeny:
Australopithecus africanus
/ta/
/di/
Homo habilis
/tata/
/didi/
Homo erectus
/tarzæn/
/d
áen/
Homo sapiens
/mi tarzæn/ /yu d
áen/
Homo loquens
I am Tarzan. Thou art Jane.
These paleontological associations are fanciful, of course, but insofar as on
togeny recapitulates phylogeny, the evolutionary scheme is borne out. The child
first produces single syllables and then duplicates them into two beats. Even
tually, the onsets and rhymes are elaborated, and the beats differentiate them
selves, forming true two-syllable words. Then the child enters the two-word
stage, producing two words with the same fluency and control—in the same
rhythm—as he previously produced one (Branigan 1979). In this stage, gram
matical morphemes begin to make their offbeat appearance.
We were led to this rhythmic view of morphology from a more general
consideration of how parallel neurons can encode serial behavior. The details
of this intricate morphophonemic dance go well beyond what we can consider
here, so we will return to the topic again when we consider language learning
in chapter 12. But several more topics are worthy of a passing note.
In addition to free grammatical morphemes like then, and, prepositions, and
the like, there are also two kinds of bound grammatical morphemes: derivational
morphemes and inflectional morphemes.
ROMIET AND JULEO
• 141
The derivational morphemes fit easily into the rhythmic framework we have
been developing. In fact, they fairly require a rhythmic analysis. For example,
when an English noun like reciprócity or an adjective like recíprocal is derived
from a stem like recipro-, the downbeat of the derived form shifts according to
the suffix (Dickerson and Finney 1978). We might say that -ity has the under
lying form 1-ity, whereas -al has the underlying form 1-2-al, where 1 and 2 are
stress levels of the preceding syllables (“downbeat” and “offbeat,” respectively).
A recurring research question in morphology is whether accomplished
speakers of a language generate forms like reciprócity from such underlying
representations “by rule” and “on the fly,” even as they speak, or if they first
learn and store reciprócity in its final form and then simply retrieve it, prefabri
cated, when they speak. This is a bit of a trick question, since it seems clear
that accomplished speakers can do both. Children, however, at first seem only
able to access prefabricated forms, as is shown by wug tests.
Berko (1958) and later Derwing and Baker (1979) assessed children’s lan
guage development by measuring their ability to correctly fill in blanks like those
of 9.10b and 9.11b:
This is a picture of one wug.
(9.10a)
This is a picture of two wugs.
(9.10b)
This is a picture of a pellinator.
(9.11a)
A pellinator is used for pellination.
(9.11b)
Somewhere around the age of four, as their vocabularies expand, children
become able to generate novel inflectional forms like 9.10b in accordance with
the “rules” in 9.12:
4
+sibilant#
→ #Iz, e.g., rose → roses
(9.12a)
+voiced, –sibilant#
→ #z, e.g., road → roads
(9.12b)
–voiced, –sibilant#
→ #s, e.g., rope → ropes
(9.12c)
Rule 9.12 says that if a sibilant phoneme (/s/, /z/, etc.) occurs at the end
of a word (#), then the plural form adds -es (subrule 9.12a). Otherwise, the
voiced or voiceless plural ending /z/ or /s/ is added. Since children of three
and even two can use plural forms like cats and dogs but still flunk the wug test,
it appears they do not generate plurals by rule. But why, at age four, should
children suddenly stop saying cats by rote and start generating it “by rule”? An
adaptive grammar would simply say they do both. Whenever an appropriate
plural concept is coactivated with a [–voiced, –sibilant#] word (e.g., cat), a
/-s/ resonance is activated. Along with that /-s/ resonance, the whole-word
142 •
HOW THE BRAIN EVOLVED LANGUAGE
form /kæts/ may also be activated.
5
The case of 9.12a is, however, somewhat
different. In this case, the inflectional morpheme is syllabic, and at least in later
learning, it would be learned under the control of the offbeat of the metrical
foot rhythm generator. Such inflections should be primarily accessible “by rule,”
and as we shall see in chapter 12, these inflections are especially vulnerable in
language disorders like aphasia or dysphasia.
By adding dipole rhythm generators to the serial mechanisms developed in chap
ter 8, an adaptive grammar can easily model metathesis and related aspects of
phonology and morphology. These models converge with recent theories of
metrical phonology to identify rhythm as a central organizing mechanism
of language. These rhythms seem to be directly relevant to the cognitive pro
cessing of (word) morphology, but in the next chapter, we will see that the model
can be extended naturally to account as well for the structures of syntax.
NULL MOVEMENT
•
143
•
T
E
N
•
Null Movement
In chapter 9, we saw how dipole anatomies could explain phonological met
athesis in spoonerisms. But as Lashley (1951) noted, metathesis is a much more
general phenomenon, one that occurs in virtually all forms of serial behavior.
Lashley’s observations were not wasted on Chomsky, who saw that metathesis
also occurred in syntax. One of the first problems Chomsky undertook to solve
was the problem of relating sentence 10.1 to 10.2:
John kissed Mary.
(10.1)
Mary was kissed by John.
(10.2)
Although the structures of 10.1 and 10.2 are metathesized, they mean the
same thing.
1
In both sentences, Mary gets kissed, and John does the kissing.
Linguists say that 10.1 is in the active voice and 10.2 is in the passive voice.
Chomsky recognized that if we represented 10.1 in a tree structure like figure
10.1, then the derivation of 10.2 from 10.1 could be described by simply swap
ping the two noun phrase (NP) nodes. He called this a transformation, and the
earliest forms of generative syntax were accordingly known as transformational
grammar. Of course, along the way one also must attend to myriad grammati
cal cleanups, like changing kissed to was kissed, but transformations on tree struc
tures promised to describe many sentential relations, if not to solve Lashley’s
conundrum for all serial behavior. For example, the metathesis in 10.3–10.4
could also be explained using tree structures:
John is singing a song.
(10.3)
Is John singing a song?
(10.4)
143
144 •
HOW THE BRAIN EVOLVED LANGUAGE
Figure 10.1.
A basic syntactic tree.
It was the 1950s. The popular scientific metaphor was that the human brain
is a computer, and Church’s lambda calculus extension of Turing’s work showed
how computation could be performed on tree structures. All this implied that
the brain might use a kind of lambda calculus to generate tree structures. Trans
formations would then operate on these tree structures to produce sentences.
For example, 10.5–10.10 could be used to generate 10.11:
S
→ NP + VP
(10.5)
NP
→ (DET) + N
(10.6)
VP
→ V + (NP)
(10.7)
DET
→ a
(10.8)
N
→ {John, song}
(10.9)
V
→ is + singing
(10.10)
(S (NP (N John)
(10.11)
(VP (V is singing))
(NP (DET a))
(N song))))
To a computer scientist, the system of 10.5–10.10 has a number of elegant
features. The NP rule (10.6) is like a computer language subroutine, a rou
tine that is “called” by the S rule (10.5). Moreover, all the rules have the same
basic structure, so they can all be computed by a single mechanism. The par
enthetical syntax of 10.11 shows how the rules build on one another. This syn
tax may be hard to read, but if one turns the page sideways, 10.11 shows itself
to actually have the tree structure of figure 10.2. Indeed, the form of 10.11 is
the syntactic form of the recursive computer language LISP, which remains to
this day the preferred computer language for artificial intelligence and natu
ral language processing.
Following figure 10.2, we can model the generation of sentence 10.3 as
a “top-down” process. In attempting to generate an S by rule 10.5, a recur
NULL MOVEMENT
• 145
Figure 10.2.
Tree structure of example 10.11.
sive computing device finds it must first generate an NP by rule 10.6 (with
out the optional DETerminer) and a VP by rule 10.7. In attempting to gener
ate the VP, it finds it must first generate a V, and then another NP by rule
10.6. Finally, lexical insertion rules 10.8–10.10 operate to complete the sen
tence. In this fashion, 10.11 is built by a series of node expansions, or
“rewrites”:
(S (NP)
(VP)
)
(10.12)
(S (NP (N John)) (VP (V)
(NP)
))
(10.13)
(S (NP (N John)) (VP (V is singing) (NP)
))
(10.14)
(S (NP (N John)) (VP (V is singing) (NP (DET a) (N song))))
(10.11)
To simplify accounts of movement, such as in the derivation of 10.4 from
10.3, 10.7 was later changed to move the tense of the verb out of VP, as in
10.15:
S
→ NP + TENSE + VP
(10.15)
As a result of such changes, figure 10.2 eventually came to look like figure
10.3. Now to form a question and account for 10.4, one need only move the
TENSE node to the top of the tree, as in figure 10.4.
The tree structures of figures 10.1–10.4 explained a range of interesting
linguistic phenomena, but perhaps the most compelling capability of the gen
erative analysis was its ability to handle complex sentences with the same fa
cility as simple sentences. For example, by augmenting 10.6 with a recursive
call to the S rule, we enable the grammar to account for the problems first
raised in 2.1–2.3 (repeated here as 10.17–10.19). It can account for the rela
tive clauses in 10.17 and 10.18, and it can also block the ungrammatical sen
tence 10.19:
146 •
HOW THE BRAIN EVOLVED LANGUAGE
Figure 10.3.
Lengthening the tree.
NP
→ (DET) + N + (S)
(10.16)
The man who is
1
dancing is
2
singing a song.
(10.17)
Is
2
the man who is
1
dancing singing a song?
(10.18)
*Is
1
the man who dancing is
2
singing a song?
(10.19)
If we ask children to make questions out of sentences like 10.17, they al
ways produce sentences like 10.18. Why do they never move is
1
as in *10.19?
The classic generative answer was that there are innate, universal rules of syn
tax, akin to 10.5–10.10, and that children’s responses are governed by these
innate rules.
In later years, this position became more abstract, holding only that there
are innate “principles and parameters” that shape the rules that shape language,
but the core computational metaphor still works remarkably well. For example,
the early generative description of the clause who is dancing in 10.17 as an em-
Figure 10.4.
Syntactic tree after question transform.
NULL MOVEMENT
• 147
bedded sentence led to the later postulation of a kind of barrier against movement
between levels of recursive embedding. It is as if the parenthetic principle of
10.20 existed in the child’s mind to block illegal movement as surely as it de
fines dynamic scoping
2
in the LISP computer language:
3
(S
1
Is
2
the man (S
2
who is
1
dancing) singing a song?)
(10.20)
These generative ideas brought order to a collection of linguistic data which
seemed to defy explanation by any other theory. Unfortunately, as generative
theory became increasingly refined, it started to look more and more like be
haviorism. Unconsciously bound to the presumption that serial language must
be the product of a serial processor, Chomsky found himself declaring that “the
basic elements of a representation are chains”; and generative grammar’s tree
diagrams of the basic structure of the clause came to look more and more like
stimulus-response chains and crayfish brains (Chomsky 1995; figure 10.5).
To be sure, Chomsky was not talking of stimulus-response chains, but he
was proposing cognitive chains. Generative grammar’s node-swapping insights
were on the right track, but then it erred in presuming that a well-engineered
human mind would node-swap the same way a computer would. With this as
sumption, the serial, computational, generative explanation failed to answer
Lashley’s criticism almost as completely as did the behavioral explanation.
In the end, generative metaphysics failed to explain what moves when a
grammatical node “moves.” We are obviously not expected to believe that when
I produce sentence 10.18 in derivation from some structure like 10.4, some is
2
neuron physically changes its place in my brain. Generative linguists have lately
taken to defending the notion of movement by claiming that it is only a “meta
phor,” but after forty years, science can reasonably ask what it is a metaphor of.
Figure 10.5.
A syntactic tree devolves into a chain (Chomsky 1995).
148 •
HOW THE BRAIN EVOLVED LANGUAGE
If it is not neurons that are moving, then what exactly is moving? The simple
answer, of course, is that nothing moves. Linguists may “derive” 10.2 from 10.1,
but in normal discourse, normal humans do not. There is no such thing as lin
guistic movement. But then how do sentence pairs like 10.1 and 10.2 spring from
the brain, and how are they related?
Universal Order
Generative philosophers were right in seeking “universals” of language. Despite
the existence of thousands of languages, centuries of research have yielded doz
ens of linguistic features that are universal or almost universal. Especially note
worthy are certain universal features of syntactic order first explicitly noted by
Greenberg (1968). Greenberg classified thirty diverse languages according to
the manner in which they ordered subject, object, and verb. There are six pos
sible permutations in which these elements may be arranged: OSV, OVS, VOS,
VSO, SOV, SVO. In Greenberg’s study and subsequent research, the first three
of these proved to be virtually nonexistent, and from the accumulating evi
dence, Lehmann (1978a) observed that languages exhibited a profound SO
unity, differing only on the relatively minor OV/VO distinction. Furthermore,
Greenberg’s Universal 6 stated that: “All languages with dominant VSO order
have SVO as an alternative or as the only alternative basic order” (1968, 79).
Bickerton (1981) went beyond Greenberg to claim that there exists a strong,
universal preference for the order SVO. In his studies of pidgin languages and
creoles,
4
he posited an even deeper unity. Bickerton’s reasoning was that such
freshly invented languages offered better evidence for the “natural” or “uni
versal” structures of language because they were unelaborated by linguistic
artifacts of history or tradition.
From the preceding line of research, I conclude that the most basic fact
which needs to be explained about language is why the subject “naturally”
precedes everything else.
Subjects
Unfortunately, the traditional grammatical term “subject” has come to mean
many different things. For example, consider the following defining features
of the term “subject” in three simple English sentences:
John helps Mary.
(10.21)
He helped her.
(10.22)
Mary is helped by him.
(10.23)
NULL MOVEMENT
• 149
The subjects of 10.21–10.23 are John, He, and Mary, respectively, but subject-
hood variously implies that the subject
1. agrees with the verb in person and number (10.21, 10.23),
2. is in the nominative case (10.22),
3. is the agent of the sentence (10.21, 10.22),
4. is the topic of the sentence (10.21–10.23).
In the case of English, only assertion 4 holds for all three sentences. Con
tra assertion 1, he does not agree with the verb helped in any overt way in sen
tence 10.22. Contra assertion 2, the subject has no overt case marking in 10.21
or 10.23. And as for assertion 3, Mary is not the agent in 10.23; it is him who is
doing the helping.
Sense 4 is also the only meaning of “subject” which holds universally across
languages. Sense 1 is not universal because some languages, like Chinese, do
not overtly mark agreement between subject and verb at all. In many other
languages, overtly marking agreement is more the exception than the rule. En
glish, for example, only marks agreement between present-tense indicative
verbs and third-person singular subjects. Pidgin languages, almost by defini
tion, lack agreement or grammatical case inflections. In the cases of assertion
2 and 3, we find that nominative marking and agentive marking are often
mutually exclusive, as between accusative languages, which mark nominative
case subjects, and unaccusative languages, which mark “subjects” in the agentive
(or ergative) case.
5
Sense 4 alone applies to 10.21–10.23, it alone applies to uninflected lan
guages like Chinese, and it alone is the sense in which the term “subject” is
relevant to our analysis of universal order. Therefore, what Greenberg’s,
Lehmann’s, and Bickerton’s data suggest for adaptive grammar is that univer
sal subject-verb order is more accurately termed universal topic-verb order. Nev
ertheless, sense 4, in which the subject is regarded as the topic of discourse,
has been rather the least-noted meaning of the term “subject.”
Topicality
The topic is what we are talking about. Saussure described a word as a relation
between a sound or a graphic signifier and its referent, its significand. The topic’s
significand is usually persistently present in our cognitive environment, and
its signifier is usually persistent from sentence to sentence. Neurally, this per
sistence takes the form of a resonance in short-term memory (STM). By the
analyses of chapters 8 and 9, this means the topic should have a universal syn
tactic primacy effect, and according to Greenberg, Lehmann, and Bickerton,
so it does.
Some of the significance of topic has been previously recognized (Li 1976).
Chafe (1970), in particular, anticipated much of the importance which adap
150 •
HOW THE BRAIN EVOLVED LANGUAGE
tive grammar will ascribe to it. But like “subject,” the term “topic” has been so
variously used as to obscure its relevance to a unified theory of language. For
example, Chao (1948, 1968) and Li and Thompson (1981) refer to Chinese as
a topic-comment language rather than a subject-predicate language. In so doing,
however, they treated topicality as a unique feature of Chinese, obscuring its
universal role.
In a different usage, both structural and generative syntacticians have used
topic to refer only to unusual, marked
6
constructions such as 10.24 (from Pinker
1989):
That coat I like.
(10.24)
In our view, such “topicalized” sentences do not reflect topicality so much as
change of topicality. They are not so much topicalized as topicalizing.
Linguists, particularly sociolinguists, have also noted that old information
(or “given” information) tends to come early in a sentence, and new informa-
tion tends to come later. In this context, old information corresponds closely
to our sense of topic, but for sociolinguists, it is the conversation, not the sen-
tence, which is the basic unit of language. In taking this perspective, they broke
away early from generative linguistics and its context-free analysis of sentences.
This no-man’s-land between syntax and sociolinguistics was partially bridged
by the field of pragmatics, which used topicality relations like pronominal ref
erence to develop an intersentential syntax, but the essential and universal role
of topic in the simple sentence was still largely overlooked. Taken together,
dissident syntacticians, sociolinguists, and pragmaticians formed a broad school
of “functionalism” which perceived itself to be at theoretical odds with gen
erative linguistics. But functionalism generally failed to relate its intersentential,
discourse-level conception of topic to intrasentential syntax, and so functional
linguists and generative linguists lived side by side for years, each school writ
ing in its separate journals, neither school really understanding what the other
was talking about.
Under adaptive grammar, the topic of both sentence and discourse is
plainly and simply what one is talking about. At the moment a sentence is
about to be spoken, the topic is neurophysiologically instantiated as that word
or phrase, that motor plan which is most activated by cerebral resonance. Of
course, the significand of the topic usually still exists externally, in the sur
rounding world, in the surrounding context, or in the surrounding discourse,
but adaptive grammar’s definition of topic emphasizes the internal, cogni
tive, STM resonance. The most active STM resonance becomes the topical
signifier, the “head” element of adaptive grammar’s intrasentential and inter-
sentential syntax.
Sociolinguists speak of the tendency of old information to be expressed
before new information as a rule of discourse or a property of conversation. In
this sense, Grice (1975) suggested that collocutors in a conversation adhere to
a tacit contract to “be relevant.” Adaptive grammar sees relevance as a deeper,
biological injunction to say (and do) topical things first, a “rule” which applies
NULL MOVEMENT
• 151
not just to conversations but to everything brains do. It is a corollary of evolu
tion: the organism that doesn’t do relevant things first simply doesn’t survive.
Under adaptive grammar, what one is talking about, the topic, is the cur
rently most activated referential subnetwork in neocortex. By our account of
serial order in chapters 8 and 9, persistent STM activation will drive this topic
subnetwork to competitive, primacy-effect prominence among competing sen
tence elements. Therefore, in 10.21–10.23, as in all the unmarked sentences
of all the world’s languages, the topic is the first propositional nominal ele
ment of a sentence. But what do we mean by propositional, and how does adap
tive grammar describe the remaining, presumably competing elements of the
sentence?
Case Grammar
Case grammar is generally said to have originated with Fillmore (1968), but
there were several precursors. European languages are generally richer in gram
matical cases than English. In these languages, it has always been apparent that
there is a correlation between such grammatical categories as nominative (sub
ject), accusative (direct object), and dative (indirect object) and such semantic
categories as actor, patient, and donor/ recipient. From such relations, Tesniere
([1959] 1969) had developed valency grammar, and Gruber ([1965] 1976) had
developed a theory of thematic relations. In these systems, a proposition is a clause
consisting of a single verb and its most immediate (or “inner”) case arguments.
7
Unfortunately, the correlation between grammatical case and semantic case
is not always close. Thus, in 10.23 ( Mary is helped by him), him is in the accusa
tive case, but him is still the “helper,” and so semantically it fills the actor role.
Different case grammarians also use different terms for similar semantic roles.
Thus, the term “agent” is often used for “actor”; “patient” and “theme” are often
used for “object”; and “source” and “goal” are often used for “donor” and “re
cipient.” Over the years, this Tower of Babel has become a religious schism.
Generative linguists have thematic relations while other linguists have case gram-
mar, and the two churches never cite each other. My usage will loosely follow
Cook 1989.
From the perspective of adaptive grammar, Fillmore made two especially
significant contributions to case grammar theory. First, he claimed that semantic
cases like actor and patient were the actual organizational elements of “deep
structure.” These are the “propositional” case arguments of a transitive verb
like kill. Give, by contrast, has three such propositional arguments: an agent
(the giver), a patient (the gift), and a recipient. In contrast to these “inner”
propositional case arguments, most verbs also accept a variety of “outer” or
“nonpropositional” case arguments—for example, purpose or location. These
cases roles are usually optional; they are often left unspecified, presumably
because they are often contextually obvious.
Second, Fillmore also claimed that this semantic deep structure was ini
tially unordered. But then, forced to account for syntactic order, Fillmore sug
152 •
HOW THE BRAIN EVOLVED LANGUAGE
gested that for each verb in the lexicon there existed a subject selection hierarchy.
By the subject selection hierarchy, the verb kill would normally select the se
mantic actor, if there were one, for its subject, as in The killer killed the victim.
Otherwise, if there were no definite killer in the semantic context (say, in a mys
tery novel), kill would select a semantic instrument as subject, as in The poison
killed the victim. Otherwise, if there were neither agent nor instrument, the object
would become the subject, as in the passive sentence, The victim was killed.
Topicalization and Passives
But what about 10.25, a passive sentence in which the object is selected as sub
ject, even though there is an explicit instrument and an explicit agent?
The victim was killed by Mr. Green with a candlestick.
(10.25)
The subject selection hierarchy, by attempting a context-free explanation
of word order, fails to explain many such commonplace sentences. If analysis
of language and sentence is conducted independent of context, then it is by
definition conducted without consideration of topic. But where, outside the
ivory tower, are sentences analyzed without regard to topic? If we analyze real
language, then topicality replaces the subject selection hierarchy as the prin
cipal ordering force in syntax.
Consider, for example, the last sentence of the following extract from Peirce
1877. The first paragraph is provided for context.
Now, there are some people, among whom I must suppose that my reader is to
be found, who, when they see that any belief of theirs is determined by any cir
cumstance extraneous to the facts, will from that moment not merely admit in
words that that belief is doubtful, but will experience a real doubt of it, so that
it ceases in some degree at least to be a belief.
To satisfy our doubts,
(10.26a)
therefore, it is necessary
(10.26b)
that some method be found by which
(10.26c)
our beliefs may be caused by nothing human,
(10.26d)
but by some external permanency—by something
upon which our thinking has no effect. . . . Such is the method
of science (Peirce 1877, 10f–11).
The last sentence is rather remarkable in that it contains two topicalizing
clauses and two passive clauses. I call the first purpose clause of 10.26 ( to satisfy our
doubts) ( re) topicalizing because purpose clauses are nonpropositional: they are more
often extrasententially expressed, implied, or assumed than intrasententially
NULL MOVEMENT
• 153
expressed. In this case, the clause recapitulates the primary topic of the preced
ing pages: Peirce has argued that doubt nags us to epistemological action. The
second clause (10.26b; it is necessary) recapitulates the fact that Peirce is conduct
ing a philosophical argument, and that what follows, follows from logical neces
sity. I call 10.26b a topicalization because adjectives are not usually topics, but this
“cleft” clause “fronts” and “focuses” the proposition of necessity.
In general, adaptive grammar sees all such fronting, focusing, and topicali
zing as manifestations of an underlying topic gradient: reflections of how resonant
each clause (and each component within each clause) is in the speaker’s STM.
The most resonant—and therefore most topical—element reaches threshold first
and is expressed first. The topic gradient is a self-similar analogue of the primacy
gradient we examined in chapter 8.
The third clause (10.26c) is a passive clause. It recapitulates the secondary
topic Peirce has been addressing: the various methods by which people avoid or
relieve nagging doubts. Of course, Peirce might have said it in the active voice:
?that everybody find some method by which
(10.26c')
But everybody has not been particularly active in Peirce’s STM or the reader’s.
As the very title of his paper makes clear, belief and methods of fixing belief are the
active topics, and in 10.26 it is their expression which is accorded primacy.
The fourth clause (10.26d) is also a passive clause. Beliefs is topical, and so
it becomes the subject of the clause. Once again, this clause would sound rather
odd in active voice:
?nothing human may cause our beliefs
(10.26d')
Finally, Peirce introduces the new information to which this passage turns: some
external permanency . . . the method of science.
Admittedly 10.26c' and 10.26d' are grammatical (albeit only in the most trivial
sense of the term), and there are other legitimate and illegitimate reasons for
using and avoiding passives. It could be argued that Peirce’s use of passives in
the preceding analysis reflects no cognitive principles but is simply “stylistic.”
However, 10.26c and 10.26d are the first passive clauses Peirce has used in three
pages, so one cannot easily argue that passive clauses are a signature of his style.
Nor do other instances of passive “style” invalidate the principle of the topic
gradient. It is perfectly consistent for adaptive grammar to maintain that the
learned norms of scientific writing can inhibit the first-person expression of the
researcher-agent in STM and demote it in the topic gradient.
For another class of passives, the subject selection hierarchy suggested that
agentless passives like
Mistakes were made.
(10.27)
are in the passive voice because there simply is no agent to assume the subject
role. Less-gullible linguists (e.g., Bolinger 1968) take such passives to be lin
154 •
HOW THE BRAIN EVOLVED LANGUAGE
guistic legerdemain, deliberately concealing who did the defective deed. In the
latter case, although the agent may be very active in the speaker’s STM, his in
tent clearly is to inhibit the agent’s resonance in the listener’s STM.
Subtopics and Dative Movement
The topic gradient accounts for many other types of linguistic “movement.” In
addition to passive movement, generative theory claimed that 10.29 was de
rived from 10.28 by dative movement. That is, in 10.28, the “indirect object,” Neil,
follows the direct object and is said to be in the dative/ recipient case. In 10.28,
Neil has “moved” to before the direct object.
The police gave a break to Neil.
(10.28)
The police gave Neil a break.
(10.29)
In the absence of context, there is no reason to prefer 10.28 over 10.29.
But, except in narrow linguistic inquiry, language never occurs independently
of context. Sentences 10.28 and 10.29 would normally follow from two differ
ent conversations:
The police didn’t give many breaks. (But,)
(10.30)
The police gave a break to Neil.
(10.28)
*The police gave Neil a break.
(10.29)
The police gave Neil a break.
(10.31)
In 10.30, the police appears as the topic of discourse, and break is introduced as
new information, becoming a secondary topic. In this context, 10.28 is pre
ferred. Sentence *10.29 sounds relatively odd because it gives newer informa
tion in the conversation, Neil, primacy over the older information, break(s).
Giving Neil additional, contrastive stress resolves the oddity, but the exceptional
markedness of 10.31 proves the rule: secondary topics normally have primacy
over tertiary topics. Conversely, in the following context *10.28 violates the
topical precedence established by 10.32:
The police investigated Neil. (But,)
(10.32)
The police gave Neil a break.
(10.29)
*The police gave a break to Neil.
(10.28)
The police gave a break to Neil.
(10.33)
NULL MOVEMENT
• 155
Sentence 10.32 establishes Neil as the secondary topic, but in *10.28, the
new information and tertiary topic break is presented as secondary. Once again,
contrastive stress in 10.33 can correct this abnormality. Just as chapters 8 and 9
explained phonological seriality in terms of a primacy gradient, adaptive gram
mar explains syntactic seriality in terms of a topic gradient. No movement is
necessary.
Particle Movement
The same topic gradient effects can be found in the case of so-called particle
movement.
8
If prior topics have not been established, 10.34 and 10.35 are
equally felicitous context-free sentences:
John looked up the address.
(10.34)
John looked the address up.
(10.35)
But given 10.36, a discourse context establishing the prior topic address, 10.35
is preferred:
The address was torn off the package, (so)
(10.36)
John looked the address up.
(10.35)
*John looked up the address.
(10.34)
The Nominal Topic Gradient and
the Verbal Relation Gradient
The preceding several sections provide evidence that the nominal elements
of a sentence are organized in a primacy-ordered topic gradient. Now we re
turn to the question of where the verb fits into this order. At the end of chap
ter 9, we observed that morphology tends to organize on the offbeat. In figure
10.6, I extend this offbeat morphology to include the verb. The verb, with its
case-marking prepositions (or postpositions in languages like Japanese), exists
in a primacy- (or recency-) ordered verbal relation gradient. A rhythmic dipole
generates sentences by alternating between these two gradients.
In figure 10.6, Prof. Plum killed Mrs. White in the hall with a knife is generated
by alternately outputting elements from the nominal topic gradient and the
verbal relation gradient. The T (topic) pole of the topic/relation dipole is ini
tially activated by S. The T/R (topic/relation) dipole then oscillates in phase
with the downbeat/offbeat “foot” dipole described in chapter 9. This does not
mean that the T/R dipole rebounds with each and every beat of the foot di
pole. Several feet may occur on each pole of the T/R dipole, as, for example,
156 •
HOW THE BRAIN EVOLVED LANGUAGE
Dostları ilə paylaş: |