Theiling Online    Sitemap    Conlang Mailing List HQ   

Re: To Matt Pearson

From:Matthew Pearson <matthew.pearson@...>
Date:Tuesday, October 23, 2001, 18:17
--- David Peterson wrote:
    I see you're in favor of the "new" transformation idea of which I was
unaware.  But have you heard about the coding idea?  (I forget what exactly
the name of the idea was...)  I found this one a much, much more plausible
solution.  The idea is that every single word has a code that goes with that
can predict or delimit what comes next depending on what semantic/pragmatic
context it carries with it.  I don't know the specifics, but an example that
ordinarily would have been explained by transformation would run as follows:

"The hamburger", noun, should be followed by a verb (or an adverb phrase
which would be followed by a verb) if it comes initially.  However, if it's
followed by something other than a verb, it emphasizes it.  (This is
ridiculously simplified, and ignoring relative clauses.)  So:

1.) "The hamburger is good."
2.) "The hamburger I gave him."  (In response to, "What did you give him?"
in, say, Yiddish American English.)

    The point is that every word delimits what can come next and what it me
ans therefore.  (By the by, this obviously doesn't matter for languages like
Latin that can take on literally any word order and convey the same meaning.)
 But, say, if you had an OSV language in which word order is important, then
the first noun of the sentence would just encode the idea of being the object
without having to be near the verb and without  having to undergo some sort
of transformation.  If this doesn't make sense, can you at least see the idea?
--- end of quote ---

Well, not to seem obnoxious, but I'll start with a quibble concerning your comment
on Latin: Latin word order is not completely free; there are constraints (a
preposition cannot be at the other end of the sentence from its complement, for
example). But nitpicking aside, if the kind of theory you're talking about has
nothing useful to say about Latin (or 'free word order' languages generally,
insofar as they exist), then it's not a good theory of Human Language. A good
theory of Human Language needs to account for all possible linguistic phenomena
in all languages. (I don't claim that transformational theories can account for
all phenomena in all languages--FAR FAR from it--but they at least have that
ambition...)

But anyway, getting to this coding theory you mention: I can think of two
different theories of sentence structure which you might have in mind. The
first is a 'word-chain' device (or 'Markov model'), in which every word is
encoded with probability information concerning the kind of
word/phrase/category which can follow it (to use your example, a subject noun
phrase includes the information that it will probably be followed by a verb;
this could perhaps be supplemented with information about marked constructions,
as you suggest). The problem with word-chain devices is that words don't just
have syntactic/semantic dependencies with the words or phrases which they are
adjacent to; there are also *long-distance* dependencies. For example,
languages can have pairs of conjunctions like "either" and "or", or "if" and
"then":

  EITHER Marion likes ice-cream, OR James likes ice-cream.
  IF Marion likes ice-cream, THEN James likes ice-cream.

You can't mix and match these, obviously; they need to go together:

  *EITHER Marion likes ice-cream, THEN James likes ice-cream.
  *IF Marion likes ice-cream, OR James likes ice-cream.

But notice that the material which intervenes between "either" and "or", or
between "if" and "then", is arbitrary (it's gotta be a sentence, but it can be
ANY sentence). Since it's arbitrary, it can be arbitrarily long. And yet the
dependency between "either" and "or", or between "if" and "then", remains, in
spite of what (or how much) material intervenes between them. So a word like
"either" will have to be able to 'look ahead' arbitrarily far to ensure that
somewhere down the line there will be an "or" to go with it. Linguists have
argued that it is impossible to construct a word-chain device which has this
'look ahead' property without introducing tons of redundancy. (Steven Pinker
has a nice discussion of this in "The Language Instinct". Some of his other
arguments against word-chain devices are less convincing, but this one seems
pretty sound to me.)

The second kind of thing which you might be referring to is Categorial Grammar. In
categorial grammar, words and phrases are coded with category labels which
describe the kinds of things they can combine with, and there are
straightforward rules for 'merging' the categories of words to give categories
of phrases. These category labels have the form X/Y (or Y\X, depending on word
order), which means something like "I combine with an X to form a Y" (I may
have the slashes wrong, but I think that's the right format). For example, a
subject noun phrase in English would have the category VP/S, meaning "combines
with a verb phrase to its right to give a sentence", while a transitive verb
would have the category NP/TVP, meaning "combines with a noun phrase object to
to its right to give a transitive verb phrase". Categorial grammar gets away
without movement transformations by having 'type-lifting' rules, which can
change the category labels of phrases in systematic ways. So to han!
dle your OSV focus case ("Hamburgers I like"): Object noun phrases in English
normally have the label TVP\TV ("combines with a transitive verb to get a
transitive verb phrase"), but you can type-lift it to give a category that
means something like "combines with a sentence containing a transitive verb
without an object", or some such (that's too clunky; the actual theory is more
principled and elegant than that, but I can't remember the details well
enough).

I have no particular problem with Categorial Grammar per se. Basically, it just
seems like a notational variant of transformational theories, which uses
type-lifting instead of movement transformations to account for long-distance
dependencies involving displacement (like the relation between the
sentence-initial focussed direct object and the gap after the transitive verb
in "Hamburgers I like __"). My major complaint with Categorial Grammar is that
it's not constrained enough vis a vis what kinds of categories words can have.
For example, there's no principled reason why some word in some language
couldn't have the category NP/V/V/V/V/V ("combines with a noun phrase to give
something that combines with a verb to give something that combines with a verb
to give... etc."), but I don't think any such cases exist. I may be out of
date, though: I haven't studied Categorial Grammar in years and perhaps the
issue of overgeneration of categories has been adequately addressed in su!
bsequent research. (I find that most people who criticize Chomskyan frameworks are at
least 20 years out of date, so I need to be sensitive to this when criticizing
other frameworks.)

Matt.

Matt Pearson
Department of Linguistics
Reed College
3203 SE Woodstock Blvd
Portland, OR 97202 USA
ph:  503-771-1112 (x 7618)