Re: To Matt Pearson
From: | laokou <laokou@...> |
Date: | Thursday, October 25, 2001, 4:39 |
From: "Matthew Pearson"
Rant coming, hide yourselves....
> Well, human language is rule-governed, pure and simple, so there's got to
be a set of rules in our heads somewhere.
Says who? I'm no scientist, but statements like "X is true, so Y *must* be
true" (a *hypothesis*) fail the scientific method litmus test, unless one
can come up with proof that Y is true ("Apples fall, so oranges must fall."
is a hypothesis where we can test if oranges fall.) (As opposed to: Apples
fall, I've seen an orange fall, apples and oranges are round, so all oranges
*must* fall.)) When *we* were in school :) linguistics profs seemed to want
to make linguistics legit as a science (as opposed to a discipline) by using
scientific language, but I frequently found the Y part lacking.
> Generative grammar seeks to find out what those rules are. What we
actually *do* with those rules when we speak and understand sentences in
real time is, in principle (and in practice), a different issue.
While I appreciate that generative (transformational?) grammar seeks to
provide us with paradigms to explain grammar rules that we in the here and
now can understand cognitively, that is no proof that that is how the mind
works. (Processing research sounds intriguing.) (Freudian models are/were
useful to explain dreamlife, but trains into tunnels as sexual intercourse,
all the rage in their day, seem not so applicable anymore [though I don't
think one throws Freud out with the bath water.]) (And was it scientific?).
(Here again [and I took those notorious Ling101 classes back in the 80s
which you referred to]: the premise was that, like a computer, the human
mind must use the simplest rules available for language [a specious claim,
and far from the scientific method]) if for no other reason than efficiency.
I balked, 'cause the "there" transformation made no sense (I often hear the
speech 'error' by native speakers, "There's books on the table." but not in
the original "core" sentence, "Books are on the table." [the transformation
would involve inversion etc.] Then there are sentences like, "Let's go,
shall we?" and "Stay, won't you?" which require untold mental gymnastics to
find a sequence of transformations that work. (Linguistics prof offered a
free "A" to anyone who solved that dilemma). If there's a gaping hole in
your theory...
The field may have come a long way in twenty years, and I would genuinely
love to read the next wave of research, but if it takes on the same ol'
hubris of "this is the way it is", then I fear "we haven't come a long way,
baby."
> This is a very common-sense attitude,
as is: "the mind must work economically, like a computer" (I hope you
realize I'm not taking you personally to task for what my prof said 20 years
ago)
>least limited 'look ahead' capacity. But predicting too much is costly: If
your brain starts to analyze a sentence based on a certain prediction about
what will come next, and that prediction turns out to be incorrect, then it
will have to start all over again, and you've wasted time and memory.
Where, then, finishing other people's sentences? There are many contexts
where it really isn't that difficult to do (and I'm not talking spouses or
significant others that are used to certain speech patterns). Either I'm
ultra intuitive (NOT!), or there are a whole range of cues to indicate even
specific words (beyond set phrases) (rather than words which express a
concept, which is also a possibility).
End rant, hopefully reasonably sensical,
Kou