Theiling Online    Sitemap    Conlang Mailing List HQ   

Re: OT: THEORY Fusion Grammar

From:Gary Shannon <fiziwig@...>
Date:Saturday, July 15, 2006, 20:12
--- Alex Fink <a4pq1injbok_0@...> wrote:

> On Fri, 14 Jul 2006 13:22:05 -0700, Gary Shannon > <fiziwig@...> wrote: > [...] > >Hypothesis: For any natural language, related > elements > >are always immediately adjacent and there exists a > >complete fusion grammar for that language. > > > >Comments? Counterexamples? > > From a formal language theory point of view, the set > of languages with > fusion grammars (without transposition) seems to be > equal to the set of > context-free languages. So how about examples of > non-context-free behaviour > in natural language? > > For example, a quick Google turns up > >
http://www.ling.uni-potsdam.de/~michael/esslli2004/flt.pdf
> containining a potential English counterexample: > > | * Bar-Hillel and Shamir (1960): > | - English contains copy-language > | - cannot be context-free > | * Consider the sentence > | John, Mary, David, ... are a widower, a widow, > a widower, ..., > | respectively. > | * Claim: the sentence is only grammatical under > the condition that > | if the nth name is male (female) then the nth > phrase after the > | copula is a widower (a widow) > > How would you handle this, having for instance 'male > noun (phrase)' and > 'female noun (phrase)' tags? On the other hand, the > counterargument is > later offered that "crossing dependencies triggered > by _respectively_ are > semantic rather than syntactic"; so maybe this is a > non-issue.
Interesting problem! I'm going to have to mull that one over for a while. Perhaps introducing the notion of paired sets with one-to-one mapping? Set {John, Mary, David} maps onto set {a widower, a widow, a widower}, where each set is defined to be a single element with a tag like .NSET (noun set) and .ATRSET (attribute set)? Since my main concern is more practical than theoretical (i.e. computerized natural language "understanding") this may require some kind of ad hoc, but theoretically impure approach. My concern is more with determining the meaning of a sentence than determining if the sentence is grammatically correct. ("Them dogs is mean!" should parse correctly for meaning despite its ungrammatical nature.)
> But then there's this Dutch (again!) example: > | dat Jan Marie Pieter Arabisch laat zien schrijven > | THAT JAN MARIE PIETER ARABIC LET SEE WRITE > | 'that Jan let Marie see Pieter write Arabic' > > How do you handle this? >
There's no doubt this sentnece would require some kind of transposition. It clearly falsifies my original hypothesis as it stands.
> From another message: > > In this case (where Mary's might be a possesive or > it > > might be a contraction of "Mary is") both cases > are > > developed in parallel. The 's would be expanded as > > both "Mary is" and "Mary OWNEROF" (where
"OWNEROF")
> > is a sort of internal possesive particle). > > Why treat these as contractions, and expand them? > Why not just say there > are two words "Mary's", with tag types 'SV' (for > "Mary is") and whatever > type "Mary OWNEROF" is?
Good idea. It does simplify the processing. Of course it doubles the number of dictionary entries for each proper name or noun that could take a possesive 's. On the other hand, if it can be handled in a more generalize manner then perhaps it should be. I was going to treat "(Mary OWNEROF).PDT" the same way I'd treat "my.PDT", or "her.PDT". (PDT is possesive determiner tag. A determiner because it can stand in for "the" or "this") Another alternative would be to simply replace "'s" with the ".PDT" tag and dispense with "OWNEROF" completely. Thus, it would only take two dictionary entries: "*'s->*.PDT" and "*'s->* is", where "*" is a wildcard match.
> On Fri, 14 Jul 2006 21:29:42 -0700, Gary Shannon > <fiziwig@...> wrote: > >--- Herman Miller <hmiller@...> wrote: > > > >> Gary Shannon wrote: > >> > >> > Hypothesis: For any natural language, related > >> elements > >> > are always immediately adjacent and there > exists a > >> > complete fusion grammar for that language. > >> > > >> > Comments? Counterexamples? > >> > > >> > >> Did you want counterexamples from English? > >> > >> (did ... want) > >
<snip>
> >The fact that the deep meaning of "did" changes > >radically with that transposition suggests that > >transposition rules cannot be applied in this case, > >and that interpreting "did" as a query-marking > >particle that properly belongs at the front of the > >query, rather than a past tense marker, is > reasonable. > > But "did you want counterexamples from English?" has > past tense as well. > Compare "Do you want...?" and its counterpart "you > (do) want...", which > don't. All the varying placement is really changing > is the illocutionary > force, from an assertion to a question.
I guess "did" and "do" would have to be treated as slightly different versions of the query-marking particle.
> If "did" is a query-marking particle, how do you > account for the behaviour > of auxiliary verbs, which appear moved to the same > position in questions? > "Can you come?" "Will you come?"
I suppose that such auxiliaries would have to appear twice in the dictionary. Once as an auxiliary and once as a query-marking particle of a specific nature. That doesn't quite solve the problem, however, since "Can you come" would have to be marked: "QUERY: you can come?" But on the other hand, "Did you want this?" could also be marked "QUERY: You did want this?" So perhaps the rules is to change "AUX NOUN VERB..." into "QUERY: NOUN AUX VERB...". This again violates the original adjacency hypothesis, but it could be argued that the noun (or pronoun) was "embedded" in the complete verb (made up of AUX+VERB), and can be un-embedded and still be adjacent to the complete verb. Another example of such embedding might be "...was slowly drifting..." where the complete verb is (was drifting) and the adverb was embedded within the complete verb, and needs to be extracted and made adjacent: "(was drifting) slowly".
> From the snippage: > > Then we parse the marked statement: > > > > (((you want).SV counterexamples).SVO (from > > English)).SVO > > It's interesting that you consider the verb to bind > more tightly to the > subject than the object. Mainstream syntax would > have the V bind more > tightly to the O: (you (want counterexamples)).
Now that you mention it, tighter binding to the object does feel more natural. So far I've been working with the list of 1200 progressively graded sentences from the book "Graded Sentences for Analysis" (Rossman and Mills), and the first few hundred sentences have no direct objects, just pure SV with adjectives, adverbs and tense markers. I've extracted 98 formal rules from those first 200 sentences, and a tag dictionary of 195 words. When I get to section Two (of seven sections in the book) I will have to deal with more and more complications. It will be interesting to see how far I can go without having to introduce too many obnoxious epicycles into the system. FWIW: So far, the average sentence takes anywhere from 1 to 15 rules to parse. For example: ((the.DET (poor.ADJ (old.ADJ man.N).NQ).NQ).DTN ((limped.V along.DIR).VDR painfully.ADV).VQ).SV Uses rules: (DTN VQ).SV (ADJ N).NQ (ADJ NQ).NQ (DET NQ).DTN (V DIR).VDR (VDR ADV).VQ --gary
> In any case, whatever you let the verb fuse with, > you'll have trouble > handling at least one of the word orders SOV (if V > fuses to S) or VSO (if V > fuses to O). Does this call for a transposition > rule? > > Alex >

Reply

Hanuman Zhang <zhang@...>