Re: lexicon
From: | The Gray Wizard <dbell@...> |
Date: | Wednesday, April 4, 2001, 20:02 |
> From: Tim Judge, Erion Telepalda
>
> Hey does anyone know where I can find out the minimum of words I
> need for a
> "complete" language. English Dictionarys are useless for this (and much to
> big), and I don't know any other language well enough for me to determin
> exactly how usefull the words are.
This question gets asked a lot on this list. Personally, I think its the
wrong way to go. This approach almost guarantees a relex of the language
from which the minimum list of words is derived. A better question is what
is the minimum number of "concepts" needed. Of course, this is a more
difficult question to answer. I have found that WordNet synsets are
somewhat helpful in this regard. A project that has been on my to-do list
for a long time is to map amman iar (my conlang) words to WordNet synsets
and then to disambiguate those mappings that appear to have obvious English
semantic influences. One could go to the extreme as did Mark Line with his
Classical Yiklamu and create a separate word for every WordNet concept, but
there is room for both compromise and creativity in this approach.
OTOH, I tend to agree with many others who have responded to this thread
that creating words as they are needed is perhaps a better approach. In
this way, words can be created that fit the concept you are trying to
express and you have much more freedom to experiment with semantic mappings
that differ from those of your L1.
IMO, a good derivational system key to lexical productivity. Starting with
a good set of roots representing general concepts, a wealth of new and
related words can be generated. This was my approach with amman iar. I
systematically applied the derivational system to my list of roots and
adjusted for morphophonemic processes. This process generated an inventory
of some 20,000+ potential lexemes complete with potential glosses. These
got promoted into the lexicon as they were needed. Often I was able to
promote whole networks of semantically related lexemes at a time.
Occasionally, I rejected a lexeme suggested by the inventory as being not
quite right. Fortunately, the derivational system is robust enough to
provide multiple paths to the same semantic result, so rejection usually
implied taking a different derivational path. During the process of
promotion, I extended the suggested glosses into "real definitions" adding
synonyms culled from alternative derivations. Of course, the process never
quite went as smoothly as I have just implied. Often I would change the
form of an affix and sometimes even its semantics. This required a complete
regression through my approved lexicon to ensure consistency. A small sample
of the end result can be seen on my site at
http://www.graywizard.net/Conlinguistics/amman_iar/ai_lexicon.htm.
David
David E. Bell
The Gray Wizard
www.graywizard.net
Wisdom begins in wonder.