Word formation processes: Ways of creating new words in English
1. Affixation:
adding a derivational affix to a word. Examples: abuser, refusal, untie,
inspection, pre-cook.
2. Compounding: joining two or more words into one new word. Examples: skateboard, whitewash, cat lover, self-help, red-hot, etc.
3. Zero derivation: (also called conversion or functional shift): Adding no affixes; simply using a word of one category as a word of another category. Examples: Noun-verb: comb, sand, knife, butter, referee, proposition.
4. Stress shift: no affix is added to the base, but the stress is shifted from one syllable to the other. With the stress shift comes a change in category.
2. Compounding: joining two or more words into one new word. Examples: skateboard, whitewash, cat lover, self-help, red-hot, etc.
3. Zero derivation: (also called conversion or functional shift): Adding no affixes; simply using a word of one category as a word of another category. Examples: Noun-verb: comb, sand, knife, butter, referee, proposition.
4. Stress shift: no affix is added to the base, but the stress is shifted from one syllable to the other. With the stress shift comes a change in category.
Noun
Verb
cómbine combíne
ímplant implánt
réwrite rewríte
tránsport transpórt
cómbine combíne
ímplant implánt
réwrite rewríte
tránsport transpórt
Noun
Adjective
cóncrete concréte
ábstract abstráct
5. Clipping: shortening of a polysyllabic word. Examples: bro (< brother), pro (< professional), prof (< professor), math (< mathematics), veg (< 'vegetate', as in veg out in front of the TV), sub (< substitute or submarine).
6. Acronym formation: forming words from the initials of a group of words that designate one concept. Usually, but not always, capitalized. An acronym is pronounced as a word if the consonants and vowels line up in such a way as to make this possible, otherwise it is pronounced as a string of letter names. Examples: NASA (National Aeronautics and Space Administration), NATO (North Atlantic Treaty Organization), AIDS (Acquired Immune Deficiency Syndrome), scuba (self-contained underwater breathing apparatus), radar (radio detecting and ranging), NFL (National Football League), AFL-CIO (American Federation of Labor-Congress of Industrial Organizations).
7. Blending: Parts (which are not morphemes!) of two already-existing words are put together to form a new word. Examples: motel (motor hotel) brunch (breakfast & lunch), smog (smoke & fog), telethon (television & marathon), modem (modulator & demodulator), Spanglish (Spanish & English).
8. Backformation: A suffix identifiable from other words is cut off of a base which has previously not been a word; that base then is used as a root, and becomes a word through widespread use. Examples: pronunciate (< pronunciation < pronounce), resurrect (< resurrection), enthuse (< enthusiasm), self-destruct (< self-destruction < destroy), burgle (< burglar), attrit (< attrition), burger (< hamburger). This differs from clipping in that, in clipping, some phonological part of the word which is not interpretable as an affix or word is cut off (e.g. the '-essor' of 'professor' is not a suffix or word; nor is the '-ther' of 'brother'. In backformation, the bit chopped off is a recognizable affix or word ('ham ' in 'hamburger'), '-ion' in 'self-destruction'. Backformation is the result of a false but plausible morphological analysis of the word; clipping is a strictly phonological process that is used to make the word shorter. Clipping is based on syllable structure, not morphological analysis. It is impossible for you to recognize backformed words or come up with examples from your own knowledge of English, unless you already know the history of the word. Most people do not know the history of the words they know; this is normal.
9. Adoption of brand names as common words: a brand name becomes the name for the item or process associated with the brand name. The word ceases to be capitalized and acts as a normal verb/noun (i.e. takes inflections such as plural or past tense). The companies using the names usually have copyrighted them and object to their use in public documents, so they should be avoided in formal writing (or a lawsuit could follow!) Examples: xerox, kleenex, band-aid, kitty litter.
10. Onomatopoeia (pronounced: 'onno-motto-pay-uh'): words are invented which (to native speakers at least) sound like the sound they name or the entity which produces the sound. Examples: hiss, sizzle, cuckoo, cock-a-doodle-doo, buzz, beep, ding-dong.
11. Borrowing: a word is taken from another language. It may be adapted to the borrowing language's phonological system to varying degrees. Examples: skunk, tomato (from indigenous languages of theAmericas ),
sushi, taboo, wok (from Pacific Rim
languages), chic, shmuck, macho, spaghetti, dirndl, psychology, telephone,
physician, education (from European languages), hummus, chutzpah, cipher,
artichoke (from Semitic languages), yam, tote, banana (from African languages).
cóncrete concréte
ábstract abstráct
5. Clipping: shortening of a polysyllabic word. Examples: bro (< brother), pro (< professional), prof (< professor), math (< mathematics), veg (< 'vegetate', as in veg out in front of the TV), sub (< substitute or submarine).
6. Acronym formation: forming words from the initials of a group of words that designate one concept. Usually, but not always, capitalized. An acronym is pronounced as a word if the consonants and vowels line up in such a way as to make this possible, otherwise it is pronounced as a string of letter names. Examples: NASA (National Aeronautics and Space Administration), NATO (North Atlantic Treaty Organization), AIDS (Acquired Immune Deficiency Syndrome), scuba (self-contained underwater breathing apparatus), radar (radio detecting and ranging), NFL (National Football League), AFL-CIO (American Federation of Labor-Congress of Industrial Organizations).
7. Blending: Parts (which are not morphemes!) of two already-existing words are put together to form a new word. Examples: motel (motor hotel) brunch (breakfast & lunch), smog (smoke & fog), telethon (television & marathon), modem (modulator & demodulator), Spanglish (Spanish & English).
8. Backformation: A suffix identifiable from other words is cut off of a base which has previously not been a word; that base then is used as a root, and becomes a word through widespread use. Examples: pronunciate (< pronunciation < pronounce), resurrect (< resurrection), enthuse (< enthusiasm), self-destruct (< self-destruction < destroy), burgle (< burglar), attrit (< attrition), burger (< hamburger). This differs from clipping in that, in clipping, some phonological part of the word which is not interpretable as an affix or word is cut off (e.g. the '-essor' of 'professor' is not a suffix or word; nor is the '-ther' of 'brother'. In backformation, the bit chopped off is a recognizable affix or word ('ham ' in 'hamburger'), '-ion' in 'self-destruction'. Backformation is the result of a false but plausible morphological analysis of the word; clipping is a strictly phonological process that is used to make the word shorter. Clipping is based on syllable structure, not morphological analysis. It is impossible for you to recognize backformed words or come up with examples from your own knowledge of English, unless you already know the history of the word. Most people do not know the history of the words they know; this is normal.
9. Adoption of brand names as common words: a brand name becomes the name for the item or process associated with the brand name. The word ceases to be capitalized and acts as a normal verb/noun (i.e. takes inflections such as plural or past tense). The companies using the names usually have copyrighted them and object to their use in public documents, so they should be avoided in formal writing (or a lawsuit could follow!) Examples: xerox, kleenex, band-aid, kitty litter.
10. Onomatopoeia (pronounced: 'onno-motto-pay-uh'): words are invented which (to native speakers at least) sound like the sound they name or the entity which produces the sound. Examples: hiss, sizzle, cuckoo, cock-a-doodle-doo, buzz, beep, ding-dong.
11. Borrowing: a word is taken from another language. It may be adapted to the borrowing language's phonological system to varying degrees. Examples: skunk, tomato (from indigenous languages of the
A perennial problem in semantics is the delineation of its subject
matter. The term meaning can be used in a variety of ways, and only some of
these correspond to the usual understanding of the scope of linguistic or
computational semantics. We shall take the scope of semantics to be restricted to the literal interpretations of sentences in a
context, ignoring phenomena like irony, metaphor, or conversational implicature
.
A standard assumption in computationally oriented semantics is that
knowledge of the meaning of a sentence can be equated with knowledge of its
truth conditions: that is, knowledge of what the world would be
like if the sentence were true. This is not the same as knowing whether a
sentence is true, which is (usually) an empirical matter, but knowledge of
truth conditions is a prerequisite for such verification to be possible.
Meaning as truth conditions needs to be generalized somewhat
for the case of imperatives or questions, but is a common ground among all
contemporary theories, in one form or another, and has an extensive
philosophical justification, e.g
A semantic description of a language is some finitely stated mechanism that allows us to
say, for each sentence of the language, what its truth conditions
are. Just as for grammatical description, a semantic theory
will characterize complex and novel sentences on the basis of their
constituents: their meanings, and the manner in which they are put together.
The basic constituents will ultimately be the meanings of words and morphemes. The modes of combination of constituents are largely determined
by the syntactic structure of the language. In general, to each
syntactic rule combining some sequence of child constituents into a parent
constituent, there will correspond some semantic operation combining the
meanings of the children to produce the meaning of the parent.
Some natural language
processing tasks (e.g., message routing, textual information
retrieval, translation) can be carried out
quite well using statistical or pattern matching techniques
that do not involve semantics in the sense assumed above. However, performance
on some of these tasks improves if semantic processing is involved. (Not enough
progress has been made to see whether this is true for all of the tasks).
Some tasks, however, cannot be carried out at
all without semantic processing of some form. One important example application
is that of database query, of the type chosen for the Air
Travel Information Service (ATIS) task [DAR89].
For example, if a user asks, ``Does every flight from London
to San Francisco
stop over in Reykyavik?'' then the system needs to be able to deal with some
simple semantic facts. Relational databases do not store propositions of the
form every X has property P and so a logical inference from the meaning of the
sentence is required. In this case, every X has property P is equivalent to
there is no X that does not have property P and a system that knows this will
also therefore know that the answer to the question is no if a non-stopping
flight is found and yes otherwise.
Any kind of generation of natural language output (e.g., summaries of
financial data, traces of KBS system operations) usually requires semantic
processing. Generation requires the construction of an appropriate meaning
representation, and then the production of a sentence or sequence of sentences
which express the same content in a way that is natural for a reader to
comprehend, e.g., [MKS94].
To illustrate, if a database lists a 10 a.m.\ flight from London to Warsaw on
the 1st--14th, and 16th--30th of November, then it is more helpful to answer
the question What days does that flight go? by Every day except the 15th
instead of a list of 30 days of the month. But to do this the system needs to
know that the semantic representations of the two propositions are equivalent.
No comments:
Post a Comment