yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Latin syntax wikipedia , lookup

Polish grammar wikipedia , lookup

Indeterminacy (philosophy) wikipedia , lookup

Lojban grammar wikipedia , lookup

Word-sense disambiguation wikipedia , lookup

Distributed morphology wikipedia , lookup

Untranslatability wikipedia , lookup

Integrational theory of language wikipedia , lookup

Musical syntax wikipedia , lookup

Morphology (linguistics) wikipedia , lookup

Construction grammar wikipedia , lookup

Transformational grammar wikipedia , lookup

Pleonasm wikipedia , lookup

Focus (linguistics) wikipedia , lookup

Parsing wikipedia , lookup

Interpretation (logic) wikipedia , lookup

Lexical semantics wikipedia , lookup

Junction Grammar wikipedia , lookup

Symbol grounding problem wikipedia , lookup

General semantics wikipedia , lookup

Semantic holism wikipedia , lookup

Meaning (philosophy of language) wikipedia , lookup

Cognitive semantics wikipedia , lookup

The study of how meaning in language is created by the use and interrelationships
of words, phrases, and sentences.
The semantics of a programming language describe the relationship between the
syntactical elements and the model of computation.
Different languages have different semantic features (over and above differences
in lexical semantics)
the definitions of the meaning of metadata elements, as opposed to the rules for
encoding or representing the values of the elements, see also syntax.
Word meanings, including patterns of associated words and concepts.
The names and meanings of metadata elements. Source: NISO, Understanding
A relationship between words, phrases or any other allowable constraint and their
actual meaning. This is contrast to "Syntax". ...
the branch of linguistics which studies meaning in language.
The study of meaning and the development of meanings of words.
The meanings assigned to symbols and sets of symbols in a language.
The study of meaning (corresponding to all lexical items and to all possible
The study of meaning in language, including the relationship between language,
thought, and behavior.
The meaning of a word, phrase, sentence, or text; "a petty argument about
Logical Semantics
Let us introduce a very simple formal language L. It will use three variables, x, y, and z;
one one-place relation, m; two two-place relations, M and P; and the standard logical
operation of negation, ¬. We define the syntax of L by the following rules of term
mvariable, e.g., mx;
variableMvariable, e.g., xMy;
¬term, e.g., ¬xPy.
If we now interpret the variables as people, m as male, M as married, and P as parent, the
examples above will mean, respectively:
x is male;
x is married to y; and
x is not a parent of y.
L now has semantics. Of sorts.
Philosophical Semantics
This discipline is interested in the truth values of the propositions expressed by sentences,
e.g., the sentence Connecticut is a USA state expresses a proposition that is true (T) while
the sentence Canada is a USA state expresses a proposition that is false (F). It shares with
logical semantics an interest in truth preservation and formulates such rules as T & T = T,
i.e., if two propositions are true then their conjunction is also true.
Linguistic Semantics
This discipline, of which's ontological semantics is the most advanced school,
studies the meaning of sentences and texts as they are understood intuitively by native
speakers. Because native speakers have internalized large lexicons, based presumably on
a large ontology, as well as the rules combining word meanings into sentence meanings,
making inferences, etc., ontological semantics has committed a major effort to the
acquisition of such resources and discovery of such rules. This approach is called
representational. Most linguistic semanticists do not have the know-how or resources to
practice this approach and, intimidated by the computer scientists and engineers
dominating computational/NLP semantics, attempt to take a short cut into the nonrepresentational approach by replacing the resources and the rules with logical or
statistical methods to linguistic meaning. The currently still dominant "formal semantics,"
a combination of logical and philosophical semantics from above, replaces the meaning
of the sentence as the goal of its study with the truth value, thus severely limiting the
scope of linguistic meaning to what can be easily logicized. Thus, they can handle the
meaning of every in every chair by applying the universal quantifier
("all") to it but are
incapable of accounting for the meaning of chair. Similarly, they easily represent John
loves Mary with love (John, Mary) and John hates Mary with hate (John, Mary) but
cannot access the meanings of love or hate. What the formal semanticists cannot account
for, which is most of linguistic semantics, they define out of semantics and out of the
scope of formal methods. Semantic fields
In studying the lexicon of English (or any language) we may group together lexemes
which inter-relate, in the sense that we need them to define or describe each other. For
example we can see how such lexemes as cat, feline, moggy, puss, kitten, tom, queen and
miaow occupy the same semantic field. We can also see that some lexemes will occupy
many fields: noise will appear in semantic fields for acoustics, pain or discomfort and
electronics (noise = “interference”). Although such fields are not clear-cut and coherent,
they are akin to the kind of groupings children make for themselves in learning a
language. An entertaining way to see how we organize the lexicon for ourselves is to play
word-association games.
In linguistics, semantics is the subfield that is devoted to the study of meaning, as
inherent at the levels of words, phrases, sentences, and even larger units of discourse
(referred to as texts). The basic area of study is the meaning of signs, and the study of
relations between different linguistic units: homonymy, synonymy, antonymy, polysemy,
paronyms, hypernymy, hyponymy, meronymy, metonymy, holonymy, exocentricity /
endocentricity, linguistic compounds. A key concern is how meaning attaches to larger
chunks of text, possibly as a result of the composition from smaller units of meaning.
Traditionally, semantics has included the study of connotative sense and denotative
reference, truth conditions, argument structure, thematic roles, discourse analysis, and the
linkage of all of these to syntax.
Formal semanticists are concerned with the modeling of meaning in terms of the
semantics of logic. Thus the sentence John loves a bagel above can be broken down into
its constituents (signs), of which the unit loves may serve as both syntactic and semantic
In the late 1960s, Richard Montague proposed a system for defining semantic entries in
the lexicon in terms of lambda calculus. Thus, the syntactic parse of the sentence above
would now indicate loves as the head, and its entry in the lexicon would point to the
arguments as the agent, John, and the object, bagel, with a special role for the article "a"
(which Montague called a quantifier). This resulted in the sentence being associated with
the logical predicate loves (John, bagel), thus linking semantics to categorial grammar
models of syntax. The logical predicate thus obtained would be elaborated further, e.g.
using truth theory models, which ultimately relate meanings to a set of Tarskiian
universals, which may lie outside the logic. The notion of such meaning atoms or
primitives are basic to the language of thought hypothesis from the 70s.
Despite its elegance, Montague grammar was limited by the context-dependent
variability in word sense, and led to several attempts at incorporating context, such as :
situation semantics ('80s): Truth-values are incomplete, they get assigned based
on context
generative lexicon ('90s): categories (types) are incomplete, and get assigned
based on context