Download Class Notes # 10c: Semantics

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Esperanto grammar wikipedia , lookup

Arabic grammar wikipedia , lookup

Udmurt grammar wikipedia , lookup

Causative wikipedia , lookup

Navajo grammar wikipedia , lookup

Modern Hebrew grammar wikipedia , lookup

Scottish Gaelic grammar wikipedia , lookup

Portuguese grammar wikipedia , lookup

Serbo-Croatian grammar wikipedia , lookup

English clause syntax wikipedia , lookup

Polish grammar wikipedia , lookup

Kannada grammar wikipedia , lookup

Parsing wikipedia , lookup

Grammatical case wikipedia , lookup

Yiddish grammar wikipedia , lookup

Chinese grammar wikipedia , lookup

Ancient Greek grammar wikipedia , lookup

Icelandic grammar wikipedia , lookup

Georgian grammar wikipedia , lookup

Spanish grammar wikipedia , lookup

Latin syntax wikipedia , lookup

Lexical semantics wikipedia , lookup

Pipil grammar wikipedia , lookup

Cognitive semantics wikipedia , lookup

Transcript
NLP — semantics
Points
Semantic analysis
• Semantic markers
Case analysis
• Syntactic patterns
• Case lists
• An algorithm
Quantifier scope
A taste of discourse analysis
A look at pragmatics
CSI 4106, Winter 2005
NLP — semantics, page 1
Semantic analysis
Semantic analysis may follow parsing: map a parse
tree (a syntactic structure) into a representation of
meaning (a knowledge structure).
Semantics resides at both sides of parsing, and
elements of meaning come from words. Lexical
knowledge lives in dictionaries. It has two forms.
• Morphological and syntactic information about the
word: part-of-speech (class), number, case, gender,
tense, requirements (for verbs), and so on.
• Semantic information about the word, for example,
a semantic marker that locates — in a hierarchy of
concepts — the concept that the word denotes.
CSI 4106, Winter 2005
NLP — semantics, page 2
Semantic markers
Suppose that a dictionary entry contains both syntactic
and semantic information. (Verb patterns would be
unused for other classes.) For example:
lexicon( Word, Class, SyntCategories,
Root, VerbPattern, Semantics).
The word “ball” could have at least these two entries:
lexicon( ball, verb, [inf, pres],
ball, trans, [makeBall]).
lexicon( ball, noun, [sg], ball,
_unused, [sportsEquipment, dance]).
CSI 4106, Winter 2005
NLP — semantics, page 3
Semantic markers (2)
Here is a place for the noun senses of “ball”
(semantic markers) in a possible hierarchy:
physical
object
...
artifact
natural
object
...
...
equipment
tool
...
sports
equipment
CSI 4106, Winter 2005
event
...
social natural
event
event
...
...
charity
entertainment
...
dance
NLP — semantics, page 4
Case analysis
It is one of many methods of semantic analysis, based on
familiar ideas: recognize a general situation (denoted by a
verb) and roles in this situation (denoted usually by noun
phrases).
Examples of syntactic verb patterns
intransitive (subject, verb)
Jim laughed.
transitive (subject, verb, object)
Jim found a penny.
bitransitive (subject, verb, indirect object, object)
Jim gave a penny to Jill.
to-inf (subject, verb, infinitive clause)
Jim wanted to laugh.
object + to-inf (subject, verb, object, infinitive clause)
Jim wanted Jill to laugh.
for-object + to-inf (subject, verb, for object, infinitive clause)
Jim waited for Jill to laugh.
CSI 4106, Winter 2005
NLP — semantics, page 5
Case analysis (2)
Examples of semantic verb patterns
Agent
(if the subject is animate then the subject → Agent)
laughed( Jim )
Agent + Object
(if the subject is animate then
the subject → Agent, the object → Object)
found( Jim, penny )
Agent + Object + Beneficiary
if the subject is animate then the subject → Agent;
if the indirect object is animate then the indirect object
→ Beneficiary, the object → Object)
gave( Jim, penny, Jill )
CSI 4106, Winter 2005
NLP — semantics, page 6
Case analysis (3)
Another example of semantic verb patterns
Agent + Content
(if subject is animate then subject → Agent;
subordinate sentence → Content)
wanted( Jim,  “Jim laugh” )
wanted( Jim,  “Jill laugh” )
We need some form of a pointer to the semantic
structure for the embedded sentence. [Recall the
“boxed” propositions in the conceptual graph notation.]
CSI 4106, Winter 2005
NLP — semantics, page 7
Case analysis (4)
Lists of cases used in NLP systems have usually more
than a few elements. Here is an example.
Participant cases
Accompaniment, Agent, Beneficiary, Exclusion,
Experiencer, Instrument, Object, Recipient
Causality cases
Cause, Effect, Opposition, Purpose
Spatial cases
Direction, LocationAt, LocationFrom, LocationTo,
LocationThrough, Orientation, Order
Temporal cases
Frequency, TimeAt, TimeFrom, TimeThrough, TimeTo
Quality cases
Content, Manner, Material, Measure
CSI 4106, Winter 2005
NLP — semantics, page 8
Case analysis (5)
A case marker is a syntactic element that signals the
presence of a case. A preposition (in, at, from, of, for, ...)
may mark cases. A position of a noun phrase (subject,
direct object, indirect object) also marks case.
Subject
Agent
Jim hit the ball.
Experiencer
Jim grew hungry as time passed.
Instrument
The ball broke the window.
Cause
The wind broke the window with a branch.
Indirect object
Recipient
I threw the dog a ball.
Beneficiary
I wrote her a reference letter to her boss.
Direct object
Object
John hits the ball.
CSI 4106, Winter 2005
NLP — semantics, page 9
Case analysis (6)
A few examples of markers that mark exactly one case.
LocationThrough
Manner
LocationAt
Exclusion
Opposition
TimeFrom
TimeTo
CSI 4106, Winter 2005
We walked around the courtyard.
She acted as my agent last year.
Sit beside me.
Everyone was pleased except her.
They persisted despite my warning.
He has been sick since the accident.
We worked till dawn.
NLP — semantics, page 10
Case analysis (7)
Examples of markers that mark many cases: at, for.
Direction
The deer ran right at the hunters.
LocationAt
I stood at the door.
TimeAt
The case will be heard at noon.
Manner
The car moves at high speed.
Content
She is good at arts.
Measure
It stopped at fifty.
Cause
She was amazed at his insolence.
LocationTo
Aim for the heart.
Direction
Run for the train.
Content
I stand for social responsibility.
TimeThrough
They worked for three hours.
Beneficiary
I'd walk a mile for them.
Purpose
This drug is for people with a flu.
Measure
Sell it for fifty dollars.
Cause
He received a medal for courage.
Recipient
This mail is for everyone.
TimeAt
Call him for ten o'clock.
CSI 4106, Winter 2005
NLP — semantics, page 11
Case analysis (8)
An algorithm of case analysis
1. In the parse tree, identify all case markers.
2. Find case patterns of the main verb (assume a knowledge
base of patterns!).
3. Apply rules — based on lexical, syntactic and semantic
features — to match case markers with cases.
Examples of rules [see slides 6-7 for more]
active sentence, animate subject: subject → Agent
Jim laughed.
passive sentence, inanimate subject: subject → Object
The window was broken.
passive sentence, animate subject: subject → Experiencer
Jim was detained.
CSI 4106, Winter 2005
NLP — semantics, page 12
Quantifier scoping
Every author wrote a book.
a  b author(a)  book(b)  wrote(a, b)
skolemize: a author(a)  book(s(a))  wrote(a, s(a))
 b a author(a)  book(b)  wrote(a, b)
skolemize: a author(a)  book(B0)  wrote(a, B0)
Only one scoping is correct: which one?
The man picked up all papers.
THE m p man(m)  paper(p)  pickedUp(m, p)
p THE m man(m)  paper(p)  pickedUp(m, p)
A simple algorithm: fixed precedence, for example,
the > each > what, who, whom > every, all, some, a
But: there is no universally approved, objective ordering.
CSI 4106, Winter 2005
NLP — semantics, page 13
A taste of discourse analysis
Text units beyond sentences — examples
• A story (such as a fairy tale, a drama, ...).
• A news item.
• Dialogue.
• Technical text (manual, textbook, documentation).
• A document in a document base (abstract, patent
description, ...).
Links between sentences/phrases in a larger text
• Textual ordering.
• Temporal link (for example, an event precedes
another event).
Jim saw the bus. He ran to catch it.
“saw” precedes “ran”
CSI 4106, Winter 2005
NLP — semantics, page 14
... discourse analysis (2)
• Causal link (for example, reason, effect, prerequisite).
Jim saw the bus pull away. He waved to the driver.
“waved” could be an effect of “saw”
• Coreference: linking references to the same entity.
Jim bought a book. He liked it a lot.
“he” = Jim, 'it' = book (and “bought” precedes “liked”)
Jim bought a book. The price was good.
price is a property of books (and it enables buying)
Jim bought a book. He paid $10.
paying is an element of (is included in) buying
Jim bought a book. The dust-jacket was red.
dust-jackets are parts of books
CSI 4106, Winter 2005
NLP — semantics, page 15
A look at pragmatics
Focus
Here is one tiny example from a hypothetical NLP
interface to an airline reservation system:
I want to fly to Vancouver tomorrow night.
There is a flight at 6.
When does it arrive?
At 8 local time.
Is it WestJet?
No, Air Canada.
Show me others. ←
shift of focus
Modelling beliefs: who knows what, who believes what.
This can be done formally, in advanced forms of logic,
for example in autoepistemic logic (check it out).
CSI 4106, Winter 2005
NLP — semantics, page 16
... pragmatics (2)
Plan-based understanding
We can use scripts (see textbook, section 7.1.4).
Jim was hungry. He stopped at Nate’s deli.
A possible line of reasoning:
hungry
needs
food
buys
food
at
at
restaurant market
Scripts (and other similar
representations of plans)
help fill gaps in the story.
CSI 4106, Winter 2005
fast
food
deli
formal
burgers
NLP — semantics, page 17
... pragmatics (3)
Speech acts
assert—inform—explain;
ask if—ask what;
order—request.
Indirect speech acts
The form disagrees with the intention: a question
(interrogative) or a statement (declarative) really means
something different.
Could you pass the salt?
a request
Do you know that it’s raining?
information
Honey, Fido needs a shower.
a command
... time out — and there is still so much to tell...
CSI 4106, Winter 2005
NLP — semantics, page 18