Download Slides

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Introduction
Learning by Inference Rules
Experiment
An Inference-rules based Categorial Grammar
Learner for Simulating Language Acquisition
Xuchen Yao, Jianqiang Ma, Sergio Duarte, Çağrı Çöltekin
University of Groningen
18 May 2009
Conclusion
Introduction
Learning by Inference Rules
Outline
Introduction
Combinatory Categorial Grammar
Language Acquisition
Learning by Inference Rules
Grammar Induction by Inference Rules
The Learning Architecture
Experiment
Learning an Artificial Grammar
Learning Auxiliary Verb Fronting
Learning Correct Word Order
Conclusion
Experiment
Conclusion
Introduction
Learning by Inference Rules
Outline
Introduction
Combinatory Categorial Grammar
Language Acquisition
Learning by Inference Rules
Grammar Induction by Inference Rules
The Learning Architecture
Experiment
Learning an Artificial Grammar
Learning Auxiliary Verb Fronting
Learning Correct Word Order
Conclusion
Experiment
Conclusion
Introduction
Learning by Inference Rules
Experiment
Conclusion
Categorial Grammar
Peter saw a book
NP VP DT
NP
• basic categories: S
(sentence), NP (noun
phrase), N (noun)
• Complex categories:
NP/N, S\NP and
(S\NP)\(S\NP)
• Slash operators: / \
N
VP
S
Peter
saw
a
book
NP (S\NP)/NP NP/N
N
NP
S\NP
S
Figure: Example derivation for
sentence Peter saw a book.
>
>
<
Introduction
Learning by Inference Rules
Experiment
Conclusion
Categorial Grammar
Peter saw a book
NP VP DT
NP
• basic categories: S
(sentence), NP (noun
phrase), N (noun)
• Complex categories:
NP/N, S\NP and
(S\NP)\(S\NP)
• Slash operators: / \
N
VP
S
Peter
saw
a
book
NP (S\NP)/NP NP/N
N
NP
S\NP
S
Figure: Example derivation for
sentence Peter saw a book.
>
>
<
Introduction
Learning by Inference Rules
Experiment
Conclusion
Different Operation Rules
• Function application rules (CG)
Forward
A/B B
→ A (>)
Backward B
A\B → A (<)
• Function composition rules (CCG)
Forward
A/B B/C → A/C (> B)
Backward B\C A\B → A\C (< B)
• Type raising rules (CCG)
Forward
A → T /(T \A) (> T)
Backward A → T \(T /A) (< T)
• Substitution rules (CCG)
Forward
(A/B)/C B/C
→ A/C
Backward B\C
(A\B)\C → A\C
(>S)
(<S)
Introduction
Learning by Inference Rules
Outline
Introduction
Combinatory Categorial Grammar
Language Acquisition
Learning by Inference Rules
Grammar Induction by Inference Rules
The Learning Architecture
Experiment
Learning an Artificial Grammar
Learning Auxiliary Verb Fronting
Learning Correct Word Order
Conclusion
Experiment
Conclusion
Introduction
Learning by Inference Rules
Experiment
Nativist vs. Empiricist
• Auxiliary Verb Fronting
• Peter is awake.
• Is Peter awake?
• Peter who is sleepy is awake.
• Is Peter who is sleepy awake?
• *Is Peter who sleepy is awake?
• Word Order
•
•
•
•
•
•
I
I
I
I
I
I
should go.
have gone.
am going.
have been going.
should have gone.
should be going.
• I should have been going.
• *I have should been going.
Conclusion
Introduction
Learning by Inference Rules
Experiment
Nativist vs. Empiricist
• Auxiliary Verb Fronting
• Peter is awake.
• Is Peter awake?
• Peter who is sleepy is awake.
• Is Peter who is sleepy awake?
• *Is Peter who sleepy is awake?
• Word Order
•
•
•
•
•
•
I
I
I
I
I
I
should go.
have gone.
am going.
have been going.
should have gone.
should be going.
• I should have been going.
• *I have should been going.
Conclusion
Introduction
Learning by Inference Rules
Experiment
Nativist vs. Empiricist
• Auxiliary Verb Fronting
• Peter is awake.
• Is Peter awake?
• Peter who is sleepy is awake.
• Is Peter who is sleepy awake?
• *Is Peter who sleepy is awake?
• Word Order
•
•
•
•
•
•
I
I
I
I
I
I
should go.
have gone.
am going.
have been going.
should have gone.
should be going.
• I should have been going.
• *I have should been going.
Conclusion
Introduction
Learning by Inference Rules
Experiment
Nativist vs. Empiricist
• Auxiliary Verb Fronting
• Peter is awake.
• Is Peter awake?
• Peter who is sleepy is awake.
• Is Peter who is sleepy awake?
• *Is Peter who sleepy is awake?
• Word Order
•
•
•
•
•
•
I
I
I
I
I
I
should go.
have gone.
am going.
have been going.
should have gone.
should be going.
• I should have been going.
• *I have should been going.
Conclusion
Introduction
Learning by Inference Rules
Experiment
Conclusion
Research Questions
1. Can we give a computational simulation of the acquisition of
syntactic structures?
• How do we derive the category of an unknown word in a
sentence?
2. Can we give a judgement of the Nativist-Empiricist debate
from the perspective of CCG?
• How important is experience? Or the innate ability is more
important?
Introduction
Learning by Inference Rules
Outline
Introduction
Combinatory Categorial Grammar
Language Acquisition
Learning by Inference Rules
Grammar Induction by Inference Rules
The Learning Architecture
Experiment
Learning an Artificial Grammar
Learning Auxiliary Verb Fronting
Learning Correct Word Order
Conclusion
Experiment
Conclusion
Introduction
Learning by Inference Rules
Experiment
Level 0/1 Inference Rules
• Level 0 inference rules
B/A X
→ B ⇒ X = A ifA 6= S
X
B\A → B ⇒ X = A ifA 6= S
• Level 1 inference rules
A X → B ⇒ X = B\A ifA 6= S
X A → B ⇒ X = B/A ifA 6= S
Peter works
NP
X
(S\NP)
S
<
Figure: Example of level 1 indefrence rules: Peter works.
Conclusion
Introduction
Learning by Inference Rules
Experiment
Level 2 Inference Rules
• Level 2 side inference rules
X A B → C ⇒ X = (C /B)/A
A B X → C ⇒ X = (C \A)\B
• Level 2 middle inference rule
A X B → C ⇒ X = (C \A)/B
Peter
NP
saw
a
book
X
NP/N
((S\NP)/NP)
NP
S\NP
S
N
>
>
<
Figure: Example of level 2 inference rules: Peter saw a book.
Conclusion
Introduction
Learning by Inference Rules
Experiment
Level 3 Inference Rules
• Level 3 side inference rules
X A B C → D ⇒ X = ((D/C )/B)/A
A B C X → D ⇒ X = ((D\A)\B)\C
• Level 3 middle inference rules
A X B C → D ⇒ X = ((D\A)/C )/B
A B X C → D ⇒ X = ((D\A)\B)/C
• Inference rules of up to level 3 can derive most categories of
common English words.
Conclusion
Introduction
Learning by Inference Rules
Outline
Introduction
Combinatory Categorial Grammar
Language Acquisition
Learning by Inference Rules
Grammar Induction by Inference Rules
The Learning Architecture
Experiment
Learning an Artificial Grammar
Learning Auxiliary Verb Fronting
Learning Correct Word Order
Conclusion
Experiment
Conclusion
Learning by Inference Rules
Experiment
The Learning Architecture
Edge
Generator
Corpus
Generation
rule set
N
Y
Level 2
IR can
parse?
Y
Applying level
0 and 1 inference
rule recursively
N
Cannot learn
SCP Principle
Level 1
IR can
parse?
N
Level 3
IR can
parse?
Recursive
Learner
Right
combining
rule
N
Introduction
Y
Y
Level 0
IR can
parse?
Y
Output Selector
Test
rule set
Figure: Learning process using inference rules
Conclusion
Introduction
Learning by Inference Rules
Outline
Introduction
Combinatory Categorial Grammar
Language Acquisition
Learning by Inference Rules
Grammar Induction by Inference Rules
The Learning Architecture
Experiment
Learning an Artificial Grammar
Learning Auxiliary Verb Fronting
Learning Correct Word Order
Conclusion
Experiment
Conclusion
Introduction
Learning by Inference Rules
Experiment
Target Grammar
Peter
Mary
big
colorless
book
telescope
the
run
big
:=
:=
:=
:=
:=
:=
:=
:=
:=
NP
NP
N/N
N/N
N
N
NP/N
S\NP
N/N
with
with
with
sleep
a
give
saw
read
furiously
:=
:=
:=
:=
:=
:=
:=
:=
:=
(N\N)/NP
(NP\NP)/NP
((S\NP)\(S\NP))/NP
S\NP
NP/N
((S\NP)/NP)/NP
(S\NP)/NP
(S\NP)/NP
(S\NP)\(S\NP)
Table: Target Grammar Rules
• Recursive & ambiguous
• Assume only NP and N are known to the learner
Conclusion
Introduction
Learning by Inference Rules
Experiment
Conclusion
Result
Peter
saw
Mary
with
a
big big telescope
NP (S\NP)/NP NP (NP\NP)/NP NP/N N/N N/N
N
N
NP
NP\NP
NP
S\NP
S
Peter
saw
Mary
with
a
big big telescope
NP (S\NP)/NP NP ((S\NP)\(S\NP))/NP NP/N N/N N/N
N
>
S\NP
>
N
NP
<
(S\NP)\(S\NP)
>
<
>
<
N
N
>
S\NP
S
(b)
(a)
Figure: Two ambiguous parses of the sentence
>
>
<
>
<
<
Introduction
Learning by Inference Rules
Experiment
Conclusion
Result
Peter
saw
Mary
with
a
big big telescope
NP (S\NP)/NP NP (NP\NP)/NP NP/N N/N N/N
N
N
N
NP
NP\NP
NP
S\NP
S
Figure: Ambiguous parse 1
>
>
<
>
<
>
<
Introduction
Learning by Inference Rules
Experiment
Conclusion
Result
Peter
saw
Mary
with
a
big big telescope
NP (S\NP)/NP NP ((S\NP)\(S\NP))/NP NP/N N/N N/N
S\NP
>
N
N
N
NP
(S\NP)\(S\NP)
S\NP
S
Figure: Ambiguous parse 2
>
>
<
>
<
<
Introduction
Learning by Inference Rules
Outline
Introduction
Combinatory Categorial Grammar
Language Acquisition
Learning by Inference Rules
Grammar Induction by Inference Rules
The Learning Architecture
Experiment
Learning an Artificial Grammar
Learning Auxiliary Verb Fronting
Learning Correct Word Order
Conclusion
Experiment
Conclusion
Introduction
Learning by Inference Rules
Experiment
Conclusion
Learning Auxiliary Verb Fronting 1
Peter
is
sleepy
Is
(Sq /(Sadj \NP))/NP NP Sadj \NP
NP (S\NP)/(Sadj \NP) Sadj \NP
>
S\NP
Peter
Sq /(Sadj \NP)
<
S
(a)
who
Peter awake
>
<
Sq
(b)
is
sleepy
is
awake
NP (NP\NP)/(S\NP) (S\NP)/(Sadj \NP) Sadj \NP (S\NP)/(Sadj \NP) Sadj \NP
S\NP
>
S\NP
>
>
NP\NP
<
NP
S
(c)
Figure: Learning Auxiliary Verb Fronting 1
<
Introduction
Learning by Inference Rules
Experiment
Conclusion
Learning Auxiliary Verb Fronting 2
Is
Peter
who
is
sleepy
awake
(Sq /(Sadj \NP))/NP NP (NP\NP)/(S\NP) (S\NP)/(Sadj \NP) Sadj \NP Sadj \NP
S\NP
NP\NP
NP
Sq /(Sadj \NP)
Sq
Figure: Learning Auxiliary Verb Fronting 2
• is := (S\NP)/(Sadj \NP)
Is := (Sq /(Sadj \NP))/NP
>
>
<
>
>
Introduction
Learning by Inference Rules
Outline
Introduction
Combinatory Categorial Grammar
Language Acquisition
Learning by Inference Rules
Grammar Induction by Inference Rules
The Learning Architecture
Experiment
Learning an Artificial Grammar
Learning Auxiliary Verb Fronting
Learning Correct Word Order
Conclusion
Experiment
Conclusion
Introduction
Learning by Inference Rules
Experiment
Conclusion
Learning Correct Word Order
• I should go.
• I have gone.
should
should
should
have
have
be
• I am going.
• I have been going.
• I should have gone.
• I should be going.
:=
:=
:=
:=
:=
:=
(Ss \NP)/(S\NP)
(Ss \NP)/(Sh \NP)
(Ss \NP)/(Sb \NP)
(Sh \NP)/(S\NP)
(Sh \NP)/(Sb \NP)
(Sb \NP)/(S\NP)
• I should have been going.
• *I have should been going.
I
should
have
been
going
NP (Ss \NP)/(Sh \NP) (Sh \NP)/(Sb \NP) (Sb \NP)/(S\NP) S\NP
Sb \NP
Sh \NP
Ss \NP
Ss
Figure: Learning Correct Word Order
>
>
>
<
Introduction
Learning by Inference Rules
Experiment
Conclusion
Conclusion
1. Can we give a computational simulation of the acquisition of
syntactic structures?
• How do we derive the category of an unknown word in a
sentence?
• This paper presents a simple and intuitive method to achieve
this.
2. Can we give a judgement of the Nativist-Empiricist debate
from the perspective of CCG?
• How important is experience? Or the innate ability is more
important?
• Simple and intuitive logical rules can also help resolve the
celebrated linguistic phenomena.
• Logic gives a third way beside experience and innateness.
Introduction
Learning by Inference Rules
Experiment
Conclusion
Conclusion
1. Can we give a computational simulation of the acquisition of
syntactic structures?
• How do we derive the category of an unknown word in a
sentence?
• This paper presents a simple and intuitive method to achieve
this.
2. Can we give a judgement of the Nativist-Empiricist debate
from the perspective of CCG?
• How important is experience? Or the innate ability is more
important?
• Simple and intuitive logical rules can also help resolve the
celebrated linguistic phenomena.
• Logic gives a third way beside experience and innateness.
Related documents