Download Resolution Algorithm

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

History of logic wikipedia , lookup

Mathematical logic wikipedia , lookup

Catuṣkoṭi wikipedia , lookup

Modal logic wikipedia , lookup

Algorithm wikipedia , lookup

Quantum logic wikipedia , lookup

Laws of Form wikipedia , lookup

Boolean satisfiability problem wikipedia , lookup

Intuitionistic logic wikipedia , lookup

Law of thought wikipedia , lookup

Interpretation (logic) wikipedia , lookup

Propositional calculus wikipedia , lookup

Truth-bearer wikipedia , lookup

Transcript
Course : T0264 – Artificial Intelligence
Year : 2013
LECTURE 06
Logical Agent
Learning Outcomes
At the end of this session, students will be able to:
» LO 2 : Explain various intelligent search algorithms
to solve the problems
» LO 3 : Explain how to use knowledge
representation in reasoning purpose
T0264 - Artificial Intelligence
3
Outline
1.
2.
3.
4.
Knowledge-Based Agents
Logic in General
Propositional Logic (PL) : A Very Simple Logic
Proportional Theorem Proving (Resolution
Algorithm)
5. Forward and Backward Algorithm
6. Summary
T0264 - Artificial Intelligence
4
Knowledge-Based
Agents
• Knowledge base = set of sentences in a formal language
• Declarative approach to building an agent (or other
system):
– Tell it what it needs to know
• Then it can Ask itself what to do - answers should follow
from the KB
• Agents can be viewed at the knowledge level
i.e., what they know, regardless of how implemented
• Or at the implementation level
– i.e., data structures in KB and algorithms that manipulate them
T0264 - Artificial Intelligence
5
Tugas Presentasi I
• Logical Agents & Wumpuss World
ANDREAN HARRY RAMADHAN
DIONDY KUSUMA
RIVKY CHANDRA
WIENA MARCELINA
WILSON GUNAWAN
ANDRE IVAN
T0264 - Artificial Intelligence
6
Knowledge-Based
Agents
A Simple Knowledge-Based Agent
• The agent must be able to:
–
–
–
–
–
Represent states, actions, etc.
Incorporate new percepts
Update internal representations of the world
Deduce hidden properties of the world
Deduce appropriate actions
T0264 - Artificial Intelligence
7
Knowledge-Based
Agents
Wumpus World PEAS Description
• Performance measure
– gold +1000, death -1000
– -1 per step, -10 for using the arrow
• Environment
–
–
–
–
–
–
–
Squares adjacent to wumpus are smelly
Squares adjacent to pit are breezy
Glitter if gold is in the same square
Shooting kills wumpus if you are facing it
Shooting uses up the only arrow
Grabbing picks up gold if in same square
Releasing drops the gold in same square
• Sensors: Stench, Breeze, Glitter, Bump, Scream
• Actuators: Left turn, Right turn, Forward, Grab, Release, Shoot
T0264 - Artificial Intelligence
8
Knowledge-Based
Agents
Wumpus World Characterization
•
•
•
•
•
•
Fully Observable No – only local perception
Deterministic Yes – outcomes exactly specified
Episodic No – sequential at the level of actions
Static Yes – Wumpus and Pits do not move
Discrete Yes
Single-agent? Yes – Wumpus is essentially a
natural feature
T0264 - Artificial Intelligence
9
Knowledge-Based
Agents
Exploring a Wumpus World
A = Agent
B = Breeze
G = Glitter, Gold
OK = Safe square
P = Pit
S = Stench
V = Visited
W = Wumpus
T0264 - Artificial Intelligence
10
Knowledge-Based
Agents
Exploring a Wumpus World
T0264 - Artificial Intelligence
11
Logic In General
• Logics are formal languages for representing information
such that conclusions can be drawn
• Syntax defines the sentences in the language
• Semantics define the "meaning" of sentences;
– i.e., define truth of a sentence in a world
• E.g., the language of arithmetic
– x+2 ≥ y is a sentence; x2+y > { } is not a sentence
– x+2 ≥ y is true if the number x+2 is no less than the
number y
– x+2 ≥ y is true in a world where x = 7, y = 1
– x+2 ≥ y is false in a world where x = 0, y = 6
T0264 - Artificial Intelligence
12
Logic In General
Entailment
• Entailment means that one thing follows from another:
KB ╞ α
• Knowledge base KB entails sentence α if and only if α is
true in all worlds where KB is true
– E.g., the KB containing “the Giants won” and “the
Reds won” entails “Either the Giants won or the Reds
won”
– E.g., x+y = 4 entails 4 = x+y
– Entailment is a relationship between sentences (i.e.,
syntax) that is based on semantics
T0264 - Artificial Intelligence
13
Logic In General
Models
• Logicians typically think in terms of models, which are
formally structured worlds with respect to which truth
can be evaluated
• We say m is a model of a sentence α if α is true in m
• M(α) is the set of all models of α
• Then
KB ╞ α iff M(KB)  M(α)
E.g.
KB = Giants won and Reds won
α = Giants won
T0264 - Artificial Intelligence
14
Logic In General
Wumpus Models
• KB = wumpus-world rules + observations
T0264 - Artificial Intelligence
15
Logic In General
Wumpus Models
• KB = wumpus-world rules +
observations
• α1 = "[1,2] is safe",
KB ╞ α1, proved by model
checking
• KB = wumpus-world rules +
observations
T0264 - Artificial Intelligence
16
Logic In General
Wumpus Models
• KB = wumpus-world rules + observations
• α2 = "[2,2] is safe", KB ╞ α2
T0264 - Artificial Intelligence
17
Logic In General
Inference
• KB ├i α = sentence α can be derived from KB by
procedure i
• Soundness: i is sound if whenever KB ├i α, it is also true
that KB╞ α
• Completeness: i is complete if whenever KB╞ α, it is also
true that KB ├i α
• Preview: we will define a logic (first-order logic) which is
expressive enough to say almost anything of interest,
and for which there exists a sound and complete
inference procedure.
• That is, the procedure will answer any question whose
answer follows from what is known by the KB.
T0264 - Artificial Intelligence
18
Logic In General
• Sentences are physical configuration of agent, and
reasoning is a process of constructing new physical
configurations from old ones.
• New configurations represent aspects of the world that
actually follow from the aspect that the old configurations
represent
T0264 - Artificial Intelligence
19
Propositional Logic:
A Very Simple Logic
Standard Logic Symbols
 = For all ; [e.g : every one, every body, any time, etc]
 = There exists ; [e.g : some one, some time, etc]
 = Implication ; [ if … then ….]
 = Equivalent ; biconditional [if … and … only … if …]
 = Not ; negation
 = OR ; disjunction
 = AND ; conjunction
T0264 - Artificial Intelligence
20
Propositional Logic:
A Very Simple Logic
Syntax
• Propositional logic is the simplest logic – illustrates basic
ideas
• The proposition symbols S1, S2 etc are sentences
– If S is a sentence, S is a sentence (negation)
– If S1 and S2 are sentences, S1  S2 is a sentence
(conjunction)
– If S1 and S2 are sentences, S1  S2 is a sentence
(disjunction)
– If S1 and S2 are sentences, S1  S2 is a sentence
(implication)
– If S1 and S2 are sentences, S1  S2 is a sentence
(biconditional)
T0264 - Artificial Intelligence
21
Propositional Logic:
A Very Simple Logic
Semantics
• True is true in every model and False is false in every model
• The truth value of every other proposition symbol must be specified
directly in the model.
• For complex sentences, there are five rules for any sub sentences P
and Q in any model m :
 P is true iff P is false in m.
 P  Q is true iff both P and Q true in m.
 P  Q is true iff either P or Q is true in m.
 P  Q is true unles P is true and Q is false in m.
i.e., is false iff P is true and Q is false
 P  Q is true iff P and Q are both true or both false in m.
i.e., P Q is true and Q  P is true
• Simple recursive process evaluates an arbitrary sentence, e.g.,
P1,2  (P2,2  P3,1) = true  (true  false) = true  true = true
T0264 - Artificial Intelligence
22
Propositional Logic:
A Very Simple Logic
Truth Tables for Connectives
T0264 - Artificial Intelligence
23
Propositional Logic:
A Very Simple Logic
A Truth-Table Enumeration Algorithm
• Depth-first enumeration of all models is sound and
complete
• For n symbols, time complexity is O(2n), space
complexity is O(n)
T0264 - Artificial Intelligence
24
Propositional Logic:
A Very Simple Logic
Logical Equivalences
• Two sentences are logically equivalent iff true in same
models: α ≡ ß iff α╞ β and β╞ α
T0264 - Artificial Intelligence
25
Propositional Theorem Proving
(Resolution Algorithm)
• Inference procedure based on resolution work by using
the principle of proof by contradiction.
• That is, to show that KB├ = , we show that (KB   )
is unsatisfied. We do this by contradiction.
Therefore:
 First : (KB   ) is convert into CNF (normal clause
form)
 Second : show clauses obtained by resolving pairs in
the row
T0264 - Artificial Intelligence
26
Propositional Theorem Proving
(Resolution Algorithm)
Algorithm:
1. Convert all the proposition logic to normal clause form
(CNF).
2. Negate P and convert the result to clause form. Add it to
the set of clause in step
3. Repeat until either a contradiction is found or no
progress can be made, or a predetermined amount of
effort has been expended.
a. Select two clause. Call these the parent clauses.
T0264 - Artificial Intelligence
27
Propositional Theorem Proving
(Resolution Algorithm)
Algorithm: (cont’d)
b. Resolve them together. The resolvent will be the
disjunction of all of the literals of both parent clauses
with appropriate substitutions performed and with the
following exception:
If there is one pairs of literals T1 and T2 such that one of the
parent clauses contains T1 and the other contains  T2 and if T1
and T2 are unifiable, then neither T1 nor  T2 should appear in
the resolvent. If there is more than one pair of complementary
literals, only one pair should be omitted from the resolvent.
c. If the resolvent is the empty clause, then a
contradiction has been found. If it is not, then add it to
the set of clause available to the procedure.
T0264 - Artificial Intelligence
28
Propositional Theorem Proving
(Resolution Algorithm)
Simple Pseudo Code Resolution
algorithm for proportional logic
T0264 - Artificial Intelligence
29
Propositional Theorem Proving
(Resolution Algorithm)
Conversion to Normal Clause Form
(CNF Algorithm)
Eliminate , using: a  b = a  b.
Reduce the scope of each  to a single term, using de Morgan’s laws:
(p) = p
(ab) = a  b
(ab) = a  b
x : P(x) =x : P(x)
x : P(x) = x : P(x)
3. Standardize variables
For sentences like (∀x P(x)) ∨ (∃x Q(x)) which use the same variable
name twice, change the name of one of the variables. This avoids
confusion later when we drop the quantifiers. For example, from ∀x
[∃y Animal(y) ∧ ¬Loves(x, y)] ∨ [∃y Loves(y, x)]. we obtain: ∀x [∃y
Animal(y) ∧ ¬Loves(x, y)] ∨ [∃z Loves(z, x)].
1.
2.
T0264 - Artificial Intelligence
30
Propositional Theorem Proving
(Resolution Algorithm)
Convert to Normal Clause Form (cont’d)
4. Move all quantifiers to the left of the formula without
changing their relative order.
5. Eliminate existential quantifiers by inserting Skolem
functions.
∃x P(x) into P(A), where A is a new constant
6. Drop universal quantifiers
7. Convert the matrix into a conjunction of disjoints, using
associativity and distributivity (distribute ORs over ANDs)
T0264 - Artificial Intelligence
31
Propositional Theorem Proving
(Resolution Algorithm)
Example :
Conversion to Normal Clause Form
Sentence:
“Every body who know Hitler, either like Hitler or think that
anyone who killed some one is crazy”
Proportional Logic is :
x : [body(x)  know(x, Hitler)] [like(x, Hitler)
(y:z: killed(y, z)  crazy(x, y)]
T0264 - Artificial Intelligence
32
Propositional Theorem Proving
(Resolution Algorithm)
Applied of Clause Form Algorithm
1.
2.
3.
4.
5.
6.
7.
x : [body(x)  know(x, Hitler)]
[like(x, Hitler) (y:z: killed(y, z)  crazy(x, y)]
x : [body(x)  know(x, Hitler)]  [like(x, Hitler)
(y: (z: killed(y, z))  crazy(x, y))]
Ok !
x : y : z : [body(x)  know(x, Hitler)]  [like(x,
Hitler) (killed(y, z))  crazy(x, y)]
Ok !
[body(x)  know(x, Hitler)]  [like(x, Hitler)
(killed(y, z))  crazy(x, y)]
body(x)  know(x, Hitler)  like(x, Hitler) killed(y,
z)  crazy(x, y)
T0264 - Artificial Intelligence
33
Propositional Theorem Proving
(Resolution Algorithm)
Conversion to Normal Form
T0264 - Artificial Intelligence
34
Propositional Theorem Proving
(Resolution Algorithm)
Conversion to Normal Form
T0264 - Artificial Intelligence
35
Forward and Backward
Algorithm
Forward and Backward Chaining
• Idea:
fire any rule whose premises are satisfied in the KB,
– add its conclusion to the KB, until query is found
T0264 - Artificial Intelligence
36
Forward and Backward
Algorithm
Forward Chaining Algorithm
• Forward chaining is sound and complete for Horn KB
T0264 - Artificial Intelligence
37
Forward and Backward
Algorithm
Forward Chaining Example
T0264 - Artificial Intelligence
38
Forward and Backward
Algorithm
Forward Chaining Example
T0264 - Artificial Intelligence
39
Forward and Backward
Algorithm
Forward Chaining Example
T0264 - Artificial Intelligence
40
Forward and Backward
Algorithm
Proof of Completeness
•
FC derives every atomic sentence that is entailed by KB
1. FC reaches a fixed point where no new atomic
sentences are derived
2. Consider the final state as a model m, assigning
true/false to symbols
3. Every clause in the original KB is true in m
a1  …  ak  b
4. Hence m is a model of KB
5. If KB╞ q, q is true in every model of KB, including
m
T0264 - Artificial Intelligence
41
Forward and Backward
Algorithm
Backward Chaining
Idea: work backwards from the query q:
to prove q by BC,
check if q is known already, or
prove by BC all premises of some rule concluding q
Avoid loops: check if new sub goal is already on the goal
stack
Avoid repeated work: check if new sub goal
1. has already been proved true, or
2. has already failed
T0264 - Artificial Intelligence
42
Forward and Backward
Algorithm
Backward Chaining Example
T0264 - Artificial Intelligence
43
Forward and Backward
Algorithm
Backward Chaining Example
T0264 - Artificial Intelligence
44
Forward and Backward
Algorithm
Backward Chaining Example
T0264 - Artificial Intelligence
45
Forward and Backward
Algorithm
Backward Chaining Example
T0264 - Artificial Intelligence
46
Summary
• Logical agents apply inference to a knowledge base to
derive new information and make decisions
• Basic concepts of logic:
– syntax: formal structure of sentences
– semantics: truth of sentences in the models
– entailment: necessary truth of one sentence given
another
– inference: deriving sentences from other sentences
– soundness: derivations produce only entailed
sentences
– completeness: derivations can produce all entailed
sentences
• Resolution is complete for propositional logic
Forward, backward chaining are linear-time, complete
for Horn clauses
T0264 - Artificial Intelligence
47
References
• Stuart Russell, Peter Norvig,. 2010. Artificial intelligence : a
modern approach. PE. New Jersey. ISBN:9780132071482,
Chapter 7
• Elaine Rich, Kevin Knight, Shivashankar B. Nair. 2010. Artificial
Intelligence. MHE. New York. , Chapter 5
• Propositional Logic:
http://logic.stanford.edu/classes/cs157/2009/notes/chap02.ht
ml
• The Resolution Method:
http://www.doc.ic.ac.uk/~sgc/teaching/pre2012/v231/lecture8
.html
T0264 - Artificial Intelligence
48
<< CLOSING >>
End of Session 06
Good Luck
T0264 - Artificial Intelligence
49