Download PPT

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Embodied cognitive science wikipedia , lookup

Personal knowledge base wikipedia , lookup

Philosophy of artificial intelligence wikipedia , lookup

Ethics of artificial intelligence wikipedia , lookup

Human–computer chess matches wikipedia , lookup

Expert system wikipedia , lookup

Intelligence explosion wikipedia , lookup

History of artificial intelligence wikipedia , lookup

Existential risk from artificial general intelligence wikipedia , lookup

Knowledge representation and reasoning wikipedia , lookup

Transcript
CPSC 433 : Artificial Intelligence
Tutorials T01 & T02
Andrew Kuipers
[email protected]
Please include [CPSC433] in the subject line of any emails
regarding this course.
CPSC 433 Artificial Intelligence
Expert Systems
• Designed to function similar to a human expert operating
within a specific problem domain
• Used to:
– Provide an answer to a certain problem, or
– Clarify uncertainties where normally a human expert would be
consulted
• Often created to operate in conjunction with humans
working within the given problem domain, rather than as
a replacement for them.
CPSC 433 Artificial Intelligence
Components of an Expert Systems
• Knowledge Base
– Stores knowledge used by the system,
usually represented in a formal logical
manner
Knowledge Base
CPSC 433 Artificial Intelligence
Components of an Expert Systems
• Knowledge Base
– Stores knowledge used by the system,
usually represented in a formal logical
manner
• Inference System
– Defines how existing knowledge may be
used to derive new knowledge
Inference System
Knowledge Base
CPSC 433 Artificial Intelligence
Components of an Expert Systems
• Knowledge Base
– Stores knowledge used by the system,
usually represented in a formal logical
manner
Search
Control
• Inference System
– Defines how existing knowledge may be
used to derive new knowledge
Inference System
• Search Control
– Determines which inference to apply at a
given stage of the deduction
CPSC 433 Artificial Intelligence
Knowledge Base
Components of an Expert Systems
• Knowledge Base
– Stores knowledge used by the system,
usually represented in a formal logical
manner
Search
Control
• Inference System
– Defines how existing knowledge may be
used to derive new knowledge
Inference System
• Search Control
– Determines which inference to apply at a
given stage of the deduction
CPSC 433 Artificial Intelligence
Knowledge Base
Knowledge Representation
• For now, we’ll use a simple If … then … consequence
relation using English semantics
• ie: If [it is raining] Then [I should wear a coat]
– [it is raining] is the antecedent of the relation
– [I should wear a coat] is the consequent of the relation
• Facts can be understood as consequence relations with
an empty antecedent
– ie: “If [] Then [it is raining]” is equivalent to the fact that [it is
raining]
CPSC 433 Artificial Intelligence
Inferring New Knowledge
• New knowledge can be constructed from existing
knowledge using inference rules
• For instance, the inference rule modus ponens can be
used to derive the consequent of a consequence
relation, given that the antecedent is true
• ie:
– k1: If [it is raining] Then [I should wear a coat]
– k2: [it is raining]
– result: [I should wear a coat]
CPSC 433 Artificial Intelligence
Goal Directed Reasoning
• Inference rules are applied to knowledge base in order to
achieve a particular goal
• The goal in an expert system is formed as a question, or
query, to which we want the answer
• ie: [I should wear a coat]?
– note: this would read easier in English as “should I wear a coat”,
but we want to use the same propositional symbol as is in our
knowledge base
• The goal of the search is to determine an answer to the
query, which may be boolean as above or more complex
CPSC 433 Artificial Intelligence
Forward Chaining
• Forward chaining is a data driven method of deriving a
particular goal from a given knowledge base and set of
inference rules
• Inference rules are applied by matching facts to the
antecedents of consequence relations in the knowledge
base
• The application of inference rules results in new
knowledge (from the consequents of the relations
matched), which is then added to the knowledge base
CPSC 433 Artificial Intelligence
Forward Chaining
• Inference rules are successively applied to
elements of the knowledge base until the goal is
reached
• A search control method is needed to select
which element(s) of the knowledge base to
apply the inference rule to at any point in the
deduction
CPSC 433 Artificial Intelligence
Forward Chaining Example
• Knowledge Base:
–
–
–
–
–
If [X croaks and eats flies] Then [X is a frog]
If [X chirps and sings] Then [X is a canary]
If [X is a frog] Then [X is colored green]
If [X is a canary] Then [X is colored yellow]
[Fritz croaks and eats flies]
• Goal: Finding the color of Fritz.
– [Fritz is colored Y]?
CPSC 433 Artificial Intelligence
Forward Chaining Example
Knowledge Base
If [X croaks and eats flies]
Then [X is a frog]
If [X chirps and sings]
Then [X is a canary]
If [X is a frog]
Then [X is colored green]
If [X is a canary]
Then [X is colored yellow]
[Fritz croaks and eats flies]
Goal
[Fritz is colored Y]?
CPSC 433 Artificial Intelligence
Forward Chaining Example
Knowledge Base
If [X croaks and eats flies]
Then [X is a frog]
If [X chirps and sings]
Then [X is a canary]
If [X is a frog]
Then [X is colored green]
If [X is a canary]
Then [X is colored yellow]
[Fritz croaks and eats flies]
Goal
[Fritz is colored Y]?
CPSC 433 Artificial Intelligence
Forward Chaining Example
If [X croaks and eats flies]
Then [X is a frog]
[Fritz croaks and eats flies]
Knowledge Base
If [X croaks and eats flies]
Then [X is a frog]
If [X chirps and sings]
Then [X is a canary]
If [X is a frog]
Then [X is colored green]
[Fritz is a frog]
If [X is a canary]
Then [X is colored yellow]
[Fritz croaks and eats flies]
Goal
[Fritz is colored Y]?
CPSC 433 Artificial Intelligence
Forward Chaining Example
If [X croaks and eats flies]
Then [X is a frog]
[Fritz croaks and eats flies]
Knowledge Base
If [X croaks and eats flies]
Then [X is a frog]
If [X chirps and sings]
Then [X is a canary]
If [X is a frog]
Then [X is colored green]
[Fritz is a frog]
If [X is a canary]
Then [X is colored yellow]
[Fritz croaks and eats flies]
[Fritz is a frog]
Goal
[Fritz is colored Y]?
CPSC 433 Artificial Intelligence
Forward Chaining Example
If [X croaks and eats flies]
Then [X is a frog]
[Fritz croaks and eats flies]
Knowledge Base
If [X croaks and eats flies]
Then [X is a frog]
If [X chirps and sings]
Then [X is a canary]
If [X is a frog]
Then [X is colored green]
[Fritz is a frog]
If [X is a canary]
Then [X is colored yellow]
[Fritz croaks and eats flies]
[Fritz is a frog]
?
CPSC 433 Artificial Intelligence
Goal
[Fritz is colored Y]?
Forward Chaining Example
If [X croaks and eats flies]
Then [X is a frog]
[Fritz croaks and eats flies]
Knowledge Base
If [X croaks and eats flies]
Then [X is a frog]
If [X chirps and sings]
Then [X is a canary]
If [X is a frog]
Then [X is colored green]
[Fritz is a frog]
If [X is a canary]
Then [X is colored yellow]
[Fritz croaks and eats flies]
[Fritz is a frog]
Goal
[Fritz is colored Y]?
CPSC 433 Artificial Intelligence
Forward Chaining Example
If [X croaks and eats flies]
Then [X is a frog]
[Fritz croaks and eats flies]
Knowledge Base
If [X croaks and eats flies]
Then [X is a frog]
If [X chirps and sings]
Then [X is a canary]
[Fritz is a frog]
If [X is a frog]
Then [X is colored green]
If [X is a frog]
Then [X is colored green]
If [X is a canary]
Then [X is colored yellow]
[Fritz croaks and eats flies]
[Fritz is colored green]
[Fritz is a frog]
Goal
[Fritz is colored Y]?
CPSC 433 Artificial Intelligence
Forward Chaining Example
If [X croaks and eats flies]
Then [X is a frog]
[Fritz croaks and eats flies]
Knowledge Base
If [X croaks and eats flies]
Then [X is a frog]
If [X chirps and sings]
Then [X is a canary]
[Fritz is a frog]
If [X is a frog]
Then [X is colored green]
If [X is a frog]
Then [X is colored green]
If [X is a canary]
Then [X is colored yellow]
[Fritz croaks and eats flies]
[Fritz is colored green]
[Fritz is a frog]
[Fritz is colored green]
Goal
[Fritz is colored Y]?
CPSC 433 Artificial Intelligence
Forward Chaining Example
If [X croaks and eats flies]
Then [X is a frog]
[Fritz croaks and eats flies]
Knowledge Base
If [X croaks and eats flies]
Then [X is a frog]
If [X chirps and sings]
Then [X is a canary]
[Fritz is a frog]
If [X is a frog]
Then [X is colored green]
If [X is a frog]
Then [X is colored green]
If [X is a canary]
Then [X is colored yellow]
[Fritz croaks and eats flies]
[Fritz is colored green]
[Fritz is a frog]
[Fritz is colored green]
?
CPSC 433 Artificial Intelligence
Goal
[Fritz is colored Y]?
Forward Chaining Example
If [X croaks and eats flies]
Then [X is a frog]
[Fritz croaks and eats flies]
Knowledge Base
If [X croaks and eats flies]
Then [X is a frog]
If [X chirps and sings]
Then [X is a canary]
[Fritz is a frog]
If [X is a frog]
Then [X is colored green]
If [X is a frog]
Then [X is colored green]
If [X is a canary]
Then [X is colored yellow]
[Fritz croaks and eats flies]
[Fritz is colored green]
[Fritz is a frog]
[Fritz is colored green]
Goal
[Fritz is colored Y]?
CPSC 433 Artificial Intelligence
Forward Chaining Example
If [X croaks and eats flies]
Then [X is a frog]
[Fritz croaks and eats flies]
Knowledge Base
If [X croaks and eats flies]
Then [X is a frog]
If [X chirps and sings]
Then [X is a canary]
[Fritz is a frog]
If [X is a frog]
Then [X is colored green]
If [X is a frog]
Then [X is colored green]
If [X is a canary]
Then [X is colored yellow]
[Fritz croaks and eats flies]
[Fritz is colored green]
[Fritz is colored Y] ?
[Fritz is a frog]
[Fritz is colored green]
Goal
[Fritz is colored Y]?
Y = green
CPSC 433 Artificial Intelligence
Backward Chaining
• Backward chaining is a goal driven method of deriving a
particular goal from a given knowledge base and set of
inference rules
• Inference rules are applied by matching the goal of the
search to the consequents of the relations stored in the
knowledge base
• When such a relation is found, the antecedent of the
relation is added to the list of goals (and not into the
knowledge base, as is done in forward chaining)
CPSC 433 Artificial Intelligence
Backward Chaining
• Search proceeds in this manner until a goal can
be matched against a fact in the knowledge
base
– Remember: facts are simply consequence relations
with empty antecedents, so this is like adding the
‘empty goal’ to the list of goals
• As with forward chaining, a search control
method is needed to select which goals will be
matched against which consequence relations
from the knowledge base
CPSC 433 Artificial Intelligence
Backward Chaining Example
Knowledge Base
If [X croaks and eats flies]
Then [X is a frog]
If [X chirps and sings]
Then [X is a canary]
If [X is a frog]
Then [X is colored green]
If [X is a canary]
Then [X is colored yellow]
[Fritz croaks and eats flies]
Goals
[Fritz is colored Y]?
CPSC 433 Artificial Intelligence
Backward Chaining Example
Knowledge Base
If [X croaks and eats flies]
Then [X is a frog]
If [X chirps and sings]
Then [X is a canary]
If [X is a frog]
Then [X is colored green]
If [X is a canary]
Then [X is colored yellow]
[Fritz croaks and eats flies]
Goals
[Fritz is colored Y]?
CPSC 433 Artificial Intelligence
Backward Chaining Example
[Fritz is colored Y]
If [X is a frog]
Then [X is colored green]
Knowledge Base
If [X croaks and eats flies]
Then [X is a frog]
If [X chirps and sings]
Then [X is a canary]
If [X is a frog]
Then [X is colored green]
[X is a frog]
If [X is a canary]
Then [X is colored yellow]
[Fritz croaks and eats flies]
Goals
[Fritz is colored Y]?
CPSC 433 Artificial Intelligence
Backward Chaining Example
[Fritz is colored Y]
If [X is a frog]
Then [X is colored green]
Knowledge Base
If [X croaks and eats flies]
Then [X is a frog]
If [X chirps and sings]
Then [X is a canary]
If [X is a frog]
Then [X is colored green]
[X is a frog]
If [X is a canary]
Then [X is colored yellow]
[Fritz croaks and eats flies]
Goals
[Fritz is colored Y]?
[X is a frog]
CPSC 433 Artificial Intelligence
Backward Chaining Example
[Fritz is colored Y]
If [X is a frog]
Then [X is colored green]
Knowledge Base
If [X croaks and eats flies]
Then [X is a frog]
If [X chirps and sings]
Then [X is a canary]
If [X is a frog]
Then [X is colored green]
[X is a frog]
If [X is a canary]
Then [X is colored yellow]
[Fritz croaks and eats flies]
Goals
[Fritz is colored Y]?
[X is a frog]
CPSC 433 Artificial Intelligence
Backward Chaining Example
[Fritz is colored Y]
If [X is a frog]
Then [X is colored green]
Knowledge Base
If [X croaks and eats flies]
Then [X is a frog]
If [X is a canary]
Then [X is colored yellow]
If [X chirps and sings]
Then [X is a canary]
If [X is a frog]
Then [X is colored green]
[X is a frog]
[X is a canary]
If [X is a canary]
Then [X is colored yellow]
[Fritz croaks and eats flies]
Goals
[Fritz is colored Y]?
[X is a frog]
CPSC 433 Artificial Intelligence
Backward Chaining Example
[Fritz is colored Y]
If [X is a frog]
Then [X is colored green]
Knowledge Base
If [X croaks and eats flies]
Then [X is a frog]
If [X is a canary]
Then [X is colored yellow]
If [X chirps and sings]
Then [X is a canary]
If [X is a frog]
Then [X is colored green]
[X is a frog]
[X is a canary]
If [X is a canary]
Then [X is colored yellow]
[Fritz croaks and eats flies]
Goals
[Fritz is colored Y]?
[X is a frog]
[X is a canary]
CPSC 433 Artificial Intelligence
Backward Chaining Example
[Fritz is colored Y]
If [X is a frog]
Then [X is colored green]
Knowledge Base
If [X croaks and eats flies]
Then [X is a frog]
If [X is a canary]
Then [X is colored yellow]
If [X chirps and sings]
Then [X is a canary]
If [X is a frog]
Then [X is colored green]
[X is a frog]
[X is a canary]
If [X is a canary]
Then [X is colored yellow]
[Fritz croaks and eats flies]
Goals
[Fritz is colored Y]?
[X is a frog]
[X is a canary]
CPSC 433 Artificial Intelligence
Backward Chaining Example
[Fritz is colored Y]
If [X is a frog]
Then [X is colored green]
Knowledge Base
If [X croaks and eats flies]
Then [X is a frog]
If [X is a canary]
Then [X is colored yellow]
If [X chirps and sings]
Then [X is a canary]
If [X is a frog]
Then [X is colored green]
[X is a frog]
[X is a canary]
If [X croaks and eats flies]
Then [X is a frog]
[X croaks and eats flies]
If [X is a canary]
Then [X is colored yellow]
[Fritz croaks and eats flies]
Goals
[Fritz is colored Y]?
[X is a frog]
[X is a canary]
CPSC 433 Artificial Intelligence
Backward Chaining Example
[Fritz is colored Y]
If [X is a frog]
Then [X is colored green]
Knowledge Base
If [X croaks and eats flies]
Then [X is a frog]
If [X is a canary]
Then [X is colored yellow]
If [X chirps and sings]
Then [X is a canary]
If [X is a frog]
Then [X is colored green]
[X is a frog]
[X is a canary]
If [X croaks and eats flies]
Then [X is a frog]
[X croaks and eats flies]
If [X is a canary]
Then [X is colored yellow]
[Fritz croaks and eats flies]
Goals
[Fritz is colored Y]?
[X is a frog]
[X is a canary]
[X croaks and eats flies]
CPSC 433 Artificial Intelligence
Backward Chaining Example
[Fritz is colored Y]
If [X is a frog]
Then [X is colored green]
Knowledge Base
If [X croaks and eats flies]
Then [X is a frog]
If [X is a canary]
Then [X is colored yellow]
If [X chirps and sings]
Then [X is a canary]
If [X is a frog]
Then [X is colored green]
[X is a frog]
[X is a canary]
If [X croaks and eats flies]
Then [X is a frog]
[X croaks and eats flies]
If [X is a canary]
Then [X is colored yellow]
[Fritz croaks and eats flies]
Goals
[Fritz is colored Y]?
[X is a frog]
[X is a canary]
[X croaks and eats flies]
CPSC 433 Artificial Intelligence
Backward Chaining Example
[Fritz is colored Y]
If [X is a frog]
Then [X is colored green]
Knowledge Base
If [X croaks and eats flies]
Then [X is a frog]
If [X is a canary]
Then [X is colored yellow]
If [X chirps and sings]
Then [X is a canary]
If [X is a frog]
Then [X is colored green]
[X is a frog]
[X is a canary]
If [X croaks and eats flies]
Then [X is a frog]
[X croaks and eats flies]
[Fritz croaks and eats flies]
If [X is a canary]
Then [X is colored yellow]
[Fritz croaks and eats flies]
Goals
[Fritz is colored Y]?
[X is a frog]
[X is a canary]
X = Fritz, Y = green
CPSC 433 Artificial Intelligence
[X croaks and eats flies]
Backward Chaining Example
[Fritz is colored Y]
If [X is a frog]
Then [X is colored green]
Knowledge Base
If [X croaks and eats flies]
Then [X is a frog]
If [X is a canary]
Then [X is colored yellow]
If [X chirps and sings]
Then [X is a canary]
If [X is a frog]
Then [X is colored green]
[X is a frog]
[X is a canary]
If [X croaks and eats flies]
Then [X is a frog]
[X croaks and eats flies]
[Fritz croaks and eats flies]
If [X is a canary]
Then [X is colored yellow]
[Fritz croaks and eats flies]
Goals
[Fritz is colored Y]?
[X is a frog]
[X is a canary]
X = Fritz, Y = green
CPSC 433 Artificial Intelligence
[X croaks and eats flies]