Download Presentation

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Embodied cognitive science wikipedia , lookup

Linear belief function wikipedia , lookup

Inductive probability wikipedia , lookup

Transcript
Uncertainty
in Artificial Intelligence
Ramtin Raji Kermani, Mehrdad Rashidi, Hadi Yaghoobian
Overview
Uncertainty
 Probability
 Syntax and Semantics
 Inference
 Independence and Bayes' Rule

Uncertain Agent
sensors
?
?
environment
agent
?
actuators
model
Uncertainty !!!
Let action At = leave for airport t minutes before flight
Will At get me there on time?



Problems:


partial observability (road state, other drivers' plans, etc.)


noisy sensors (traffic reports)


uncertainty in action outcomes (flat tire, etc.)


immense complexity of modeling and predicting traffic




Hence a purely logical approach either

risks falsehood: “A25 will get me there on time”, or

leads to conclusions that are too weak for decision making:
“A
will get me there on time if there's no accident on the bridge
Truth & Belief
Ontological Commitment: What exists
in the world — TRUTH
 Epistemological Commitment: What an
agent believes about facts — BELIEF

Types of Uncertainty

Uncertainty in prior knowledge
E.g., some causes of a disease are unknown and are
not represented in the background knowledge of a
medical-assistant agent

Uncertainty in actions
E.g., actions are represented with relatively short
lists of preconditions, while these lists are in fact
arbitrary long

Uncertainty in perception
E.g., sensors do not return exact or complete
information about the world; a robot never knows
exactly its position
Questions

How to represent uncertainty in
knowledge?

How to perform inferences with
uncertain knowledge?

Which action to choose under
uncertainty?
How do we represent
Uncertainty?
We need to answer several questions:
 What do we represent & how we represent it?


What can we do with the representations?


What language do we use to represent our uncertainty? What
are the semantics of our representation?
What queries can be answered? How do we answer them?
How do we construct a representation?

Can we ask an expert? Can we learn from data?
Uncertainty?
What we call uncertainty is a
summary
of all that is not explicitly
taken into account
in the agent’s KB
Uncertainty?
Sources of uncertainty:


Ignorance
Laziness (efficiency?)
Methods for handling uncertainty
 Creed:
Default
nonmonotonic
logic:
Theorworld
is fairly normal.
Abnormalities
are rare

my car does
not have
flat tire
So,anAssume
agent assumes
normality,
untila there
is evidence of the contrary


Assume A

WetGrass |→ 0.7 Rain
works unless contradicted by evidence
25
E.g.,
if Just
an agent
sees a bird
x, world
it assumes
thatby
x can
fly, unless
Creed:
the
opposite!
The
is ruled
Murphy’s
 Issues: What assumptions
are reasonable?
How to Law
handle
it contradiction?
has evidence that x is a penguin, an ostrich, a dead bird,
a bird with broken wings, …
 Uncertainty is defined by sets, e.g., the set possible outcomes of an
Creed:
world
is not divided
between
“normal” and “abnormal”,
action, The
the set
of possible
positions
of a robot
is it adversarial.
 nor
Rules
with fudge Possible
factors:situations have various likelihoods
 (probabilities)
The agent assumes the worst case, and chooses the actions that
 A25 |→0.3 get there on time
maximizes
a utility function in this case
 agent has probabilistic beliefs – pieces of knowledge with
The
Example:
Adversarial search

Sprinkler
|→ 0.99 WetGrass
associated probabilities
(strengths) – and chooses its actions to

maximize
the expected value of some utility function

Issues: Problems with combination, e.g., Sprinkler causes
More on deal with uncertainty?

Implicit:



Ignore what you are uncertain of when you can
Build procedures that are robust to uncertainty
Explicit:


Build a model of the world that describe
uncertainty about its state, dynamics, and
observations
Reason about the effect of actions given the
model
Making decisions under
uncertainty
Suppose I believe the following:
P(A25 gets me there on time | …)
P(A90 gets me there on time | …)
P(A120 gets me there on time | …)
P(A1440 gets me there on time | …)


Which action to choose?
=
=
=
=
0.04
0.70
0.95
0.9999
Probability



A well-known and well-understood
framework for uncertainty
Clear semantics
Provides principled answers for:





Combining evidence
Predictive & Diagnostic reasoning
Incorporation of new evidence
Intuitive (at some level) to human experts
Can be learned
Probability
Probabilistic assertions summarize effects of

laziness: failure to enumerate exceptions, qualifications, etc.


ignorance: lack of relevant facts, initial conditions, etc.

Subjective probability:
 Probabilities relate propositions to agent's own state of
knowledge
e.g., P(A25 | no reported accidents) = 0.06
These are not assertions about the world
Decision Theory

Decision Theory develops methods for making
optimal decisions in the presence of uncertainty.



Decision Theory = utility theory + probability theory
Utility theory is used to represent and infer
preferences: Every state has a degree of usefulness
An agent is rational if and only if it chooses an action
that yields the highest expected utility, averaged
over all possible outcomes of the action.
Random variables
A discrete random variable is a function that



takes discrete values from a countable domain and
maps them to a number between 0 and 1
Example: Weather is a discrete (propositional) random
variable that has domain <sunny,rain,cloudy,snow>.





sunny is an abbreviation for Weather = sunny
P(Weather=sunny)=0.72, P(Weather=rain)=0.1, etc.
Can be written: P(sunny)=0.72, P(rain)=0.1, etc.
Domain values must be exhaustive and mutually exclusive
Other types of random variables:


Boolean random variable has the domain <true,false>,
 e.g., Cavity (special case of discrete random variable)
Continuous random variable as the domain of real numbers, e.g.,
Temp
Propositions

Elementary proposition constructed by
assignment of a value to a random variable:


e.g., Weather = sunny, Cavity = false
(abbreviated as cavity)
Complex propositions formed from
elementary propositions & standard logical
connectives

e.g., Weather = sunny  Cavity = false
Atomic Events

Atomic event:


A complete specification of the state of the world
about which the agent is uncertain
E.g., if the world consists of only two Boolean
variables Cavity and Toothache, then there are 4
distinct atomic events:
Cavity
Cavity
Cavity
Cavity

= false Toothache = false
= false  Toothache = true
= true  Toothache = false
= true  Toothache = true
Atomic events are mutually exclusive and
exhaustive
Atomic Events, Events & the
Universe



The universe consists of atomic events
An event is a set of atomic events
P: event  [0,1]
AB

Axioms of Probability



P(true) = 1 = P(U)
P(false) = 0 = P()
P(A  B) = P(A) + P(B) – P(A  B)
A
B
U
Prior probability

Prior (unconditional) probability



corresponds to belief prior to arrival
of any (new) evidence
P(sunny)=0.72, P(rain)=0.1, etc.
Probability distribution gives values for
all possible assignments:

Vector notation: Weather is one of <0.72, 0.1,
0.08, 0.1>

P(Weather) = <0.72,0.1,0.08,0.1>

Sums to 1 over the domain
Joint probability distribution

Probability assignment to all combinations of values of random
variables
Toothache
 Toothache
Cavity
0.04
0.06
 Cavity
0.01
0.89

The sum of the entries in this table has to be 1

Every question about a domain can be answered by the joint
distribution

Probability of a proposition is the sum of the probabilities of atomic
events in which it holds


P(cavity) = 0.1 [add elements of cavity row]
P(toothache) = 0.05 [add elements of toothache column]
Conditional Probability
Toothache




 Toothache
A
Cavity
0.04
0.06
 Cavity
0.01
0.89
B
U
AB
P(cavity)=0.1 and P(cavity  toothache)=0.04 are both
prior (unconditional) probabilities
Once the agent has new evidence concerning a previously
unknown random variable, e.g., toothache,
we can specify a posterior (conditional) probability
 e.g., P(cavity | toothache)
P(A | B) = P(A  B)/P(B) [prob of A w/ U limited to B]
P(cavity | toothache) = 0.04/0.05 = 0.8
Conditional Probability
(continued)

Definition of Conditional Probability:

Product rule gives an alternative formulation:

A general version holds for whole distributions:

Chain rule is derived by successive application of product rule:
P(A | B) = P(A  B)/P(B)
P(A  B) = P(A | B)  P(B)
= P(B | A)  P(A)
P(Weather,Cavity) = P(Weather | Cavity)  P(Cavity)
P(X1, …,Xn) = P(X1,...,Xn-1) P(Xn | X1,...,Xn-1)
= P(X1,...,Xn-2) P(Xn-1 | X1,...,Xn-2) P(Xn | X1,...,Xn-1)
=…
n
=
 P(X
i 1
i|
| X1, ..., X i 1 )