Download Uncertain Knowledge

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Statistics wikipedia , lookup

History of statistics wikipedia , lookup

Ars Conjectandi wikipedia , lookup

Birthday problem wikipedia , lookup

Inductive probability wikipedia , lookup

Probability interpretations wikipedia , lookup

Probability wikipedia , lookup

Transcript
Uncertainty
• Uncertain Knowledge
• Probability Review
• Bayes’ Theorem
• Summary
Uncertain Knowledge
In many situations we cannot assign a value of
true or false to world statements.
Example
Symptom(p,Toothache)  Disease(p,Cavity)
To generalize:
Symptom(p,Toothache)  Disease(p,Cavity) V
Disease(p,GumDisease) V …
Uncertain Knowledge
Solution: Deal with degrees of belief.
We will use probability theory. Probability
states a degree of belief based on evidence:
P(x) = 0.80 – based on evidence, 80% of the times
in which the experiment is run, x occurs.
It summarizes our uncertainty of what causes x.
Degree of truth – Fuzzy logic.
Utility Theory
Combine probability and decision theory
To make a decision (action) an agent needs to
have preferences between plans.
An agent should choose the action with
highest expected utility averaged over
all possible outcomes.
Uncertainty
• Uncertain Knowledge
• Probability Review
• Bayes’ Theorem
• Summary
Random Variable
Definition: A variable that can take on several
values, each value having a probability of
occurrence.
There are two types of random variables:
Discrete. Take on a countable number of
values.
Continuous. Take on a range of values.
Random Variable
Discrete Variables
 For every discrete variable X there will be a
probability function P(x) = P(X = x).
Random Variable
Continuous Variables:
 For every continuous random variable X we
will associate a probability density function f(x).
It is the area under the density functions between
two points that corresponds to the probability of
the variable lying between the two values.
x2
Prob(x1 < X <= x2) = ∫x1 f(x) dx
The Sample Space
 The space of all possible outcomes of a
given process or situation is called the
sample space S.
S
red & small
blue & small
red & large
blue & large
An Event
 An event A is a subset of the sample space.
S
red & small
A
red & large
blue & small
blue & large
Atomic Event
An atomic event is a single point in S.
Properties:
 Atomic events are mutually exclusive
 The set of all atomic events is exhaustive
 A proposition is the disjunction of the
atomic events it covers.
The Laws of Probability
The probability of the sample space S is 1,
P(S) = 1
The probability of any event A is such that
0 <= P(A) <= 1.
Law of Addition
If A and B are mutually exclusive events, then
the probability that either one of them will
occur is the sum of the individual probabilities:
P(A or B) = P(A) + P(B)
The Laws of Probability
If A and B are not mutually exclusive:
P(A or B) = P(A) + P(B) – P(A and B)
A
B
Prior Probability
It is called the unconditional or prior probability
of event A.
P(A) -- Reflects our original degree of belief of X.
Conditional Probabilities
 Given that A and B are events in sample space S,
and P(B) is different of 0, then the conditional
probability of A given B is
P(A|B) = P(A and B) / P(B)
 If A and B are independent then
P(A|B) = P(A)
The Laws of Probability
 Law of Multiplication
What is the probability that both A and B
occur together?
P(A and B) = P(A) P(B|A)
where P(B|A) is the probability of B conditioned
on A.
The Laws of Probability
If A and B are statistically independent:
P(B|A) = P(B) and then
P(A and B) = P(A) P(B)
Independence on Two Variables
P(A,B|C) = P(A|C) P(B|C)
If A and B are conditionally independent:
P(A|B,C) = P(A|C) and
P(B|A,C) = P(B|C)
Exercises
Find the probability that the sum of the numbers
on two unbiased dice will be even by considering the
probabilities that the individual dice will show an even
number.
19
Exercises
X1 – first throw
X2 – second throw
20
Exercises
X1 – first throw
X2 – second throw
Pfinal = P(X1=1 & X2=1) + P(X1=1 & X2=3) + P(X1=1 & X2=5) +
P(X1=2 & X2=2) + P(X1=2 & X2=4) + P(X1=2 & X2=6) +
P(X1=3 & X2=1) + P(X1=3 & X2=3) + P(X1=3 & X2=5) +
…
P(X1=6 & X2=2) + P(X1=6 & X2=4) + P(X1=6 & X2=6).
Pfinal = 18/36 = 1/2
21
Exercises
Find the probabilities of throwing a sum of a) 3, b) 4
with three unbiased dice.
22
Exercises
Find the probabilities of throwing a sum of a) 3, b) 4
with three unbiased dice.
X = sum of X1 and X2 and X3
P(X=3)?
P(X1=1 & X2=1 & X3=1) = 1/216
P(X=4)?
P(X1=1 & X2=1 & X3=2) + P(X1=1 & X2=2 & X3=1) + …
P(X=4) = 3/216
23
Exercises
Three men meet by chance. What are the probabilities
that a) none of them, b) two of them, c) all of them
have the same birthday?
24
Exercises
None of them have the same birthday
X1 – birthday 1st person
X2 – birthday 2nd person
X3 – birthday 3rd person
a) P(X2 is different than X1 & X3 is different than X1 and X2)
Pfinal = (364/365)(363/365)
25
Exercises
Two of them have the same birthday
P(X1 = X2 and X3 is different than X1 and X2) +
P(X1=X3 and X2 differs) +
P(X2=X3 and X1 differs).
P(X1=X2 and X3 differs) = (1/365)(364/365)
Pfinal = 3(1/365)(364/365)
26
Exercises
All of them have the same birthday
P(X1 = X2 = X3)
Pfinal = (1/365)(1/365)
27
Multivariate o Joint Distributions
P(x,y) = P( X = x and Y = y).
 P’(x) = Prob( X = x) = ∑y P(x,y)
It is called the marginal distribution of X
The same can be done on Y to define
the marginal distribution of Y, P”(y).
 If X and Y are independent then
P(x,y) = P’(x) P”(y)
Expectations: The Mean
 Let X be a discrete random variable that takes
the following values: x1, x2, x3, …, xn.
Let P(x1), P(x2), P(x3),…,P(xn) be their
respective probabilities. Then the expected
value of X, E(X), is defined as
E(X) = x1P(x1) + x2P(x2) + x3P(x3) + … + xnP(xn)
E(X) = Σi xi P(xi)
Exercises
Suppose that X is a random variable taking the values
{-1, 0, and 1} with equal probabilities and that Y = X2 .
Find the joint distribution and the marginal distributions
of X and Y and also the conditional distributions of X
given a) Y = 0 and b) Y = 1.
30
Exercises
X
-1
Y
0
1
0
0
1/3
1/3
0
1/3
1/3
1
0
1/3
1/3
2/3
1/3
If Y = 0 then X= 0 with probability 1
If Y = 1 then X is equally likely to be +1 or -1
31
Uncertainty
• Uncertain Knowledge
• Probability Review
• Bayes’ Theorem
• Summary
Bayes’ Theorem
P(A,B) = P(A|B) P(B)
P(B,A) = P(B|A) P(A)
The theorem:
P(B|A) = P(A|B) P(B) / P(A)
More General Bayes’ Theorem
P(Y|X,e) = P(X|Y,e) P(Y|e) / P(X|e)
Where e: background evidence.
Thomas Bayes
Born in London (1701).
Studied logic and theology (Univ. of Edinburgh).
Fellow of the Royal Society (year 1742).
Given white and black balls in an urn, what is the prob. of
drawing one or the other?
Given one or more balls, what can be said about the number
of balls in the urn?
Uncertainty
• Uncertain Knowledge
• Probability Review
• Bayes’ Theorem
• Summary
Summary
• Uncertainty comes from ignorance on the
true state of the world.
• Probabilities indicate our degree of belief on
certain event.
• Concepts: random variable, prior probabilities,
conditional probabilities, joint distributions,
conditional independence, Bayes’ theorem.
Application: Predicting Stock Market
Bayesian Networks BNs have been exploited to predict
the behavior of the stock market. BNs can be constructed
from daily stock returns over a certain amount of time.
Stocks can be analyzed from well-known repositories:
e.g., S&P 500 index.