Download P(x 1 )

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Probability: Introduction
1
•
Definitions,
•
Laws of Probability
•
Random Variables
•
Distributions
Statistical and Inductive Probability
2
 Statistical:
Relative frequency of occurrence after many trials
 Inductive:
Degree of belief on certain event
Proportion of heads
We will be concerned with the statistical view only.
Law of large numbers
0.5
Number of flips of a coin
The Sample Space
3
The space of all possible outcomes of a given process
or situation is called the sample space S.
Example: cars crossing a check point based on color and size:
S
red & small
blue & small
red & large
blue & large
An Event
4
 An event is a subset of the sample space.
Example: Event A: red cars crossing a check point
irrespective of size
S
A
red & small
red & large
blue & small
blue & large
Probability: Introduction
5
•
Definitions,
•
Laws of Probability
•
Random Variables
•
Distributions
The Laws of Probability
6
The probability of the sample space S is 1, P(S) = 1
The probability of any event A is such that 0 <= P(A) <= 1.
Law of Addition
If A and B are mutually exclusive events, then the probability that
either one of them will occur is the sum of the individual probabilities:
P(A or B) = P(A) + P(B)
B
If A and B are not mutually exclusive:
A
P(A or B) = P(A) + P(B) – P(A and B)
Conditional Probabilities
7
 Given that A and B are events in sample space S, and P(B) is
different of 0, then the conditional probability of A given B is
P( A and B)
P( A B) 
P( B)
 If A and B are independent then P(A|B) = P(A)
The Laws of Probability
8
 Law of Multiplication
What is the probability that both A and B occur together?
P(A and B) = P(A) P(B|A)
where P(B|A) is the probability of B conditioned on A.
If A and B are statistically independent:
P(B|A) = P(B) and then
P(A and B) = P(A) P(B)
Exercises
9
Find the probability that the sum of the numbers
on two unbiased dice will be even by considering the
probabilities that the individual dice will show an even
number.
Exercises
10
X1 – first throw
X2 – second throw
Exercises
11
X1 – first throw
X2 – second throw
Pfinal = P(X1=1 & X2=1) + P(X1=1 & X2=3) + P(X1=1 & X2=5) +
P(X1=2 & X2=2) + P(X1=2 & X2=4) + P(X1=2 & X2=6) +
P(X1=3 & X2=1) + P(X1=3 & X2=3) + P(X1=3 & X2=5) +
…
P(X1=6 & X2=2) + P(X1=6 & X2=4) + P(X1=6 & X2=6).
Pfinal = 18/36 = 1/2
Exercises
12
Find the probabilities of throwing a sum of a) 3, b) 4
with three unbiased dice.
Exercises
13
Find the probabilities of throwing a sum of a) 3, b) 4
with three unbiased dice.
X = sum of X1 and X2 and X3
P(X=3)?
P(X1=1 & X2=1 & X3=1) = 1/216
P(X=4)?
P(X1=1 & X2=1 & X3=2) + P(X1=1 & X2=2 & X3=1) + …
P(X=4) = 3/216
Exercises
14
Three men meet by chance. What are the probabilities
that a) none of them, b) two of them, c) all of them
have the same birthday?
Exercises
15
None of them have the same birthday
X1 – birthday 1st person
X2 – birthday 2nd person
X3 – birthday 3rd person
a) P(X2 is different than X1 & X3 is different than X1 and X2)
Pfinal = (364/365)(363/365)
Exercises
16
Two of them have the same birthday
P(X1 = X2 and X3 is different than X1 and X2) +
P(X1=X3 and X2 differs) +
P(X2=X3 and X1 differs).
P(X1=X2 and X3 differs) = (1/365)(364/365)
Pfinal = 3(1/365)(364/365)
Exercises
17
All of them have the same birthday
P(X1 = X2 = X3)
Pfinal = (1/365)(1/365)
Probability: Introduction
18
•
Definitions,
•
Laws of Probability
•
Random Variables
•
Distributions
Random Variable
19
Definition: A variable that can take on several values,
each value having a probability of occurrence.
 There are two types of random variables:
Discrete.
Take on a countable number of values.
Continuous.
Take on a range of values.
Discrete Variables
 For every discrete variable X there will be a probability function
P(x) = P(X = x).
 The cumulative probability function for X is defined as
F(x) = P(X <= x).
Random Variable
20
Continuous Variables:
 Concept of histogram.
 For every variable X we will associate a probability density
function f(x). The probability is the area lying between
two values.
Prob(x1 < X <= x2) =

x2
x1
f ( x)dx
 The cumulative probability function is defined as
F(x) = Prob( X <= x) =

x

f (u )du
Multivariate Distributions
21
 P(x,y) = P( X = x and Y = y).
 P’(x) = Prob( X = x) = ∑y P(x,y)
It is called the marginal distribution of X
The same can be done on Y to define the marginal
distribution of Y, P”(y).
 If X and Y are independent then
P(x,y) = P’(x) P”(y)
Expectations: The Mean
22
 Let X be a discrete random variable that takes the following
values:
x1, x2, x3, …, xn.
Let P(x1), P(x2), P(x3),…,P(xn) be their respective
probabilities. Then the expected value of X, E(X), is
defined as
E(X) = x1P(x1) + x2P(x2) + x3P(x3) + … + xnP(xn)
E(X) = Σi xi P(xi)
Exercises
23
Suppose that X is a random variable taking the values
{-1, 0, and 1} with equal probabilities and that Y = X2 .
Find the joint distribution and the marginal distributions
of X and Y and also the conditional distributions of X
given a) Y = 0 and b) Y = 1.
Exercises
X
-1
Y
0
1
0
0
1/3
1/3
0
1/3
1/3
1
0
1/3
1/3
2/3
1/3
If Y = 0 then X= 0 with probability 1
If Y = 1 then X is equally likely to be +1 or -1
24
Probability: Introduction
25
•
Definitions,
•
Laws of Probability
•
Random Variables
•
Distributions
Properties of Distributions
26
Measures of Location
Mean: Average of observations
Mean:
x
i
i
N
Median: Middle observation
Example: 9, 11, 12, 13, 13 Median: 12
Mode: The most frequent observation (value with highest prob.)
Example: 1, 2, 3, 3, 4, 5, 6 Mode: 3
Mean
27
The mean is the expected value of X:
E[X] = = ∫ x f(x) dx
A distribution is uniform when f(x) = 1 and x is between 0 and 1.
What is the expected value of x if it is
uniformly distributed?
f(x) = 1
0
1
Mean
28
What is the expected value of x if it is
uniformly distributed?
f(x) = 1
0
1
E[X] = ∫ x dx evaluated from 0 – 1
= ½ x2 evaluated [0,1] = 1/2
Properties of Distributions
29
Measures of Location
Mode
Median
Mean
Properties of Distributions
30
Measures of Dispersion
Most popular: Variance
Variance =
S2
N
Where S2 = Σ (xi – mean)2
  variance
Properties of Distributions
31
Skewness: Measure of symmetry.
 ( xi  mean )
M3 :
3
i
Skewness:
N
M3
3
Skewed to the right
Symmetric
Skewed to the left
Properties of Distributions
32
Kurtosis: Measure of symmetry.
 ( xi  mean )
M4 :
4
i
N
Kurtosis: M 4
4
Low kurtosis
High kurtosis
Correlation Coefficient
33
The correlation coefficient ρ is defined as follows:
COV [ X , Y ]

(V [ X ]  V [Y ])
It is a measure of the (linear) relationship between
The variables X and Y.
ρ=1
ρ = -1
Normal Distribution
34
A continuous random variable is normally distributed if its probability
density function is

1
f ( x) 
e
 2
( x )2
2 2
where x goes from –infinity to infinity
E[X] = μ
V[X] = σ2
σ2
μ
Central Limit Theorem
35
The sum of a large number of independent random variables
will be approximately normally distributed almost regardless
of their individual distributions.
Related documents