Download Review of Probability Theory (Examples): Example 1

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Statistics wikipedia , lookup

History of statistics wikipedia , lookup

Inductive probability wikipedia , lookup

Birthday problem wikipedia , lookup

Ars Conjectandi wikipedia , lookup

Probability interpretations wikipedia , lookup

Probability wikipedia , lookup

Transcript
Review of Probability Theory (Examples):
Example 1:
Suppose we are interested in an experiment where we conduct Bernoulli trials until the occurrence of m
1’s. When the mth ‘1’ occurs the experiment ends. Let define a random variable X for the number of trials
until occurrence of m 1’s. What is the distribution of the random variable X?
Solution:
We would like to evaluate P[X=k], k=m, m+1, m+2, ……………
By definition of the random variable the mth ‘1’ occurs at the kth trial. This indicates that in the (k-1) trials
before that (m-1) 1’s have occurred. The probability of having (m-1) 1’s in (k-1) trials follows a binomial
distribution with probability:
 k  1  m1
k m

 p 1  p 
 m 1
Now to calculate P[X=k], this means that we had (m-1) 1’s occur in (k-1) trials AND the kth trial is a ‘1’
 k  1  m1
k m
 P X  k   
 p 1  p   p
m

1


 k 1  m
k m
P X  k   
 p 1  p 
m

1


Example 2:
If two events A and B can occur and P[A], P[B] are not zero. What combinations of Independent (I), Not
Independent (NI), Mutually exclusive (M), and Not Mutually exclusive (NM) are permissible? In other
words, which of the four combinations (I,M), (NI,M), (I,NM), and (NI,NM) are permissible? Construct an
example for those combinations that are permissible.
Solution:
Notes:
i) (I,M)
This combination is NOT POSSIBLE because by the definition,
mutually exclusive means that by the occurrence of event A,
the probability of occurrence of event B is zero (i.e.,
P[B|A]=0). This means that even though P[B]≠0, knowledge
of A changes this probability to zero.
Ex: Throwing a die once. Let A={1,2}. Let B={3,4}
Now if event A has occurred  for sure the outcome of the
experiment does not belong to B  P[B|A]=0 (even though
we know that P[B] = 2/6 = 1/3  Independence is not
possible)
What is mutually exclusive?
Events A, B in a given experiment are mutually
exclusive if there exists no intersection
between them. In other words, if event A is
true then event B is definitely not true.
What is independent?
Events A, B in a given experiment are
independent if knowing that event A has
occurred does not change anything about the
probability of event B has occurred.
ii) (NI,M)
This combination is POSSIBLE because if we review part (i), if two events are mutually exclusive, then
they have to be dependent.
Ex: Throwing a die once. Let A={1,2}. Let B={3,4}
iiI) (I,NM)
This combination is POSSIBLE. To construct an example, we need to define two events A, B such that:
P[B|A] = P[B] and P[A|B] = P[A], P[A,B] ≠ 0
Ex: Throwing a die once. Let A be the vent of the outcome smaller than or equal to 2. Let B be the event
of the outcome being even.
 A={1,2}, B={2,4,6}
P[A] = 2/6 = 1/3
P[B] = 3/6 = 1/2
P[A,B] = 1/6
P[B|A] means that we would like to calculate probability that event B has occurred given that with the
occurrence of event A, the sample space has now become {1, 2}. From this set one of the two elements
would correspond to the occurrence of B.
P[B|A] = 1/2 = P[B]
P[B|A] means that we would like to calculate probability that event A has occurred given that with the
occurrence of event B, the sample space has now become {2, 4, 6}. From this set one of the three
elements would correspond to the occurrence of A.
P[A|B] = 1/3 = P[A]
Note also that P[A,B]=1/6=P[A]P[B]=(1/3)*(1/2) = 1/6 (which is also a condition for independence)
iv) (NI,MM)
This combination is POSSIBLE.
Ex: Throwing a die once. Let A={1,2}. Let B={2,3}
P[A] = 2/6 = 1/3
P[B] = 2/6 = 1/3
P[A,B] = 1/6  Not mutually exclusive
P[A|B]= 1/2 ≠ P[A]
P[B|A]= 1/2 ≠ P[B]
 Not independent
Example 3:
8 bowling balls are divided into two sets. Five of which are in box 1, and three are in box 2. A player
picks a ball from either of these boxes but cannot see its color because the room is dark. Assume that
box 1 is closer to the door than box 2, so that a player is more likely to pick a ball from this box. Let the
random variable denote the number of the chosen box, and the random variable denote the color
)
(
), find:
of the chosen ball. Assuming that (
i) (
)
Since only two boxes exist, (
(
)
(
ii) (
)
(
)
)
)
If the player chooses the first box, he has 5 balls to choose from. Because the room is dark, all balls are equally likely to be
chosen. Thus,
(
)
Similarly,
(
)
iii) (
(
)
)
)
The joint probability (
(
(
)
) is the probability of choosing the first box AND then, a red ball from this box.
(
)
(
)
Similarly,
(
)
(
)
(
)
iv) (
)
A player can pick a red ball from the first box OR from the second box.
(
)
(
)
(
)
Similarly,
(
)
(
)
(
)
v) If the chosen ball is red, what is the probability that this ball was from box 1, i.e: find (
)
According to Bayes’ rule, the joint probability of multiple events doesn’t depend on the order of occurrence of these events.
Thus, the probability of choosing box 1 then a red ball is equivalent in a mathematical sense to the probability of choosing a red
ball then box 1. This equivalence is maintained even if the order of events cannot be reversed in reality, i.e: you cannot choose
the ball unless you choose the box first.
(
)
(
)
which can be expressed in terms of the conditional and marginal probabilities as
(
)
(
)
(
)
(
)
Thus,
(
)
(
)
(
(
)
)
Similarly,
(
)
(
)
(
)
Example 4:
A digital communication system sends two messages M=0 or M=1, with equal probabilities. A receiver
observes a voltage which can be modeled as a Gaussian random variable, X, whose PDFs conditioned on
the transmitted message are given by
1
fX M 0  x  
2πσ
2
e

x2
2 σ2
1
, fX M 1  x  
2πσ
2
e

 x 12
2σ2
i) Find P[M=0|X=x] for σ2=1
Solution:
P M  0 X  x  
P M  0 
fX M 0  x  
fX M 0  x  P M  0
fX  x 
1
2
1
2πσ2
e

x2
2 σ2
fX  x   PM  0 fX M0  x   PM  1 fX M 1  x 
1
 P M  0 X  x  
2πσ
x2
e
x2
2 σ2
1
 
2
 2

1
1
1
1
e 2σ    
e
 
 2  2πσ 2
 2  2πσ 2
e
 P M  0 X  x  
e


x2
2σ2
x2
2σ
e
2

 x 12
2 σ2
1
 P M  0 X  x  
1e
 P M  0 X  x  
2


 x 12
2 σ2
x2
2
2
e σ
1
12
2
1  e 2σ
2 x 1
 x 12
2 σ2
1
For σ2  1  P M  0 X  x  
 1
 x 
2
1  e
ii) Repeat part(i) assuming P[M=0] = 1/4, P[M=1]=3/4
Solution:
P M  0 X  x  
fX M 0  x  P M  0
fX  x 
1
 P M  0 X  x  
2πσ
x2
e
x2
2σ2
1
 
4
 2

1
1
1
3
e 2σ    
e
 
2
2
 4  2πσ
 4  2πσ
e
 P M  0 X  x  
e
 P M  0 X  x  
2



2σ
2σ2
x2
2σ2
x2
2
 x 12
 3e

 x 12
2σ2
1
 1
 x 
 2
1  3e
Note: In general for P[M=0] = p and P[M=1] = 1-p and σ2=1
 P M  0 X  x  
p
 1
 x 
 2
p  1  p  e