Download Math 425 Introduction to Probability Lecture 10

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

History of statistics wikipedia , lookup

Birthday problem wikipedia , lookup

Ars Conjectandi wikipedia , lookup

Probability interpretations wikipedia , lookup

Probability wikipedia , lookup

Transcript
Multiplication Rule
Multiplication Rule for 2 events
Math 425
Introduction to Probability
Lecture 10
Lemma (Multiplication Rule)
For any events E and F (where P (F ) > 0),
Kenneth Harris
[email protected]
P (E ∩ F ) = P (E) · P(F | E)
Department of Mathematics
University of Michigan
February 9, 2009
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 10
February 9, 2009
1 / 32
Kenneth Harris (Math 425)
Multiplication Rule
February 9, 2009
3 / 32
Multiplication Rule
Multiplication Rule for 3 events
Example: 3 events
We can extend the Multiplication Rule to three events.
Example
Lemma
For any events E, F , G (provided P (E ∩ F ∩ G) > 0)
An urn is filled with 6 red balls, 5 blue balls, and 4 green balls. Three
balls chosen at random are removed from the urn.
What is the probability that the balls are of the same color?
P (E ∩ F ∩ G) = P (E) · P(F | E) · P(G | E ∩ F )
We are interested in the events (where i = 1, 2, 3)
Ri : ith ball drawn is red,
Proof. Use the Mutiplication Rule twice,
P (E ∩ F ∩ G)
Math 425 Introduction to Probability Lecture 10
Bi : ith ball drawn is blue,
= P (E ∩ F ) · P(G | E ∩ F )
Gi : ith ball drawn is green.
= P (E) · P(F | E) · P(G | E ∩ F )
C: three balls are the same color.
We need P (E ∩ F ∩ G) 6= 0 to ensure the conditional probabilities exist.
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 10
February 9, 2009
4 / 32
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 10
February 9, 2009
5 / 32
Multiplication Rule
Multiplication Rule
Example – continued
General multiplication Rule
Urn: 6 red, 5 blue, and 4 green. Use the Multiplication Rule,
P (R1 ∩ R2 ∩ R3 )
P (B1 ∩ B2 ∩ B3 )
P (G1 ∩ G2 ∩ G3 )
The Multiplication rule is the probabilistic version of the product rule
= P (R1 ) · P(R2 | R1 ) · P(R3 | R1 ∩ R2 )
6
5
4
4
=
·
·
=
15 14 13
91
for counting.
Theorem (Generalized Multiplication Rule)
Let E1 , E2 , . . . , En be any events such that
= P (B1 ) · P(B2 | B1 ) · P(B3 | B1 ∩ B2 )
5
4
3
2
·
·
=
=
15 14 13
91
P (E1 ∩ E2 ∩ · · · ∩ En ) > 0.
Then
= P (G1 ) · P(G2 | G1 ) · P(G3 | G1 ∩ G2 )
4
3
2
4
=
·
·
=
15 14 13
455
P (E1 ∩ E2 ∩ . . . ∩ En ) = P (E1 ) · P(E2 | E1 ) · P(E3 | E1 ∩ E2 ) · · ·
· · · P(En | E1 ∩ E2 ∩ . . . ∩ En−1 ).
Since these events are mutually exclusive,
P (C) =
Kenneth Harris (Math 425)
(See Ross p. 71.)
4
2
4
34
+
+
=
≈ 0.0747.
91 91 455
455
Math 425 Introduction to Probability Lecture 10
February 9, 2009
6 / 32
Kenneth Harris (Math 425)
Multiplication Rule
February 9, 2009
7 / 32
Multiplication Rule
Example: 6 events
Example – solution
Solution. We solved this before (Lecture 5) by counting using the
Product Rule for counting. We conditionalize here.
Example
In Pick-Six Lottery: A person purchases a ticket, and can choose 6
distinct numbers in the set {1, 2, 3, . . . , 49}.
Later a Lottery Machine picks 6 distinct numbers at random in the
set {1, 2, 3, . . . , 49}.
A winning ticket is one which matches the six numbers chosen by
the Machine (in any order of selection).
+ What are the odds of winning with one ticket?
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 10
Math 425 Introduction to Probability Lecture 10
February 9, 2009
8 / 32
Let E be the event that there are i matches. We want to compute
i
P (E6 ) = P (E1 ∩ E2 ∩ . . . ∩ E6 )
= P (E1 ) · P(E2 | E1 ) · · · P(E6 | E1 ∩ . . . ∩ E5 )
6 5 4 3 2 1
=
·
·
·
·
·
49 48 47 46 45 44
1
=
13, 983, 816
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 10
February 9, 2009
9 / 32
Independence and the Product Rule
Independence and the Product Rule
Dependence
Independence
Sometimes, changes in the conditions of an experiment change the
Sometimes, changes in the conditions of an experiment have no
probability of some outcomes.
effect on the probability of some outcomes.
Example. An urn has 7 red balls and 5 blue balls. The balls are well
mixed. A ball is drawn, its color is noted and put aside.
Compare the probability that the second ball is red (R2 ) given that
the first drawn ball is red (R1 ) versus that it is blue (R1c ).
Example. An urn has 7 red balls and 5 blue balls. The balls are well
mixed. A ball is drawn, its color is noted and returned to the urn, which
is again well mixed.
Compare the probability that the second ball is red (R2 ) given that
the first drawn ball is red (R1 ) versus that it is blue (R1c ).
P(R2 | R1 ) =
6
11
P(R2 | R1c ) =
7
11
P(R2 | R1 ) =
The probability that the second ball is red is
P (R2 )
Kenneth Harris (Math 425)
P(R2 | R1c )
= P(R2 | R1 ) · P (R1 ) +
6
7
7
5
7
=
·
+
·
=
11 12 11 12
12
7
12
P(R2 | R1c ) =
7
12
The probability that the second ball is red is
·
P (R1c )
Math 425 Introduction to Probability Lecture 10
February 9, 2009
P (R2 )
11 / 32
Independence and the Product Rule
Kenneth Harris (Math 425)
= P(R2 | R1 ) · P (R1 ) + P(R2 | R1c ) · P (R1c )
7
7
5
7
7
=
·
+
·
=
12 12 12 12
12
Math 425 Introduction to Probability Lecture 10
February 9, 2009
12 / 32
Independence and the Product Rule
Independence
Independence and the Product Rule
Suppose E and F are events with P(E | F ) = P(E | F ).
By the Partition Rule
c
P (E)
Definition (Product Rule)
Events E and F and independent if and only if
= P(E | F )P (F ) + P(E | F c )P (F c )
= x · P (F ) + P (F c ) = x
where x is P(E | F ) or P(E | F c )
P (E ∩ F ) = P (E) · P (F ).
Equivalently, E and F are independent if and only if
So, P (E) = P(E | F ) and P (E) = P(E | F c ).
By the Multiplication Rule
P(E | F ) = P (E) = P(E | F c )
P (E ∩ F ) = P(E | F ) · P (F ) = P (E) · P (F ).
Events which are not independent are said to be dependent.
So, P (E ∩ F ) = P (E) · P (F )
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 10
February 9, 2009
13 / 32
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 10
February 9, 2009
14 / 32
Independence and the Product Rule
Independence and the Product Rule
Proof of Equivalence
Example
We have already shown the
P(E | F ) = P(E | F c )
⇒
P(E | F ) = P (E) and P (E ∩ F ) = P (E) · P (F ).
Conversely, suppose P (E ∩ F ) = P (E) · P (F ).
By the Conditioning Rule
P(E | F ) =
Example
A standard 52 card deck is well shuffled. Are the following events
independent:
E: Draw a ♠,
P (E ∩ F )
P (E) · P (F )
=
= P (E).
P (F )
P (F )
F : Draw an ace?
By the Conditioning and Partition Rules,
P(E | F c ) =
=
=
Kenneth Harris (Math 425)
P (E ∩ F c )
P (F c )
P (E) − P (E ∩ F )
c
since P (E) = P (E ∩ F ) + P (E ∩ F )
P (F c )
P (E) · (1 − P (F ))
= P (E)
P (F c )
Math 425 Introduction to Probability Lecture 10
February 9, 2009
Solution. E and F are independent.
P(E | F ) =
15 / 32
Kenneth Harris (Math 425)
Independence and the Product Rule
1
4
P(E | F c ) =
12
1
=
48
4
Math 425 Introduction to Probability Lecture 10
February 9, 2009
16 / 32
Extended Product Rule
Example
Extended Product Rule
Example
Definition (Extended Product Rule)
Three dice are thrown. Are the following events independent:
The events E1 , E2 , . . . (possibly infinitely many events) are independent
if and only if
E6 : Throw a six on at least one die,
P (Ei1 ∩ Ei2 ∩ · · · ∩ Ein ) = P (Ei1 ) · P (Ei2 ) · · · P (Ein )
S16 : The sum of the dice is 16?
for any finite subset of indices i1 , i2 , . . . , in .
Solution. E and F are dependent.
P(S16 | E6 ) > 0
Challenge: verify P(S16 | E6 ) =
Kenneth Harris (Math 425)
Equivalently, the events E1 , . . . , E2 , . . . are independent if and only if
P(S16 | E6c ) = 0
P(Ein | Ei1 ∩ Ei2 ∩ · · · ∩ Ein−1 ) = P (Ein ),
6
91 .
Math 425 Introduction to Probability Lecture 10
for any finite (n ≥ 2) subset of indices i1 , i2 , . . . , in .
February 9, 2009
17 / 32
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 10
February 9, 2009
19 / 32
Extended Product Rule
Conditional Independence
Example
Conditional Independence
Example
It is possible that two events E and F are not independent,
A sequence of fair coins is flipped n times, and each outcome is
equiprobable. Let Ei be the event that the ith flip is heads.
Are the events E1 , E2 , . . . , En equiprobable?
but they become so on the assumption that a third event G occurs.
Definition
Events E and F are conditionally independent given G if
Solution. They are independent: Fix any k + 1 events. Then
P(Eik +1 | Ei1 ∩ · · · ∩ Eik ) =
P (Eik +1 ) =
Kenneth Harris (Math 425)
2n−k −1
2n−k
2n−1
2n
=
=
P(E ∩ F | G) = P(E | G) · P(F | G).
1
2
Equivalently,
P(E | F ∩ G) = P(E | G).
1
.
2
Math 425 Introduction to Probability Lecture 10
February 9, 2009
20 / 32
Kenneth Harris (Math 425)
Conditional Independence
22 / 32
Example
Suppose P(E | F ∩ G) = P(E | G). Then,
P (E ∩ F ∩ G)
= P(E | G) Conditioning Rule for P(E | F ∩ G)
P (F ∩ G)
P (E ∩ F ∩ G) = P(E | G) · P (F ∩ G)
= P(E | G) · P(F | G)
February 9, 2009
Conditional Independence
Proof of Equivalence
P(E ∩ F | G)
Math 425 Introduction to Probability Lecture 10
Example
Suppose you roll a red and blue die. Consider the events
divide both sides by P (G)
L2 : lower score is 2,
H5 : higher score is 5.
Suppose P(E ∩ F | G) = P(E | G) · P(F | G). Then,
P (E ∩ F ∩ G)
P (G)
=
P (E ∩ F ∩ G)
P (F ∩ G)
=
P(E | G)
P(E | F ∩ G)
=
P(E | G)
Kenneth Harris (Math 425)
P (F ∩ G)
P(E | G) ·
P (G)
Math 425 Introduction to Probability Lecture 10
D: one die is greater than 3 and one die is less than 3.
Then,
(a) L2 and H5 are not independent,
Conditioning Rule
(b) L2 and H5 are conditionally independent given D.
February 9, 2009
23 / 32
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 10
February 9, 2009
24 / 32
Conditional Independence
Conditional Independence
Example
Example
(a). L2 and H5 are not independent
P (L2 ∩ H5 ) =
P (L2 ) · P (H5 ) =
2
1
=
36
18
Example
Suppose you roll a red and blue die. Consider the events
R2 : a 2 on the red die,
1
9 9
·
=
36 36
16
B2 : a 2 on the blue die,
D: one die is greater than 3 and one die is less than 3.
(b). L2 and H5 are conditionally independent given D
Kenneth Harris (Math 425)
P(L2 | D) =
1
2
P(L2 | H5 ∩ D) =
1
2
Then,
(a) R2 and B2 are independent,
(b) R2 and B2 are not conditionally independent given D.
Math 425 Introduction to Probability Lecture 10
February 9, 2009
25 / 32
Kenneth Harris (Math 425)
Conditional Independence
Math 425 Introduction to Probability Lecture 10
February 9, 2009
26 / 32
Example: Baseball
Example
Example: Baseball
Compare to the Problem of points, Example 3.4j of Ross, p. 95.
(a). R2 and B2 are independent:
1
P (R2 ∩ B2 ) =
= P (R2 ) · P (B2 ).
36
(b). R2 and B2 are not conditionally independent given D:
P(R2 | D) =
P (R2 ∩ D)
=
P (D)
3
36
12
36
=
Example
1
4
The Cubs (!!) and White Sox are playing in the World Series. The
Cubs win each game with probability 0.6 (independently of the games
played). What is the probability that the Cubs win the Series. (The first
team to win four games wins the series.)
P(R2 | B2 ∩ D) = 0
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 10
February 9, 2009
27 / 32
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 10
February 9, 2009
29 / 32
Example: Baseball
Example: Baseball
Example: Baseball
Example: Baseball
Method 1. (Due to Fermat)
Since the first to four wins takes the Series, there are at most 7
games (4 Cub wins to 3 Sox wins). Fermat takes the sample space to
be sequences of length 7:
If X is an outcome in the sample space and has k W s (so, 7 − k
Ls), then
P (X ) = (0.6)k (0.4)7−k
(g1 , g2 , g3 , g4 , g5 , g6 , g7 )
where gi = W , L
Any sequence with 4 W s is a Cub win, otherwise it is a Sox win (there
are at least 4 Ls).
Not all sequences represent actual outcomes, nor are all
sequences equally likely.
Why does it not matter to extend a real Series play with phantom
games?
In this sample space, any sequence with at least 4 W s is a Cubs
Series win.
P (Cubs win) =
7 X
7
n=4
n
· (0.6)n (0.4)7−n ≈ 0.71.
The phantom games do not change the probability: If the Cubs win the
series in four games, they win every extension with phantom games as well.
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 10
February 9, 2009
30 / 32
Example: Baseball
Example: Baseball
Method 2. (Due to Pascal)
This method allows us to treat the sample space as sequences whose
length is at most 7, with 4 W s or 4 Ls. Let
Wn,m be the event that the Cubs win the series when they have n
wins and the Sox have m wins. (where n, m ≤ 4)
At the start of the Series, we want to compute W0,0 .
W (0, 0) can be determined from the following 3 conditions:
(a) P (W4,n ) = 1, where n ≤ 3,
(b) P (Wn,4 ) = 0 where m ≤ 3,
(c) When the teams have played n + m games, and both n, m < 4,
then
P (Wn,m ) = 0.6 · P (Wn+1,m ) + 0.4 · P (Wn,m+1 ).
For example,
The next game is either a Cubs win or Sox win.
P (W3,3 ) = p
P (W3,2 ) = 0.6 + 0.4 · 0.6 = 0.84P (W2,3 ) = 0.62 = 0.36
Harris (Math 425)
Math 425 Introduction to Probability Lecture 10
TheKenneth
method
gives the solution
W0,0 ≈ 0.71.
February 9, 2009
32 / 32
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 10
February 9, 2009
31 / 32