Download Notes - kaharris.org

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Statistics wikipedia , lookup

History of statistics wikipedia , lookup

Inductive probability wikipedia , lookup

Birthday problem wikipedia , lookup

Probability interpretations wikipedia , lookup

Probability wikipedia , lookup

Ars Conjectandi wikipedia , lookup

Transcript
Infinite Discrete Sample Spaces
Examples of infinite sample spaces
We have been considering finite sample spaces, but some
Math 425
Introduction to Probability
Lecture 12
experiments are best modeled by infinite sample spaces.
Experiment A: Toss a fair coin until the first head appears, then
stop.
Kenneth Harris
[email protected]
Experiment C: Toss a fair coin until a run of ten heads in a row
appears, then stop.
Experiment B: Throw a pair of dice until either a 5 or 7 appears,
then stop.
Department of Mathematics
University of Michigan
No finite sample space is adequate in these cases, since each
February 4, 2009
experiment has (potentially) infinitely many outcomes.
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 12
February 4, 2009
1/1
Infinite Discrete Sample Spaces
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 12
February 4, 2009
3/1
Infinite Discrete Sample Spaces
Example of coin tosses
Axiom 3 – Strong form
Example A. Toss a fair coin until the first head appears, then stop.
The sample space for this experiment is
We recall the strong form of the Addition Rule (Axiom 3):
S = H, TH, , TTH , TTTH , . . . , T ∞ = TTT · · ·
Axiom (3 – Addition rule (strong form))
For each k , a sequence of k tails followed by a single head: T k H.
an infinite sequence consisting only tails: T ∞ .
For any sequence of mutually exclusive events E1 , E2 , . . . (so,
Ei ∩ Ej = ∅ whenever i 6= j),
We can compute the probability of “most" of these outcomes:
P(
∞
[
k =1
P (T k H) = 2−k −1
Ek ) =
∞
X
P (Ek ).
k =1
P (T ∞ ) = ???
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 12
February 4, 2009
4/1
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 12
February 4, 2009
5/1
Infinite Discrete Sample Spaces
Infinite Discrete Sample Spaces
Example of Coin Tosses
Example – continued
Then,
P (Hk ) = P (T k H) = 2−k −1 .
Use the strong form of the Addition Rule:
S = H, TH, , TTH , TTTH , . . . , T ∞ = TTT · · ·
Let H be the event that a heads is tossed. Then
H c = {T ∞ }
Let H
P
H=
∞
[
=
=
P (T ∞ ) = P (H c ) = 1 − P (H)
k =0
∞
X
P (Hk )
2−k −1 = 1.
So,
We compute the infinite series shortly.
∞
[
P H =P
Hk = 1
Hk
k =0
and thus,
k =0
Math 425 Introduction to Probability Lecture 12
∞
X
k =0
{T k H}
Kenneth Harris (Math 425)
Hk
k =0
=
(k ≥ 0): the event that heads follows k tails.
So, the events H0 , H1 , H2 , . . . , are mutually exclusive and
k
∞
[
P H c = P T ∞ = 0.
February 4, 2009
6/1
Kenneth Harris (Math 425)
The Geometric Series
Math 425 Introduction to Probability Lecture 12
7/1
The Geometric Series
Infinite series
Geometric series
Definition
Let a1 , a2 , . . . be an infinite sequence of real numbers.
If the partial sums
n
X
sn =
ak .
Theorem
The geometric series converges absolutely:
∞
X
ax k =
k =0
k =1
have a finite limit:
a
1−x
∞
X
2−k −1
=
k =0
If the series
P∞ k =1 |ak | converges as well,
then k =1 ak converges absolutely.
a=
February 4, 2009
9/1
1
2
and x =
1
2
∞
X
1
k =0
=
P∞
Math 425 Introduction to Probability Lecture 12
for any real numbers a, and |x| < 1.
Example. From the previous section:
lim sn = s,
n→∞
P
then we say the infinite series ∞
k =1 ak converges with sum s.
Otherwise, it diverges.
Kenneth Harris (Math 425)
February 4, 2009
2
·
1
1
·
2 1−
1 k
2
1
2
= 1.
from the Theorem.
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 12
February 4, 2009
10 / 1
The Geometric Series
The Geometric Series
Proof
Proof
Proof . The nth partial sum is
sn =
n
X
By definition
ax k .
∞
X
k =0
Multiply by x (shifting powers by 1):
x · sn =
k =0
n+1
X
lim x k = 0,
k =1
k →∞
so
Subtract the previous two results (most terms cancel)
k +1
.
∞
X
k =0
Solve for sn ,
sn =
Kenneth Harris (Math 425)
n→∞
Since |x| < 1,
ax k .
sn − x · sn = a − ax
a(1 − x n+1 )
.
n→∞
1−x
ax k = lim sn = lim
a(1 − x n+1 )
a
=
.
n→∞
1−x
1−x
ax k = lim
a(1 − x n+1 )
1−x
Math 425 Introduction to Probability Lecture 12
February 4, 2009
11 / 1
Kenneth Harris (Math 425)
Monkey at the Typewriter: Bernoulli Trials
Math 425 Introduction to Probability Lecture 12
February 4, 2009
12 / 1
Monkey at the Typewriter: Bernoulli Trials
The monkey and the bard
Bernoulli Trials
Example
A monkey is sitting at a typewriter and randomly hits keys in a
never-ending sequence. What is the probability the monkey eventually
types the complete works of Shakespeare?
A real monkey will eventually tire of pecking at a typewriter and go
searching for a banana.
However, this problem is really about a probability model, not a real
Definition
By a sequence of Bernoulli trials, we mean a sequence of trials
(repetitions of an experiment) satisfying the following
1
Only two possible mutually exclusive outcomes on each trial.
One arbitrarily called success and the other failure.
2
The probability of success on each trial is the same for each trial.
3
The trials are independent.
monkey.
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 12
February 4, 2009
14 / 1
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 12
February 4, 2009
15 / 1
Monkey at the Typewriter: Bernoulli Trials
Monkey at the Typewriter: Bernoulli Trials
Examples of Bernoulli Trials
Sample space
A typical experiment involving a sequence of Bernoulli trials is as
follows.
Examples. The following are examples of Bernoulli trials:
Flip a coin (heads, tails),
Each computer chip in a production line tested (chip passes test,
fails test),
Rolling a pair of dice for “snake-eyes" (double ones, any other
value),
Example. Suppose you are willing to wait indefinitely for the first
success in a sequence of Bernoulli trials. What is the sample space of
such an experiment?
Let s be success, f be failure. Then the sample space S consists of
A patient is prescribed a drug treatment (cured, not cured).
(s)
A monkey types the complete works of Shakespeare (success,
failure).
(f , f , . . . , f , s)
(f , f , f . . .)
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 12
February 4, 2009
16 / 1
Kenneth Harris (Math 425)
Monkey at the Typewriter: Bernoulli Trials
only success is last trial
infinite sequence of failures .
Math 425 Introduction to Probability Lecture 12
February 4, 2009
17 / 1
Monkey at the Typewriter: Bernoulli Trials
Waiting for success
Waiting for success
Problem. Suppose the probability of success is p and failure is 1 − p
for a sequence of Bernoulli trials.
What is the probability of each outcome?
Solution. Let Ek be the event that the first success is on the k + 1st
trial. The finite outcomes, success on the k + 1st trial, are
Ek = {(f , f , . . . , f , s)}
Let E be the event that there is eventually some success. So,
E=
∞
[
Ek .
k =0
The event of no success is E c = {(f , f , f , . . .)}.
Since the events E0 , E1 , . . . are mutually exclusive,
P (E)
= P
k failures f , first success s
=
∞
X
∞
[
∞
X
Ek =
P (Ek )
k =0
k =0
p(1 − p)k
k =0
Since the trials are independent
=
P (Ek ) = p(1 − p)k .
p
=1
1 − (1 − p)
So,
P (E c ) = 1 − P (E) = 0.
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 12
February 4, 2009
18 / 1
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 12
February 4, 2009
19 / 1
Monkey at the Typewriter: Bernoulli Trials
Craps
Back to the Monkey
Craps
The complete works of Shakespeare consists of L symbols (spaces,
Example
letters, punctuation, etc.). A monkey beating on the keys at random
has exceeding small, but nonzero probability p of typing a sequence of
L symbols which is exactly the complete works of Shakespeare.
Success in a trial occurs when the monkey has typed the complete
works of Shakespeare in the trial. The probability of eventually
succeeding is one, since the only other outcome, the infinite sequence
of failures, (f , f , f , . . .), has probability zero.
Math 425 Introduction to Probability Lecture 12
February 4, 2009
If 7 or 11 show, she wins,
if 2, 3 or 12 shows she loses,
Let each trial consist of L symbols banged-out by the Monkey.
Kenneth Harris (Math 425)
Craps is played with a pair of dice. The player (or “shooter") rolls once:
20 / 1
and if any other number shows (the “gambler’s point"), she must
keep rolling the dice until she gets a 7 before her point appears
(she loses), or her point appears before the 7 (she wins).
What is the probability of winning at craps?
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 12
Craps
22 / 1
Craps
Sample space
Computing the odds
Let d = 4. The probability that the gambler rolls 4 and wins on the
nth throw, the outcome (4, a , a , . . . , a , 4):
The sample space S consists of the following sequences:
2
(d, a2 , a3 , . . . , an , d)
where d 6= 7 and ai 6= d, 7
(d, a2 , a3 , . . . , an , 7)
where d 6= 7 and ai 6= d, 7
(d, a2 , a3 , . . .)
3
n−1
P (E4,n ) = (
(7), (11), (2), (3), (12)
3 2 27 n−2
) ·( )
36
36
The probability that the gambler wins some time (when she threw a
4 on the first toss):
where d 6= 7 and ai 6= d, 7
P (E4 ) = P (
∞
[
∞
E4,n ) = (
n=2
Let E
3 2 X 27 k
) ·
( ) .
36
36
k =0
This is a geometric series,
d be the event that the first throw is d and the gambler
eventually wins. Let Ed,n be the event that the first roll is d and the
gambler wins on the nth throw. (So, d = 4, 5, 6, 8, 9, 10).
∞
P (E4 )
=
Math 425 Introduction to Probability Lecture 12
February 4, 2009
(
3 2 X 27 k
) ·
( )
36
36
k =0
=
Kenneth Harris (Math 425)
February 4, 2009
23 / 1
Kenneth Harris (Math 425)
(
3 2
1
) ·
36
1−
2
3
=
1
.
38
Math 425 Introduction to Probability Lecture 12
February 4, 2009
24 / 1
Craps
Craps
Computing the odds
The probability of rolling a 4 or 10 are
Computing the odds
3
36 ,
so
∞
P (E4 ) = P (E10 ) = (
1
3 2 X 27 k
( ) =
) ·
36
36
38
The probability of winning at black jack is
k =0
The probability of rolling a 5 or 9 are
P (E5 ) = P (E9 ) = (
4
36 ,
4 2
) ·
36
The probability of rolling a 6 or 8 are
so
∞
X
(
k =0
5
36 ,
8
1
2
25
+2·
+2·
+2·
≈ 0.4783
36
38
45
396
26 k
2
) =
36
45
The probability of throwing a 7 or 11 on the first throw is
8
36 .
so
∞
X 25
5
25
( )k =
P (E6 ) = P (E8 ) = ( )2 ·
36
36
396
k =0
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 12
February 4, 2009
25 / 1
Kenneth Harris (Math 425)
Variation on a Theme
Math 425 Introduction to Probability Lecture 12
February 4, 2009
26 / 1
Variation on a Theme
Example
Example – continued
The ideas of independence and conditioning are remarkably
Let
E be the event of heads eventually appearing on an even toss.
By the partition rule, conditioning on the outcome of the first toss
effective at working together to provide neat solutions to a wide range
of problems.
(H or T ):
P (E)
Example
A coin shows a head with probability p, or a tail with probability 1 − p.
It is flipped repeatedly until the first head appears.
What is the probability that a heads appears on an even number
of tosses?
=
P(E | H) · P (H) + P(E | T ) · P (T )
=
0 · p + P(E | T ) · (1 − p),
since 1 is odd, P(E | H) = 0.
What about P(E | T )? The key is that we now need an odd number
of tosses to succeed.
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 12
February 4, 2009
28 / 1
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 12
February 4, 2009
29 / 1
Variation on a Theme
Variation on a Theme
Example – continued
Example – continued
no Let
O be the event that the first head is on odd throw, and N be that
head is tossed.
P (E c ) = P (O) + P (N) = P (O) + 0 = P (O).
P (E)
But,
so
P(E | T ) = 1 − P (E).
Thus,
Kenneth Harris (Math 425)
(1 − P (E)) · (1 − p)
Let q = 1 − p, and solve for P (E):
P (O) = P(E | T ),
P (E) =
=
P (E)
=
P (E)
=
(1 − P (E)) · q
q
1−p
1−p
=
=
.
1+q
1 + (1 − p)
2−p
P(E | T ) · (1 − p) = (1 − P (E)) · (1 − p)
Math 425 Introduction to Probability Lecture 12
February 4, 2009
30 / 1
Kenneth Harris (Math 425)
Variation on a Theme
Math 425 Introduction to Probability Lecture 12
February 4, 2009
31 / 1
Variation on a Theme
Example – continued
Example – Method 2
Method 2. Let En (n ≥ 0) be the event that heads first appears on the
2nth toss. Let q = 1 − p.
1−p
P (E) =
.
2−p
P (En ) = pq 2n−1 = pq q 2
Consider a fair coin (p = 12 ).
The probability of tossing heads on an even throw is
Since the events E1 , E2 , . . . are mutually exclusive,
P (E)
1
P (E) =
3
=
=
The probability of tossing heads on an odd throw is
P (O) = 1 −
n
∞
X
n=0
∞
X
P (En )
pq q 2
n
n=0
=
1
2
=
3
3
=
p(1 − p)
pq
=
1 − q2
1 − (1 − p)2
1−p
2−p
Does this seem right?
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 12
February 4, 2009
32 / 1
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 12
February 4, 2009
33 / 1
A problem of Huygens
A problem of Huygens
Example: Huygen’s problem
Solution to Huygen’s problem
Example
p1 be the probability A wins when A has the first throw, and
p2 be the probability A wins when B has the first throw.
By conditioning on the outcome of the first throw, when A is first,
Let
Two players, A and B, take turns at throwing dice; each needs some
score to win. If one player does not throw the required score, the play
continues with the next person throwing. At each of their attempts A
wins with probability α and B wins with probability β.
(a) What is the probability that A wins if A throws first?
p1 = α + (1 − α)p2 .
When B is first, conditioning on the first throw gives
p2 = (1 − β)p1 .
Solving this pair gives
(If B throws their score, A loses.)
(b) What is the probability that A wins if A throws second?
p1 =
p2 =
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 12
February 4, 2009
35 / 1
Kenneth Harris (Math 425)
A problem of Huygens
α
α + β − αβ
(1 − β)α
α + β − αβ
Math 425 Introduction to Probability Lecture 12
February 4, 2009
36 / 1
A problem of Huygens
Example
Example
Suppose A and B are tossing a fair coin, and each needs a head. A
throws first. (Here, α = β = 12 ).
Suppose A and B are throwing a pair of dice. A needs a 5 and B a
7. (Here, α =
The probability that A wins is
1
12
and β = 16 ).
The probability that A wins if A throws first is
p1 =
1
2
+
1
2
1
2
−
1
4
2
=
3
p1 =
The problem is equivalent to tossing a heads on an odd throw.
+
=
3
≈ 0.43
7
The probability that A wins if A throws second is
The probability that B wins is
p2 =
1
6
1
9
1
1
9 − 54
1
1
· p1 =
2
3
p2 =
5
15
· p1 =
≈ 0.36
6
42
The problem is equivalent to tossing a heads on an even throw.
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 12
February 4, 2009
37 / 1
Kenneth Harris (Math 425)
Math 425 Introduction to Probability Lecture 12
February 4, 2009
38 / 1