Download FSRM 582, Homework 3

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
FSRM 582, Homework 3 - Solutions
Assigned: September 16, 2014
Due: September 30, 2014
Problem 1. Suppose A1 and A2 are independent events for some probability measure P on Ω.
For j = 1, 2, define
1,
ω ∈ Aj
Ij (ω) =
0, otherwise,
to be the indicator function associated to the event Aj . Writing p1 = P (A1 ), p2 = P (A2 ), q1 = 1−p1
and q2 = 1 − p2 , show the following:
• I1 is a Bernoulli random variable with success probability p1 and I2 is a Bernoulli random
variable with a success probability p2 .
• The product I1 I2 is a Bernoulli random variable with success probability p1 p2 .
• The random variable Y = max(I1 , I2 ) defined as the maximum of I1 , I2 is a bernoulli random
variable with success probability 1 − q1 q2 .
• Deduce that,
. , An the variable min(I1 , . . . , In ) is
Qnin general, for independent events A1 , A2 , . . Q
n
Bernoulli( j=1 pj ) and max(I1 , . . . , In ) is Bernoulli(1 − j=1 qj ).
Solution A1 , A2 are independent events for measure P on Ω.
1. I1 is a Bernoulli(p1 ) RV, where p1 = P (A1 ).
This is obvious:
P (I1 = 1)
= P ({ω ∈ Ω : I1 (ω) = 1})
= P (A1 ) = p1 .
P (I1 = 0)
= P ({ω ∈ Ω : I1 (ω) = 0})
= P (Ac1 ) = 1 − p1 .
2. The product
(
(I1 I2 )(ω) =
1
1, ω ∈ A1 ∩ A2
0,
o.w.
Hence,
= P ({I1 = 1} ∩ {I2 = 1})
P (I1 I2 = 1)
= P (A1 ∩ A2 )
= P (A1 )P (A2 )
=
(byA1 ⊥ A2 )
p1 p2 .
3.
(
max(I1 , I2 ) =
1, ω ∈ A1 ∪ A2
0,
o.w.
P (A1 ∪ A2 )
P (max(I1 , I2 ) = 1) =
= P (A1 ) + P (A2 ) − P (A1 )P (A2 )
=
1 − P (Ac1 )P (Ac2 )
=
1 − q1 q2 .
4. Induction implies that, in general,
for mutually independent events A1 , Q
· · · , An ,
Q
min(I1 , · · · In ) ∼ Bernoulli( pj ) and max(I1 , · · · In ) ∼ Bernoulli(1 − qj ).
♣
Problem 2. Show that the following statements are equivalent for a collection A1 , A2 , . . . , An of
events in Ω.
Qn
(i) P (A•1 ∩ A•2 ∩ · · · ∩ A•n ) = j=1 P (A•j ), where A•i = Ai or Aci for all i = 1, . . . , n;
T
Q
(ii) For all I ⊂ [n], P ( i∈I Ai ) = i∈I P (Ai ).
Solution
1. i) ⇒ ii):
Assume i) holds, and let I ⊂ [n]. Then
[
\
\
\
P ( Ai ) = P (
[( Ai ) ∩ (
Aj ) ∩ (
i∈I
J⊂[n]\I
=
X
P [(
j∈J
\
Ai ) ∩ (
i∈I
J⊂[n]\I
=
i∈I
\
X Y
P (Ai )
J⊂[n]\I i∈I
=
Y
=
Y
i∈I
j∈J
X
P (Ai )
J⊂[n]\I
P (Ai ).
i∈I
2
Ack )])
k∈I c ∩J c
Aj ) ∩ (
P (Aj )
\
Ack )]
k∈I c ∩J c
j∈J
Y
\
Y
k∈I c ∩J c
P (AJ ∩ AcJ c )
P (Ack )
2. ii) ⇒ i):
T
T c
For J ⊂ [n], AJ =
Aj , AcJ c =
Aj , then
j∈J c
j∈J
AJ
=
(AJ ∩ AcJ c ) ∪ (AJ ∩ (AJ c ))
AJ ∩ AcJ c and AJ ∩ (AJ c ) are disjoint.
By inclusion-exclusion,
P [AJ ∩ AJ c ]
=
[
P[
(AJ ∩ Aj )]
j∈J c
X
=
(−1)#I+1 P [AJ ∩ AI ]
I⊂J c
=
X
P [AJ ]
(−1)#I+1
I⊂J c
i∈I
X
(−1)#I+1
Y
X
X
P [Ai ] +
Y
P [Ai ])
i∈I
P [Ai ])
= P [AJ ]{1 − P [Ack ] + P [Ack ](P [Ak−1 ] − P [Ack−1 ]
X
I⊂[k−2]
X
(−1)#+1
i∈I
I⊂[k−1]
= P [AJ ]{1 − P [Ack ] · · · P [Ac1 ](1 +
P [Ai ])
I⊂[k−1]
Y
(−1)#+1
Y
i∈I
i∈I
I⊂[k−1]
= P [AJ ](1 − P [Ack ] + P [Ack ]
(−1)#I+1
I⊂[k];k∈I
/
X
= P [AJ ](P [Ak ] − P [Ak ]
P [Ai ]
i∈I
Without loss of generality, assume J c = [k]
X
Y
P [AJ ∩ AJ c ] = P [AJ ](
(−1)#I+1
P [Ai ] +
I⊂[k];k∈I
Y
(−1)#+1
Y
P [Ai ])}
i∈I
(−1)#+1 P [AI ])}
I⊂∅
= P [AJ ](1 −
Y
P [Acj ])
j∈J c
Hence,
P [AJ ∩ AcJ c ]
= P [AJ ] − P [AJ ∩ AJ c ]
Y
= P [AJ ] − P [AJ ](1 −
P [Acj ])
j∈J c
= P [Aj ]
Y
P [Acj ]
j∈J c
=
Y
P [Aj ]
j∈J
Y
P [Acj ]
j∈J c
♣
3
Problem 3. Suppose a random variable Y takes the values 2n and −2n , each with probability
1/2n+2 , for n = 0, 1, 2, . . ..
(i) P
Specify the probability mass function (pmf ) of Y and show that it is, indeed, a pmf (i.e.
y pY (y) = 1).
(ii) Show that the expectation of Y does not exist.
Solution
1. For n=0,1,2. . .

1

 n+2 ,
P (Y = y) = 2

 1 ,
2n+2
X
P (Y = y) =
y
∞
X
2
2n+2
n=0
y = 2n
y = −2n
=
∞
X
1
=1
n+1
2
n=0
2. EY does not exist. Let Y + = max(Y, 0), Y − = max(−Y, 0)
∞
P
2n
EY + =
= ∞ = EY − .
n+2
n=0 2
♣
Problem 4. Recall the definition of the craps game from the second lecture. In particular, you
roll 2 fair 6-sided dice repeatedly until you win or lose. The rules of the game are as follows: On
the first roll,
• if either a 7 or 11 is rolled, you win;
• if either 2, 3 or 12 is rolled, you lose;
• otherwise, the roll is your point and
you continue rolling until
• you roll your point before you roll a 7, and you win;
• you roll a 7 before making your point, and you lose.
In class, we computed the probability that you win at craps by using the law of cases. In this
exercise, you are asked to compute the conditional probabilities P (W |Ti ), for i = 4, 5, 6, 8, 9, 10,
where Ti = {first roll is i}, by using a conditioning on the first step argument.
Solution Let Ti = {1st roll is i}. Then we obtain P (W |Ti ) by noting that, after the first roll, if we
fail to roll either an i or a 7, then our probability of winning remains at P (W |Ti ). In other words,
4
if we let Si = {2nd roll is not i or 7}, then P (W |Ti , Si ) = P (W |Ti ).
So, we have
P (W |Ti )
=
P (Si )P (W |Ti , Si ) + P (Sic )P (W |Ti , Sic )
=
P (Si )P (W |Ti ) + P (Sic )P (W |Ti , Sic ).
Based on this we have, for i = 4,
P (W |T4 )
=
P (W |T5 )
=
P (W |T6 )
=
27
P (W |T4 ) +
36
26
P (W |T5 ) +
36
25
P (W |T6 ) +
36
9 3
1
⇒ P (W |T4 ) = p4 =
36 9
3
10 4
2
⇒ P (W |T5 ) = p5 =
36 10
5
11 5
5
⇒ P (W |T6 ) = p6 =
36 11
11
♣
Problem 5.
Give an example of a pair of random variables X and Y defined on the same
probability space such that X and Y are not independent but Cov(X, Y )=0.
1
each and let Y = X 2 . Then cov(X, Y ) = 0,
4
but X is not independent with Y , because P (Y = 1|X = 1) = 1 6= P (Y = 1|X 6= 1) = 0.
Solution Let X = {−2, −1, 1, 2} with probability
♣
Problem 6. .
(i) The following table shows the cumulative distribution function of a discrete random variable.
Find the frequency function.
k
0
1
2
3
4
5
F (k)
0
.1
.3
.7
.8
1.0
(ii) If X is an integer-valued random variable, show that the frequency function is related to the
cdf by p(k) = F (k) − F (k − 1).
5
(iii) A multiple-choice test consists of 20 items, each with four choices. A student is able to
eliminate one of the choices on each question as incorrect and chooses randomly from the
remaining three choices. A passing grade is 12 items or more correct.
a. What is the probability that the student passes?
b. Answer the question in part (a) again, assuming that the student can eliminate two of
the choices on each student.
(iv) Two boys play basketball in the following way. They take turns shooting and stop when a
basket is made. Player A goes first and has probability p1 of making a basket on any throw.
Player B, who shoots second, has probability p2 of making a basket. The outcomes of the
successive trials are assumed to be independent. What is the probability that player A wins?
Solution.
(i)
P (k)
k
0.1
1
0.2
2
0.4
3
0.1
4
0.2
5
(ii)
F (k) = P (X ≤ k)
=
X
p(x)
x≤k
F (k − 1) = P (X ≤ k − 1)
=
X
p(x)
x≤k−1
F (k) − F (k − 1)
=
P (k)
(iii)
1. Let X =number of correct answers, and X ∼ Bin(20, 13 ),
P (pass) = P (X ≥ 12) =
20
P
k=12
20
k
( 13 )k ( 23 )20−k
2. X ∼ Bin(20, 12 ).
(iv) Let A = {player A wins}, then
P (A) = p1 + (1 − p1 )(1 − p2 )P (A) ⇒ P (A) =
p1
.
(1 − q1 q2 )
♣
6
Problem 7. .
(i) Show that
X is a discrete random variable, taking values on the positive integers, then
Pif
∞
E(X) = k=1 P (X ≥ k). Apply this result to find the expected value of a geometric random
variable.
(ii) If n men throw their hats into a pile and each man takes a hat at random, what is the expected
number of matches?(Hint: Express the number as a sum.)
Solution (i) Suppose X ≥ 0, then
∞
X
P (X ≥ k)
=
k=1
=
=
=
∞ X
∞
X
k=1 j=k
∞ X
∞
X
k=1 j=1
∞
X
P (X = j)
P (X = j)1{j≥k}
P (X = j)
j=1
∞
X
∞
X
1{k≤j} , and
k=1
P (X = j)j
j=1
= E(X).
(ii) Let Ij = {man j matches}, then the total number of matches M =
n
P
Ij .
j=1
1
1
Marginally, each Ij ∼ Bern( ) ⇒ E(Ij ) = , ∀j.
n
n
P
P
1
E(M ) = E Ij =
E(Ij ) = n · = 1.
n
j
j
♣
7
Related documents