Download Homework 4 Solution

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Statistics wikipedia , lookup

History of statistics wikipedia , lookup

Birthday problem wikipedia , lookup

Ars Conjectandi wikipedia , lookup

Probability interpretations wikipedia , lookup

Probability wikipedia , lookup

Transcript
Lahore University of Management Sciences
CMPE 501: Applied Probability (Fall 2010)
Homework 4: Solution
1. Random variables X and Y have joint PMF
(
c(x + y) if x ∈ {0, 1, 2}
pX,Y (x, y) =
0
otherwise
and y ∈ {1, 2}
(a) What is the value of the constant c?
Solution:
To be a valid PMF,
XX
x
pX,Y (x, y) = 1. Therefore:
y
1=
XX
x
pX,Y (x, y)
y
= c(0 + 1) + c(0 + 2) + · · · + c(2 + 2)
= 15c
This gives c = 1/15
(b) Find the marginal PMFs pX (x) and pY (y).
Solution:
pX (x) =
X
pX,Y (x, y) = c(x + 1) + c(x + 2)
y
=
2x + 3
15
for x ∈ {0, 1, 2}
Similarly
pY (y) =
X
pX,Y (x, y)
x
=
3y + 3
15
for y ∈ {1, 2}
(c) Find the conditional PMFs pX | Y (x | y) and pY
| X (y | x).
Solution:
x+y
pX,Y (x, y)
15
= 3y+3
pX | Y (x | y) =
pY (y)
15
x+y
for x ∈ {0, 1, 2} and y ∈ {1, 2}
=
3y + 3
Page 1
Similarly
pY
| X (y | x)
pX,Y (x, y)
pX (x)
x+y
for x ∈ {0, 1, 2} and y ∈ {1, 2}
=
2x + 3
=
(d) Find the conditional expectations E[X | Y = y] and E[Y | X = x].
Solution:
E[X | Y = y] =
X
x pX | Y (x | y)
x
= 0 · pX | Y (0 | y) + 1 · pX | Y (1 | y) + 2 · pX | Y (2 | y)
1+y
2+y
=0+1·
+2·
3y + 3
3y + 3
3y + 5
=
for y ∈ {1, 2}
3y + 3
Notice that E[X | Y ] does not depend on X. Similarly
X
E[Y | X = x] =
y pY | X (y | x)
y
=
3x + 5
2x + 3
for x ∈ {0, 1, 2}
which does not depend on Y .
(e) Find the expectations E[X] and E[Y ] by using
i. the marginal PMF of X (and of Y respectively), and
Solution:
E[X] =
X
x pX (x) =
19
15
X
y pY (y) =
8
5
x
Similarly
E[Y ] =
y
ii. the total expectation theorem for E[X] (and E[Y ] respectively).
Solution:
E[X] =
X
E[X | Y = y] pY (y)
y
= E[X | Y = 1] pY (1) + E[X | Y = 2] pY (2)
11 9
8 6
+
·
= ·
6 15
9 15
19
=
15
Page 2
Similarly
E[Y ] =
X
E[Y | X = x] pX (x)
x
8
=
5
(f) Are X and Y independent?
Solution:
No, X and Y are not independent. For X and Y to be independent, pX,Y (x, y) =
pX (x) pY (y) for all (x, y).
pX,Y (x, y) =
x+y
2x + 3 3y + 3
6=
·
= pX (x) pY (y)
15
15
15
In this case it would have sufficed to show that pX,Y (x, y) 6= pX (x) pY (y) for any one
value of (x, y).
2. There are two possible causes for a breakdown of a machine. To check the first possibility
would cost C1 dollars, and if that were the cause of the breakdown, the trouble could be
repaired at a cost of R1 dollars. Similarly, there are costs C2 and R2 associated with the
second possibility. Let p and 1 − p denote, respectively, the probabilities that the breakdown
is caused by the first and second possibilities. Under what conditions on p, Ci , Ri , i = 1, 2,
should we check the first possible cause of breakdown and then the second, as opposed to
reversing the checking order, so as to minimize the expected cost involved in returning the
machine to working order? Assume that if the first check is negative, you must still check the
other possibility.
Solution:
If we check the first possible cause of breakdown before the second, then:
• with probability p we will be right and the total cost would be C1 + R1 , and
• with probability 1 − p we will also have to check the second possible cause with a total
cost of C1 + C2 + R2 .
If on the other hand, we check the second possible cause of breakdown before the first, then
• with probability 1 − p we will be right and the total cost would be C2 + R2 , and
• with probability p we will also have to check the first possible cause with a total cost of
C2 + C1 + R1 .
Let T1 be the total cost if we check the first cause before checking the second cause. Similarly
let T2 be the total cost if we check the second cause before the first cause. Because of the
above reasoning, its clear that T1 and T2 are r.v.s with the following PMFs.
(
p
pT1 (k) =
1−p
(
p
pT2 (k) =
1−p
if k = C1 + R1
if k = C1 + C2 + R2
Page 3
if k = C2 + C1 + R1
if k = C2 + R2
Now, we should check the first possible cause of breakdown and then the second iff E[T1 ] < E[T2 ].
This means that:
E[T1 ] < E[T2 ]
which expands to
p(C1 + R1 ) + (1 − p)(C1 + C2 + R2 ) < p(C2 + C1 + R1 ) + (1 − p)(C2 + R2 )
After some cancellations of common terms, this simplifies to
p
C1
>
1−p
C2
or
p>
C1
C1 + C2
So we should check the first possible cause before the second possible cause iff p >
C1
C1 + C2
3. Professor May B. Right often has her science facts wrong, and answers each of her students’
questions incorrectly with probability 1/4, independently of other questions. In each lecture
Professor Right is asked either 1 or 2 questions with equal probability. Assume that Professor
Right has 20 lectures each semester and each lecture is independent of any other lecture.
(a) What is the probability that Professor Right give wrong answers to all the questions she
gets in a given lecture?
Solution:
Let X be the number of questions asked A be the event that Professor Right gives wrong
answer to all the questions asked. Using the total probability theorem:
X
P(A) =
P(X = x) P(A | X = x)
x
= P(X = 1) P(A | X = 1) + P(X = 2) P(A | X = 2)
1 1 1 1
= · + ·
2 4 2 16
5
=
32
(b) Given that Professor Right gave wrong answers to all the questions she was asked in a
given lecture, what is the probability that she got two questions?
Solution:
Let the r.v. X and event A be defined as in part (a) above. We are required to find
Page 4
P(X = 2 | A).
P({X = 2} ∩ A)
P(A)
P(X = 2) P(A | X = 2)
=
5/32
1
1
·
= 2 5 16
P(X = 2 | A) =
32
1
=
5
(c) Let X and Y be the number of questions asked and the number of questions answered
correctly in a lecture, respectively. What is the mean and variance of X and the mean
and variance of Y ?
Solution:
The PMF pX (x) of X is given by:
(
1/2 if x ∈ {1, 2}
pX (x) =
0
otherwise
2
With this PMF, the E[X] and E X can be calculated as 1.5 and 2.5 respectively. This
gives
X
3
E[X] =
x pX (x) =
2
x
and
1
var(X) = E X 2 − E[X]2 =
4
For Y , it’s easy to see that the conditional PMF pY | X (y | x) is a binomial r.v. with n = x
and p = 3/4. So
y x−y
3
x
1
for 0 ≤ y ≤ x ≤ 2
pY | X (y | x) =
y
4
4
Now using the total probability theorem, the (marginal) PMF of Y can be calculated
as:


pX,Y (1, 0) + pX,Y (2, 0) if y = 0


p
X,Y (1, 1) + pX,Y (2, 1) if y = 1
pY (y) =

pX,Y (2, 2)
if y = 2



0
otherwise
Using pX,Y (x, y) = pX (x) pY
| X (y | x)


5/32



9/16
pY (y) =

9/32



0
we get
if y = 0
if y = 1
if y = 2
otherwise
Page 5
With this PMF, the E[Y ] and E Y 2 can be calculated as 9/8 and 27/16 respectively.
This gives
X
9
y pY (y) =
E[Y ] =
8
y
and
27
var(Y ) = E Y 2 − E[Y ]2 =
64
(d) Give a neatly labeled table of the joint PMF pX,Y (x, y).
Solution:
y
pX,Y (x, y)
2
0
9/32
1
3/8
3/16
0
1/8
1/32
1
2
x
(e) Let Z = X + 2Y . What are the expectation and variance of Z?
Solution:
From the joint PMF pX,Y (x, y) above we can calculate the PMF of Z directly as:


1/8 if z = 1




1/32 if z = 2



3/8
if z = 3
pZ (z) =

3/16 if z = 4





9/32 if z = 6



0
otherwise
2
Using this we can easily calculate E[Z] and E Z as 15/4 and 67/4 respectively. This
gives
X
15
E[Z] =
z pZ (z) =
4
z
and
43
var(Z) = E Z 2 − E[Z]2 =
16
We could also have found E[Z] using
15
4
as before. Notice that this does NOT require X and Y to be independent. We could
not, however have used var(Z) = var(X) + 4var(Y ) since X and Y are dependent and
we had to resort to finding the PMF of Z first.
E[Z] = E[X + 2Y ] = E[X] + 2E[Y ] =
Page 6
(f) The university where Professor Right works has a peculiar compensation plan. For each
lecture, she gets paid a base salary of $1,000 plus $40 for each question she answers
and an additional $80 for each of these she answers correctly. In terms of the random
variable Z, she gets paid $1000 + $40Z per lecture. What are the expected value and
variance of her semesterly salary?
Solution:
Let Li (for 1 ≤ i ≤ 20) be the r.v. representing Professor Right’s salary for the ith lecture.
Also let Zi (for 1 ≤ i ≤ 20) be the r.v. so that Li = 1000 + 40Zi . Her semesterly salary
S is then given by:
S=
20
X
Li =
20
X
(1000 + 40Zi ) = 20000 + 40
Zi
i=1
i=1
i=1
20
X
By linearity of expectations:
E[S] = 20000 + 40E
"
20
X
#
Zi = 2000 + 40 · 20E[Zi ] = 23000
i=1
Since all Li are independent:
!
20
20
X
X
var(Li ) = 20 var(1000 + 40Zi ) = 20 · 402 var(Zi ) = 36000
Li =
var(S) = var
i=1
i=1
(g) Determined to improve her reputation, Professor Right decides to teach an additional
20-lecture course in her specialty (math), where she answers questions incorrectly with
probability 1/10 rather than 1/4. What is the expected number of questions that she
will answer wrong in a randomly chosen lecture (math or science)?
Solution:
Let Am (and As ) be the event that the randomly chosen lecture is math (and science
respectively). Also, let Y be the number of questions she answers incorrectly. We are
required to find E[Y ]. We first find the conditional PMFs pY | Am (y) and pY | As (y) using
the method similar to the one used in part (c).




21/32
if
y
=
0
171/200 if y = 0






5/16

if y = 1
7/50
if y = 1
pY | Am =
pY | As =


1/32
if y = 2
1/200
if y = 2






0

otherwise
0
otherwise
Using these we can calculate the values for E[Y | As ] and E[Y | Am ] as:
X
E[Y | As ] =
y pY | As (y) = 3/8
y
and
E[Y | Am ] =
X
y
Page 7
y pY
| Am (y)
= 3/20
E[Y ] can now be calculated using the total expectation theorem.
E[Y ] = P(Am ) E[Y | Am ] + P(As ) E[Y | As ]
1 3
1 3
= ·
+ ·
2 20 2 8
21
=
80
4. Joe Lucky plays the lottery on any given week with probability p, independently of whether
he played on any other week. Each time he plays, he has a probability q of winning, again
independently of everything else. During a fixed period of n weeks, let X be the number of
weeks that he played the lottery and Y the number of weeks that he won.
(a) What is the probability that he played the lottery any particular week, given that he
did not win anything that week?
Solution:
Let Li be the event that he played the lottery in week i and Wi be the event that he
won something in week i. We are required to find P(Li | Wic ).
P(Li | Wic ) =
=
=
=
=
P(Li ∩ Wic )
P(Wic )
P(Li ∩ Wic )
P(Li ∩ Wic ) + P(Lci ∩ Wic )
P(Li ) P(Wic | Li )
P(Li ) P(Wic | Li ) + P(Lci ) P(Wic | Lci )
p · (1 − q)
p · (1 − q) + (1 − p) · 1
p − pq
1 − pq
You can save the first two steps if you remember Baye’s rule.
(b) Find the conditional PMF pY
| X (y | x).
Solution:
Conditioned on X, Y has a binomial PMF:
x y
pY | X (y | x) =
q (1 − q)x−y
y
for 0 ≤ y ≤ x
(c) Find the joint PMF pX,Y (x, y).
Solution:
X has a binomial PMF with pX (x) =
given by:
$n
x
px (1 − p)n−x . The joint PMF pX,Y (x, y) is then
pX,Y (x, y) = pX (x) pY | X (y | x)
n x
n−x x
q y (1 − q)x−y
p (1 − p)
=
y
x
Page 8
for 0 ≤ y ≤ x ≤ n
which simplifies to
n!
px q y (1 − p)n−x (1 − q)x−y
(n − x)! (x − y)! y!
pX,Y (x, y) =
for 0 ≤ y ≤ x ≤ n
(d) Find the marginal PMF pY (y). (Hint: One possibility is to start with the answer to
part (c), but the algebra can be messy. But if you think intuitively about the procedure
that generates Y , you may be able to guess the answer.)
Solution:
The probability that in any week Joe will win something is P(Li ) P(Wi | Li ) = pq. So Y
is a binomial r.v. with parameters n and pq.
n
(pq)y (1 − pq)n−y for 0 ≤ y ≤ n
pY (y) =
y
(e) Find the conditional PMF pX | Y (x | y). Do this algebraically using previous answers.
Solution:
pX | Y (x | y) =
=
=
pX,Y (x, y)
pY (y)
n!
x y
n−x (1 − q)x−y
(n−x)! (x−y)! y! p q (1 − p)
n!
y
n−y
y!(n−y)! (pq) (1 − pq)
px−y (1 − q)x−y (1 − p)n−x
(n − y)!
(n − x)!(x − y)!
·
(1 − pq)n−y
which simplifies to
pX | Y (x | y) =
n−y
x−y
1−p
1 − pq
n−x p − pq
1 − pq
x−y
for 0 ≤ y ≤ x ≤ n
(f) (Optional) Rederive the answer to part (e) by thinking as follows: For each one of the
n − Y weeks that he did not win, the answer to part (a) should tell you something.
Solution:
Given that Y = y, Joe did not win anything in n − y weeks. For each of these weeks,
the probability that he played the lottery (given that he did not win anything) is given
by P(Li | Wic ) = p−pq
1−pq as calculated in part (a). So, given Y = y the r.v. X taking the
value x means that Joe did not play the lottery in x − y of the n − y weeks he did not
win anything. So now given Y = y, X is a binomial r.v. with parameters n − y and
p−pq
1−pq . Therefore
pX | Y (x | y) =
n−y
x−y
p − pq
1 − pq
x−y p − pq
1−
1 − pq
(n−y)−(x−y)
for 0 ≤ y ≤ x ≤ n
which is the identical to the conditional PMF pX | Y (x | y) calculated in part (e).
5. Let X and Y be random variables that take on values from the set {-1, 0, 1}.
Page 9
(a) Find a joint PMF assignment for which X and Y are independent, and confirm that X 2
and Y 2 are then also independent.
Solution:
Let Xsq and Ysq be the r.v.s corresponding to the X 2 and Y 2 respectively. For the PMFs
given below, it can be verified that if X and Y are independent then X 2 and Y 2 are also
independent.
pX,Y (x, y)
y
ysq
pXsq ,Ysq (xsq , ysq )
1
1/9
1/9
1/9
0
1/9
1/9
1/9
1
2/9
4/9
−1
1/9
1/9
1/9
0
1/9
2/9
−1
0
1
0
1
x
xsq
(b) Find a joint PMF assignment for which X and Y are not independent, but for which
X 2 and Y 2 are independent.
Solution:
For the PMFs given below, it can be verified that X 2 and Y 2 are independent but X
and Y are not.
ysq
pXsq ,Ysq (xsq , ysq )
pX,Y (x, y)
y
1
0
1/9
2/9
1
2/9
4/9
0
1/9
1/9
1/9
0
1/9
2/9
−1
2/9
1/9
0
0
1
−1
0
1
xsq
x
6. Pascal PMF. You toss a coin until exactly k heads appear. Assume that the probability of
a head on any toss is p independent of all other tosses.
(a) Find the probability that the k = 5th head appears on the y = 10th toss?
Solution:
The k = 5th head on the y = 10th toss means that we have exactly k − 1 = 4 heads
on the first y − 1 = 9 tosses and a$ head
on the 10th toss. The probability of exactly 4
heads on first 9 heads is given by 94 p4 (1 − p)5 and the probability of the (independent)
10th toss resulting in a head is p. The required probability is therefore given by:
9 5
9 4
5
p (1 − p)5
p (1 − p) p =
P(5th head on 10th toss) =
4
4
(b) Let Y be the toss number that results in the kth head. Find the PMF of Y .
Solution:
Page 10
By similar reasoning, the PMF of Y is given by:
y−1 k
p (1 − p)y−k
pY (y) =
k−1
for y ≥ k
This is known as the Pascal PMF of order k.
7. The Cauchy-Schwarz inequality tells us that for two vectors u and v in an inner product
space,
|hu, vi| ≤ kuk · kvk
with the equality holding iff one vector is a constant multiplier of the other. Prove the
analogue of the Cauchy-Schwartz inequality for random variables:
p
p
E[XY ] ≤ E[X 2 ] E[Y 2 ]
(Hint: Use the fact that E (αX + Y )2 ≥ 0 for all real (constants) α).
Solution:
Starting from the hint:
E (αX + Y )2 ≥ 0
for all real (constants) α. Expanding the L.H.S.
E α2 X 2 + 2αXY + Y 2 ≥ 0
By linearity of expectations, this becomes
α2 E X 2 + 2αE[XY ] + E Y 2 ≥ 0
This is a quadratic equation in α which is ≥ 0 for all real α. This means that the discriminant ≤ 0. Therefore:
$ $ 2 (2 E[XY ])2 − 4 E X 2
E Y
≤0
from which the desired result follows directly.
8. Show the following properties for expectations:
(a) Pull-through property:
E[Y g(X) | X] = g(X)E[Y | X] for any suitable function g
Solution:
E[Y g(X) | X] =
X
y
Page 11
yg(X)pY
| X (y | X)
Since the summation is over y, g(X) can be pulled out
E[Y g(X) | X] = g(X)
X
ypY
| X (y | X)
y
The summation on the R.H.S. is just the E[Y | X] and so
E[Y g(X) | X] = g(X)E[Y | X]
(b) Tower property:
E[E[Y | X, Z] | X] = E[Y | X]
Solution:
We start by calculating the inner expected value on the L.H.S.
X
E[Y | X, Z] =
y pY | X,Z (y | X, Z)
y
(1)
X P(Y = y, X = x, Z = z)
=
y
P(X = x, Z = z)
y
Notice that this is a function of X and Z. Let h(X, Z) = E[Y | X, Z]. So now:
E[E[Y | X, Z] | X] = E[h(X, Z) | X]
X
h(X, z)pZ | X (z | X)
=
(2)
z
=
X
z
P(X = x, Z = z)
h(X, z)
P(X = x)
Replacing h(X, Z) = E[Y | X, Z] from equation (1) into equation (2) gives:
E[E[Y | X, Z] | X] =
=
X X P(Y = y, X = x, Z = z) P(X = x, Z = z)
y
P(X = x, Z = z)
P(X = x)
z
y
X X P(Y = y, X = x, Z = z)
y
P(X = x)
z
y
The summation over y can be pulled out to get
E[E[Y | X, Z] | X] =
X X pY,X,Z (y, x, z)
y
pX (x)
z
y
P
P
Just as x pY,X (y, x) = pY (y), the summation z pY,X,Z (y, x, z) = pY,X (y, x). This
means that the above now simplifies to:
E[E[Y | X, Z] | X] =
X pY,X (y, x) X
y
=
y pY
pX (x)
y
y
which completes the proof.
Page 12
| X (y | x)
= E[Y | X]
(c) E[E[Y | X] | X, Z] = E[Y | X]
Solution:
Since E[Y | X] is a function of X, we let g(X) = E[Y | X]. Now:
E[E[Y | X] | X, Z] = E[g(X) | X, Z]
From part (a) we have E[Y g(X) | X] = g(X)E[Y | X] and so this simplifies to:
E[E[Y | X] | X, Z] = g(X)E[1 | X, Z]
= g(X) · 1
= E[Y | X]
Optional Questions
Here we provide only the final solution to these optional questions. In some cases, hints are also
provided. In case you are unable to calculate the given answer, please contact the TAs or the
instructor.
1. (Chapter 2 Problem 18): PMF of the minimum of several random variables. On
any given day your golf score is an integer between 101 and 110, each with probability 0.1.
Determined to improve your score, you decide to play on three different days and declare as
your score the minimum X of the scores X1 , X2 and X3 on the three days.
(a) Calculate the PMF of X.
Solution:
See text book.
(b) By how much has your expected score improved as a result of playing on three days?
Solution:
See text book.
2. A sample of 3 items is selected at random from a box containing 20 items of which 4 are
defective. Find the expected number of defective items in the sample.
Solution:
4
16
k
3−k
pX (k) =
20
3
for 0 ≤ k ≤ 3
where X is the number of defective items in the sample. Using this we have:
E[X] = 0.6
Page 13
3. A newsboy purchases papers at 10 cents and sells them at 15 cents. However, he is not allowed
to return unsold papers. If his daily demand is a binomial random variable with n = 10 and
p = 1/3, approximately how many papers should he purchase so as to maximize his expected
profit?
Solution:
Let X be the demand and Y be the profit. X is a binomial r.v. with n = 10 and p = 1/3.
(
15X − 10a if X < a
pY (y) =
5a
if X ≥ a
where a is the number of papers he bought. Using the total expectation theorem, we can find
the value of a which maximizes E[Y ] (this might be done graphically). The value of a that
maximizes his expected profit is a = 3.
4. A certain airline company, having observed that five percent of the people making reservation
on a flight do not show up for the flight, sells hundred tickets for a plane that has only ninety
five seats.
(a) What is the exact probability that there will be a seat available for every person who
shows up for the flight?
Solution:
P(seat for every person)=0.564
(b) Recalculate the above probability using a Poisson approximation.
Solution:
Using the Poisson approximation we get P(seat for every person)=0.56
5. Suppose two million lottery tickets are issued with 100 winning tickets among them.
(a) If a person purchases 100 tickets, what is the probability he will win the lottery?
Solution:
The probability of winning ≈ 0.005.
(b) How many tickets should you buy so that the probability that you have a winning ticket
is greater than 95%?
Solution:
You need to buy approximately 60,000 tickets to be more than 95% confident of having
a winning ticket.
Page 14