Download Test 2 Solutions — Summer 2012

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
1
ISyE 6739 — Test 2 Solutions — Summer 2012
This test is 100 minutes long. You are allowed two cheat sheets. Only write final answers!
All parts of all questions are 3 points each.
1. A box contains 2 red, 3 black, and 5 blue sox. Suppose 6 sox are selected one
at-a-time with replacement. Let X denote the number of blue sox drawn.
(a) Name the distribution of X (include parameter values).
Solution: Bin(6, 0.5)
♦
(b) Find the probability that X = 4.
Solution: P(X = 4) =
6
4
(0.5)4 (0.5)2 =
15
.
64
♦
(c) Now suppose that you had instead sampled 6 sox without replacement, and
let Y denote the number of blues you get. Find the probability that Y = 4
(you do not need to simplify your solution).
Solution: Now we’re dealing with a hypergeometric distribution. In this case,
5
4
5
2
P(Y = 4) = 10 . ♦
6
2. TRUE or FALSE? In ISyE 6739, we have
d
E[etX ]|t=0
dt
= E[X].
Solution: TRUE (at least in this class), since E[etX ] is the moment generating
function. ♦
3. Suppose X has p.d.f. f (x) = 1/4, −1 ≤ x ≤ 3.
(a) Name the distribution of X (with parameters).
Solution: Unif(−1, 3).
♦
2
(b) Find the p.d.f. of W = |X − 1|.
Solution: The c.d.f. of W is
G(w) ≡
=
=
=
=
=
P(W ≤ w)
P(|X − 1| ≤ w)
P(−w ≤ X − 1 ≤ w)
P(1 − w ≤ X ≤ 1 + w)
Z 1+w
1−w
Z 1+w
f (x) dx
(1/4) dx
1−w
= w/2 if 0 ≤ w ≤ 2.
d
Thus, the p.d.f. of W is g(w) = dw
G(w) = 1/2 for 0 ≤ w ≤ 2; and so
W ∼ Unif(0, 2). You probably could’ve also gotten this answer via an
intuitive argument. ♦
4. If U ∼ Unif(0, 1), find the p.d.f. of Y = − 14 `n(U ).
Solution: From class notes (on the Inverse Transform Theorem), Y ∼ Exp(4).
Thus, the desired p.d.f. is fY (y) = 4e−4y for y > 0. ♦
5. Suppose that X has p.d.f. f (x) = 2x, 0 ≤ x ≤ 1.
(a) Find F (x), the c.d.f. of X.
Solution: F (x) = x2 , for 0 ≤ x ≤ 1.
♦
(b) Find the p.d.f. of F (X).
Solution: By the Inverse Transform Theorem, Y ≡ F (X) ∼ Unif(0, 1).
Thus, the p.d.f. of Y is g(y) = 1 for 0 ≤ y ≤ 1. ♦
6. Suppose X and Y are discrete random variables with the following joint p.m.f.,
where any letters denote probabilities that you might need to figure out.
3
f (x, y)
X = −3 X = 0 X = 5 P(Y = y)
Y = 1.6
0.1
0.1
a
0.3
Y = 27
b
c
0.3
d
P(X = x)
e
0.2
f
g
(a) What is the value of e?
♦
Solution: e = 0.4.
(b) Find P(Y ≤ 10).
Solution: P(Y ≤ 10) = P(Y = 1.6) = d = 0.3.
♦
7. Suppose that f (x, y) = 6x, for 0 ≤ x ≤ y ≤ 1.
(a) Find P(X < 1/2 and Y < 1/2).
Solution:
P(X < 1/2 and Y < 1/2) =
Z 1/2 Z y
0
=
Z 1/2 Z y
0
f (x, y) dx dy
0
6x dx dy
0
= 1/8. ♦
(b) Find the marginal p.d.f. of X.
Solution: fX (x) =
R1
x
f (x, y) dy =
R1
x
6x dy = 6x(1 − x), for 0 ≤ x ≤ 1.
♦
8. Suppose that the marginal p.d.f. of X is fX (x) = 6x(1 − x), for 0 ≤ x ≤ 1, and the
1
conditional p.d.f. of Y given X = x is f (y|x) = 1−x
, for 0 ≤ x ≤ y ≤ 1.
(a) Find E[Y |X = x].
Solution:
E[Y |X = x] =
Z 1
x
yf (y|x) dy =
Z 1
x
y
1+x
dy =
,
1−x
2
0 ≤ x ≤ 1. ♦
4
h
i
(b) Find E E[Y |X] .
Solution: By the Law of the Unconscious Statistician,
h
E E[Y |X]
i
=
Z 1
0
E[Y |x]fX (x) dx
Z 1
1+x
6x(1 − x) dx
2
0
= 3/4. ♦
=
Note that f (x, y) = fX (x)f (y|x) = 6x, for 0 ≤ x ≤ y ≤ 1. This his simplyi
the joint p.d.f. from Question 7! Now we can check our answer for E E[Y |X]
(because we have so much free time on our hands). First of all, the marginal
p.d.f. of Y is
fy (y) =
Then E[Y ] =
R1
0
Z y
f (x, y) dx =
Z y
6x dx = 3y 2 ,
for 0 ≤ y ≤ 1.
0
0
yfY (y) dy =
R1
0
3y 3 dy = 3/4.
h
i
Finally, by double expectation, E E[Y |X] = E[Y ] = 3/4, so the check works!
♦
9. Three TRUE/FALSE questions. The RVs X and Y are independent if. . .
(a) f (y|x) = fX (x) or f (x|y) = fY (y) for all x, y.
Solution: FALSE.
♦
(b) E[XY ] = E[X] · E[Y ].
Solution: FALSE. (This is necessary but not sufficient.)
♦
(c) f (x, y) = cxy/(1 + y 2 ), for 0 ≤ x ≤ 1 and 0 ≤ y ≤ 2, for some appropriate
constant c.
Solution: TRUE, since you can factor f (x, y) = a(x)b(y) for all x, y.
5
10. Suppose buses show up at the bus stop randomly according to a Poisson process
with a rate of 3 per hour. Let’s suppose that I also show up at the stop randomly.
What is my expected waiting time?
Solution: By the memoryless property of the exponential distribution, my waiting
time with be Exp(3/hr). Thus, my expected wait is 20 minutes. ♦
11. The coefficient
of variation of a random variable
q
CV(X) ≡ Var(X)/E[X]. Find CV(X) when X ∼ Exp(λ).
Solution:
q
Var(X)/E[X] = (1/λ)/(1/λ) = 1.
X
is
defined
as
♦
12. Suppose that the number of typographical errors in a book is Poisson with a rate
of 0.75 per page. Find the probability that there will be a total of exactly 1 typo
on Pages 221–222 of the book.
Solution: Suppose Y is the number of typos on a Pages 221–222. Then Y ∼
Pois(0.75/page) ∼ Pois(1.5/(2 pages)); and so
P(Y = 1) =
e−1.5 (1.5)1
= 0.335. ♦
1!
13. It’s raining cats and dogs! The number of dogs that drop down from the sky is
Pois(0.5/hr). The number of cats that drop is Pois(1.0/hr) (amazingly, all of the
cats land on their feet). Assuming that cats and dogs are independent, what’s the
probability that exactly one of these fine animals drops from the sky in the next
hour?
Solution: Independent Poissons add up, so the number of animals is
−1.5
1
Y ≡ Pois(1.5/hr). Then P(Y = 1) = e 1!(1.5) = 0.335. ♦
14. Drivers arrive at a parking lot according to a Poisson process at the rate of
10/hour. What is the probability that the time between the 18th and 19th arrivals
will be less than 5 minutes?
Solution: Let X be the time between the 18th and 19th arrivals. Then by remarks
in class, we know that the times between consecutive arrivals are i.i.d. Exp(10/hr),
6
so that P(X < 5 minutes) = P(X < (1/12) hour) = 1 − e−λt = 1 − e−10/12 = 0.565.
♦
15. The failure rate of a positive random variable X can be regarded as the instantaneous rate of death — that is, the rate of death, given that the person (or light
bulb) has survived until time x. It’s formally defined as f (x)/(1 − F (x)), where
f (x) and F (x) are the p.d.f. and c.d.f. of X. What is the failure rate if X ∼ Exp(λ)?
Solution: λ (constant).
♦
16. Suppose X1 and X2 are i.i.d. Bernoulli(p) random variables, which represent the
functionality of two network components. Think of a signal passing through
a network, where Xi = 1 if the signal can successfully get through component i, for i = 1, 2 (and Xi = 0 if the signal is unsuccessful). Let’s consider
two set-ups: (A) X1 and X2 have p = 0.8 and are hooked up in a series so
that a signal getting through the network has to pass through components
1 AND 2. (B) X1 and X2 have p = 0.5 and are hooked up in parallel so
that a signal getting through the network has to pass through components 1 OR
2. Which series is more reliable, i.e., more likely to permit a signal to pass through?
Solution: P(A) = P(X1 = 1 ∩ X2 = 1) = P(X1 = 1)P(X2 = 1) = 0.64.
Meanwhile,
P(B) = P(X1 = 1 ∪ X2 = 1) = P(X1 = 1) + P(X2 = 1) − P(X1 = 1 ∩ X2 = 1)
= P(X1 = 1) + P(X2 = 1) − P(X1 = 1)P(X2 = 1) = 0.5 + 0.5 − (0.5)(0.5) = 0.75.
Therefore, (B) is more reliable.
♦
17. TRUE or FALSE? If X is any normal distribution, then about 99.7% of all
observations from X will fall within three standard deviations of the mean.
Solution: TRUE.
♦
18. TRUE or FALSE? The normal quantile value Φ−1 (0.975) = 1.96.
7
Solution: TRUE.
♦
19. Suppose X ∼ Nor(µ, σ 2 ). Find P(−1 ≤
X−µ
σ
≤ 1).
Solution: P(−1 ≤ Z ≤ 1) = 2Φ(1) − 1 = 0.6826.
♦
20. Suppose X and Y are the scores that an incoming UGA student will receive, respectively, on the verbal and math portions of the SAT test. Further suppose that
X and Y are both Nor(400, 4000) and that Cov(X, Y ) = 1000. Find the probability
that the total score, X +Y , will exceed 900. (You can assume that X +Y is normal.)
Solution: Note that E[X + Y ] = 800 and
Var(X + Y ) = Var(X) + Var(Y ) + 2Cov(X, Y ) = 10000.
Therefore, X + Y ∼ N (800, 10000). This implies that
900 − 800
= P(Z > 1) = 0.1587. ♦
P(X + Y > 900) = P Z > √
10000
21. If X1 , . . . , X200 are i.i.d. from some distribution with mean 2 and variance 200,
find the approximate probability that the sample mean X̄ is between 1 and 3.
Solution: By the Central Limit Theorem, we have X̄ ≈ Nor(2, 1). Thus,
P(1 ≤ X̄ ≤ 3) ≈ P(−1 ≤ Z ≤ 1) = 2Φ(1) − 1 = 2(0.8413) − 1 = 0.6826. ♦
22. Find χ20.05,5 .
Solution: 11.07.
♦.
23. Suppose T ∼ t(362). What’s P(T < 1.645)?
Solution: Because of the high d.f., P(T < 1.645) ≈ P(Z < 1.645) = 0.95.
♦
8
24. Suppose that X1 , X2 , . . . , Xn are i.i.d. Weibull(α, β) and T (X) is an unbiased
estimator for α. What is E[T (X)]?
Solution: α (by definition).
♦
9
Table 1: Standard normal values
z
P(Z ≤ z)
1
0.8413
0.9000
1.28
1.5
0.9332
0.9500
1.645
1.96
0.9750
0.9773
2
Table 2: χ2α,ν values
ν\α
3
4
5
6
0.975
0.22
0.48
0.83
1.24
0.95 0.90 0.50 0.10
0.35 0.58 2.37 6.25
0.71 1.06 3.36 7.78
1.15 1.61 4.35 9.24
1.64 2.20 5.35 10.65
Table 3: tα,ν values
ν\α
7
8
9
10
0.10
1.415
1.397
1.383
1.372
0.05 0.025
1.895 2.365
1.860 2.306
1.833 2.262
1.812 2.228
Table 4: F0.025,n,m values
3
m\n
3
15.44
4
9.98
5
7.76
4
15.10
9.60
7.39
5
14.88
9.36
7.15
0.05 0.025
7.81 9.35
9.49 11.14
11.07 12.83
12.59 14.45
Related documents