Download ISyE 6739 — Practice Test #2 Solutions

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
1
ISyE 6739 — Practice Test #2 Solutions
Summer 2011
1. Suppose that X is a random variable X with mean E[X] = −3. If MX (t) is the
moment generating function of X, what is dtd MX (t)|t=0 ?
Solution: −3.
♦
2. If X has m.g.f. MX (t) = 0.2et + 0.8, name the distribution of X (with parameter
value(s)).
Solution: Bern(0.2) or Bin(1, 0.2).
♦
3. Consider the joint p.m.f. of (X, Y ) for which f (0, 0) = 0.3, f (0, 1) = 0.1,
f (0, 2) = 0.2, f (2, 0) = 0.2, f (2, 1) = 0.2, and f (2, 2) = 0. Find Pr(Y = 1).
Solution: First of all, the joint p.m.f. can be written as follows:
f (x, y) x = 0
y=0
0.3
0.1
y=1
y=2
0.2
fX (x)
0.6
Then Pr(Y = 1) = fY (1) = 0.3.
x=1
0.2
0.2
0
0.4
fY (y)
0.5
0.3
0.2
1
♦
4. Again consider the joint p.m.f. from Question 3. Are X and Y independent?
Solution: No. For example, f (1, 2) = 0 6= fX (1)fY (2) = 0.08.
♦
5. True or False? If X and Y are independent, then the conditional p.d.f.
f (y|x) = fX (x) for all x and y.
Solution: False. It should be f (x|y) = fX (x) for all x and y.
♦
2
6. Suppose that X and Y have joint p.d.f. f (x, y) = 3x, 0 < y < x < 1. Find fX (x),
the marginal p.d.f. of X.
Solution: fX (x) =
R∞
−∞
f (x, y) dy =
Rx
0
3x dy = 3x2 , 0 < x < 1.
♦
7. Again consider the joint p.d.f. f (x, y) from Question 6. Find Cov(X, Y ).
Solution: By Question 6, we have fX (x) = 3x2 , 0 < x < 1.
Thus, E[X] =
R1
0
x 3x2 dx = 3/4.
Similarly, fY (y) =
Thus, E[Y ] =
R1
y
3x dx = 32 (1 − y 2 ), 0 < y < 1.
R1 3
2
0 y 2 (1 − y ) dy = 3/8.
In addition, E[XY ] =
R∞ R∞
−∞ −∞
xyf (x, y) dy dx =
R1Rx
0
Finally, we have Cov(X, Y ) = E[XY ] − E[X] E[Y ] =
0
xy 3x dy dx = 3/10.
3
10
−
9
32
=
3
160
= 0.01875.
♦
8. Yet again consider the joint p.d.f. f (x, y) from Question 6. Find the conditional
p.d.f. f (x|y).
Solution: f (x|y) = f (x, y)/fY (y) =
3x
3
(1−y 2 )
2
=
2x
,
1−y 2
0 < y < x < 1.
♦
9. Suppose that the joint p.d.f. of X and Y is f (x, y) = c/(1 + x + y), for 0 < x < 1
and 0 < y < 1 and some appropriate constant c. Are X and Y independent?
Solution: No. You can’t factor f (x, y).
♦
10. Suppose that the questions on a test are i.i.d. in the sense that you will be able to
answer any question correctly with probability 0.9. What is the expected number
of questions that you will do until you make your second incorrect answer?
3
Solution: Let W = number of questions until second error. Then W ∼
NegBin(2,0.1). (You can also write W as the sum of two geometric random variables if you don’t recall the Negative Binomial distribution.) Then
E[W ] = r/p = 2/0.1 = 20. ♦
h
i
11. True or False? E[X 2 ] = E E[X 2 |Y ] .
Solution: True.
♦
12. Suppose that E[X] = 3, E[Y ] = 4, Var(X) = 4, Var(Y ) = 16, and Cov(X, Y ) = −4.
Find Var(Y − X).
Solution: Var(Y − X) = Var(Y ) + Var(X) − 2Cov(X, Y ) = 28.
♦
13. Again suppose that E[X] = 3, E[Y ] = 4, Var(X) = 4, Var(Y ) = 16, and
Cov(X, Y ) = −4. Find Corr(X, Y ).
q
Solution: Corr(X, Y ) = Cov(X, Y )/ Var(X)Var(Y ) = −1/2.
♦
14. True or False? If Cov(X, Y ) = 0, then X and Y are independent random variables.
Solution: False. See counterexample in notes.
♦
15. If X and Y are independent Bern(p) random variables, find E[(X − Y )2 ].
Solution: Since X and Y are independent, and all Bern(p) moments are equal to
p, we have
E[(X−Y )2 ] = E[X 2 −2XY +Y 2 ] = E[X 2 ]−2E[X]E[Y ]+E[Y 2 ] = p−2p2 +p = 2p(1−p). ♦
16. I am a discrete distribution with the memoryless property. Name me.
Solution: Geometric(p).
♦
4
17. I am the sum of 4 i.i.d. Exp(λ = 3) random variables. Name me (including
parameter(s)).
Solution: Erlangk=4 (λ = 3).
♦
18. True or False? A Poisson process is a special kind of counting process.
Solution: True.
♦
19. I’m a continuous distribution used to model stock prices. Name me.
Solution: lognormal.
♦
20. The number of defects on a length of wire is a Poisson process with rate
λ = 0.5/meter. Find the probability that we will find exactly 2 defects on a certain
2-meter section of the wire.
Solution: Let X denote the number of defects on a 2-meter section.
X ∼ Pois(1), and Pr(X = 2) = e−1 (1)2 /2! = 0.1839. ♦
Then
21. Customers arrive at a store according to a Poisson process at the rate of 5/hour.
Name the distribution (with parameter(s)) of the time between the 4th and 5th
arrivals.
Solution: Exp(λ = 5).
♦
22. What kind of random variables does the Box-Muller method generate?
23. Suppose that Z is a standard normal random variable, having c.d.f. Φ(z). What is
the value of Φ−1 (0.9)? (You may want to look at the following table.)
5
z
1
1.28
1.5
1.645
1.96
2
Pr(Z ≤ z)
0.8413
0.9000
0.9332
0.9500
0.9750
0.9773
Solution: Φ−1 (0.9) is that value of z such that Pr(Z ≤ z) = 0.9. The table shows
that the desired value is z = 1.28. ♦
24. Suppose that Z ∼ Nor(0, 1). Find Pr(−1.5 ≤ Z ≤ 1.5).
Solution: From the table, we have Pr(−1.5 ≤ Z ≤ 1.5) = 2Φ(1.5) − 1 =
2(0.9332) − 1 = 0.8664. ♦
25. Suppose that the diameter of a ball bearing is D ∼ Nor(4, 1), where the units are
in centimeters. Find the probability that a particular ball bearing will be greater
than 5 cm.
Solution: Pr(D > 5) = Pr(Z >
5−4
√ )
1
= Pr(Z > 1) = 0.1587.
♦
26. The amount of time to perform Job A is Nor(10, 2), and the amount of time to
perform Job B is Nor(12, 2). Assuming that the times A and B are independent,
find Pr(A > B).
Solution: Since A and B are independent, we have A − B ∼ Nor(10 − 12, 2 + 2) ∼
Nor(−2, 4).
Then Pr(A > B) = Pr(A − B > 0) = Pr(Nor(−2, 4) > 0) = Pr(Z >
Pr(Z > 1) = 0.1587. ♦
27. Consider i.i.d. dice tosses X1 , X2 , . . . , X100 ,
P
X̄ ≡ 100
i=1 Xi /100. What is the variance of X̄?
0−(−2)
√
)
4
=
and let the sample mean
6
Solution: First of all,
Var(Xi ) =
E[Xi2 ]
12 + 22 + 32 + 42 + 52 + 62
35
− (E[Xi ]) =
− (3.5)2 =
= 2.9167.
6
12
2
Then Var(X̄) = Var(Xi )/100 = 0.0292.
♦
28. If X1 , X2 , . . . , X100 are i.i.d. Nor(5, 10), what’s the distribution (with parameter(s))
of the sample mean X̄?
Solution: X̄ ∼ Nor(5, 0.1)
♦
29. Suppose that X1 , X2 , . . . , X400 are i.i.d. Pois(4), and define the sample mean
P
X̄ ≡ 400
i=1 Xi /400. Find an approximate expression for Pr(3.9 ≤ X̄ ≤ 4.1).
Solution: Note that E[X̄] = 4 and Var(X̄) = 4/400 = 0.01. Then by the CLT,
Ã
3.9 − 4.0
X̄ − 4.0
4.1 − 4.0
√
Pr(3.9 ≤ X̄ ≤ 4.1) = Pr
≤ √
≤ √
0.01
0.01
0.01
≈ Pr(−1 ≤ Z ≤ 1)
= 2Φ(1) − 1 = 0.6826. ♦
30. What is our most important theorem?
• (a) The central lemon theorem.
• (b) The sentry limit theorem.
• (c) The central limit theorem.
• (d) The central limit serum.
• (e) All of the above.
Solution: (c).
♦
31. Find the sample standard deviation of 7, 12, and 1.
ANSWER:
√
30.33 = 5.51.
!
7
32. If X1 , . . . , Xn are i.i.d. Bern(p), what is the MLE of p?
ANSWER: X̄.
33. TRUE or FALSE? If X1 , . . . , Xn are i.i.d. N (µ, σ 2 ), where µ and σ 2 are unknown,
then S 2 is unbiased for σ 2 .
ANSWER: True.
34. TRUE or FALSE? If X1 , . . . , Xn are i.i.d. N (µ, σ 2 ), where µ and σ 2 are unknown,
then S 2 is the MLE for σ 2 .
ANSWER: False.
35. If X1 , . . . , Xn are i.i.d. with E[Xi ] = 3 and Var(Xi ) = 10, find E[S 2 ].
ANSWER: 10.
36. TRUE or FALSE? If X1 , . . . , Xn are i.i.d. U (0, θ), then max1≤i≤n {Xi } is unbiased
for θ.
ANSWER: False.
37. TRUE or FALSE? If X1 , . . . , Xn are i.i.d. U (0, θ), then max1≤i≤n {Xi } is the MLE
for θ.
ANSWER: True.
38. What does “MSE” mean?
ANSWER: mean-squared error.
39. Suppose we have 2 estimators, T1 and T2 , for some parameter θ. Further suppose
that Bias(T1 ) = 5, Var(T1 ) = 10, Bias(T2 ) = 0, and Var(T2 ) = 50. Which estimator
would you use?
8
ANSWER: MSE(T1 ) = 35, MSE(T2 ) = 50, so choose T1 .
40. BONUS: Who is Dr. Paul Arnold?
ANSWER: He’s one of the original Zombies.
Related documents