Download Sample Exam 2 Solutions - Math 464 - Fall 10

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Sample Exam 2 Solutions - Math 464 - Fall 10 -Kennedy
1. Let X have a gamma distribution with λ = 2, w = 3. Let Y = 3X. Show
that Y has a gamma distribution and find the values of λ and w for Y . (Hint:
how is the moment generating function of Y related to that of X?)
Solution: You can do this by looking at the cdf of Y , but is easier to use
moment generating functions. The generating function of X is
3
2
MX (t) =
2−t
So
MY (t) = E[etY ] = E[et3X ] = MX (3t)
3
2
=
2 − 3t
3
2/3
=
2/3 − t
which is the mgf of a gamma distribution with parameters λ = 2/3 and
w = 3.
2. Let Z have a standard normal distrbution. (So its mean is 0 and its
variance is 1.) Let X = e−Z .
(a) Find the mean and variance of X.
Solution: The mean is
E[X] = E[e
−Z
1
]= √
2π
Z
∞
e−z ez
2 /2
dz
−∞
You can do this integral by completing the square. Or we can observe that
E[e−Z ] = MZ (−1) = e(−1)
2 /2
= e1/2
and
E[X 2 ] = E[e−2Z ] = MZ (−2) = e(−2)
2 /2
= e2
So the variance is e2 − (e1/2 )2 = e2 − e.
(b) Find the probability density function (pdf) for X.
1
Solution: The range of X is [0, ∞). We first compute the cdf of X:
FX (x) = P(X ≤ x) = P(e−Z ≤ x)
= P(−Z ≤ ln x) = P(Z ≥ − ln x) = 1 − FZ (− ln x)
Differentiate this to get the density of X,
1
d
[1 − FZ (− ln x)] = fZ (− ln x)
dz
x
So
fX (x) =
√1 1
2π x
0,
exp(− 21 (ln x)2 ), if x ≥ 0
otherwise
3. Let X and Y be continuous random variables with joint pdf
3
fX,Y (x, y) = (x2 + y 2 ),
2
0 ≤ x ≤ 1, 0 ≤ y ≤ 1
(a) Find the marginal densities of X and Y The marginal density of X for
0 ≤ x ≤ 1 is
Z 1
Z 1
3 2
3
1
1 3
3 2
2
(x + y ) dy = [x +
y 2 dy] = [x2 + ] = + x2
fX (x) =
2
2
3
2 2
0
0 2
So
fX (x) =
1
2
+ 32 x2
if 0 ≤ x ≤ 1
otherwise
1
2
+ 23 y 2
if 0 ≤ y ≤ 1
otherwise
0
The same calculation shows
fY (y) =
0
(b) Are X and Y independent?
Solution: They are not independent since fX,Y (x, y) is not equal to fX (x)fY (y).
4. X and Y are independent continuous random variables. X is uniformly
distributed on [−1, 1]. Y has an exponential distribution with λ = 1.
2
(a) Let Z = 2X + Y . Find the mean and variance of Z.
Solution: E[Z] = 2E[X] + E[Y ] = 2 × 0 + 1 = 1. And sinnce X and Y
are independent, the variance of Z is 4var(X) + var(Y ). We can look these
variances up in the formula sheet and find var(X) = 1/3 and var(Y ) = 1.
So var(Z) = 7/3.
(b) Compute the probability density function, fZ (z), of Z for z ≥ 2.
Solution: Compute the cdf of Z: FZ (z) = P(Z ≤ z) = P(2X + Y ≤ z) The
joint density of X, Y is 21 ey for −1 ≤ x ≤ 1 and y ≥ 0. So
FZ (z) = P(Z ≤ z) = P(2X + Y ≤ z)
Z 1 Z z−2x
1 −y
e dy dx
=
2
−1
0
Z
1 1
=
[1 − e−z+2x ] dx
2 −1
1
= 1 − (e2 − e−2 ) e−z
4
Differentiating we find that the pdf for z ≥ 2 is
1
f (z) = (e2 − e−2 ) e−z
4
5. (a) Let X be a standard normal. The odd moments of X are zero. Use
the mgf to compute E[X 2 ], E[X 4 ] and E[X 6 ].
Solution:
2
3
t2 1 t2
1 t2
e
= 1+ +
+
+ ···
2
2 2
3! 2
t6
t2 t4
+ ···
= 1+ + +
2
8
48
In general the nth derivative of MX (t) evaluated at t = 0 will be n! times
the coefficient of the tn term in the above. So
1
(2)
E[X 2 ] = MX (0) = 2! = 1
2
1
(4)
E[X 4 ] = MX (0) = 4! = 3
8
1
(6)
E[X 6 ] = MX (0) = 6! = 15
48
t2 /2
3
(b) Let X, Y be independent random variables, each with the standard
normal distribution. Let Z = X 2 + Y 2 . Find the mean and variance of Z.
Solution:
E[Z] = E[X 2 ] + E[Y 2 ] = 1 + 1 = 2
Since X and Y are independent, X 2 and Y 2 are independent. So
var(Z) = var(X 2 ) + var(Y 2 )
Now var(X 2 ) = E[X 4 ] − (E[X 2 ])2 = 3 − 1 = 2. The variance of Y 2 is the
same, so the variance of Z is 4.
6. Random variables X and Y have joint cumulative distribution function
(cdf)
1
[ π tan−1 (x) + c](1 − e−y ), if y ≥ 0
FX,Y (x, y) =
0,
if y < 0
where c is some constant.
(a) Find the value of c.
Solution:
lim F (x, y) =
x,y→∞
1π
1
+c= +c
π2
2
This must equal 1, so c = 1/2.
(b) Find the joint probability density function (pdf) for X, Y .
Solution: We take the second order partial derivative of FX,Y (x, y) with
respect to x and y. This gives
1 1
e−y , if y ≥ 0
fX,Y (x, y) = π 1+x2
0,
if y < 0
Note that X, Y are independent. X has the Cauchy distribution, and Y is
exponential with λ = 1.
7. The joint pdf of X and Y is
−x−y
2e
, if x ≥ 0, y ≥ 0, y ≥ x
fX,Y (x, y) =
0,
otherwise
4
Define new random variables by
U = Y −X
√
X
V =
(a) Are X and Y independent? Solution: No, the condition y ≥ x does not
factor. Another way to see they are not independent is to look at
P (Y ≤ 1, X ≥ 2). If we compute this we will integrate the joint density over
a region where it is zero, so P (Y ≤ 1, X ≥ 2) = 0.. But P (Y ≤ 1) and
P (X ≥ 2) are both not zero.
(b) Find the joint density of U, V . Solution: Solving for the inverse we get
X = V2
Y = U +V2
So the Jacobian is
J = det
0 2v
1 2v
= −2v
As function of u, v, f (x, y) becomes 2 exp(−u − 2v 2 ). So the joint density of
U, V is 4v exp(−u − 2v 2 ).
We need to determine the range of U, V . Clearly v ≥ 0. The condition
y ≥ x implies u ≥ 0. So the range is contained in the region u ≥ 0, v ≥ 0.
To see if all of the upper right quadrant is in the range we look at the
inverse equations and ask if for any u ≥ 0, v ≥ 0 we get x, y satisfying
x ≥ 0, y ≥ 0, y ≥ x. We do, so the range is all of u ≥ 0, v ≥ 0. So
4v exp(−u − 2v 2 ), if u ≥ 0, v ≥ 0
fU,V (u, v) =
0,
otherwise
(c) Are U and V independent? Solution: Yes, the joint pdf factors into a
function of u times a function of v.
5
Related documents