Download Solution - University of Arizona Math

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Hardware random number generator wikipedia , lookup

Mathematical optimization wikipedia , lookup

Randomness wikipedia , lookup

Density of states wikipedia , lookup

Simulated annealing wikipedia , lookup

Simplex algorithm wikipedia , lookup

Probability box wikipedia , lookup

Least squares wikipedia , lookup

Generalized linear model wikipedia , lookup

Transcript
Sample Exam 2 Solutions - Math 464 - Fall 14 -Kennedy
1. Let X and Y be independent random variables. They both have a gamma
distribution with mean 3 and variance 3.
(a) Find the joint probability density function (pdf) of X, Y .
Solution: Since they are independent it is just the product of a gamma
density for X and a gamma density for Y . For the gamma distribution,
µ = w/λ, σ 2 = w/λ2 . Since the mean and variance are both 3, λ = 1 and
w = 3. So
1
x2 y 2 e−x−y if x ≥ 0, y ≥ 0
fX,Y (x, y) = Γ(3)2
0,
otherwise
(b) Express P (3X + Y ≤ 3) as an integral. Do not try to do the integral.
Solution: The region where 3x + y ≤ 3, x ≥ 0, y ≥ 0 is the triangle in the
upper right quadrant below the line y ≤ 3 − 3x. So we get
Z 1 Z 3−3x
1
2 2 −x−y
xy e
dy dx
Γ(3)2
0
0
2. Let X have an exponential distribution with E[X] = 1. Let Y = X 2 − 2.
(a) Find the mean and variance of Y .
Solution: First we compute some moments of X for later use. The mgf for
X is m(t) = 1/(1 − t).
1
, E[X] = m′ (0) = 1,
(1 − t)2
2
, E[X 2 ] = m(2) (0) = 2,
m(2) (t) =
3
(1 − t)
6
, E[X 3 ] = m(3) (0) = 6,
m(3) (t) =
(1 − t)4
24
m(4) (t) =
, E[X 4 ] = m(4) (0) = 24
(1 − t)5
m′ (t) =
Now
E[Y ] = E[X 2 ] − 2 = 2 − 2 = 0
E[Y 2 ] = E[(X 2 − 2)2 ] = E[X 4 − 4X 2 + 4] = 24 − 4 · 2 + 4 = 20
1
So var(Y ) = 20 − 0 = 20.
(b) Find the probability density function (pdf) for Y .
Solution: We start by finding the cdf for Y .
FY (y) = P (Y ≤ y) = P (X 2 − 2 ≤ y) = P (X 2 ≤ y + 2)
Z √y+2
p
p
e−x dx = 1 − exp(− y + 2)
= P (X ≤ y + 2) =
0
Take the derivative of this to get
p
1
fY (y) = (y + 2)−1/2 exp(− y + 2), y ≥ −2
2
The range for Y is [−2, ∞).
3. Let X and Y be continuous random variables with joint pdf
3
fX,Y (x, y) = (x2 + y 2 ),
2
0 ≤ x ≤ 1, 0 ≤ y ≤ 1
Outside of 0 ≤ x ≤ 1, 0 ≤ y ≤ 1, fX,Y (x, y) = 0.
(a) Find the marginal densities of X and Y The marginal density of X for
0 ≤ x ≤ 1 is
Z 1
Z 1
3
3 2
1
1 3
3 2
2
y 2 dy] = [x2 + ] = + x2
(x + y ) dy = [x +
fX (x) =
2
2
3
2 2
0
0 2
So
fX (x) =
1
2
+ 32 x2
if 0 ≤ x ≤ 1
otherwise
1
2
+ 23 y 2
if 0 ≤ y ≤ 1
otherwise
0
The same calculation shows
fY (y) =
0
(b) Are X and Y independent?
Solution: They are not independent since fX,Y (x, y) is not equal to fX (x)fY (y).
2
4. Let X, Y be jointly continuous random variables with joint probability
density function (pdf)
4xy, if 0 ≤ x ≤ 1, 0 ≤ y ≤ 1
fX,Y (x, y) =
0,
otherwise
Let Z = X + Y . Compute fZ (z), the probability density function (pdf) for
Z.
Solution: The range of X, Y is the unit square. The range of Z will be
[0, 2] We need to compute the cdf, P (Z ≤ z) = P (X + Y ≤ z). How the
line x + y = z intersects the unit square depends on whether 0 ≤ z ≤ 1 or
1 ≤ z ≤ 2. In the first case
Z z Z z−x
4xy dy dx
P (X + Y ≤ z) =
0
0
After some calculation this equals 61 z 4 . For 1 ≤ z ≤ 2,
P (X + Y ≤ z) =
Z
z−1
0
Z
1
4xy dy dx +
0
Z
1
z−1
Z
z−x
4xy dy dx
0
After an unreasonable amount of calculation this equals − 61 z 4 + 2z 2 − 83 z + 1.
So the pdf is

 23 z 3 ,
if 0 ≤ z ≤ 1
2
8
3
fZ (z) = − 3 z + 4z − 3 , if 1 ≤ z ≤ 2

0,
otherwise
5. Let X and Y be independent random variables, each of which has a
standard normal pdf. Let Z = Y − X + 4.
(a) Find the mean and variance of Z.
Solution: E[Z] = E[Y ] − E[X] + 4 = 4. var(Z) = var(Y ) + var(−X) =
var(Y ) + var(X) = 1 + 1 = 2.
(b) Find the probability density function (pdf) of Z. Hint: this can be done
with very little computation.
Solution: It is easy to show that −X is also a standard normal. The sum
of independent normal random variables is normal, and adding a constant
to a normal random variable gives another normal random variable. So Z is
3
normal. Part (a) tells us its mean and variance. Another way to see this is
to look at the mgf. Since X and Y are independent, functions of them are
independent. So
MZ (t) = E[exp(t(Y − X + 4))] = e4t E[exp(tY )]E[exp(−tX)]
1
1
= exp(4t + t2 + (−t)2 ) = exp(4t + t2 )
2
2
which is mgf of a normal with mean 4 and variance 2. So
1
1
exp( (x − 4)2 ),
fZ (z) = √
4
4π
−∞ < z < ∞
6. Random variables X and Y have joint cumulative distribution function
(cdf)
1
[ π tan−1 (x) + c](1 − e−y ), if y ≥ 0
FX,Y (x, y) =
0,
if y < 0
where c is some constant.
(a) Are X and Y independent?
Solution: Yes, the joint cdf factors into a function of x times a function of
y, so they are independent.
(b) Find the value of c.
Solution:
lim F (x, y) =
x,y→∞
1
1π
+c= +c
π2
2
This must equal 1, so c = 1/2.
(c) Find the joint probability density function (pdf) for X, Y .
Solution: We take the second order partial derivative of FX,Y (x, y) with
respect to x and y. This gives
1 1
e−y , if y ≥ 0
fX,Y (x, y) = π 1+x2
0,
if y < 0
Note that X, Y are independent. X has the Cauchy distribution, and Y is
exponential with λ = 1.
4
7. The joint pdf of X and Y is
−x−y
2e
, if x ≥ 0, y ≥ 0, y ≥ x
fX,Y (x, y) =
0,
otherwise
Define new random variables by
U = Y −X
√
X
V =
(a) Are X and Y independent? Solution: No, the condition y ≥ x does not
factor. Another way to see they are not independent is to look at
P (Y ≤ 1, X ≥ 2). If we compute this we will integrate the joint density over
a region where it is zero, so P (Y ≤ 1, X ≥ 2) = 0.. But P (Y ≤ 1) and
P (X ≥ 2) are both not zero.
(b) Find the joint density of U, V . Solution: Solving for the inverse we get
X = V2
Y = U +V2
So the Jacobian is
J = det
0 2v
1 2v
= −2v
As function of u, v, f (x, y) becomes 2 exp(−u − 2v 2 ). So the joint density of
U, V is 4v exp(−u − 2v 2 ).
We need to determine the range of U, V . Clearly v ≥ 0. The condition
y ≥ x implies u ≥ 0. So the range is contained in the region u ≥ 0, v ≥ 0.
To see if all of the upper right quadrant is in the range we look at the
inverse equations and ask if for any u ≥ 0, v ≥ 0 we get x, y satisfying
x ≥ 0, y ≥ 0, y ≥ x. We do, so the range is all of u ≥ 0, v ≥ 0. So
4v exp(−u − 2v 2 ), if u ≥ 0, v ≥ 0
fU,V (u, v) =
0,
otherwise
(c) Are U and V independent? Solution: Yes, the joint pdf factors into a
function of u times a function of v.
5
8. Let X and Y be independent random variables. X has an exponential
distribution with E[X] = 2. Y has an exponential distribution with E[Y ] =
1. Let Z = X + 2Y .
(a) Find the mean and variance of Z. ‘Solution: For X, the parameter
is λX = 1/2 and for Y it is λY = 1. Using the formula sheet, Mean is
E[Z] = E[X] + 2E[Y ] = 4. Variance is V ar(Z) = V ar(X) + 4V ar(Y ) = 8.
(b) Find the moment generating function (mgf) of Z. Solution: X and 2Y
are independent, so
MZ (t) = MX (t)M2Y (t) = MX (t)MY (2t) =
1
1
1/2
=
1/2 − t 1 − 2t
(1 − 2t)2
(c) The pdf of Z is in our catalog. What is it? Solution: The mgf is that
of a gamma distribution with λ = 1/2 and w = 2.
6