Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Mathematics 4255/5255
Final exam with solutions
May 3rd, 2011
Partial credit will be awarded for your answers, so it is to your advantage to explain your
reasoning and what theorems you are using when you write your solutions. Please answer
the questions in the space provided and show your computations.
Good luck!
I
II
III
IV
V
VI
Total
Name:
1
I. (20 points) If X and Y are two random variables whose joint density is given by
1/2 x > 0, y > 0, x + y < 2
f (x, y) =
0
otherwise
Find the joint density of U = Y − X and Y .
Solution:
Let g(x, y) = y − x and h(x, y) = y, hence
∂g ∂g ∂x ∂y −1 1 Jac = ∂h ∂h = = −1
0 1 ∂x ∂y |Jac−1 | = 1. On the other side, if we set u and v such that:
u = y−x
v = y.
Then, 2v − u = 2y − u = y + x < 2 and y > 0 and y − u = x > 0. Hence,
1/2 y > 0, y − u > 0, 2y − u < 2
fU,Y (u, y) =
0
otherwise
2
2
II. (20 points) Given the moment-generating function MX (t) = e3t+8t of a random variable
1
X, find the moment-generating function of the random variable Z = (X − 3) and use it to
4
find the mean and the variance of Z.
Solution:
MZ (t) := E e
= e
−3t
4
t
(X−3)
4
t
E e4X
=E e
t
X
4
e
−3t
4
t
MX ( )
4
−3t
3(t/4)+8(t/4)2
4
= e e
= e
−3t
4
t2
= e2
t2
E(Z) = MZ0 (t)|t=0 = te 2 |t=0 = 0
t2
E(Z 2 ) = M ”Z (t)|t=0 = e 2 (1 + t2 )|t=0 = 1
V ar(Z) = E(Z 2 ) = 1.
3
III. (20 points) Suppose X and Y are independent random variables. For a given x, find
E(X + Y |X = x).
Solution:
E(X + Y |X = x) = E(X|X = x) + E(Y |X = x) = x + E(Y ).
Here E(X|X = x) = x because X may be treated as the constant x given X = x and by
independence E(Y |X = x) = E(Y ).
4
VI. 20 points Suppose that N is a counting random variable, with values {0, 1 . . . , n}, and
that given {N = k}, for k ≥ 1, there are defined random variables X1 , . . . , Xk such that
E(Xj |N = k) = µ 1 ≤ j ≤ k.
Define a random variable SN by
SN =
X1 + · · · + Xk if N = k, 1 ≤ k ≤ n
0
if N = 0
Show that E(SN ) = µE(N ).
Hint: Use the formula E(SN ) = E (E(SN |N )) .
Solution:
E(SN ) = E (E(SN |N )) .
On the other hand
E(SN |N = k) = E(Sk |N = k)
k
X
=
E(Xi |N = k)
i=1
= kµ
Which yields that E(SN |N ) = N µ and as a consequence
E(SN ) = E (E(SN |N )) = E(N µ) = µE(N ).
5
V. (20 points) Let X and Y be continuous random variables with joint density function
( x
+ cy, 0 < x < 1, 0 < y < 5
f (x, y) =
5
0,
otherwise
where c is a constant.
1. What is the value of c?
2. Are X and Y independent?
3. Find P [X + Y > 3].
Solution:
1.
Z
1
5
Z
1 =
(x/5 + cy)dydx
0
Z
=
1
1
(4x/5 + 12y)dx
0
= 12c + 2/5.
Hence c = 1/20.
2. No, because the joint density does not factor in a sole function of x and y. But, let us
prove it by computing the marginal densities:
Z 5
(x/5 + y/20)dy
fX (x) =
1
= xy/5|51 + y 2 /10|51
= 4x/5 + 12/5.
Z
fY (y) =
1
(x/5 + y/20)dx
0
2
= x /10|10 + xy/20|10
= 1/10 + y/20.
f (x, y) 6= fX (x) × fY (y).
Hence X are Y are dependent.
6
3.
Z
1
Z
5
P [X + Y > 3] =
(x/5 + y/20)dydx
0
Z
=
3−x
1
[(2 + x)x/5 + 25/40 − (3 − x)2 /40]dx
0
= 11/15.
7
VI. (Bonus 20 points)
Let X and Y be independent random variables, X with uniform distribution on (0,3), Y
with Poisson (λ) distribution.
1. Find a formula in terms of λ for P (X < Y ).
2. The conditional density of X given X < Y .
3. Find E(X|X < Y ).
Solution:
fX (x) =
1/3 x ∈ (0, 3)
0
otherwise
and
fY (y) = e−λ
λy
, y ∈ N.
y!
X and Y being independent then the joint density function is given by
fX,Y (x) =
1 −λ λy
e y!
3
0
x ∈ (0, 3), y ∈ N
otherwise
1. Using the partition equation:
P (X < Y ) =
∞
X
P (X < Y |Y = y) P (Y = y)
y=0
=
+
+
+
=
+
P
P
P
P
P
P
(X
(X
(X
(X
(X
(X
< Y |Y = 0) P (Y = 0)
< Y |Y = 1) P (Y = 1)
< Y |Y = 2) P (Y = 2)
< Y |Y ≥ 3) P (Y ≥ 3)
< 0) P (Y = 0) + P (X < 1) P (Y = 1)
< 2) P (Y = 2) + P (X < Y |Y ≥ 3) P (Y ≥ 3)
On the other hand
P (X < 0) = 0, P (X < 1) = 31 , P (X < 2) = 32 and P (X < Y |Y ≥ 3) = 1. Hence we
get that
2
1 2
−λ
P (X < Y ) = 1 − e
1+ λ+ λ
3
6
8
2. The conditional probability distribution is given by
F(X|X<Y ) (t) := P (X ≤ t|X < Y ) =
P (X ≤ t, X < Y )
.
P (X < Y )
On the other hand, using the partition equation again
P (X ≤ t, X < Y ) =
∞
X
P (X ≤ t, X < Y |Y = y) P (Y = y)
y=0
=
+
+
+
P
P
P
P
(X
(X
(X
(X
≤ t, X
≤ t, X
≤ t, X
≤ t, X
< Y |Y
< Y |Y
< Y |Y
< Y |Y
= 0) P (Y
= 1) P (Y
= 2) P (Y
≥ 3) P (Y
= 0)
= 1)
= 2)
≥ 3)
we have that P (X ≤ t, X < Y |Y = 0) = 0, and moreover
• if t ≤ 0 then P (X ≤ t, X < Y ) = 0
• if 0 < t ≤ 1 then P (X ≤ t, X < Y |Y = k) = P (X < t) = t/3, k = 1, 2
and P (X ≤ t, X < Y |Y ≥ 3) = P (X < t) = t/3.
• if 1 < t ≤ 2 then P (X ≤ t, X < Y |Y = 1) = P (X < 1) = 1/3 and
P (X ≤ t, X < Y |Y = 2) = P (X < t) = t/3
and P (X ≤ t, X < Y |Y ≥ 3) = P (X < t) = t/3.
• if 2 < t ≤ 3 then P (X ≤ t, X < Y |Y = 1) = P (X < 1) = 1/3 and
P (X ≤ t, X < Y |Y = 2) = P (X < 2) = 2/3
and P (X ≤ t, X < Y |Y ≥ 3) = P (X < t) = t/3.
• if t ≥ 3 then P (X ≤ t, X < Y ) = P (X < Y )
Hence, we get the following results
• if 0 < t ≤ 1 then P (X ≤ t, X < Y ) = 3t (1 − e−λ )
• if 1 < t ≤ 2 then P (X ≤ t, X < Y ) = 13 λe−λ + 3t (1 − λe−λ − e−λ )
• if 2 < t ≤ 3 then
1
2
t
P (X ≤ t, X < Y ) = λe−λ + λ2 e−λ + (1 − λe−λ − e−λ − λ2 /2e−λ )
3
3
3
The conditional probability density function is the derivative with respect to t of the
conditional probability distribution function and we have that
9
f(X|X<Y ) (t) =
(1−e−λ )
3(1−e−λ (1+ 23 λ+ 16 λ2 ))
(1−λe−λ −e−λ )
3(1−e−λ (1+ 23 λ+ 16 λ2 ))
(1−λe−λ −e−λ −λ2 /2e−λ )
3(1−e−λ (1+ 32 λ+ 16 λ2 ))
0
if 0 < t ≤ 1
if 1 < t ≤ 2
if 2 < t ≤ 3
otherwise
3.
Z
∞
E(X|X < Y ) =
tf(X|X<Y ) (t)dt
−∞
Z 1
Z
1
0
10
Z
3
tf(X|X<Y ) (t)dt
tf(X|X<Y ) (t)dt +
tf(X|X<Y ) (t)dt +
=
2
2