Download Mathematics 4255 Final with solutions

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Mathematics 4255
Final with solutions
May 7th 2013
Partial credit will be awarded for your answers, so it is to your advantage to explain your
reasoning and what theorems you are using when you write your solutions. Please answer
the questions in the space provided and show your computations.
Good luck!
I
II
III
IV
V
VI
Total
Name:
1
I. (15 points) In a certain college, 25% of the students failed mathematics (M), 15% of
the students failed chemistry (C) and 10% of the students failed both mathematics and
chemistry. A student is selected at random.
1. If he failed chemistry, what is the probability that he failed mathematics?
2. If he failed mathematics, what is the probability that he failed chemistry?
3. What is the probability that he failed mathematics or chemistry?
Solution: Let us denote by
M = {Students who f ailed M athematics} ,
C = {Students who f ailed Chemistry} ,
T
then P (M ) = 0.25, P (C) = 0.15, and P (M C) = 0.10.
1. The probability that a student failed Mathematics given that he (she) failed Chemistry
is
T
0.10
2
P (M C)
=
= .
P (M |C) =
P (C)
0.15
3
2. The probability that a student failed Chemistry given that he (she) failed Mathematics
is
T
P (C M )
0.10
2
P (C|M ) =
=
= .
P (M )
0.25
5
3. The probability that a student failed Chemistry or Mathematics is
P (C
[
M ) = P (M ) + P (C) − P (M
2
\
C) = 0.25 + 0.15 − 0.10 = 0.3 =
3
.
10
II. (15 points) Let X be a discrete random variable such that
P (X = a) = p and P (X = b) = 1 − p.
1. Show that
X −b
is a Bernoulli random variable.
a−b
2. Find V ar(X).
Solution:
1. Let us compute the following probability mass function for the r.v. Y .
X −b
P (Y = 0) = P
=0
a−b
= P (X − b = 0)
= P (X = b)
= 1 − p.
And
X −b
=1
P (Y = 1) = P
a−b
= P (X − b = a − b)
= P (X = a)
= p.
Hence, we get that Y takes values 0 and 1 with respectively probability 1 − p and p.
Hence, Y is a Bernoulli.
2. Since Y is a Bernoulli random variable, then
V ar(Y ) = p(1 − p).
Moreover,
V ar(Y ) = V ar
X −b
a−b
1
V ar(X − a)
(a − b)2
1
=
V ar(X)
(a − b)2
=
This implies that
V ar(X) = (a − b)2 V ar(Y ) = (a − b)2 p(1 − p)
3
III. (20 points) Let X and Y be independent random variables taking values in the positive
1
integers and having the same mass function f (n) = n for n = 1, 2, . . . . Find
2
1. P (min {X, Y } ≤ k).
2. P (Y > X).
3. P (X > kY ) for a given positive integer k.
Hint: Recall some results from geometric series
1.
n
X
rk =
1 − rn+1
1−r
rn =
1
1−r
n=0
2.
∞
X
n=0
for |r| < 1.
Solution:
1.
P (min (X, Y ) ≤ k) = 1 − P (min (X, Y ) > k) = 1 − P (X > k, Y > k)
#2
" ∞
X
P (X = m)
= 1 − P (X > k) P (Y > k) = 1 − [P (X > k)]2 = 1 −
m=k+1
"
1−
∞
X
1
2m
m=k+1
#2
"
=1−
1
2k+1
∞
X
1
2m
m=0
#2
On the other side the infinite geometric series
P∞
1
m=0 2m
1
P (min (X, Y ) ≤ k) = 1 − k
2
4
= 2. Hence
2
.
2. Using the partition equation by conditioning on X, we have that
P (Y > X) =
=
=
=
=
∞
X
m=1
∞
X
m=1
∞
X
P (Y > X|X = m) P (X = m)
P (Y > m|X = m) P (X = m)
P (Y > m) P (X = m)
m=1
∞
X
∞
X
m=1 l=m+1
∞
∞
X
X
1 1
2m 2l
m=1 l=m+1
∞
X
1
=
2m
m=1
On the other side
P∞
1
l=0 2l
P (Y = l) P (X = m)
= 2 and
Pm
m
∞
X
1 X1
−
2l
2l
l=0
l=0
1
l=0 2l
=
1−(1/2)m+1
.
1−1/2
!
Hence
Pm
1
l=0 2l
=2−
1
2m
and
∞
X
1
1
P (Y > X) =
= .
2m
2
3
m=1
3. Using again the partition equation by conditioning on Y , we get that
P (X > kY ) =
∞
X
P (X > kY |Y = y) P (Y = y)
y=1
=
∞
X
P (X > ky|Y = y) P (Y = y)
y=1
=
∞
X
P (X > ky) P (Y = y)
y=1
=
∞
X
P (Y = y)
y=1
∞
X
P (X = l)
l=ky+1
On the other side using the index shift and the infinite geometric series we get that
∞
X
P (X = l) =
l=ky+1
5
1
.
2ky
Hence,
∞
X
1 1
P (X > kY ) =
2y 2ky
y=1
=
∞
X
y=1
1
2(k+1)y
6
=
1
2(k+1)
−1
VI. (20 points) Let X and Y two random variables jointly distributed with a density given
by
f(X,Y ) (x, y) =
x(y − x)e−y 0 ≤ x ≤ y < ∞
0
otherwise
1. Show that:
(a)
fX|Y (x|y) = 6x(y − x)y −3 ,
0≤x≤y
(b)
fY |X (y|x) = (y − x)ex−y ,
0 ≤ x ≤ y < ∞.
2. Deduce that
(a)
1
E(X|Y ) = Y
2
(b)
E(Y |X) = X + 2
Solutions:
1. First, let us compute the marginals fX and fY .
For y ≥ x
Z
∞
f (x, y)dx
fY (y) =
Z−∞
y
x(y − x)e−y dx
=
Z0 ∞
=
(xy − x2 )e−y dx
x
= e−y (yx2 /2 − x3 /3)|y0
y3
= e−y .
6
Hence, for y ≥ x
fX|Y (x|y) =
=
f (x, y)
fY (y)
x(y − x)e−y
3
e−y y6
= 6x(y − x)y −3 .
7
(b)
(a) For 0 < x ≤ y
∞
Z
fX (x) =
f (x, y)dy
Z−∞
∞
=
x(y − x)e−y dy
x
= x (y − x)(−e
−y
)|∞
x
Z
∞
−y
e dy
+
x
= x(−e−y )|∞
x
= xe−x .
Hence, for 0 < x ≤ y
f (x, y)
fX (y)
x(y − x)e−y
xe−x
= (y − x)ex−y .
fY |X (y|x) =
2. (a)
∞
Z
xfX|Y (x|y)dx
E(X|Y = y) =
Z−∞
∞
=
x[6x(y − x)y −3 ]dx
x
1
= y.
2
Hence, we deduce that E(X|Y ) = 21 Y .
(b)
Z
∞
E(Y |X = x) =
yfY |X (y|x)dy
Z−∞
y
y(y − x)ex−y dy
=
0
= x + 2.
Hence, we deduce that E(Y |X) = X + 2
8
V. (20 points) Suppose that X and Y are independent and identically distributed (iid)
with a standard normal distributions N (0, 1).
√
1. Compute the density of Z := 2Y
2. Compute the joint density of (X, Z).
3. Compute the moment generating function of X + Z
Solution:
1. X and Y have the same distribution that we denote by F and density which is given
by
x2
1
f (x) = √ e− 2 , x ∈ R.
2π
√
Let us compute the distribution of 2Y :
√
FZ (y) : = P ( 2Y ≤ y)
y
= P (Y ≤ √ )
2
y
= F ( √ ).
2
Hence,
d
(FZ (y))
dy
d
y
=
F (√ )
dy
2
1
y
= √ f(√ )
2
2
y2
1
= √ e− 4
4π
fZ (y) =
Hence, Z ∼ N (0, 2)
√
2. Since X and Y are independent then X and 2Y are also independent. Hence, the
joint density of X and Z is given by the product of their densities that is:
fX,Z (x, y) = fX (x)fZ (y)
1 − x2
1 − y2
√ e 4
= √ e 2
2π
4π
y2
x2
1
= √ e− 2 − 4
2π 2
9
3. X ∼ N (0, 1) and Z ∼ N (0, 2) and X and Z are independent, hence their sum
X + Z ∼ N (0, 3). Hence, the MGF of their sum is given by
3 2
MX+Z (t) = e 2 t .
10
VI. (10 points) Let X and Y two random variables identically distributed and not necessary
independent. Compute
Cov(X + Y, X − Y )
Solution:
Cov(X + Y, X − Y ) = Cov(X, X − Y ) + Cov(Y, X − Y )
= Cov(X, X) − Cov(X, Y ) + Cov(Y, X) − Cov(Y, Y )
Since Cov(X, Y ) = Cov(Y, X)
= Cov(X, X) − Cov(X, Y ) + Cov(X, Y ) − Cov(Y, Y )
= Cov(X, X) − Cov(Y, Y )
= V ar(X) − V ar(Y )
Now, X and Y are identically distributed, hence they have the same variance
V ar(X) = V ar(Y ).
Hence,
Cov(X + Y, X − Y ) = 0.
11
Related documents