Download Solution

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Statistics 330 - Assignment 3
(Due date : November 03, 2006)
Total marks: 15 × 5 = 75.
1. Let F (x, y) be the distribution of X and Y . For all real constants a < b, c < d, show that
P (a < X ≤ b, c < Y ≤ d) = F (b, d) − F (b, c) − F (a, d) + F (a, c) (Show it mathematically, but
you can use a diagram to visualize it).
Solution: Assuming a < b, c < d,
P (X ≤ b, Y ≤ d) = P (X ≤ b, Y ≤ c) + P (X ≤ b, c < Y ≤ d).
Since P (X ≤ b, c < Y ≤ d) = P (a < X ≤ b, c < Y ≤ d) + P (X ≤ a, c < Y ≤ d),
P (X ≤ b, c < Y ≤ d) = P (a < X ≤ b, c < Y ≤ d) + P (X ≤ a, Y ≤ d) − P (X ≤ a, Y ≤ c).
Combining the two equations we get
P (X ≤ b, Y ≤ d) = P (X ≤ b, Y ≤ c) + P (a < X ≤ b, c < Y ≤ d) + P (X ≤ a, Y ≤ d)
−P (X ≤ a, Y ≤ c).
That is, F (b, d) = F (b, c) + P (a < X ≤ b, c < Y ≤ d) + F (a, d) − F (a, c).
2. Show that the function F (x, y) can not be a distribution function of two random variables.

 1 x + 2y ≥ 1
F (x, y) =
 0 x + 2y < 1,
Hint : Find four numbers a < b, c < d such that F (b, d) − F (b, c) − F (a, d) + F (a, c) < 0.
Solution: Take a = 0, c = 0, b = 2 and d = 2, then F (a, c) = 0 and F (b, d) = F (a, d) =
F (b, c) = 1. Note that, there are infinitely many choices for a, b, c and d. Since from previous
1
question, P (a < X ≤ b, c < Y ≤ d) = F (b, d)−F (b, c)−F (a, d)+F (a, c), the function defined
above can not be a CDF as the probability lies between 0 and 1.
3. Let X1 and X2 have the joint pdf f (x1 , x2 ). Find the cdf and pdf of Y = X1 X2 .

 1 0 < x1 < 1, 0 < x2 < 1
f (x1 , x2 ) =
 0 otherwise.
Solution: Let Y1 = X1 X2 and Y2 = X2 , then the jacobian is |J| = |1/y2 |. The transformed
range of Y1 and Y2 is {(y1 , y2 ) : 0 < y1 , y2 < 1, y1 < y2 }. The joint pdf of Y1 and Y2 is

 1/y2 0 < y1 , y2 < 1, y1 < y2
f (y1 , y2 ) =
 0
otherwise.
Therefore, the pdf of Y1 is given by

 −log(y1 ) 0 < y1 < 1
f (y1 ) =
 0
otherwise,
and the cdf of Y1 is given by



1


F (y1 ) =
y1 ≥ 1
y1 − y1 log(y1 ) 0 < y1 < 1



 0
y1 ≤ 0.
4. Let 13 cards be taken, at random and without replacement, from an ordinary deck of playing
cards. If X is the number of spades in these 13 cards, find the pmf of X. If, in addition, Y
is the number of hearts in these 13 cards, find the probability P (X = 2, Y = 5). What is the
joint pmf of (X, Y ) ?
2
Solution: Since the sampling is done without replacement, it will not be binomial. We have
to find P (X = x), where x ∈ {0, 1, ..., 13}.
¡13¢¡
P (X = x) =
x
¢
39
¡5213−x
¢
13
The joint pmf of (X, Y ) is given by
¡13¢¡13¢¡
P (X = x, Y = y) =
x
y
26
13−x−y
¡52¢
13
¢
0 < x + y ≤ 13.
5. Let X1 and X2 be two random variables with joint pmf P (x1 , x2 ) = (x1 +x2 )/12, for x1 = 1, 2,
x2 = 1, 2, zero elsewhere. Compute E(X1 ), E(X12 ), E(X2 ), E(X22 ), E(X1 X2 ), E(X1 X22 ) and
E(X1 |X2 = 1), E(X2 |X1 = 1). Is E(X1 X2 ) = E(X1 )E(X2 ) ? Find E(2X1 − 6X22 + 7X1 X2 ).
Solution: Coming soon.
6. Let X1 and X2 have the joint pdf f (x1 , x2 ) given below. Find the positive constant c and the
pdf of Y = λX1 .

 c e−λ(x1 +x2 ) 0 < x1 < x2 < ∞, λ > 0
f (x1 , x2 ) =
 0
otherwise.
Solution: Coming soon.
7. Let X1 and X2 be continuous random variables with the joint probability density function
fX1 ,X2 (x1 , x2 ), −∞ < xi < ∞ for i = 1, 2. Let Y1 = X1 + X2 and Y2 = X2 .
3
(a) Find the joint pdf of Y = (Y1 , Y2 ).
(b) Show that
Z
fY1 (y1 ) =
∞
−∞
fX1 ,X2 (y1 − y2 , y2 )dy2 ,
which is sometimes called the convolution formula.
Solution: Coming soon.
8. Let f (x1 , x2 ) = c x21 x42 , 0 < x1 < x2 < 1, zero elsewhere, be the joint pdf of X1 and X2 . Find
the positive constant c.
(a) Find the conditional mean and variance of X1 , given X2 = x2 , 0 < x2 < 1.
(b) Find the distribution of Y = E(X1 |X2 ).
(c) Determine E(Y ) and var(Y ) and compare these to E(X1 ) and var(X1 ), respectively.
Solution: Since the function f (x1 , x2 ) is a density function it should integrate to 1. Thus,
¶
Z 1 µZ 1
2 4
cx1 x2 dx2 dx1 = 1
0
x1
which further simplifies to c = 24. The conditional pdf of X1 |X2 is given by
f (x1 |x2 ) = R x2
0
f (x1 , x2 )
3x2
= 31
x2
f (x1 , x2 ) dx1
Now, Y = E(X1 |X2 = x2 ) =
for 0 < x1 < x2 and zero elsewhere.
R x2
x1 f (x1 |x2 ) dx1 and the the conditional variance can be
Rx
obtained by finding E(X12 |X2 = x2 ) = 0 2 x21 f (x1 |x2 ) dx1 .
0
9. Let X1 and X2 be two random variables such that the conditional distributions and means
exist. Show that:
(a) E(u(X1 ) + v(X2 )|X2 ) = E(u(X1 )|X2 ) + v(X2 )
¯ ´
³
E(u(X1 )|X2 )
1) ¯
(b) E u(X
v(X2 ) ¯X2 =
v(X2 )
Note that these results hold for both continuous and discrete random variables.
4
Solution: Since v(X2 ) is a constant after conditioning on X2 , E(u(X1 ) + v(X2 )|X2 ) =
E(u(X1 )|X2 ) + E(v(X2 )|X2 ) = E(u(X1 )|X2 ) + v(X2 ). This can also be shown using integra¯ ´
³
E(u(X1 )|X2 )
1
1) ¯
tion. Using the same argument E u(X
.
v(X2 ) ¯X2 = v(X2 ) E(u(X1 )|X2 ) =
v(X2 )
10. Let f (x) and F (x) denote, respectively, the pdf and the cdf of the random variable X. The
conditional pdf of X, given X > x0 , x0 a fixed number, is defined by

 f (x)
1−F (x0 ) x0 < x
f (x|X > x0 ) =
 0
otherwise.
This kind of conditional pdf finds application in a problem of time until death, given survival
until time x0 .
(a) Show that f (x|X > x0 ) is a pdf.
(b) Let f (x) = λe−λx , 0 < x < ∞ and zero elsewhere. Compute P (X > 2|X > 1).
Solution: It is trivial that f (x|X > x0 ) ≥ 0 for all values of x : x0 < x. Now
R∞
Z ∞
Z ∞
f (x) dx
f (x)
f (x|X > x0 ) dx =
dx = x0
= 1.
1 − F (x0 )
x0
x0 1 − F (x0 )
For part (b), use the function f (x) = λe−λx to compute
R∞
f (x) dx
P (X > 2|X > 1) = 2
.
1 − F (1)
11. Two line segments, each of length 2cm, are placed along the x-axis. The midpoint of the first
is between x = 0 and x = 14 and that of the second is between x = 6 and x = 20. Assuming
independence and uniform distributions for these midpoints, find the probability that the line
segments overlap.
Solution: Let X1 and X2 be the location of the mid points of the two line segments, then
X1 ∼ unif (0, 14)
and
5
X2 ∼ unif (6, 20).
If we focus on the movement of X1 , then the two lines overlap, only if 4 < X1 < 14. Now,
given that 4 < X1 < 14, the two lines overlap if X1 < X2 + 2 and X1 > X2 − 2. That is, the
probability of overlap is P (4 < X2 − 2 < X1 < X2 + 2 < 14). Since the two random variables
are independent, the probability can be evaluated as
Z
P (4 < X2 − 2 < X1 < X2 + 2 < 14) =
6
16 µZ x2 +2
x2 −2
1 1
·
dx1
14 14
¶
dx2
12. Let the random variables X1 and X2 have the joint pdf

 1 (x1 − 1)2 + (x2 + 2)2 < 1
π
f (x1 , x2 ) =
 0 otherwise.
Are X1 and X2 independent ? Explain.
Solution: The marginal pdfs are


f (x1 ) =
2
π
p
1 − (x1 − 1)2 x1 ∈ (0, 2)
 0
otherwise,


f (x2 ) =
2
π
p
1 − (x2 + 2)2 x2 ∈ (−3, −1)
 0
otherwise.
Since, the joint pdf is not the product of two marginals, X1 and X2 are not independent.
13. Let X1 , X2 , X3 and X4 be four independent random variables, each with pdf

 λe−λx 0 < x < ∞
f (x) =
 0
otherwise.
If Y is the minimum of these four variables, find the cdf and the pdf of Y .
Solution: You have to find the pdf and cdf of X(1) .
6
14. Let f (x1 , x2 , x3 ) be the joint pdf of a multivariate variate variable X = (X1 , X2 , X3 ).

 e−(x1 +x2 +x3 ) 0 < xi < ∞, for i = 1, 2, 3
f (x1 , x2 , x3 ) =
 0
otherwise.
(a) Compute P (X1 < X2 < X3 ) and P (X1 = X2 < X3 ).
(b) Compute E[X1 |(X2 , X3 )].
(c) Are these random variables independent ?
Solution: Since the joint pdf is separable i.e., f (x1 , x2 , x3 ) = f (x1 )f (x2 )f (x3 ), Xi ’s are
independent. For part (b), first find the pdf of Y = X1 |(X2 , X3 ), then find the expectation
of Y . Note that Xi ’s are continuous, which implies that P (X1 = X2 < X3 ) = 0. The difficult
most part of this problem was to figure out the limits of integrations
¶
¶
Z ∞ µZ ∞ µZ ∞
−(x1 +x2 +x3 )
P (X1 < X2 < X3 ) =
e
dx3 dx2 dx1 .
0
x1
x2
15. Let X1 , X2 , X3 be iid (independent and identically distributed) with common pdf f (x) =
λe−λx , x > 0, zero elsewhere. Find the joint pdf of Y = (Y1 , Y2 , Y3 ), where Y1 =
X1
X1 +X2
and Y3 =
X2
X2 +X3 .
X1
X2 ,
Y2 =
Are Y1 , Y2 and Y3 mutually independent ?
Solution: (Oops !!) The question is wrong. The transformation is not invertible and therefore, you can not find the joint distribution of Y1 , Y2 and Y3 .
Lesson to learn : Given a set of transformations Yi = gi (X1 , X2 , ...), the joint distribution of
Y = (Y1 , ....) can be obtained only if the transformation is invertible.
Determinant of a

a b


Let J =  c d

f g
3 × 3 matrix :

¯
¯
¯
¯
¯
¯
c
¯
¯
¯
¯
¯
¯

¯
¯
¯
¯
¯
¯
c
d
c
e
d
e

¯
¯
¯
¯
¯
¯






+
c
·
−
b
·
e  then, det(J) = |J| = a · ¯
¯
¯
¯
¯
¯

¯ f g ¯
¯ f h ¯
¯ g h ¯
h
7
Related documents