Download Solutions to the Practice Problems for the Final Exam

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Solutions to the Practice Problems for the Final Exam
1. Problem: Suppose that X and Y are independent exponential random variables with
parameter X and Y respectively.
(a) Find E (2 X  3Y ) and Var (2 X  3Y ) in terms of X and Y .
(b) Find E (2 X  3Y ) 2 in terms of X and Y .
(c) Find the correlation of X and X  Y in terms of X and Y .
(d) Assume X  3 and Y  1 . Find P( X  Y  2) .
2
Solution: (a) E (2 X  3Y )  2 E ( X )  3E (Y ) 
3

X
. Because X and Y are
Y
independent, Var (2 X  3Y )  4Var ( X )  9Var (Y ) 
4


2
X
9
Y2
.
(b)
2
 2
3 
8
12
18
E (2 X  3Y )  Var (2 X  3Y )  [ E (2 X  3Y )]  2  2  
   2 
 2
X Y  X Y  X X Y Y
2
(c) Cov( X , X  Y )  Cov( X , X )  Cov( X , Y ) 
Var ( X  Y )  Var ( X )  Var (Y )  2Cov( X , Y ) 
(X , X  Y ) 
4
2
Cov( X , X  Y )
 X  X Y

1

2
X
1

2
X
9
1

Y2
1/ X2
(1/ X ) 1/   1/ 
2
X
1
0 
2
Y
X2
and
. Thus,

Y
  Y2
2
X
.
P( X  Y  2)  1  P( X  Y  2)
(d)
 1 
2
0


2 y
0
3e3 x e  y dxdy
3 2 1 6
e  e
2
2
2. Problem: Suppose Z1 , Z 2 and Z3 are independent N (0,1) random variables. Find
(a) P(Z1  Z 2  Z3 ) ;
(b) E ( Z1Z 2 Z3 ) ;
(c) Var (Z1Z 2 Z3 ) ;
(d) P(Z1  Z3 ) ;
(e) P(max(Z1 , Z 2 )  Z3 ) (f) P( Z12  Z 22  1)
(g) P( Z1  Z 2  Z3  2) (h) P( Z1 / Z 2  1)
(i) P(3Z1  2Z 2  4Z3  1)
Solution:
(a) 1/6, by symmetry, all 3!=6 orderings of Z1 , Z 2 , Z3 are equally likely.
(b) Since Z1 , Z 2 , Z3 are independent, E (Z1Z 2 Z3 )  E (Z1 ) E (Z 2 ) E (Z3 )  0 .
(c)
Var ( Z1Z 2 Z3 )  E ( Z12 Z 2 2 Z32 )  ( E ( Z1Z 2 Z3 ))2 
E ( Z12 ) E ( Z 2 2 ) E ( Z32 )  ( E ( Z1Z 2 Z3 )) 2  1
where the second equality follows from Z1 , Z 2 , Z3 being independent.
1
(d) P( Z1  Z 3 )  P( Z1  Z 3 )  since Z1  Z3 ~ N (0, 2) is symmetric about 0.
2
(e)
P(max(Z1 , Z 2 )  Z3 )  1  P(max(Z1 , Z 2 )  Z3 )  1  P(Z3 is the largest) 
1  1/ 3  2 / 3
(f) P( Z12  Z 22  1)  e 1/ 2 (see Example 6.3b, page 282).
(g) Z1  Z 2  Z3 ~ N (0,3) so
2 
 Z  Z 2  Z3
 2 
P( Z1  Z 2  Z3  2)  P  1

  
  0.8759
3
3

 3
3
(h) P(Z1 / Z 2  1)  P(Z1  Z 2 , Z 2  0)  P(Z1  Z 2 , Z 2  0) = .
4
(i) Note that 3Z1  2Z2  4Z3 ~ N (0, 29) so
1 
 3Z  2Z 2  4Z3
 1 
P(3Z1  2Z 2  4Z3  1)  P  1

  
  0.5737
29
29 

 29 
3. Problem: Suppose that U ~ uniform(0,2 ) and independently V ~ exponential(1) .
Show that
X  2V cos U , Y  2V sin U
are independent standard normal random variables.
Solution: We use the method of Section 6.7 for finding the joint probability distribution
of functions of random variables. The transformation from (U ,V ) to ( X , Y ) is
g1 (u, v)  2v cos u, g2 (u, v)  2v sin u
and the inverse transformation is
 tan 1 ( y / x)
x>0
 1
 tan ( y / x)   x  0
u  h1 ( x, y )  
x  0, y  0,
( / 2)sign( y )
0
x  0, y  0

The Jacobian of the transformation is
2v ( sin u ) (2v)-1/2 cos u
J (u, v) 
 1
2v cos u
(2v)-1/2 sin u
v  h2 ( x, y ) 
x2  y 2
2
Thus, using formula (7.1) on page 300,
f X ,Y ( x, y )  fU ,V (h1 ( x, y ), h2 ( x, y )) | J (u, v) |1
 x2  y 2 
1

exp  

2
2 

This is the joint density of two independent standard normal random variables.
4. Problem: I toss a coin which lands heads with probability p  1/ 3 . Let N H be the
number of tosses till I get a head, N HH the number of tosses till I get two heads in a row
and N HHH the number of tosses till I get three heads in a row. Find
(a) E ( N H ) ;
(b) E ( N HH ) ;
(c) E ( N HHH ) ;
(d) Generalize to find the expected number of tosses to obtain m heads in a row.
Solution:
(a) Note that N H ~ Geometric( p) so E ( N H ) 
1
 3.
p
(b) Condition on whether the first toss was a head or a tail. Let Y be the number of heads
in the first toss. So Y  1 with probability p and 0 with probability 1  p . Then
E ( N HH )  E ( E ( N HH | Y ))  pE ( N HH | Y  1)  (1  p) E ( N HH | Y  0) .
Note that E ( N HH | Y  0)  1  E ( N HH ) and E ( N HH | Y  1)  2 p  (2  E( N HH ))(1  p) .
Let q  1  p and x  E ( N HH ) . Then it follows that
x  p(2 p  (2  x)(1  p))  (1  p)(1  x)  q  qx  2 p 2  2 pq  xpq
and hence E ( N HH ) 
q  2 pq  2 p 2 1  p 1 1
 2   2  3  9  12 .
1  q  pq
p
p p
(c) Let NT be the number of tosses to obtain the first tail. Then N HHH  3 if NT  4 and
N HHH  k  N where the random variable N has the same distribution as N HHH if
NT  k , k  1, 2,3 . Set x  E ( N HHH ) . Then
x  3 p3  (3  x) p 2 q  (2  x) pq  (1  x)q
 q  2 pq  3 p 2 q  3 p3  ( p  pq  p 2 q) x
and so x 
q  2 pq  3 p 2 q  q  3 p3 1  p  p 2 1 1
1

  2  3  3  9  27  39 .
2
3
1  q  pq  p q
p
p p
p
(d) x  k 1 (k  x) p k 1q  mp m and so
m

x


kp k 1q  mp m
k 1
m
1   k 1 p k 1q

 p m1
pm
m
1   k 1 p k 1   k 1 p k
m
1 p 
kp k 1   k 1 kp k  mp m
k 1
m
m
1 1


p p2

m

1
 3  32 
pm
 3m 
3m1  3
2
5. Problem: Let X 1 , X 2 , be a sequence of iid continuous random variables. Let N  2
be such that
X1  X 2   X N 1  X N .
That is, N is the point at which the sequence stops decreasing. Show that E ( N )  e .
Proof:
P( X 1  X 2 
 X N 1  X N )  P( X1  X 2   X N 1 ) P( X N  X N 1 | X1  X 2   X N 1 )
1
We have P( X1  X 2   X N 1 ) 
because X1  X 2   X N 1 is one of
( N  1)!
( N  1)! equally likely orderings of X 1 , , X N 1 . We have
1 N 1
P( X N  X N 1 | X 1  X 2   X N 1 )  1  P( X N is minimum of {X 1 , , X N })  1  
N
N
1
N 1
1
Thus, P( X1  X 2   X N 1  X N ) 
and

( N  1)! N
( N  2)! N



1
1
1
E(N )   N

  e .
( N  2)! N N  2 ( N  2)! i 0 i !
N 2
6. Problem: Let X and Y have the joint density
f ( x, y)  cx( y  x)e y , 0  x  y   .
(a) Find the constant c
(b) Find the conditional distribution of X given Y  y .
(c) Find the conditional distribution of Y given X  x .
(d) Find E ( X | Y  y ) and E (Y | X  x) .
Solution:
(a) 1  



 
f ( x, y )dxdy  

0
y

y
0
cx( y  x)e  y dxdy 
(b) Note that fY ( y )   x( y  x)e  y dx 
0
f X ( x | Y  y) 
c  3 y
y e dy  c , thus c  1 .
6 0
1 3 y
y e . Hence,
6
f ( x, y )
 6 x( y  x) y 3 , 0  x  y ,
f X ( x)

(c) f X ( x)   x( y  x)e y dy  xe x . So
x
fY ( y | X  x ) 
f ( x, y )
 ( y  x )e  ( y  x ) , x  y   .
f X ( x)
y
(d) E ( X | Y  y )   xf X ( x | Y  y )dx   6 x 2 ( y  x) y 3dx 
0
1
y and
2

E (Y | X  x)   yfY ( y | X  x)dy   y( y  x)e( y  x ) dy  x  2 .
x
7. Problem: An urn contains 10 red balls, 8 blue balls and 12 white balls. From this urn,
12 balls are randomly withdrawn. Let X be the number of red, and Y the number of blue
balls that are withdrawn. Find Cov( X , Y )
(a) by defining appropriate indicator random variables X i , Yi such that X  i 1 X i and
10
Y   j 1 Y j .
8
(b) by conditioning (on either X or Y) to determine E ( XY ) .
Solution:
(a) Number the red balls and the blue balls and let X i  1 if the ith red ball is selected and
let X i  0 otherwise. Similarly, let Y j  1 if the jth blue ball is selected and let
Y j  0 otherwise. Cov(i 1 X i , j 1Y j )  i 1  j 1 Cov( X i , Y j ) . Now,
10
8
10
8
E ( X i )  E (Y j )  12 / 30  2 / 5 and
 28   30 
E ( X iY j )  P(red ball i and blue ball j are selected)=   /    22 /145 . Thus,
10  12 
2

96
 22  2  

.
Cov( X , Y )  80 
    
145
145  5  


(b) We shall calculate E ( XY ) by conditioning on X. Note that given X, there are
12  X additional balls to be selected from among 8 blue and 12 white balls. Hence,
E ( XY | X )  XE (Y | X )  X (12  X )(8 / 20) .
Now since X is a hypergeometric random variable, it follows that E ( X )  12(10 / 30)  4
and E ( X 2 )  Var ( X )  ( E ( X ))2  12(18)(1/ 3)(2 / 3) / 29  42  512 / 29 . As
2
E (Y )  12(8 / 30)  16 / 5 , we obtain E ( XY )  (48  512 / 29)  352 / 29 and
5
Cov( X , Y )  352 / 29  4(16 / 5)  96 /145 .
8. Problem: Let X  2U  V and Y  3U  2V where U and V are independent standard
normal random variables.
(a) Find P( X  2Y  1) .
(b) Find the conditional distribution of U given X  x .
(c) Find the conditional distribution of Y given X  x .
Solution:
(a)
P( X  2Y  1)  P(2U  V  6U  4V  1)  P(4U  3V  1)
4
3
1
 P( U  V  )  (0.2)  0.5793
5
5
5
(b) Let X '  U  2V . Then X ' ~ N (0,5) and Cov( X , X ')  0 . So X and X ' are
uncorrelated and hence independent (since X and X ' are jointly bivariate normal).
2
1
2
1
2 1
U  X  X ' . Given X  x , U  x  X ' ~ N ( x, ) .
5
5
5
5
5 5
(c) Similar to part (b), Y  3U  2V 
Y
8
1
X  X ' . Given X  2U  V  x ,
5
5
8
1
8 1
x  X ' ~ N ( x, ) .
5
5
5 5
9. Problem: Let U1 ,U 2 ,
For x  (0,1) , let
be a sequence of independent uniform (0,1) random variables.
n
N ( x)  min{n :  U i  x} .
i 1
Find E[ N ( x )] by first showing by induction on n that for 0  x  1 and all n  0 ,
P( N ( x)  n  1)  x n / n!.
Solution: We first show by induction on n that for 0  x  1 and all n  0 ,
P( N ( x)  n  1)  x n / n!. The result is true when n  0 , so assume that
P( N ( x)  n)  x n1 /(n  1)!
Now,
1
P( N ( x)  n  1)   P{N ( x)  n  1| U1  y}dy
0
x
  P{N ( x  y )  n}dy
0
x
  P{N (u )  n}du
0
x
  u n 1 /(n  1)!du by the induction hypothesis
0
 xn / n!
which completes the proof. Then, using the result in Theoretical Exercises 5.2,



E[ N ( x)]  n0 P{N ( x)  n} n0 P{N ( x)  n  1} n0 x n / n!  e x .
Related documents