Download Practice problems 1. What is exponential, γ, Binimial, Bernoulli

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Probability wikipedia , lookup

Transcript
Practice problems
1. What is exponential, γ, Binimial, Bernoulli, negative binomial distribution?
2. Explain the memoryless property of the exponential distribution. P rob(X >
b|X > a) = P rob(X > (b − a)) for b > a. This means just because you
managed to be 70, that is no evidence that your exceptionally healthy in this
model (but it also means it is a fresh start!).
3. What do you think if you here central limit theorem? Approximation of Binomial coefficients, universal property of independent events, moment generating
function. It means for example that
lim Ee
√t
n
Pn
j=1 (
Xj −EXj
σj
n
)
2 /2
= et
for independent random variables with uniformly bounded moments.
4. Estimate the probability of winning 10 times out of 20, 100 times out of 200,
where you flip a coin with P rob(up) = p? Does a tabular of the Errorfunction
for N (0, 1) help? We get
P10
20
= p (1 − p)
≈ 0.176
10
10
10
and
200
p100 = p (1 − p)
≈ 0.05 .
100
200
I think 20
a
calculator
can
still
manage,
is probably okay. The approxi10
100
100
100
mated values are calculate using p = 1/2. Now the central limit theorem, tells
us that
P rob(|
n
X
j=1
Xj − E j | >
√
a2
e− 2
nσa) ≺ P rob(|g| > a) ≤ √
.
2πa
√
nσa = 1 in order to calculate 1 − P10 and 1 − P100 .
p
√
For 10 trials and σ = 1/4 = 1/2 we get a = 2/ 20 in the first case and
√
√
a = 2/ 200 = 2/10 ≈ 0.15 for n = 200.
In our case we need
For small numbers a the estimate P rob(|g| > a) ≤
1
2
−a /2
e√
2πa
is not helpful.
Thus instead, we look at
P rob(|
n
X
Xj − Ej | ≤
j=1
√
2a
4
nσa) ≈ P rob(|g| ≤ a) ≈ √
= √
.
2π
2πn
√
Thus we expect the probability to go like 1/ n for n large. Actually the value
for 1000 is
0.01783 .
(Maybe I forgot 2π).
5. Find the distribution of Z given that moment generating function is
i) 1/3 + 2/3et .
2 /3+et2 −1+et −1
ii) et/3+t
iii) e
√ 1 −1
1−t
.
.
ad i) P rob(Z = 0) = 1/3, P rob(Z = 1) = 2/3
ad ii) Z = Z1 + Z2 + Z3 + Z4 all independent. We need
MZ1 (t) = et/3
2 /3
MZ2 (t) = et
.
This Z1 = 1/3. If Z2 is N (0, σ 2 ) distributed then Z2 =
√
σY , Y N (0, 1) and
hence
MZ2 (t) = EetσY = e
σ 2 t2
2
.
Thus σ 2 = 2/3 does the job. For Z3 we choose Z3 =
PN
j
Vj where N is
1-Poisson and Vj is N (0, 2) (to kill the 2). Finally Z4 is 1-Poisson.
√
P
as iii) For ξ 2 we have 1/ 1 − 2t thus Z = N
j=1 Yj , N is 1-Poisson and Yj are
iid 1/2g 2 with g N (0, 1).
6. Find the moment generating function for Z =
PN
j=1
Xj where X is exponential
with EX = 3. For exponential we have for µ = EX that
Z ∞
dx
1
1
1
tX
Ee =
etx e−x/µ
=
=
,
µ
µ 1/µ − t
µ−t
0
provided that t < µ. Thus
1
MZ (t) = e µ−t −1 .
2
7. Flip a coin with P rob(up) = p. When you see ‘up’ you another flip a coin
with P rob(up) = 1/2. When you see ‘down’ you another flip a coin with
P rob(up) = 1/4. What is the expected time to see the second coin up? We
have to condition
1
P rob(X2 = up) = pP rob(X2 = up|X1 = up) + P rob(X2 = up|X1 = down)
2
1
1
1 p
= p + (1 − p) = + .
2
4
4 4
8. Calculate E(X|Y ) and E(Y |X) in the following examples:
a) fX,Y (x, y) = c(x + 1 + y)2 , 0 ≤ x, y ≤ 1.
b) fX,Y (x, y) = c((x + 1)2 + y 2 ), 0 ≤ x, y ≤ 1.
Why don’t we have to calculate c?
Recall that E(X|Y ) = g(Y ) where
Z
g(y) = E(Xy ) =
xfX|y (x)dx .
In order to find that we need
fX|y (x) =
fXY (x, y)
.
fY (y)
This means we have to calculate
f˜Y (y) =
Z
1
f˜X,Y (x, y)dx .
0
Here f˜ means everything without the annoying c. Thus we get
Z 1
Z 1
1
2
˜
x2 +1+y 2 +2x+2y+2xydx = +y 2 +1+2y+y .
fY (y) =
(x+1+y) dx =
3
0
0
Thus
(x + 1 + y)2
.
fX|y (x) =
4/3 + 3y + y 2
Similarly
Z
Z
2
x(x+1+y) dx =
1
x3 +x+xy 2 +2x2 +2xy+2x2 ydx =
0
3
1 1 y2 2
2
17
+ + + +y+ y =
+4/3y
4 2 2 3
3
12
Hence g(y) =
2
17
+4/3y+ y2
12
2
4/3+3y+y
. For the other example, we get
f˜Y (y) =
Z
(x + 1)2 + ydx = 1/3(8 − 1) + y
and
Z
xf˜X,Y (x, y)dx =
Z
Z
=
x(x + 1)2 + ydx
x3 + 2x2 + xdx +
Thus
g(y) =
y
y
= 1/4 + 2/3 + 1/2 + .
2
2
17
+ y2
12
7
+y
8
.
9. When Alice sends a signal, then Bob receives N (x/2−1, 1) distributed random
variable. Now Alice sends a N (0, 1) signal. Calculate E(X|Y ) and E(Y |X)
as far as you can. First things first E(Y |X) = g(X), X is N (0, 1) and g(x) =
EYx = x/2 − 1 because Yx is N (x/2 − 1, 1) distributed. For the other one we
complete the square
fX|y (x) =
fY |x (y)fX (x)
fX,Y (x, y)
=
.
fY (y)
fY (y)
Thus we have to consider
(y − (x/2 − 1))2 + x2 = x2 /4 − x + 1 + y 2 − 2(x/2 − 1)y + x2
= 5/4x2 − x − xy + 1 + y 2 + 2y
2 − 2y
2 − 2y 2
2 − 2y 2
= 5/4(x2 − 2x(
)+(
) −(
) + 1 + y 2 + 2y
5
5
5
5
2 − 2y 2
4
= (x −
) − (1 − 2y + y 2 ) + 1 + y 2 + 2y
4
5
25
5
2 − 2y 2 21 2 58
21
= (x −
) + y + y+
.
4
5
25
25
25
Thus Xy is normal N ( 2−2y
, 25
) distributed and EXy =
6
4
2−2y
6
= g(y).
10. Let (X, Y ) be a joint gaussian random variable with EX = EY = 0 and
!
1 1
cov =
0 1
4
Find the joint distribution function.
Solution: Let X0 , Y0 be independent N (0, 1). Then
X = X0 + Y0
and Y = Y0 has exactly the same covariance matrix, and we determined the
distribution. As for the joint probability density we recall that for
cjk = E(Yj − EYj )(Yk − EYk )
and EYj = 0 = EYk we have
pY1 ,...,Yk = e−(y,Ay)/2
1
(2π)n/2 | det A|1/2
.
In two dimensions matrices are easy to invert. The inverse of
!
c11 c12
C =
c21 c22
is given by
C −1
1
=
det(C)
c22
−c12
−c21
c11
In our case
C
−1
1 −1
=
0
1
!
.
!
.
Moreover, det(C) = 1.
11. Out of four balls which are drawn with replacement, a coupon collecting problem requires to get three out of four. The probabilities 1/3 for type 1 and type
2, and 1/6 for type 3 and type 4. What is the expected waiting time. Give
two solutions.
Solutions: Let us consider the possible orders of getting three different
coupons
(1, 2, 3), (1, 2, 4), (1, 3, 2), (1, 3, 4), (1, 4, 2), (1, 4, 3)
5
are possible solutions starting with 1. There are 18 such solutions. Assuming
the the first possibility we get
EX1A(1,2,3) =
=
X
k1 ,k2
(k1 + k2 )P (1, ...., 1, 2, ...., 2, 3)
| {z } | {z }
X
k1
k1 x
k2
k2
(k1 + k2 )(1/3) (1/3) (1/6)
k1 ,k2
1X
k(1/3)k
6 k=2
1 X
k(1/3)k−1 .
=
18 k=2
=
We know that
P∞
k=1
kq k−1 =
X
1
.
(1−q)2
Thus
k(1/3)k−1 =
k=2
5
1
−1 = .
2
(2/3)
4
This means
EX1A(1,2,4) = EX1A(1,2,3) =
5
.
72
We also have
EX1A(1,3,2) = EX1A(1,4,2) ,
and
EX1A(1,4,3) = EX1A(1,3,4) .
Let us calculate the latter.
EX1A(1,4,3) =
=
X
k1 ,k2
(k1 + k2 )P (1, ...., 1, 4, ...., 4, 3)
| {z } | {z }
X
k1
k1 x
k2
k2
(k1 + k2 )(1/3) (1/6) (1/6)
k1 ,k2
=
X
1X
k1 (1/3)k1 (
(1/6)k2 )
6 k =1
k =1
1
+
∞
X
1X
k1 (1/6)k2 (
6 k =1
k
2
2
(1/3)k1 ) .
1 =1
Then we note that
∞
X
k=1
k
q =
∞
X
k=0
qk − 1 =
1
1 − (1 − q)
q
−1 =
=
.
1−q
1−q
1−q
6
Hence
1 36 1/3
1 9 1/6
+
6 4 1 − 1/6 6 25 2/3
1 9 1 1 36 1
=
+
6 4 5 6 25 2
1
1 + 24
= .
=
8 × 25
8
EX1A(1,4,3) =
All the other cases are similar.
7