Download Statistics 116 - Fall 2004 Theory of Probability Alternate Final Exam

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Statistics 116 - Fall 2004
Theory of Probability
Alternate Final Exam, December 7th, 2004
Solutions
Instructions: Answer Q. 1-6. All questions have equal weight. Bonus
worth equivalent of one half question. The exam is open book. In addition, you
are allowed a maximum of 3 pages of handwritten notes.
Q. 1) Let X ∼ Geom(p) be a Geometric random variable. Show that X has the
following memoryless property:
For any integers n and m, with m < n
P (X > n|X > m) = P (X > n − m).
Solution:
P (X > n) = P (first n trials were failures)
= (1 − p)n .
Therefore, for n > m
P (X > n, X > m)
P (X > m)
P (X > n)
=
P (X > m)
(1 − p)n
=
(1 − p)m
= (1 − p)n−m
= P (X > n − m).
P (X > n|X > m) =
1
Q. 2) The joint density function of X and Y is given by
2
e−(x−y) /2y y 2 e−y
√
f (x, y) =
,
2
2πy
(a)
(b)
(c)
(d)
−∞ < x < ∞, y ≥ 0.
Find the conditional density fX|Y (x|y) of X given Y = x.
Compute Var(X|Y ).
Compute E(X|Y ).
Compute Var(X).
Solution:
(a) By inspection, it is not hard to see that
fY (y) =
y 2 e−y
,
2
y≥0
and
2
e−(x−y) /2y
√
fX|Y (x|y) =
2πy
is a Normal density with mean y and variance y. Or,
X|Y = y ∼ N (y, y).
(b) As the conditional distribution is N (Y, Y )
Var(X|Y ) = Y.
(c) Similarly,
E(X|Y ) = Y.
(d)
Var(X) = E(Var(X|Y )) + Var(E(X|Y ))
= E(Y ) + Var(Y )
= E(Y ) + E(Y 2 ) − E(Y )2
Z ∞ 3 −y
y e
E(Y ) =
2
0
Γ(4)
=
Γ(3)
=3
Z ∞ 4 −y
y e
2
E(Y ) =
2
0
Γ(5)
=
Γ(3)
= 12
Therefore,
Var(X) = 3 + 12 − 32 = 6.
2
Q. 3) Suppose that X is a continuous random variable with density
fX (x) = αλe−λx + (1 − α)µe−µx ,
x≥0
where 0 < α < 1 and 0 < λ < µ (in this case the distribution of X is said
to be a mixture of an Exp(λ) and an Exp(µ) distribution).
(a) Compute hX (t), the hazard rate of X.
(b) Recalling that we have assumed λ < µ, show that
lim hX (t) = λ.
t→∞
Solution:
(a)
fX (t)
1 − FX (t)
Z ∞
1 − FX (t) =
fX (s) ds
t
Z ∞
Z
−λs
=α
λe
ds + (1 − α)
hX (t) =
= αe
hX (t) =
t
−λt
∞
µe−µs ds
t
+ (1 − α)e−µt
αλe−λt + (1 − α)µe−µt
αe−λt + (1 − α)e−µt
(b)
αλe−λt + (1 − α)µe−µt
t→∞ αe−λt + (1 − α)e−µt
lim hX (t) = lim
t→∞
αλ + (1 − α)µe(λ−µ)t
t→∞ α + (1 − α)e(λ−µ)t
=λ
= lim
because for λ < µ
lim e(λ−µ)t = 0.
t→∞
3
Q. 4) Given a sequence (X1 , . . . , Xn ) of n independent, identically distributed
random variables we say a record occured at time i if Xi ≥ Xj , 1 ≤ j ≤
i − 1. That is, there is a record at time i, if, at time i, Xi is the “record”
largest of the first i elements of the sequence.
(a) For 1 ≤ i ≤ n, compute pi = P (Xi is a record)
(b) Let Rn be the total number of records of the sequence (X1 , . . . , Xn ).
Compute E(Rn ).
Solution:
(a)
P (Xi is a record) = P (Xi ≥ X1 , Xi ≥ X2 , . . . , Xi ≥ Xi−1 ).
This just says that if we order the X’s from 1 to i then Xi is the last
one. As there are i! such orderings and (i − 1)! orderings with Xi the
last entry,
1
P (Xi is a record) = .
i
(b) Let
(
1 Xi is a record
Yi =
0otherwise.
Then,
Rn =
n
X
Yi
i=1
and
E(Rn ) =
n
X
E(Yi ) =
i=1
n
X
i=1
4
P (Xi is a record) =
n
X
1
i=1
i
.
Q. 5) A model for the movement of a stock supposes that if the present price
of the stock is s, then after one time period it will be either u × s with
probability p or d × s with probability 1 − p. Assuming that successive
movements are independent, approximate the probability that the stock’s
price will be up at least 30 percent after the next 1000 time periods if
u = 1.012, d = 0.990 and p = 0.51.
(Hint: log(ab) = log(a) + log(b)).
Solution:
Define the random variables
(
1 if stock rises at i-th time period
Xi =
0 otherwise.
Then, the Xi ’s are independent with
P (Xi = 1) = p
and the price of the stock Sn at the n-th time period is
Sn = S0 u
Pn
i=1
P
Xi n− n
i=1 Xi
d
= S0 d
n
u Pni=1 Xi
d
.
The event that the stock has risen 30 percent over the next 1000 time
periods is
P1000
i=1 Xi
S1000
u
1000
≥ 1.3 =
d
≥ 1.3
S0
d
(
)
1000
X
= log(u/d)
Xi + 1000 log d ≥ log(1.3)
i=1
=
(1000
X
i=1
−1000 log(d) + log(1.3)
Xi ≥
log(u/d)
)
( P
)
1000
−1000 log(d) − 1000p log(u/d) + log(1.3)
i=1 (Xi − p)
p
p
√
≥
= √
1000 p(1 − p)
log(u/d) 1000 p(1 − p)
P1000
But i=1 Xi ∼ Binom(1000, 0.51), so, by the Central Limit Theorem, or
the Normal approximation to the Binomial,
!
S1000
−1000 log(d) − 1000p log(u/d) + log(1.3)
p
√
P
≥ 1.3
'1−Φ
S0
log(u/d) 1000 p(1 − p)
= 1 − Φ(−2.58)
= 0.995.
5
Q. 6) Let (X1 , . . . , Xn ) be n independent random variables all with distribution
Exp(1). Let
m
b n (X1 , . . . , Xn ) = min Xi .
1≤i≤n
(a) What is the distribution of m
b n ? (i.e. find the density or distribution
function of m
b n and identify it, if possible)
(b) Compute E (m
b n) .
(c) Compute Var (m
b n) .
Solution:
(a)
P (m
b n (X1 , . . . , Xn ) > x) = P (∩ni=1 Xi > x)
n
Y
=
P (Xi > x)
i=1
= (e−x )n
= e−nx .
Therefore,
m
b n ∼ Exp(n).
(b) As m
b n ∼ Exp(n)
E(m
b n) =
(c) As m
b n ∼ Exp(n)
Var(m
b n) =
6
1
.
n
1
.
n2
Related documents