Survey
* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project
MSFI HEC Paris, T. Tomala
E XERCISES OF PROBABILITY
1
Exercises
Exercise 1 Let X be a rv with exponential distribution E(λ). Compute G(x) := P(X > x) and
P(X > x + y | X > y). Show that G(x + y) = G(x)G(y).
Let X be a continuous rv on R+ and assume that its distribution satisfies G(x+y) = G(x)G(y).
Show that it is an exponential distribution.
Exercise 2 Let X, Y be independent rvs distributed respectively E(λ), E(µ) and let Z = min{X, Y }.
Compute P(Z > z) and deduce the distribution of Z.
Exercise 3 Let X be a rv with geometric distribution G(p). Compute G(x) := P(X > x) and
P(X > x + y | X > y). Show that G(x + y) = G(x)G(y).
Let X be an integer valued rv and assume that its distribution satisfies G(x + y) = G(x)G(y).
Show that it is an geometric distribution.
Exercise 4 Let X be a integer valued rv. Show that
X
P (X > n)
E(X) =
n
Use this formula for computing the expectation of a Geometric distribution.
Exercise 5 Let X ≥ 0 be a continuous non-negative rv with density f . Show that
Z +∞
E(X) =
P (X > t)dt
0
Use this formula for computing the expectation of an Exponential distribution.
Exercise 6 Let X
N (0, 1) be normally distributed. Let W be independent from X, W assumes
the values −1, +1 with equal probability. Let Y = W X.
Compute the c.d.f. of Y and show that Y is normally distributed. Compute E(XY | W = +1)
and E(XY | W = −1). Deduce that Cov (X, Y ) = 0. Are X, Y independent?
1
Moment generating function. The MGF of X is the function t 7→ E(etX ).
Exercise 7 Compute the MGF of the Bernouilli distribution B(1, p), deduce the MGF of the Binomial distribution B(n, p).
Exercise 8 Let X be exponentially distributed with parameter 1, X
E(1). Compute the MGF
n
of X. Show that E(X ) = n! for each n.
Let Y be exponentially distributed with parameter λ > 0, Y
E(λ). Find the probability
n
distribution (c.d.f.) of λY . Compute E(Y ) for each n.
Exercise 9 A random variable Y has a log-Normal distribution `N (m, σ 2 ) if Y = eX with X
N (m, σ 2 ). That is, log Y
N (m, σ 2 ).
Compute E(Y n ) for each n (use the MGF of X).
Prove that the MGF of Y does not exist!
Exercise 10 The Laplace distribution L(b, µ) has the following density
f (x) = (1/2b) exp(−|x − µ|/b), x ∈ R.
Verify that this defines a probability density function. Show that the moment generating function is
ϕ(α) = eαµ /(1 − b2 α2 ). Compute the mean and the variance.
From Past Exams
Exercise 11 Let X
U[0,1] be uniformly distributed over [0, 1].
1. Write down the probability density function of X and the cumulative distribution function
(cdf) of X. Compute E(X), V (X) and E(etX ).
2. Let X1 , . . . , Xn , . . . be an iid sequence with distribution U[0,1] . Define Zn = maxi=1,...,n Xi .
What is the cdf of Zn ? Compute E(Zn ).
3. Show that P (|Zn − 1| > ε) −→n→∞ 0, for any ε > 0.
Exercise 12 Let Y be a positive random variable such that ln Y
2
N (0, 1). Compute E(Y 10 ).
2
Solutions
Rx
Exercise 1. The cdf is F (x) = P (X ≤ x) = 0 λe−λt dt = 1 − e−λx . Thus, G(x) = e−λx . Then,
P(X > x + y | X > y) = G(x + y)/G(y) = G(x).
If X is such that G(x + y) = G(x)G(y), then taking derivative with respect to x gives G0 (x +
y) = G0 (x)G(y). For x = 0, G0 (y) = G(0)G(y). It follows that G(y) = aeby . Since G(0) =
P (X > 0) = 1, a = 1. Thus, this is an exponential distribution with parameter λ = −G0 (0).
Exercise 2. P(Z > z) = P (X > z and Y > z) = P (X > z)P (Y > z) = e−λz e−µz = e−(λ+µ)z .
Thus Z is an exponential with parameter λ + µ.
P
P
i
x
k−1
x
Exercise 3. P (X > x) =
p(1
−
p)
=
(1
−
p)
i≥0 p(1 − p) = (1 − p) . Then,
k>x
P(X > x + y | X > y) = G(x + y)/G(y) = G(x).
If G(x + y) = G(x)G(y) for all integers x, y, then for all integer n, G(n) = G(1 + · · · + 1) =
G(1)n . Thus, this is a geometric distribution with parameter p = G(1).
Exercise 4.
P (X > 0) = P (X = 1) + P (X = 2) + P (X = 3) + · · ·
P (X > 1) = P (X = 2) + P (X = 3) + · · ·
P (X > 2) = P (X = 3) + · · ·
so that P (X = k) appears exactly k times.
P
For the Geometric distribution, P (X > n) = (1 − p)n . Thus, E(X) = n (1 − p)n = 1/p.
R +∞ R ∞
Exercise 5. The right hand side is 0 ( t f (x)dx)dt. Integrating over t first, this is
Z +∞
Z x
Z +∞
Z +∞ Z x
xf (x)dx
dt)dx =
f (x)(
f (x)dt)dx =
(
0
0
0
0
0
as desired.
For the Exponential distribution,
R +∞
P (X > t) = exp(−λt) and 0 exp(−λt)dt = [(−1/λ) exp(−λt)]+∞
= 1/λ.
0
Exercise 6. P (Y ≤ y) = 12 P (Y ≤ y|W = 1)+ 12 P (Y ≤ y|W = −1) = 12 P (X ≤ y)+ 21 P (−X ≤
y). When X
N (0, 1), then −X
N (0, 1). Thus P (X ≤ y) = P (−X ≤ y) = F (y) where F
is the cdf of N (0, 1). Thus, Y
N (0, 1).
|X| = |Y |, so they are not independent. Yet, E(XY | W = +1) = E(X 2 ) = 1, E(XY | W =
−1) = E(−X 2 ) = −1, thus E(XY ) = 21 E(XY | W = +1) + 21 E(XY | W = −1) = 0. Finally
Cov (X, Y ) = E(XY ) − E(X)E(Y ) = 0.
Exercise 7. Let X
B(1, p). E(etX ) = (1 − p) + pet .
Q
Let X
B(n, p). Then X = X1 + · · · + Xn with Xi iid B(1, p). Then, ϕX (t) = i ϕXi (t) =
((1 − p) + pet )n .
R∞
R∞
Exercise 8. Let X
E(1). Then, E(etX ) = 0 etx e−x dx = 0 e(t−1)x = 1/(1 − t).
3
P
P k
Since for −1 < t < 1, 1/(1 − t) = k tk and ϕX (t) = k tk! E(X k ), we get E(X k ) = k! for
each k.
Let Y
E(λ). P (λY ≤ x) = P (Y ≤ x/λ) = 1 − exp(−λ(xλ)) = 1 − e−x . Thus,
λY
E(1). Then, E((λY )k ) = k! and thus E(Y k ) = k!/λk .
nX
2 2
Exercise 9. If Y = eX , E(Y n )R= E(e
√ ) = exp(nm + 1n σ /2) from the MGF of the Normal.
tY
X
x
2
E(e ) = E(exp(te )) = (1/ 2πσ ) exp (te − 2σ (x − m)2 ).
1
Since for t > 0, limx→+∞ exp (tex − 2σ
(x − m)2 ) = +∞, this integral is not defined.
This shows that the MGF may not be defined even when all moments E(Y n ) are.
R +∞
R +∞
Exercise 10. Let us compute −∞ f (x)dx. By symmetry around µ, this is 2 µ f (x)dx and
R +∞
R +∞
R +∞
2 µ f (x)dx = (1/b) µ exp(−(x − µ)/b)dx = (1/b) 0 exp(−x/b)dx
= (1/b)[−b exp(−x/b)]+∞
= (1/b)[0 + b] = 1.
0
Let us compute
R +∞ now the MGF,
ϕ(α) = −∞ exp(αx − |x − µ|/b)dx
Rµ
R +∞
= −∞ exp(αx + (x − µ)/b)dx + µ exp(αx − (x − µ)/b)dx
R +∞
Rµ
= exp(−µ/b) −∞ exp((α + 1b )x)dx + exp(µ/b) µ exp((α − 1b )x)dx
In order that both integral be defined, one must assume |α| < 1/b. Under this condition, we
get,
ϕ(α) = exp(−µ/b) α+1 1 exp((α + 1b )µ) + exp(µ/b) α−1 1 exp((α − 1b )µ)
b
b
which gives the result after reordering.
4