Download Lecture 20: Sums of Independent Random Variables 1

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Lecture 20: Sums of Independent Random Variables
1.) Convolutions
Definition: If f, g : R → R are two integrable real-valued functions, then the convolution of
f and g is the real-valued function f ∗ g : R → R defined as
Z ∞
f (x)g(z − x)dx
(f ∗ g)(z) =
−∞
Z ∞
f (z − x)g(x)dx = (g ∗ f )(z).
=
−∞
The identity between the first and second line follows from a simple change of variables and
shows that convolution is a commutative operation: f ∗ g = g ∗ f .
Application: In probability theory, convolutions arise when we consider the distribution of
sums of independent random variables. To see this, suppose that X and Y are independent,
continuous random variables with densities px and py . Then X + Y is a continuous random
variable with cumulative distribution function
FX+Y (z) = P{X + Y ≤ z}
Z
=
pX (x)pY (y)dxdy
x+y≤z
∞ Z z−y
Z
=
pX (x)pY (y)dxdy
Z−∞
∞
=
−∞
FX (z −
−∞
Z ∞ Z z−x
y)pY (y)dy
pY (y)pX (x)dydx
=
−∞
Z−∞
∞
FY (z − x)pX (x)dx,
=
−∞
where the expression in the fourth line is the convolution FX ∗ pY and the expression in the sixth
line is the convolution FY ∗ pX . (Note: On p. 252, Ross refers to this as the convolution of FX
and FY . This is wrong!) The density of X + Y can then be found by differentiating the CDF,
giving
pX+Y (z) =
d
FX+Y (z)
Zdz
∞
pX (z − y)pY (y)dy
=
Z−∞
∞
pY (z − x)pX (x)dx,
=
−∞
1
which is equal to the convolution of the density functions of X and Y :
pX+Y (z) = (pX ∗ pY )(z) = (pY ∗ pX )(z).
2.) Sums of uniform RVs
Example (Ross, 3a) If X, Y are independent U (0, 1)-distributed random variables, then the
density of X + Y is

0≤z≤1
 z
2−z 1<z ≤2
pX+Y (z) =

0
otherwise.
This is known as the triangular distribution.
3.) Sums of gamma RVs
Proposition 3.1 If X and Y are independent gamma-distributed RVs with parameters (s, λ)
and (t, λ), then X + Y is also a gamma-distributed random variable with parameters (s + t, λ).
Proof: The result follows from the calculation
pX+Y (z) = pX ∗ pY (z)
Z z
s−1
1
=
λe−λ(z−x) λ(z − x)
λe−λx (λx)t−1 dx
Γ(s)Γ(t) 0
= Ce−λz z s+t−1 ,
where C is a constant. However, since pX+Y is a density, we know that it integrates to 1 and so
we can calculate
Z ∞
−1
−λz s+t−1
C =
e z
dz
0
=
λs+t
.
Γ(s + t)
This shows that
pX+Y (z) =
λe−λz (λz)s+t−1
,
Γ(s + t)
as claimed.
General result: By induction, it follows that if X1 , · · · , Xn are independent gamma-distributed
random variables with parameters (ti , λ), then the sum X = X1 +· · ·+Xn is a gamma-distributed
Pn
RV with parameters
i=1 ti , λ .
2
Example (Ross, 3b): Because the exponential distribution with parameter λ is the same as
the gamma distribution with parameters (1, λ), it follows that if X1 , · · · , Xn are independent
exponential RVs all with parameter λ, then the sum X = X1 + · · · + Xn is a gamma RV with
parameters (n, λ).
4.) Sums of normal RVs
Proposition 3.2: If X1 , · · · , Xn are independent normal RVs with parameters P
(µi , σi2 ), i =
n
1,
then their sum X = X1 + · · · + Xn is a normal RV with parameters
i=1 µi and
P·n· · , n,
2
i=1 σi .
Proof: It suffices to consider the case n = 2, since the full result then follows by induction on
n. Also, because Xi − µi is normally distributed with parameters (0, σi2 ), we may assume that
µ1 = µ2 = 0.
Then, using the convolution formula, we see that the density of X + Y is
Z ∞
pX+Y (z) =
pX (z − x)pY (x)dx
−∞
Z ∞
(z − x)2
x2
1
1
√
√
exp −
exp − 2 dx
=
2σ12
2σ2
2πσ1
2πσ2
−∞
Z ∞
2
z
1
1
z
1
2
exp − 2
exp −
+
x + 2 x dx
=
2πσ1 σ2
2σ1
2σ12 2σ22
σ1
−∞
−1 2 !
2
1
z
1
1
z
=
exp − 2 exp 2
+ 2
2
2πσ1 σ2
2σ1
σ1
σ2
4σ14


−1 !2
Z ∞
1
1
1
1
1
z
 dx
exp −
×
x−
+
+
2 σ12 σ22
2σ12 2σ22
σ12
−∞
=
=
1
1 −1/2
z2
+
exp −
σ12 σ22
2(σ12 + σ22 )
1
z2
p
exp
−
,
2(σ12 + σ22 )
2π(σ12 + σ22 )
1
(2π)1/2
2πσ1 σ2
which shows that X1 + X2 ∼ N (0, σ12 + σ22 ).
5.) Discrete Convolutions
If X and Y are independent integer-valued random variables with probability mass functions pX
3
and pY , then X + Y is also an integer-valued random variable with probability mass function
pX+Y (n) = P{X + Y = n}
X
=
P{X = k, Y = n − k}
k
=
X
P{X = k}P{Y = n − k}
k
=
X
pX (k)pY (n − k).
k
The expression in the last line of this series of equations can be seen as a discrete convolution.
Example (Ross, 3e): If X and Y are independent Poisson RVs with parameters λ1 and λ2 ,
then X + Y is a Poisson RV with parameter λ1 + λ2 .
Proof: Using the discrete convolution formula (and noting that X and Y are both non-negative),
the probability mass function of X + Y is
pX+Y (n) =
n
X
k=0
n
X
pX (k)pY (n − k)
λk1 −λ2 λn−k
2
e
k!
(n − k)!
k=0
n X
n k
−(λ1 +λ2 ) 1
= e
λ (1 − λ2 )n−k
n!
k 1
=
e−λ1
k=0
= e−(λ1 +λ2 )
(λ1 + λ2 )n
,
n!
which is also the probability mass function for the Poisson distribution with parameter λ1 + λ2 .
General result: Using induction, we can show that if X1 , · · · , Xn are independent Poisson
RVs with parameters λ1 , · · · , λn , respectively, then the sum X = X1 + · · · + Xn is a Poisson RV
with parameter λ1 + · · · + λn .
4
Related documents