Download Chapter 2.3 – Moments and moment generating functions

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Fundamental theorem of algebra wikipedia , lookup

Transcript
Chapter 2.3 – Moments and moment generating functions
• In this section we will define a summary of a distribution called the moment generating
function (MGF). This is useful for
1. Computing the mean, variance, skewness, etc.
2. Proving two random variables have the same distribution. For example, the central
limit theorem proof relies on MGFs.
• For n ∈ {1, 2, ...}, the nth moment of X is
µ′n = E(X n ).
• For n ∈ {2, 3, ...}, the nth central moment of X is
µn = E[(X n − µ)n ],
where µ = E(X) = µ′1 is the mean of X.
ST521
Chapter 2.3
Page 1
• The first moment is the mean and measures the center of the distribution.
• The second central moment is the variance,
µ2 = Var(X) = V(X) = E[(X − µ)2 ],
and measures spread.
• The variance can be written in terms of the first two moments:
• The standard deviation is the square root of the variance,
SD(X) =
ST521
√
E[(X − µ)2 ].
Chapter 2.3
Page 2
• The skewness is
µ3
E[(X − µ)3 ]
.
√ 3 =
SD(X)3
µ2
This measures asymmetry.
• The kurtosis is
µ4
E[(X − µ)4 ]
=
.
µ22
V(X)2
This measure the peakedness/heaviness of the tail for symmetric distributions.
ST521
Chapter 2.3
Page 3
• Example: Find the mean and variance of X if it has PDF (for some σ > 0)
(
)
|x|
1
exp −
.
fX (x) =
2σ
σ
ST521
Chapter 2.3
Page 4
• Example: Find the mean and variance of X if it has PMF (for some N ∈ {1, 2, ...})

 1
x ∈ {−N, −(N − 1), ..., N − 1, N }
2N +1
fX (x) =
.
 0
otherwise
ST521
Chapter 2.3
Page 5
• Fact: For any constants a and b, Var(aX + b) = a2 Var(X).
ST521
Chapter 2.3
Page 6
• The moment generating function (MGF) of X is
MX (t) = E[exp(tX)]
provided this expression is finite for t in the neighborhood of zero.
• Using the following theorem, we can obtain (generate) all the moments of X using the MGF.
(n)
• Theorem: E(X n ) = MX (0) =
ST521
dn
MX (t) |t=0 .
dtn
Chapter 2.3
Page 7
• Continued...
ST521
Chapter 2.3
Page 8
• Example: Find the MGF, mean, and variance of X if fX (x) =
ST521
Chapter 2.3
1
λ
exp(−x/λ)I(x > 0).
Page 9
• If Y = a + bX is a linear transformation of X, then its MGF is
MY (t) = exp(at)MX (bt).
ST521
Chapter 2.3
Page 10
• We’ve shown how to use MGF’s to compute mean, variance, etc. Now let’s explore using
MGF’s to show two random variables have the same distribution.
• Clearly if FX (x) = FY (x) for all x then E(X k ) = E(Y k ) for all k, so having the same CDF
implies the same moments.
• However, it is possible for E(X k ) = E(Y k ) for all k but FX (x) ̸= FY (x) for some x, so the
same moments doesn’t always imply the same CDF.
• Example: Casella & Berger, pages 64-65.
• However, the MGF has more information than the moments!
ST521
Chapter 2.3
Page 11
• Theorem: If X and Y have bounded support (SX = SY = (a, b) for finite a, b) then
FX (u) = FY (u) for all u if and only if E(X k ) = E(Y k ) for all k ∈ {1, 2, ...}.
• Theorem: If MX and MY exist in a neighborhood of zero and MX (t) = MY (t) for all t in a
neighborhood of zero, then FX (u) = FY (u) for all u.
• Therefore, to prove X and Y have the same distribution, you can either show they are
bounded and have the same moments, or that they have the same MGF near zero.
ST521
Chapter 2.3
Page 12
• In Chapter 5, we’ll use the following result to prove the central limit theorem. It states that
convergence in MGF implies convergence in distribution.
• Say X1 , X2 , ... is a sequence of random variables with Xi having CDF Fi and MGF Mi , and
lim Mi (t) → MX (t) for all t in a neighborhood of 0
i→∞
where MX (t) is a MGF. Then there exists CDF FX with MGF MX so that
lim Fi (x) → FX (x) for all x.
i→∞
ST521
Chapter 2.3
Page 13