Download proof of the Central Limit Theorem

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Law of large numbers wikipedia , lookup

Central limit theorem wikipedia , lookup

Transcript
Central Limit Theorem
The proof given here is for collections of i.i.d. (independent, identically distributed) random
variables1. We assume that the distribution from which the sample is selected has a moment
generating function. A more general proof uses the characteristic function of the distribution2.
The characteristic function differs from the moment generating function in that tX in the
exponent of the formula for the m.g.f. is replaced by itX. While some distributions do not have
m.g.f.’s, every distribution has a characteristic function. Note: The proof implicitly uses a basic
principle from analysis. Which one is it? (Hint: Think about ancient Greek mathematicians.)
Theorem: Let X1, X2, X3, …. be a sequence of i.i.d. r.v.’s with mean  and variance  2   .
Suppose that the distribution from which the sample is selected has a m.g.f. M X t  , defined for
all t in a neighborhood of 0 . Suppose that N is a random variable with a standard normal
S  n
 N , where the symbol  is used to
distribution. If S n  X 1  X 2    X n , then n
 n
denote convergence in distribution.
Proof: Without loss of generality let EX n   0 and E X n2  1 (otherwise, prove the result for
 
X i* 
Xi  

, i = 1, 2, …, n).

 t 
Let M S 
  E e tSn /
 n
n
 be the m.g.f. of S evaluated at t /
n
 
n , and let M X t   E e tX i
n
 t  
 t 
be the m.g.f. of Xi, for i = 1, …, n. Due to independence, we have M S 
  M X 
 .
 n 
 n 

M  0
 t 
 t 
Let L X 
 0 , and that
  ln  M X 
  , and note that LX 0  X


M
0
n
 n


X


LX 0 
M X 0M X 0  M X 0
2
M X 02
 1.
n
1 2
t

 t 
To prove the theorem, we must show that lim M X 
  e 2 , or equivalently, that
n
 n 

  t  1 2
lim n  L X 
  t . To show this, note that, for z > 0,
n
  n  2
  t 

 t  1.5 

z t 
LX 
  L X 
z
 z 




lim
 lim
z 
z  

z 1
 2 z 2




by L’Hôpital’s rule
  t  
t 
 L X 
z

 
 lim 
 0. 5
z  

2z





 t  1.5 2 
z t 
  L X 
z



 lim 
1.5
z  

 2z




2
  t t 
 lim  LX 
 
z 
  z 2
t2
.
2
1 2
t
t 
  e 2 , the m.g.f of the standard normal distribution.
n
by L’Hôpital’s rule


Thus lim M S 
n 

1
2
See S. Ross. (1984), A First Course in Probability.
See Resnick, S. I. (1999), A Probability Path.
#