Download PowerPoint

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Lecture XVI

The characteristic function of a random
variable X is defined as
X  t   E eitX   E cos  tX   i sin  tX 
 E cos  tX   iE sin  tX 

Note that this definition parallels the
definition of the moment-generating function
M X  t   E e 
tX


Like the moment-generating function there is
a one-to-one correspondence between the
characteristic function and the distribution of
random variable.
Two random variables with the same
characteristic function are distributed the
same.

The characteristic function of the uniform
distribution function of the uniform
distribution function is
X  t   e  1
it

The characteristic function of the Normal
distribution function is
2 2

it   t
X  t   e
2

The Gamma distribution function
f X  
r 1  X
 x e
r
X   0,  
r 
which implies the characteristic function
X t  

1
1  it


r

Taking a Taylor series expansion of around
the point t = 0 yields
1 
1 
Z  t   Z  0   Z  0  t  Z  0  t 2  o  t 2 
1!
2!
X 
: Z 
 n

To work with this expression we note that
X  0   1
For any random variable X, and
X  0  i E  X
k 
k
k


Putting these two results into the secondorder Taylor series expansion
1
E Z 
i
E Z 2  t2
2
t
2
t

o
t

1


o
t




2
i
2
2
: E  Z   0, E  Z 2   1
2

Thus
1 
1 
Z  t   Z  0   Z  0  t  Z  0  t 2  o  t 2 
1!
2!
2
E
Z


E Z 
t2
2
1
t

o
t


2
i
i
2
2
t2
t
2
iy
 1   o  t   1   E  e  : y ~ N  0,1
2
2

Application of Holder’s Inequality.
◦ Holder’s Inequality

EXY   E XY  E X
p
 E Y 
1
p
q
1
q
◦ Example 4.7.1: If X and Y have means X, Y and
variances X2, Y2, respectively, we can apply the
Cauchy-Schwartz Inequality (Holders inequality with
p=q=1/2) to get

E  X   X Y  Y   E  X   X 
2
 EY    
1
2
2
Y
1
2
◦ Squaring both sides and substituting for variances
and covariances yields
Cov X , Y    
2
2
X
2
Y
◦ Which implies that the absolute value of the
correlation coefficient is less than one.

Chebychev’s Inequality: Let X be a random
variable and let g(x) be a nonnegative
function. Then, for any r>0
Eg  X 
Pg  X   r  
r

Example 4.7.3: The most widespread use of
Chebychev’s Inequality involves means and
variances. Let g(x)=(x-)2/2, where =E [X ]
and 2=V (X ). Let r=t2.
2
  X   2



1
X    1
2

P

t

E



2
2
2
2

 
 t  
 t
X   
2

2
 t  X      t  X     t
2
2
2 2
1
P X     t   2
t
◦ Letting t=2, this becomes
1
P X    2    .25
4
◦ However, this inequality may not say much, since
for the normal distribution
P X    2   2
  2

2


1
x   
exp 
 dx  2.0227   .0455
2
2 
2 


Casella and Berger offer a proof of the Central
Limit Theorem (Theorem 5.3.3) based on the
moment generating function instead of the
characteristic function. However, they note
that the proof using the characteristic
function is a stronger result.

Starting from the Binomial distribution
function:
n!
nr
r
bn, r , p  
p 1  p 
n  r !r!

First, assume that n=10 and p=.5. The
probability of r  3 is:
Pr  3  b10,0,.5  b10,1,.5  b10,2,.5  b10,3,.5  .1719

Note that this distribution has a mean of 5
and a variance of 2.5. Given this we can
compute
35
z 
 1.265
2.5
*

Integrating the standard normal distribution
function from negative infinity to –1.265
yields


P z  1.265  
*
1.265



2
1
exp  z dz  .1030
2
2

Expanding the sample size to 20 and
examining the probability that r  6 yields:
Pr  6  i 0 b20, i,.5  .0577
6

This time the mean of the distribution is 10
and the variance is 5. The resulting z*=1.7889. P[Z]=.0368.

As the sample size declines the binomial
probability approaches the normal
probability.
Related documents