Download Just as the frequency-domain representation of signals, the

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Characteristic Functions of Random Variables
Just as the frequency-domain charcterisations of discrete-time and continuous-time signals, the
probability mass function and the probability density function can also be characterized in the
frequency-domain by means of the charcteristic function of a random variable. These functions
are particularly important in
 calculating of moments of a random variable
 evaluating the PDF of combinations of multiple RVs.
Characteristic function
Consider a random variable X with probability density function f X ( x). The characteristic
function of X denoted by  X ( w), is defined as
 X ( w)  Ee j X

=  e j x f X ( x) dx

where j  1.
Note the following:

 X  
is a complex quantity, representing the Fourier transform of
f  x  and
 j X
j X
. This implies that the properties of the Fourier
instead of e
transform applies to the characteristic function.
traditionally using e

The interpretation that
 X  
is the expectation of e
j X
helps in calculating moments
with the help of the characteristics function.

As
 X  

always +ve and

f X  x  dx  1 , X   always exists.


[Recall that the Fourier transform of a function f(t) exists if
  f  t dt   , i.e., f(t) is absolutely

integrable.]
We can get
fX  x 
1
2
f X  x  from X   by the inverse transform

    e
X
 j x
d

Example 1 Consider the random variable X with pdf f X  x  given by
fX  x 
1
ba
a xb
= 0 otherwise. The characteristics function is given by
Solution:
b
1 j x
e dx
ba
a
 X    
b
e j x 
j  a
1

ba
1
e jb  e j a 

j  b  a 

Example 2 The characteristic function of the random variable X with
f X ( x)   e   x   0, x  0 is

 X ( )   e j x  e   x
0

   e  (   jw) x dx
0


  j
Characteristic function of a discrete random variable:
Suppose X is a random variable taking values from the discrete set
corresponding probability mass function
RX  x1 , x2 ,..... with
pX  xi  for the value xi .
Then
 X    Ee j X


X i RX
Note that
pX  xi  e j xi
X   can be interpreted as the discrete-time Fourier transform with e j x
e j xi in the original discrete-time Fourier transform. The inverse relation is
i
substituting
pX ( xi ) 
1   j xi
 X ( )d
e
2 
Example 3 Suppose X is a random variable with the probability mass function
p X (k )  nCk p k 1  p 
nk
, k  0,1,...., n
n
Then
 X     nCk p k 1  p 
nk
e jk
k 0
  Ck  pe
n
n
k 0
k
j
 1  p 
nk
  pe j  1  p   (Using the Binomial theorem)
n
Example 4 The characteristic function of the discrete random variable X with
p X (k )  p(1  p) k ,
k  0,1,....

 X ( )   e j k p(1  p) k
k 0

 p  e j k (1  p) k
k 0

p
1  (1  p)e j
Moments and the characteristic function
Given the characteristics function X
EX k 
  , the kth moment is given by
1 dk
 X  
j d k
 0
To prove this consider the power series expansion of ei X
( j )2 X 2
( j )n X n
 ...... 
 ..
2!
n!
2
n
Taking expectation of both sides and assuming EX , EX ,..........., EX to exist, we get
e j X  1  j X 
 X ( )  1  j EX 
( j )2 EX 2
( j )n EX n
 ...... 
 .....
2!
n!
Taking the first derivative of  X ( ) with respect to  at   0, we get
d X ( )
 jEX
d   0
Similarly, taking the nth derivative of  X ( ) with respect to  at   0, we get
d n X ( )
 j n EX n
n
d
 0
Thus
EX 
1 d X ( )
j d   0
EX n 
1 d n X ( )
j n d n   0
Example 3 First two moments of random variable in Example 2
 X ( ) 

  j
d
j
 X ( ) 
d
(  j ) 2
d2
2 j 2

(

)

X
d 2
(  j )3
EX 
1
j
j (  j ) 2
EX 2 

 0
1

1 2 j 2
2
 2
2
3
j (  j )  0 
Probability generating function:
If the random variable under consideration takes non negative integer values only, it is convenient
to characterize the random variable in terms of the probability generating function G (z) defined
by
GX  z   Ez X

  pX  k  z
k 0
Note that

GX  z  is related to z-transform, in actual z-transform, z  k is used instead of z k .

The characteristic function of X is given by

GX 1   p X  k   1

k 0
 X    GX  e j 


GX '  z    kp X  k z k 1
k 0


G '(1)   kp X  k   EX
k 0




k 0
k 0
GX ''( z )   k (k  1) p X  k  z k  2   k 2 px  k  z k  2   k px  k  z k  2
k 0


k 0
k 0
 GX ''(1)   k 2 p X  k    kp X  k   EX 2  EX
 X 2  EX 2   EX   GX ''(1)  GX '(1)   GX '(1) 
2
Ex: Binomial distribution:
pm ( x)  nCx p x (1  p) x
x ( z )   nCx p x (1  p)n x z x
x
  nCx ( pz ) x (1  p)n x
x
 (1  p  pz )n
 ' X (1)  EX  np
 X '' (1)  EX 2  EX  n(n  1) p 2
 EX 2   X '' (1)  EX
 n(n  1) p 2  np
 np 2  npq
Example 2: Geometric distribution
p X ( x)  p(1  p) x
 X ( z )   p(1  p) x z x
x
 p   1  p  z 
x
x
p
X ' ( z )  
1
1  (1  p) z
p(1  p)
(1  (1  p) z ) 2
2
X ' (1) 
p(1  p)
p(1  p) q


2
(1  1  p)
p2
p
X '' ( z ) 
2 p(1  p)(1  p)
(1  (1  p) z )3
q
2 pq 2
X (1)  3  2  
p
 p
2
''
2
q q
q
EX   (1)   2   
p
p
 p
2
''
2
q q q
Var ( X )  2        
 p  p  p
2
Moment Generating Function:
Sometimes it is convenient to work with a function similarly to the Laplace transform and
known as the moment generating function.
For a random variable X, the moment generating function
M X  s  is defined by
M X  s   Ee SX


f X  x e SX dx
RX
Where
RX is the range of the random variable X.
If X is a non negative continuous random variable, we can write

M X  s    f X  x  e SX dx
0
Note the following:


M x '( s )   xf x  x  e sx dx
0
 M '(0)  EX


dk
M X  s    x k f X  x  e sx dx
ds k
0
=
EX k
Example
fX  x 
Let
X

be
a
continuous
random
variable
   x  ,   0
  x2   2 
Then

EX 
 xf  x  dx
X

=
  2x
dx
 0 x 2   2
=


ln 1  x 2 

0

Hence EX does not exist. This density function is known as the Cauchy density function.
2
q q
    
 p  p
The joint characteristic function of two random variables X and Y is defined by,
 X ,Y (1 , 2 )  Ee j x  j y
1
 


2
f X , y ( x, y )e j1x  j2 y dydx
 
And the joint moment generating function of
X ,Y ( s1 , s2 ) 
 

f X ,Y ( x, y)e xs1  ys2 dxdy
 
 Ees1x  s2 y
IF Z =ax+by, then
Z (s)  EeZs  Ee( axby ) s  X ,Y (as, bs)
Suppose X and Y are independent. Then
 ( s1 , s2 )
is defined by,
with
 
 e
 X ,Y ( s1 , s2 ) 
s1 x  s2 x
f X ,Y ( x, y )dydx
 





sx
s y
 e 1 f X ( x)dx  e 2 fY ( y)dy
 1 ( s1 )2 ( s2 )
Particularly if Z=X+Y and X and Y are independent, then
Z ( s)  X ,Y ( s, s)
 X ( s)Y ( s)
Using the property of Laplace transformation we get,
f Z ( z )  f X ( z )* fY ( z )
Let us recall the MGF of a Gaussian random variable
X  N ( X , X 2 )
 X (s)  Ee Xs




1
e
2 X 

1
e
2 X 

1
2 X

 x
X
1
2   X
1
2




2
.e xs dx
x2  2(  X  X 2 s ) x  (  X  X 2 s )(  X  X 2 s )..
X2
dx
( 4 s 2  2  X  X 2 s )
1 X
 1 ( x   X  X 2 s )
2
X2
e 2
dx  e
dx

We have,
 X ,Y ( s1 , s2 )
 Ee Xs1 Ys2
 E (1  Xs1  Ys2 
 Xs1  Ys2   ..............)
2
s EX 2 s2 2 EY 2
 1  s1 EX  s2 EY 

 s1s2 EXY
2
2
2
1
Hence, EX 

 X ,Y ( s1 , s2 )]s1 0
s1
EY 

 X ,Y ( s1 , s2 )]s2 0
s2
2
EXY 
X ,Y ( s1 , s2 )]s1 0, s2 0
s1s2
We can generate the joint moments of the RVS from the moment generating function.