Download g (t) = E[X 2 ]

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Transformation Techniques
In Probability theory various transformation techniques
are used to simplify the solutions for various moment
calculations. We will discuss here 4 of those functions.
1.
2.
3.
4.
Probability Generating Function
Moment Generating Function
Characteristic Function
Laplace Transformation of probability density function
1
Probability Generating Function
Tool that simplifies computations of integer valued discrete random variable problems
X: non-negative integer valued Random Number
P(X=k) =pk, then define the Probability Generating Function (PGF) of X by
GX(z) = E[z X] = S pk z k
= p0 + p1z + p2 z2 + ……. pk z k +……
z is a complex number z < 1
G(z) is nothing more than z-transform of pk.
Gx(1) = 1 = S pk
2
Generating Functions
K : Non-negative integer valued random variable with probability distribution pj
where,
pj = Prob[K =j] for all j = 0,1,2,……
g(z) : p0 + p1 z+ p2 z2+ p3 z3+ …….
g(z) is a power series of probability pj with coefficient zj is the probability generating
Function of random variable K
Few properties
g(1) = 1
as S pj = 1
and z is a complex number and converged to Absolute Value Mod[z] < 1
Expected Value E[K] = S j pj for j: 0,1,2…..
(d/dz)g(z) = S j pj zj-1 at z =1 for j : 1,2,…..
E[K] = g(1)(1)
Similarly V[K] = g(2)(1) + g(1)(1) – [g(1)(1)]2
Reference: Introduction to Queuing Theory, Robert Coop
3
Moment Generating Function
mg(t) : Moment Generating Function: Expected Value of
function etX, where ‘t’ is a real variable and X is the
random variable
mg(t) = E[etX] =  Xi Rx p(Xi). etXi
= ∫Rx f(x). etXidx
If mg(t) exists for all real values of t, in some small
interval –d, d : d > 0 about the origin, it can be shown
that the probability distribution function can be obtained
from mg(t). We assume mg(t) exists at a small region t
about origin.
4
Moment Generating Function-2
etX= 1 + tx + t2X2/2! + t3X3/3!+ 
Assume X is a continuous Random Variable
mg(t) = E[etX] =  Xi Rx p(Xei).tXetXi
= ∫Rx f(x). etXidx

= ∫Rx i=0 tiXi/i! f(X)dx
= ∫Rx
ti/i!

i=0
Xi

f(X)dx = i=0 ti/i!∫Rx Xi f(X)dx

= i=0 ti/i!E[Xi] = E[X0] + tE[X1] + t2/2!E[X2] + …
5
Moment Generating Function-3
mg(t) = E[X0] + tE[X1] + t2/2!E[X2] + …
m (1)g(t) = E[X1] + tE[X2] + t2/2!E[X3] + …
m (2)g(t) = E[X2] + tE[X3] + t2/2!E[X4] + …
At t = 0
m (1)g(t) = E[X1]
m (2)g(t) = E[X2]
Var[X] = E[X2] – [E[X]]2 = m (2)g(t) - [m (1)g(t)] 2
6
Characteristic Function
The Characteristic Function of Random Variable X
fX(u) = E[e juX] = ∫- e juXfx(x)dx
where j = -1 and u is an arbitrary real variable
Note: Except for the sign of exponent, Characteristic function is
the Fourier Transform of the pdf of X.
fX(u) = ∫- fx(x)dx[1 + jux +(jux)2/2! + (jux)3/3! + ……..]dx
= 1 + juE[X] + (ju)2/2!E[X2] + (ju)3/3!E[X3] + …..
Let u=0
Then fX(0) = 1
f(1)X(0) = dfX(u)/duu=0 = jE[X]
f(2)X(0) = d2fX(u)/du2u=0 = j2E[X2]
7
Laplace Transform
Let CDF of traffic arrival process is defined as A(x), where X is the random variable for
inter arrival time between two customers.
A(x) = P[X < x]
The pdf (probability density function) is denoted by a(x)
Laplace Transform of a(x) is denoted by A*(s) and is given by
A*(s) = E[e –sX]

= ∫ - e –sx axdx
Since most random variable deals with non negative numbers, we can make the
transform as

A*(s) = ∫
0
e –sx axdx
Similar techniques of Moment generating function or characteristic function can be used
to show that
A*(n) (0) = (-1)nE[Xn]
8
Example
For a continuous Random variable pdf is given as follows
le –lx
x>0
fx(x) =
0
x<0
Laplace Transform : A*(s) =
l
l+s
Characteristic Function: fx(u) =
l
l - ju
Moment Generating Function: mg(v) =
l
l-v
9
Expected Value
Laplace Transform : E[X] = (-1)A*(1) (0) = (-) d[l/(l + s)/dss=0
= (-) [(-)l/(l +s)2] s=0
= l/l2 = 1/l
Characteristic Function: E[X] = j-1 fx(1) (0)
= (j-1) d[l/(l - ju)/duu=0
= (j-1)[l.j/(l - ju)2u=0
= l/l2 = 1/l
Moment Generating Function E[X}= mX(1) (0)
= d[l/(l - v)/dvv=0
=[l/(l - v)2v=0
= l/l2 = 1/l
10
Variance
Laplace Transform : E[X2] = (-1)2A*(2) (0) = d2[l/(l + s)/ds2s=0
= [2l(l+s)/(l +s)3] s=0
= 2l2/l3 = 2/l
Var[X] = E[X2] – [E[X]]2 = 2/l – [1/l]2 = (2l –l)/l2 = 1/l
Characteristic Function: E[X2] = j-2 fx(2) (0)
= (j-2) d2[l/(l - ju)/du2u=0
= (j-2)[2l(l – ju).j2/(l - ju)3u=0
= 2l2/l3 = 2/l
Moment Generating Function E[X2}= mX(2) (0)
= d2[l/(l - v)/dv2v=0
=[2l(l - v )/(l - v)3v=0
= 2l2/l3 = 2/l
11
12
Sum of Random Variables
K1 and K2 are two independent random variables with GF g1(z) and g2(z)
Find the Probability distribution P{K=k} where K = K1 + K2
P{K =k} = P{k1 = j}.P{k2 = k-j}
g1(z) = S P{k1=j}zj for j: 0.1,2…….
g2(z) = S P{k2=j}zj for j: 0.1,2…….
g1(z)g2(z) =kS= 0{ Sj = P{k1=j}P{k2=k
- j}zk for k: 0.1,2……. and j : 0,1,2…k
0
If K has a generating function of g(z), then
g(z) = S P{K=k}zk for k: 0.1,2…….
= k=S0[Sj =P{k1
= j}.P{k2 = k-j}] for k: 0.1,2……. and j : 0,1,2…k
0
g(z) = g1(z)g2(z)
13
Example: Bernoulli Distribution
Bernoulli Distribution :
X =0 with probability q
X = 1 with probability p
p+q=1
g(z) = q + pz
g’(1) = p
g’’(I) = 0
E[X] = g’(1) = p
V[x] = g’’(1) + g’(1) – [g’(1)]2
= p – p2 = p(1 – p) = pq
A coin is tossed for n times, Xj = 0 if tail and Xj = 1 if head
probability to have k heads in n tosses.
Sn is the sum of n independent Bernoulli random variables
Sn = X1 +X2 +……….+ Xn
g(z) = GF of a toss = q + pz
GF of Sn = S P{Sn = k}zk
for k : 0,1,2……
= g(z).g(z)…….g(z) = [g(z)]n = (q + pz)n
= S nCk[pz] k q n-k
for k = 0…..n
P{Sn= k} = nCk[pz] k q n-k
for k = 0…..n
=0
for k > n
Binomial Distribution
14
Example Poisson Distribution
Poisson Distribution = [(lt)j/j!]e –lt
for j:0,1,2…..
Generating Function
g(z) = S [(lt)j/j!]e –lt zj
= e –ltS [(ltz)j/j!] for j: 0,1,2,….
= e –lt e ltz = e –lt(1-z)
Expectation P[N(t) =j]
g’(z) = lte –lt(1-z)
E[N(t) =j] = g’(1) = lt
Variance
g’’(z) = (lt)2e –lt(1-z)
g’’(1) = (lt)2
V[N(t)] = g’’(1) + g’(1) – {g’(1)}2 = lt
Sum of Poisson distribution of l1 and l2
g(z) = e –l1t(1-z) e –l2t(1-z) = +e –(l1+ l2)t(1-z)
l = l1 + l2
15
Use of GF for Probability
M/M/1 System Birth and Death Equation
0 = - ( l + m) pn + m pn+1 + l pn-1 (n>1)
0 = -l p0 + m p1
pn+1 = [( l + m)/ m] pn - [ l/ m] pn-1
p1 = [l/ m]p0
If r = l/ m
pn+1 = ( r + 1)pn - r pn-1 (n>1)
p1 = rp0
Use GF to solve this equation
zn pn+1 = ( r + 1) zn pn - r zn pn-1 (n>1)
z-1 pn+1 zn+1 = ( r + 1) zn pn - r z pn-1 zn-1



n+1 = ( r + 1) Szn p - r z Sp
n-1
z-1 S
p
z
z
n+1
n
n-1
n=1
n=1
n=1
16
GF for Prob


n=-1
n=0

z-1[Spn+1 zn+1 – p1 z – p0 ] = ( r + 1)[ Szn pn - p0] - r z Spn-1 zn-1

But
Spn+1
n=-1
zn+1 =

Spn
n=0
zn =
n=1

Spn-1 zn-1 = P(z)
n=1
z-1[P(z) – p1 z – p0 ] = ( r + 1)[ P(z) - p0] - r z P(z)
z-1[P(z) – rp0 z – p0 ] = ( r + 1)[ P(z) - p0] - r z P(z)
z-1P(z) – rp0 – z-1p0 = rP(z) - rp0 + P(z) - p0 - r z P(z)
P(z) = p0 /(1 –rz)
To Find p we use the boundary condition P(1) =1
P(1) = p0 /(1 –r) = 1
p0 = 1 –r
P(z) = (1 –r) /(1 –rz)
1 /(1 –rz) = 1 + zr +

P(z) = S(1-r)rnzn
n=0
zr2
+ …….
pn = (1-r)rn
17
Related documents