Survey
* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project
Generating Functions
For a non-negative-integer-valued random variable T , the
generating function of T is the expected value of s T
X∞
G (s) = E[s T ] =
s k P(T = k),
STAT253/317 Winter 2013 Lecture 8
k=0
Yibi Huang
in which s T is defined as 0 if T = ∞.
I
January 25, 2013
I
I
G (s) is a power series converging absolutely for all −1 < s < 1.
(
= 1 if T is finite w/ prob. 1
G (1) = P(T < ∞)
< 1 otherwise
G (k) (0)
k!
Knowing G (s) ⇔ Knowing P(T = k) for all k = 0, 1, 2, . . .
P(T = k) =
Generating Functions
STAT253/317 2013 Winter
Lecture 8 - 1
More Properties of Generating Functions
T
G (s) = E[s ] =
∞
X
I
k
s P(T = k)
I
I
I
E[T ] = lims→1− G 0 (s) if it exists because
State Space = {0, 1, 2, . . .}
P01 = 1, Pi,i+1 = p, Pi,i−1 = 1 − p = q, for i = 1, 2, 3 . . .
Only one class, irreducible
For i < j, define
∞
G 0 (s) =
Lecture 8 - 2
4.5.3 Random Walk w/ Reflective Boundary at 0
k=0
I
STAT253/317 2013 Winter
Nij = min{m > 0 : Xm = j|X0 = i}
X
d
E[s T ] = E[Ts T −1 ] =
s k−1 kP(T = k).
ds
= time to reach state j starting in state i
k=1
I
E[T (T − 1)] = lims→1− G 00 (s) if it exists because
X∞
G 00 (s) = E[T (T − 1)s T −2 ] =
s k−2 k(k − 1)P(T = k)
k=2
I
If T and U are independent non-negative-integer-valued
random variables, with generating function GT (s) and GU (s)
respectively, then the generating function of T + U is
GT +U (s) = E[s
STAT253/317 2013 Winter
T +U
T
I
I
Observe that N0n = N01 + N12 + . . . + Nn−1,n
By the Markov property, N01 , N12 , . . . , Nn−1,n are indep.
Given X0 = i
(
1
if X1 = i + 1
Ni,i+1 =
∗
∗
1 + Ni−1,i + Ni,i+1 if X1 = i − 1
(1)
∗
∗
∗
∗
where Ni−1,i
∼ Ni−1,i , Ni,i+1
∼ Ni,i+1 , and Ni−1,i
, Ni,i+1
are
indep.
U
] = E[s ]E[s ] = GT (s)GU (s)
Lecture 8 - 3
STAT253/317 2013 Winter
Lecture 8 - 4
Similarly,
Generating Function of Ni,i+1
ps(1 − qs 2 )
ps
=
1 − qsG1 (s)
1 − q(1 + p)s 2
ps
pqs 3
=
+
1 − q(1 + p)s 2 1 − q(1 + p)s 2
∞
∞
X
X
= ps
(q(1 + p)s 2 )k + pqs 3
(q(1 + p)s 2 )k
G2 (s) =
Let Gi (s) be the generating function of Ni,i+1 . From (??), and by
∗
∗
the independence of Ni−1,i
and Ni,i+1
, we get that
∗
∗
Gi (s) = ps + qE[s 1+Ni−1,i +Ni,i+1 ] = ps + qsGi−1 (s)Gi (s)
So
Gi (s) =
ps
1 − qsGi−1 (s)
k=0
(2)
=
Since N01 is always 1, we have G0 (s) = s. Using the iterating
relation (??), we can find
G1 (s) =
So P(N12
ps
ps
=
= ps
1 − qsG0 (s)
1 − qs 2
(
pq k
= n) =
0
∞
X
(qs 2 )k =
k=0
∞
X
k=0
pq k (1 + p)k s 2k+1 +
k=0
= ps +
pq k s 2k+1
k=0
if n = 2k + 1 for k = 0, 1, 2 . . .
if n is even
STAT253/317 2013 Winter
Lecture 8 - 5
∞
X
pq k+1 (1 + p)k s 2k+3
k=0
∞
X
pq k [(1 + p)k + (1 + p)k−1 ]s 2k+1
k=1
So
P(N23
Mean of Ni,i+1
Recall that
∞
X
if n = 1
p
= n) = pq k [(1+p)k + (1+p)k−1 ] if n = 2k + 1 for k = 1, 2, . . .
0
if n is even
STAT253/317 2013 Winter
Lecture 8 - 6
Mean of N0,n
Gi0 (1)
Gi0 (s) =
=
= E (Ni,i+1 ). Let mi = E (Ni,i+1 ) = Gi0 (1).
0 (s))
p(1 − qsGi−1 (s)) + ps(qGi−1 (s) + qsGi−1
(1 − qsGi−1 (s))2
0 (s)
p + pqs 2 Gi−1
(1 − qsGi−1 (s))2
Recall that N0n = N01 + N12 + . . . + Nn−1,n
E[N0n ] = m0 + m1 + . . . + mn−1
(
2pq
q n
n
p−q − (p−q)2 [1 − ( p ) ]
=
n2
Since Ni,i+1 < ∞, Gi (1) = 1 for all i = 0, 1, . . . , n − 1. We have
0 (1)
0 (1)
p + pqGi−1
1 + qGi−1
1 q
mi = Gi0 (1) =
=
=
+ mi−1
(1 − q)2
p
p p
1 q 1 q
= + ( + mi−2 )
p p p p
1
q
q 2
q i−1
q
=
1 + + ( ) + ... + ( )
+ ( )i m 0
p
p
p
p
p
Since N01 = 1, which implies m0 = 1.
(
1−(q/p)i
+ ( pq )i
p−q
mi =
2i + 1
STAT253/317 2013 Winter
if p 6= 0.5
if p = 0.5
When
p > 0.5 E[N0n ] ≈
p = 0.5 E[N0n ] =
n
p−q
n2
−
2pq
(p−q)2
linear in n
quadratic in n
2pq
q n
p < 0.5 E[N0n ] = O( (p−q)
exponential in n
2 (p) )
if p 6= 0.5
if p = 0.5
Lecture 8 - 7
STAT253/317 2013 Winter
Lecture 8 - 8
Example: Symmetric Random Walk on (−∞, ∞)
Generating Function of N10
Let G (s) be the generating function of N10 . From (??), we know
that
1
1
1
1
0
G (s) = s + E[s 1+N21 +N10 ] = s + sG (s)2
2
2
2
2
which is a quadratic equation in G (s). The two roots are
√
1 ± 1 − s2
G (s) =
s
State space = {. . . , −2, −1, 0, 1, 2, . . .}
Pi,i+1 = Pi,i−1 = 1/2,
for all integer i
I
1 classes, recurrent, null-recurrent or positive-recurrent?
I
For i, j, define Nij = min{m > 0 : Xm = j|X0 = i}.
I
Note N00 = 1 + N10
I
Given X0 = 1
N10
(
1
=
0
1 + N21 + N10
if X1 = 0
if X1 = 2
Since G (s) must lie between 0 and 1 when 0 < s < 1. So
√
1 − 1 − s2
G (s) =
s
(3)
G 0 (s) = √
0 are independent and have the same
Note N21 and N10
distribution as N10 (Why?)
Lecture 8 - 9
k=0
α
0
= 1 and for k ≥ 1,
α
k
=
Qk−1
i=0
(α − i)/k!.
s→1−
STAT253/317 2013 Winter
Lecture 8 - 10
From all the above we have
P
∞
1/2
2 k
X
1− ∞
k=0 k (−s )
k 1/2
G (s) =
=−
(−1)
s 2k−1
s
k
k=1
∞
X
1
2k − 1 2k−1
=
s
2k−1
k
2
(2k − 1)
k=1
1 1
1
1
1
1/2
2 ( 2 − 1)( 2 − 2)( 2 − 3) . . . ( 2 − k + 1)
=
k!
k
k−1
(−1) 1 · 1 · 3 · 5 . . . (2k − 3)
=
2k k!
(−1)k−1 1 · 1 · 3 · 5 . . . (2k − 3)(2k − 1)
=
2k k!(2k − 1)
(2k−1)!
(−1)k−1 2k−1
(−1)k−1
2k − 1
(k−1)!
= 2k−1
=
k
2k k!(2k − 1)
2
(2k − 1)
STAT253/317 2013 Winter
E[N10 ] = lim G 0 (s) = ∞
2
The power series expansion of G (s) = 1− s1−s can be found via
Newton’s binomial formula
∞ X
α
2 α
(1 − s ) =
(−s 2 )k
k
where
1−
1
,
+ 1 − s2
s2
which implies E[N00 ] = 1 + E[N10 ] = ∞.
⇒ Symmetric random walk is null recurrent.
STAT253/317 2013 Winter
√
Note that
So the distribution of N10 is
1
2k − 1
P(N10 = 2k − 1) = 2k−1
k
2
(2k − 1)
P(N10 = 2k) = 0
for k = 0, 1, 2, . . .
Lecture 8 - 11
STAT253/317 2013 Winter
Lecture 8 - 12
4.7 Branching Processes
Mean of a Branching Process
Recall a Branching Process is a population of individuals in which
I
all individuals have the same lifespan, and
I
each individual will produce a random number of offsprings at
the end of its life
Let Xn = size of the nth generation, n = 0, 1, 2, . . .. Let Zn,i = #
of offsprings produced by the ith individuals in the nth generation.
Then
XXn
Xn+1 =
Zn,i
(4)
Let µ = E[Zn,i ] =
E[Xn |Xn−1 ] = E
We suppose the non-trivial case that Pj < 1 for all j ≥ 0.
{Xn } is a Markov chain with state space = {0, 1, 2, . . .}.
STAT253/317 2013 Winter
Lecture 8 - 13
Variance of a Branching Process
E[Xn |Xn−1 ] = Xn−1 µ,
PXn−1
i=1
Zn,i , we
Var(Xn |Xn−1 ) = Xn−1 σ 2
i=1
Zn−1,i = Xn−1 E[Zn−1,i ] = Xn−1 µ
If X0 = 1, then
I
If µ < 1 ⇒ E[Xn ] → 0 as n → ∞ ⇒ limn→∞ P(Xn ≥ 1) = 0
the branching processes will eventually die out.
I
What if µ = 1 or µ > 1?
STAT253/317 2013 Winter
Lecture 8 - 14
Generating FunctionsPof the Branching Processes
k
Let g (s) = E[s Zn,i ] = ∞
k=0 Pk s be the generating function of
Zn,i , and Gn (s) be the generating function of Xn , n = 0, 1, 2, . . ..
Then {Gn (s)} satisfies the following two iterative equation
So
Var(Xn ) = E[Var(Xn |Xn−1 )] + Var(E[Xn |Xn−1 ])
= σ 2 µn + µ2 Var(Xn−1 )
.
= ..
= σ (µ
n
for n = 0, 1, 2, . . .
Proof of (i).
h P Xn
i h YXn
i
E[s Xn+1 |Xn ] = E s i=1 Zn,i E
s Zn,i
i=1
YXn
Zn,i
=
E[s ] (by indep. of Zn,i )
i=1
YXn
g (s) = g (s)Xn
=
E[Var(Xn |Xn−1 )] = σ 2 E[Xn−1 ] = σ 2 µn−1 .
n−1
Zn−1,i , we have
(ii) Gn+1 (s) = g (Gn (s)) if X0 = 1, for n = 0, 1, 2, . . .
Var(E[Xn |Xn−1 ]) = Var(Xn−1 µ) = µ2 Var(Xn−1 )
2
i=1
(i) Gn+1 (s) = Gn (g (s))
and hence
i=1
From which, we have
2n−2
2n
+ µ + ... + µ
) + µ Var(X0 )
(
2 µn−1 1−µn
if µ 6= 1
σ
1−µ
= µ2n Var(X0 ) +
nσ 2
if µ = 1
STAT253/317 2013 Winter
Xn−1
PXn−1
E[Xn ] = µE[Xn−1 ] = µ2 E[Xn−2 ] = . . . = µn E[X0 ] = µn
P(Zn,i = j) = Pj , j ≥ 0.
= Var(Zn,i ). Again from the fact that Xn =
X
Since Xn =
E[Xn ] = E[E[Xn |Xn−1 ]] = E[Xn−1 µ] = µE[Xn−1 ]
Suppose Zn,i ’s are i.i.d with probability mass function
Let
have
j=0 jPj .
So
i=1
σ2
P∞
Gn+1 (s) = E[s Xn+1 ] = E[E[s Xn+1 |Xn ]] = E[g (s)Xn ] = Gn (g (s))
since Gn (s) = E[s Xn ].
Proof of (ii): HW today
Lecture 8 - 15
STAT253/317 2013 Winter
Lecture 8 - 16
Example
Extinction Probability
Suppose X0 = 1, and P0 = 1/4, P1 = 1/2, P2 = 1/4.
Find the distribution of X2 .
Sol.
1
1
1
g (s) = s 0 + s 1 + s 2 = (1 + s)2 /4.
4
2
4
X
0
Since X0 = 1, G0 (s) = E[s ] = E[s 1 ] = s. From (i) we have
Let π0 = lim P(Xn = 0|X0 = 1)
n→∞
= P(the population will eventually die out|X0 = 1)
Proposition
2
The extinction probability π0 is a root of of the equation
G1 (s) = G0 (g (s)) = g (s) = (1 + s) /4
1
1
1
G2 (s) = G1 (g (s)) = (1 + (1 + s)2 )2 = (5 + 2s + s 2 )2
4
4
64
∞
X
1
P(X2 = k)s k
= (25 + 20s + 14s 2 + 4s 2 + s 4 ) =
64
g (s) = s
where g (s) =
Proof.
k=0
The coefficient of
X2 = k.
sk
in the polynomial of G2 (s) is the chance that
k
0
1
2
3
4
P(X2 = k)
25
64
20
64
14
64
4
64
1
64
P∞
k=0 Pk s
k
(5)
is the generating function of Zn,i .
π0 = P(population dies out)
X∞
=
P(population dies out|X1 = j)Pj
j=0
X∞ j
=
π0 Pj = g (π0 )
j=0
and P(X2 = k) = 0 for k ≥ 5.
STAT253/317 2013 Winter
Lecture 8 - 17
Proposition I
STAT253/317 2013 Winter
Lecture 8 - 18
Proposition II
P
Let µ = E[Zn,i ] = ∞
j=0 jPj . If µ ≤ 1, the extinction probability π0
is 1 unless P1 = 1.
Proof. Let h(s) = g (s) − s. Since g (1) = 1, g 0 (1) = µ,
h(1) = g (1) − 1 = 0,
∞
∞
X
X
0
j−1
h (s) =
jPj s
−1≤
jPj − 1 = µ − 1
j=1
If µ > 1, there is a unique root of the equation g (s) = s in the
domain [0, 1), and that is the extinction probability.
Proof. Let h(s) = g (s) − s. Observe that
h(0) = g (0) = P0 > 0
for 0 ≤ s < 1
j=1
h0 (0) = g 0 (0) − 1 = P1 − 1 < 0
Then µ > 1 ⇒ h0 (1) = µ − 1 > 0
⇒ h(s) is increasing near 1
⇒ h(1 − δ) < h(1) = 0 for δ > 0 small enough
Thus µ ≤ 1 ⇒ h0 (s) ≤ 0 for 0 ≤ s < 1
⇒ h(s) is non-increasing in [0, 1)
⇒ h(s) > h(1) = 0 for 0 ≤ s < 1
⇒ g (s) > s
for 0 ≤ s < 1
⇒ There is no root in [0,1).
Since h(s) is continuous in [0, 1), there must be a root to
h(s) = s. The root is unique since
X∞
h00 (s) = g 00 (s) =
j(j − 1)Pj s j−2 ≥ 0 for 0 ≤ s < 1
j=2
h(s) is convex in [0,1).
STAT253/317 2013 Winter
Lecture 8 - 19
STAT253/317 2013 Winter
Lecture 8 - 20