Download Lecture 8

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Generating Functions
For a non-negative-integer-valued random variable T , the
generating function of T is the expected value of s T
X∞
G (s) = E[s T ] =
s k P(T = k),
STAT253/317 Winter 2014 Lecture 8
k=0
Yibi Huang
in which s T is defined as 0 if T = ∞.
◮
January 27, 2014
◮
G (s) is a power series converging absolutely for all −1 < s < 1.
(
= 1 if T is finite w/ prob. 1
G (1) = P(T < ∞)
< 1 otherwise
◮
G (k) (0)
k!
Knowing G (s) ⇔ Knowing P(T = k) for all k = 0, 1, 2, . . .
P(T = k) =
Generating Functions
Lecture 8 - 1
Lecture 8 - 2
More Properties of Generating Functions
G (s) = E[s T ] =
∞
X
4.5.3 Random Walk w/ Reflective Boundary at 0
◮
s k P(T = k)
◮
k=0
◮
◮
◮
E[T ] = lims→1− G ′ (s) if it exists because
State Space = {0, 1, 2, . . .}
P01 = 1, Pi,i+1 = p, Pi,i−1 = 1 − p = q, for i = 1, 2, 3 . . .
Only one class, irreducible
For i < j, define
Nij = min{m > 0 : Xm = j|X0 = i}
∞
G ′ (s) =
X
d
E[s T ] = E[Ts T −1 ] =
s k−1 kP(T = k).
ds
= time to reach state j starting in state i
k=1
◮
E[T (T − 1)] = lims→1− G ′′ (s) if it exists because
X∞
s k−2 k(k − 1)P(T = k)
G ′′ (s) = E[T (T − 1)s T −2 ] =
◮
◮
k=2
◮
If T and U are independent non-negative-integer-valued
random variables, with generating function GT (s) and GU (s)
respectively, then the generating function of T + U is
Observe that N0n = N01 + N12 + . . . + Nn−1,n
By the Markov property, N01 , N12 , . . . , Nn−1,n are indep.
Given X0 = i
(
1
if X1 = i + 1
Ni,i+1 =
∗
∗
+ Ni,i+1
if X1 = i − 1
1 + Ni−1,i
∗
∗
∗
∗
where Ni−1,i
∼ Ni−1,i , Ni,i+1
∼ Ni,i+1 , and Ni−1,i
, Ni,i+1
are
indep.
GT +U (s) = E[s T +U ] = E[s T ]E[s U ] = GT (s)GU (s)
Lecture 8 - 3
Lecture 8 - 4
Similarly,
Generating Function of Ni,i+1
Let Gi (s) be the generating function of Ni,i+1 . From (1), and by
∗
∗
the independence of Ni−1,i
and Ni,i+1
, we get that
∗
Gi (s) =
∗
ps
1 − qsGi−1 (s)
ps(1 − qs 2 )
ps
=
1 − qsG1 (s)
1 − q(1 + p)s 2
ps
pqs 3
=
−
1 − q(1 + p)s 2 1 − q(1 + p)s 2
X∞
X∞
= ps
(q(1 + p)s 2 )k − pqs 3
(q(1 + p)s 2 )k
k=0
k=0
X∞
X∞
=
pq k (1 + p)k s 2k+1 −
pq k+1 (1 + p)k s 2k+3
k=0
k=0
X∞
= ps +
pq k [(1 + p)k − (1 + p)k−1 ]s 2k+1
Xk=1
∞
= ps +
p 2 q k (1 + p)k−1 s 2k+1
G2 (s) =
Gi (s) = ps + qE[s 1+Ni−1,i +Ni,i+1 ] = ps + qsGi−1 (s)Gi (s)
So
(1)
(2)
Since N01 is always 1, we have G0 (s) = s. Using the iterating
relation (2), we can find
k=1
∞
∞
X
X
ps
ps
G1 (s) =
=
= ps
(qs 2 )k =
pq k s 2k+1
1 − qsG0 (s)
1 − qs 2
k=0
So P(N12 = n) =
(
pq k
0
k=0
if n = 2k + 1 for k = 0, 1, 2 . . .
if n is even
Lecture 8 - 5
So
P(N23


p
= n) = p 2 q k (1+p)k−1


0
if n = 1
if n = 2k + 1 for k = 1, 2, . . .
if n is even
Lecture 8 - 6
Mean of Ni,i+1
Recall that Gi′ (1) = E (Ni,i+1 ). Let mi = E (Ni,i+1 ) = Gi′ (1).
′ (s))
p(1 − qsGi−1 (s)) + ps(qGi−1 (s) + qsGi−1
(1 − qsGi−1 (s))2
2
′
p + pqs Gi−1 (s)
=
(1 − qsGi−1 (s))2
Gi′ (s) =
Since Ni,i+1 < ∞, Gi (1) = 1 for all i = 0, 1, . . . , n − 1. We have
mi =
′ (1)
′ (1)
p + pqGi−1
1 + qGi−1
1 q
= + mi−1
=
=
2
(1 − q)
p
p p
q 1 q
+ ( + mi−2 )
p p p
q
q
q
q
1 + + ( )2 + . . . + ( )i−1 + ( )i m0
p
p
p
p
Gi′ (1)
1
=
p
1
=
p
Since N01 = 1, which implies m0 = 1.
(
1−(q/p)i
+ ( qp )i
p−q
mi =
2i + 1
Lecture 8 - 7
Example: Symmetric Random Walk on (−∞, ∞)
State space = {. . . , −2, −1, 0, 1, 2, . . .}
Pi,i+1 = Pi,i−1 = 1/2,
for all integer i
◮
1 classes, recurrent, null-recurrent or positive-recurrent?
◮
For i, j, define Nij = min{m > 0 : Xm = j|X0 = i}.
◮
Note N00 = 1 + N10
◮
Given X0 = 1
N10 =
(
if X1 = 0
if X1 = 2
1
′
1 + N21 + N10
(3)
′ are independent and have the same
Note N21 and N10
distribution as N10 (Why?)
if p 6= 0.5
if p = 0.5
Generating Function of N10
Let G (s) be the generating function of N10 . From (3), we know
that
1
1
1
1
′
G (s) = s + E[s 1+N21 +N10 ] = s + sG (s)2
2
2
2
2
which is a quadratic equation in G (s). The two roots are
√
1 ± 1 − s2
G (s) =
s
Lecture 8 - 8
√
2
The power series expansion of G (s) = 1− s1−s can be found via
Newton’s binomial formula
∞ X
α
(1 − s 2 )α =
(−s 2 )k
k
k=0
where
!α
Since G (s) must lie between 0 and 1 when 0 < s < 1. So
√
1 − 1 − s2
G (s) =
s
Note that
1
G ′ (s) = √
, E[N10 ] = lim G ′ (s) = ∞
s→1−
1 − s2 + 1 − s2
which implies E[N00 ] = 1 + E[N10 ] = ∞.
⇒ Symmetric random walk is null recurrent.
Lecture 8 - 9
0
1/2
k
= 1 and for k ≥ 1,
k
=
Qk−1
i=0
(α − i)/k!.
− 1)( 12 − 2)( 12 − 3) . . . ( 12 − k + 1)
k!
k−1
(−1) 1 · 1 · 3 · 5 . . . (2k − 3)
=
2k k!
(−1)k−1 1 · 1 · 3 · 5 . . . (2k − 3)(2k − 1)
=
2k k!(2k − 1)
(2k−1)!
k−1
(−1)
(−1)k−1
2k − 1
2k−1 (k−1)!
=
=
k
2k k!(2k − 1)
22k−1 (2k − 1)
=
1 1
2(2
! α
Lecture 8 - 10
4.7 Branching Processes Revisit
From all the above we have
!1/2
P
∞
2 k
X
1− ∞
1/2 2k−1
k=0 k (−s )
=−
(−1)k
G (s) =
s
s
k
k=1
∞
X
2k − 1 2k−1
1
=
s
k
22k−1 (2k − 1)
k=1
So the distribution of N10 is
2k − 1
1
k
22k−1 (2k − 1)
= 2k) = 0
P(N10 = 2k − 1) =
P(N10
for k = 0, 1, 2, . . .
Recall a Branching Process is a population of individuals in which
◮
all individuals have the same lifespan, and
◮
each individual will produce a random number of offsprings at
the end of its life
Let Xn = size of the nth generation, n = 0, 1, 2, . . .. Let Zn,i = #
of offsprings produced by the ith individuals in the nth generation.
Then
X Xn
Xn+1 =
Zn,i
(4)
i=1
Suppose Zn,i ’s are i.i.d with probability mass function
P(Zn,i = j) = Pj , j ≥ 0.
We suppose the non-trivial case that Pj < 1 for all j ≥ 0.
{Xn } is a Markov chain with state space = {0, 1, 2, . . .}.
Lecture 8 - 11
Lecture 8 - 12
Generating FunctionsPof the Branching Processes
Example
k
Let g (s) = E[s Zn,i ] = ∞
k=0 Pk s be the generating function of
Zn,i , and Gn (s) be the generating function of Xn , n = 0, 1, 2, . . ..
Then {Gn (s)} satisfies the following two iterative equation
(i) Gn+1 (s) = Gn (g (s))
for n = 0, 1, 2, . . .
(ii) Gn+1 (s) = g (Gn (s))
if X0 = 1, for n = 0, 1, 2, . . .
G1 (s) = G0 (g (s)) = g (s) = (1 + s)2 /4
1
1
1
G2 (s) = G1 (g (s)) = (1 + (1 + s)2 )2 = (5 + 2s + s 2 )2
4
4
64
∞
X
1
= (25 + 20s + 14s 2 + 4s 2 + s 4 ) =
P(X2 = k)s k
64
Proof of (i).
h P Xn
i h Y Xn
i
s Zn,i
E[s Xn+1 |Xn ] = E s i=1 Zn,i E
i=1
Y Xn
E[s Zn,i ] (by indep. of Zn,i )
=
i=1
Y Xn
=
g (s) = g (s)Xn
k=0
i=1
From which, we have
Gn+1 (s) = E[s Xn+1 ] = E[E[s Xn+1 |Xn ]] = E[g (s)Xn ] = Gn (g (s))
since Gn (s) = E[s Xn ].
Proof of (ii): HW today
25
64
20
64
14
64
4
64
1
64
Proposition I
P
Let µ = E[Zn,i ] = ∞
j=0 jPj . If µ ≤ 1, the extinction probability π0
is 1 unless P1 = 1.
Proof. Let h(s) = g (s) − s. Since g (1) = 1, g ′ (1) = µ,
Let π0 = lim P(Xn = 0|X0 = 1)
n→∞
= P(the population will eventually die out|X0 = 1)
Proposition
The extinction probability π0 is a root of of the equation
g (s) = s
k=0 Pk s
P(X2 = k)
Lecture 8 - 14
Extinction Probability
P∞
The coefficient of s k in the polynomial of G2 (s) is the chance that
X2 = k.
0
1
2
3
4
k
and P(X2 = k) = 0 for k ≥ 5.
Lecture 8 - 13
where g (s) =
Proof.
Suppose X0 = 1, and (P0 , P1 , P2 ) = (1/4, 1/2, 1/4). Find the
distribution of X2 .
Sol.
1
1
1
g (s) = s 0 + s 1 + s 2 = (1 + s)2 /4.
4
2
4
Since X0 = 1, G0 (s) = E[s X0 ] = E[s 1 ] = s. From (i) we have
k
is the generating function of Zn,i .
(5)
h(1) = g (1) − 1 = 0,
∞
∞
X
X
jPj s j−1 − 1 ≤
jPj − 1 = µ − 1 for 0 ≤ s < 1
h′ (s) =
j=1
j=1
Thus µ ≤ 1 ⇒ h′ (s) ≤ 0 for 0 ≤ s < 1
⇒ h(s) is non-increasing in [0, 1)
π0 = P(population dies out)
X∞
=
P(population dies out|X1 = j)Pj
j=0
X∞ j
=
π0 Pj = g (π0 )
⇒ h(s) > h(1) = 0 for 0 ≤ s < 1
⇒ g (s) > s
⇒ There is no root in [0,1).
j=0
Lecture 8 - 15
Proposition II
If µ > 1, there is a unique root of the equation g (s) = s in the
domain [0, 1), and that is the extinction probability.
Proof. Let h(s) = g (s) − s. Observe that
h(0) = g (0) = P0 > 0
h′ (0) = g ′ (0) − 1 = P1 − 1 < 0
Then µ > 1 ⇒ h′ (1) = µ − 1 > 0
⇒ h(s) is increasing near 1
⇒ h(1 − δ) < h(1) = 0 for δ > 0 small enough
Since h(s) is continuous in [0, 1), there must be a root to
h(s) = s. The root is unique since
X∞
j(j − 1)Pj s j−2 ≥ 0 for 0 ≤ s < 1
h′′ (s) = g ′′ (s) =
j=2
h(s) is convex in [0,1).
Lecture 8 - 17
Lecture 8 - 16
for 0 ≤ s < 1
Related documents