Download solutions

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Probability wikipedia , lookup

Transcript
Statistics 253/317
Introduction to Probability Models
Winter 2013 - Midterm Exam Solutions
Problem 1. [10 points] A Markov chain with state space {0, 1, 2, 3, 4} has transition matrix
0
1
2
3
4

0
0
1/2 0
0
1/2
1
0
1
0
0 
 0


1/2 0 1/2
0 
P = 2 0

3 0
0
1
0
0 
4 1/2
0
0 1/2
0

Find all communicating classes. For each state, determine its periods, and whether it is recurrent
or transient. Explain your reasoning.
Answer: The communicating classes are {0, 4} and {1, 2, 3}. To see the former note that 0 and
4 do communicate (in 1 step), but that 0 is not accessible from any state other than 4, and vice
versa. To see that {1, 2, 3} is a class, note that a circuit 1 → 2 → 3 → 2 → 1 is possible.
Both classes have period d = 2. To see this observe that in {0, 4} we alternate from 0 to 4 and
back (until leaving the class). And in {1, 2, 3} we alternate between being in state 2 and being in
states 1 or 3.
The class {1, 2, 3} is recurrent since the chain will never leave the class. The class {0, 4} is transient
since there is a positive chance to leave the class, and once leaves, the chain will never go back to
{0, 4} anymore.
Problem 2. [20 points] Let i and j be 2 states of a Markov chain. Let fij be the probability
that starting in i the chain ever reaches j and let fji be the probability that starting in j the chain
ever reaches i.
(a) [10 pts] If i and j are both transient and i communicates with j, argue that fij and fji
cannot both be equal to one.
Answer: Suppose fij and fji are both equal to 1. Then starting in i, with probability 1
the chain will visit j and once there with probability 1 the chain will go back to i. So with
probability 1 the chain, starting in i, will return to i, contradicting the assumption that i is
transient.
(b) [10 pts] If i and j are both recurrent, is it possible for i to be accessible from j but j not
accessible from i? Justify your answer.
Answer: No. If i and j are both recurrent and they do not communicate then they are
in different classes, and recurrent classes are closed, so neither could be accessible from the
other.
1
Problem 3. [35 points] [Nearly Symmetric Random Walk]
Let {Xn : n = 0} be a (non-simple) random walk on {0, 1, 2, . . .} with transition probabilities
Pi,i+1 =
ai+1
, for i ≥ 0
ai + ai+1
and Pi,i−1 =
ai
for i ≥ 1,
ai + ai+1
P0,0 =
a0
.
a0 + a1
Here {ai > 0 : i = 0, 1, 2, . . .} is a positive sequence such that limi→∞ ai+1 /ai = 1. Observed that
Pi,i+1 → 1/2 when i is large. Thus the process behaves like a symmetric random walk when it is
far from 0. Since ai > 0 for all i = 0, 1, 2, . . . and P0,0 > 0, {Xn } is irreducible and aperiodic. We
know that a symmetric random walk is null recurrent, will a nearly symmetric random walk be null
recurrent or positive recurrent?
(a) [10pts] Show that {Xn } has a limiting distribution (and hence is positive recurrent) if
X∞
ai < ∞.
i=0
Express the limiting distribution of {Xn } in terms of {a0 , a1 , a2 , . . .}.
(Hint: Solve the detailed balanced equation for the limiting distribution.)
Answer: Solving the detailed balanced equation πi Pi,i+1 = πi+1 Pi+1,i , we get
πi
ai+1
ai+1
= πi+1
ai + ai+1
ai+1 + ai+2
πi
πi+1
=
, i = 0, 1, 2, . . .
ai + ai+1
ai+1 + ai+2
⇒
We can see that πi = C(ai + ai+1 ) is a solution to the set of equations above. To make
P
∞
i=0 πi = 1, we need to make
∞
X
C = [2(
ai ) − a0 ]−1
i=0
which is non-zero if
∞
X
ai < ∞.
i=0
So the limiting distribution exists in this case. The limiting distribution is
πi =
ai + ai+1
P∞
.
2( i=0 ai ) − a0
(b) [5 pts] For ai =P
1/(i + 1)2 , i = 0, 1, 2, . . ., find the limiting distribution of {Xn }. You may
2
2
use the identity ∞
n=1 (1/n ) = π /6.
P
P∞
2
2
Answer: Using the previous identity and that ∞
i=0 ai =
i=0 1/(i + 1) = π /6, we get
3
that C = (2 × π 2 /6 − a0 )−1 = (π 2 /3 − 1)−1 = 2
π −3
3
1
1
πi = 2
+
π − 3 (i + 1)2 (i + 2)2
2
(c) [10 pts] Let Mi,j be the mean time to reach state j starting in state i. First show that for a
general random walk on {0,1,2,. . . },
Mi,i+1 = 1 + Pi,i−1 Mi−1,i + Pi,i−1 Mi,i+1 ,
for i ≥ 1
and M0,1 = 1/P01 .
and then show that for the nearly symmetric random walk above
ai+1 Mi,i+1 = ai + ai+1 + ai Mi−1,i .
Answer: For i < j, define
Nij = min{m > 0 : Xm = j|X0 = i}
= time to reach state j starting in state i
Note that E[Ni,j ] is exactly Mi,j in the problem. Given X0 = i.
(
1
if X1 = i + 1
Ni,i+1 =
∗
∗
1 + Ni−1,i + Ni,i+1 if X1 = i − 1
∗
∗
∗
∗
where Ni−1,i
∼ Ni−1,i , Ni,i+1
∼ Ni,i+1 , and Ni−1,i
, Ni,i+1
are independent. Thus, for i ≥ 1,
∗
∗
Mi,i+1 = E[Ni,i+1 ] = 1 × Pi,i+1 + E[1 + Ni−1,i
+ Ni,i+1
]Pi,i−1
= Pi,i+1 + Pi,i−1 (1 + Mi−1,i + Mi,i+1 )
= 1 + Pi,i−1 (Mi−1,i + Mi,i+1 ).
For M0,1 , observe that given X0 = 0,
N0,1
(
1
=
∗
1 + N0,1
if X1 = 1
if X1 = 0
∗ ∼ N . Thus
where N0,1
0,1
∗
M0,1 = 1 + E[N0,1
]P0,0 = 1 + M0,1 P0,0
⇒
For the nearly symmetric random walk, plugging in Pi,i+1 =
we get
ai+1 Mi,i+1
ai
ai+1
and Pi,i−1 =
,
ai + ai+1
ai + ai+1
ai
ai
Mi,i−1 +
Mi,i+1
ai + ai+1
ai + ai+1
= ai + ai+1 + ai Mi,i+1
Mi,i+1 = 1 +
⇒
M0,1 = 1/P0,1 .
3
(d) [10 pts] Use the equation in part (c) to find an expression of Mi,i+1 when
ak =
1
(k + 1)(k + 2)
for k ≥ 0.
The expression for Mi,i+1 cannot involve unevaluated summation. Then argue that
M0,n = M0,1 + M1,2 + M2,3 + · · · + Mn−1,n
to find M0,n . Hint:
1
(k+1)(k+2)
=
1
k+1
−
1
k+2 .
Answer: Let us define Hi = ai Mi−1,i . Then H1 = a1 M01 = a1 /P01 = a0 + a1
Then we obtain that Hi+1 − Hi = ai + ai+1 . So
Hi = (ai + ai−1 ) + Hi−1
= (ai + ai−1 ) + (ai−1 + ai−2 ) + Hi−2
= (ai + ai−1 ) + (ai−1 + ai−2 ) + (ai−2 + ai−3 ) + Hi−3
= (ai + ai−1 ) + (ai−1 + ai−2 ) + (ai−2 + ai−3 ) + · · · + (a2 + a1 ) + H1
= (ai + ai−1 ) + (ai−1 + ai−2 ) + (ai−2 + ai−3 ) + · · · + (a2 + a1 ) + (a1 + a0 )
Xi
=2
aj − a0 − ai
j=0
Thus in general, we see that
Mi−1,i
For ak =
P
2( ij=0 aj ) − a0 − ai
Hi
=
.
=
ai
ai
1
(k+1)(k+2) ,
Xi
j=0
Xi
1
1
1
=
−
j=0 j + 1
j=0 (j + 1)(j + 2)
j+2
1 1 1 1
1
1
= − + − + ··· +
−
1 2 3 3
i+1 i+2
1
=1−
i+2
aj =
Xi
Plugging in the equation above we get
Xi
Hi = 2(
j=0
aj ) − a0 − ai = 2(1 −
1
1
1
3
1
1
1
) − (1 − ) − (
−
)= −
−
i+2
2
i+1 i+2
2 i+1 i+2
Thus
Hi
3
1
1
= (i + 1)(i + 2)( −
−
)
ai
2 i+1 i+2
3
3i2 + 5i
= (i + 1)(i + 2) − (i + 2) − (i + 1) =
2
2
Xn
Xn 3i2 + 5i
3 n(n + 1)(2n + 1) 5 n(n + 1)
=
Mi−1,i =
=
+
i=1
i=1
2
2
6
2
2
n(n + 1)(2n + 1) 5n(n + 1)
n(n + 1)(n + 3)
+
=
=
4
4
2
Mi−1,i =
M0,n
4
Problem 4. [35 points] Suppose that people arrive at a bus stop in accordance with a Poisson
process with rate λ. The bus departs at time t.
(a) [5 pts] Suppose everyone arrives will wait until the bus comes, i.e., everyone arrives during
[0, t] will get on the bus. What is the probability that the bus departs with n people aboard?
Answer: Let N (t) be the number of people arrived by time t (and hence on bus). {N (t), t ≥
0} is a Poisson process, N (t) ∼ Poisson(λt). So,
P(N (t) = n) = e−λt (λt)n /n!.
(b) [10 pts] Let X be the total amount of waiting time of all those who get on the bus at time
t. Find E[X]. (Hint: condition on the number of people on the bus.)
Answer: Let Si be the arrival time of the ith person. Then the total amount of waiting time
of all people aboard.
N (t)
X
X=
(t − Si )
i=1
Now we know that (S1 , S2 , . . . , Sn )|N (t) = n ∼ (U(1) , U(2) , . . . , U(n) ) where U(1) , U(2) , · · · , U(n)
are the order statistics of n i.i.d. Uniform(0, t] random variables, U1 , U2 , . . . , Un .
Xn
E(X|N (t) = n) = E
t − Si
Xi=1
n
=E
t − U(i)
Xi=1
n
=E
t − Ui
i=1
Xn
=
t − E(Ui )
(E(Ui ) = t/2 since Ui ∼ Uniform(0, t])
i=1
nt
t
= n(t − ) =
2
2
Thus E(X|N (t)) = tN (t)/2, and
t
λt2
t
E(X) = E(E(X|N (t))) = E(N (t)) = (λt) =
2
2
2
5
Suppose each person arrives at the bus stop will independently wait some time that has an
exponential distribution with rate µ. If no bus arrives, he/she will leave the bus stop.
(c) [10 pts] What is the probability that the bus departs with n people aboard?
Answer: We categorize each person arriving at the bus stop as Type 1 if he/she gets on
the bus, and Type 2 if not. Let Ti be the time the ith people is willing to wait. We know
Ti ∼ Exp(µ). For a person who arrives at time x, he/she will get on the bus if Ti > t − x. So
the chance that he/she will get on the bus is p(x) = P(Ti > t − x) = e−µ(t−x) . Then we just
argue that a person arrived at time x has chance p(x) =R e−µ(t−x) to be Type 1. Let N1 (t) be
t
the number of Type 1 people. Then N1 (t) ∼ Poisson(λ 0 p(x)dx), and
Z
t
Z
p(x)dx =
0
t
e−µ(t−x) dx =
0
1
(1 − e−µt ).
µ
So N1 (t) ∼ Poisson( µλ (1 − e−µt ) and
n λ
λ
−µt
−µt
P(N1 (t) = n) = exp − (1 − e )
(1 − e )
n!
µ
µ
(d) [10 pts] If at time s (s < t), there are k people waiting at the bus stop. What is the expected
number of customers who will get on the bus at time t? (Note some people may leave the
bus stop and some may arrive.)
Answer: Let N1 and N2 be respectively the number of people aboard who arrived at the
bust stop before and after time s. Using a similar argument as in part (b), we can show that
λ
λ
−µ(t−s)
N2 ∼ Poisson
(1 − e
) , and hence E[N2 ] = (1 − e−µ(t−s) ).
µ
µ
By the memoryless property of the Exponential distribution, each of the k people waiting at
the bus stop at time s will independently wait for another Exp(µ) of time. Thus each of them
will get on the bus with probability P(Ti > t − s) = e−µ(t−s) . The expected number of people
who get on the bus (E[N1 ]) is ke−µ(t−s) . So the expected number of people who get on the
bus who get on the bus at time t is
E[N1 ] + E[N2 ] = ke−µ(t−s) +
6
λ
(1 − e−µ(t−s) )
µ