Download Homework 1 - Math 468 (Applied Stochastic Processes), Spring 15 1

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Perron–Frobenius theorem wikipedia , lookup

Four-vector wikipedia , lookup

Transcript
Homework 1 - Math 468 (Applied Stochastic Processes), Spring 15
1. (from Lawler) The Smiths receive the newspaper every morning and put
it in a pile after reading it. In the afternoon some one moves all the papers
in the pile to the recycling with probability 1/3. Also in the afternoon, if the
pile has five papers in it, then Mr. Smith moves all the papers to the recycling
(with probability 1). The number of papers in the pile in the evening can be
modelled as a Markov chain.
(a) Give the state space and the transition matrix.
(b) Suppose that in the first evening there is one paper in the pile (so X0 = 1.)
Find the probabilities that 5 evenings later there are 0, 1, 2, 3, 4 papers in the
pile.
(c) Again start with X0 = 1. After a long time, find the probabilities of there
being 0, 1, 2, 3, 4 papers in the pile.
2. Let ξn , n = 0, 1, 2, · · · be a sequence of i.i.d. (independent, identically
distributed) random variables taking values in {0, 1, 2, · · · , N − 1}. Let
Xn = (ξn + ξn−1 ) mod N, f or n ≥ 1,
Yn = max{ξ0 , ξ1 , · · · , ξn },
Zn = (ξ0 + ξ1 + · · · + ξn−1 + ξn ) mod N
X0 = ξ 0 ,
For each process Xn , Yn , Zn , determine if it is a Markov process. Explain
your reasoning.
3. Suppose you have a system that is not Markovian because the probability
of jumping to state j at time n + 1 depends not just on the state at time n
but also on the state at time n − 1. This still can be described by a Markov
process by using the following idea. We use a larger state space. In the new
state space a state will consist not just of the state of the system at time n,
but also its state at time n − 1. This problem uses this idea.
Suppose that the probability it rains today depends on whether or not it
rained yesterday and the day before. If it rained both yesterday and the day
before, then then it will rain today with probability 0.8. If it did not rain
either yesterday or the day before, then it will rain today with probability
0.2. In all the other cases the weather today will be same as it was yesterday
with probability 0.6. Set up a Markov chain to describe this and find its
transition matrix P .
1
4. Find the communication classes of the following chains and determine if
they are transient or recurrent.




0
0 0 1
0 0.5 0.5
 0
0 0 1


P1 = 0.5 0 0.5  P2 = 
 0.5 0.5 0 0 
0.5 0.5 0
0
0 1 0




0.5 0 0.5 0
0
0.2 0.8 0
0 0
 0.2 0.5 0.3 0
 0.5 0.5 0
0 
0 0





P3 = 
0 
0
1
0 0
 0.5 0 0.5 0
 P4 =  0

 0
 0
0
0 0.5 0.5 
0 0.3 0.7 0 
0
0
0 0.5 0.5
1
0
0
0 0
************** Do one of problems 5 and 6 **************
5. (from Lawler) Let Xn be an irreducible Markov chain on the state space
{1, 2, · · · , N }. Prove that there is a finite constant C and a constant ρ < 1
such that for any states i, j
P ({Xm 6= j, f or m = 0, 1, 2, · · · , n|X0 = i}) ≤ Cρn
Prove that this implies that E[T ] < ∞ where T is the first time that the
chain reaches the state j. (Hint: show there is δ > 0 such that for all i the
probability of reaching j sometime in the first N steps, starting from i, is at
least δ.)
6. (from Lawler) Let Xn be an irreducible, aperiodic Markov chain with state
space S starting at state i with transition matrix P . Define
T = min{n > 0 : Xn = i}
So T is the first time the chain returns to i. For a state j let
"T −1
#
X
r(j) = E
I(Xn = j)
n=0
(a) Let r be the vector (r(1), r(2), · · · , r(N )). Show that rP = r. Conclude
that r is equal to some constant times the stationary distribution π.
(b) Show that
X
E[T ] =
r(j)
j∈S
(c) Conclude that E[T ] = 1/π(i) where π is the invariant probability distribution. Hint: what is r(i)?
2