Download Markov processes

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
4.2 Markov Processes.
Let's summarize the assumptions from section 4.1 that we will be making in the rest of this chapter. We
have a system that at any non-negative time t can be in any of N states numbered i = 1, 2, ..., N. Let
X(t) = state at time t
(1)
Ti = time until the next transition after the system enters state i
(2)
Zi = the following state after the next transition after the system enters state i
We assume
(3)
(4)
(5)
Ti = an exponential random variable with mean 1/vi
1
vi = rate at which system is leaving state i =
mean of Ti
rij = probability that the next state is j after we make a transition out of state i
= Pr{Zi = j}
(6)
qij = rijvi = rate at which the system is making transitions from state i to state j if i  j
(7)
qii = - vi = - (rate at which system is making transitions out of state i)
q
q

q
21
q12  q1N
q22  q2N
N1
qN2  qNN
0
r

r
21
r12  r1N
0  r2N
N1
rN2 
11
(8)
(9)
Q =
R =
0








21 2
b12v1  b1Nv1
- v2  b2Nv2
N1 N
bN2vN  - vN
-v
b v

b v
1
=




= generator matrix
= transition matrix for the embedded Markov chain
We assume the Ti and Zi are all independent.
Under these assumptions the X(t) have two important properties that are the continuous time analogues of
properties of Markov chains in discrete time. The first one is called the Markov property which says that
the probabilities of future events don't depend on the past if the present is given. This condition can be
stated precisely as follows.
(10)
Pr{X(t) = j | X(s) = i, X(r1) = k1, X(r2) = k2, ... , X(rn) = kn} = Pr{X(t) = j | X(s) = i}
whenever r1 < r1 < ... < rn< s < t. Stochastic processes satisfying (10) are called Markov processes.
4.2 - 1
The second property says that the system is homogeneous in time which means that the transition
probabilities don't depend on time, i.e.
(11)
Pr{X(s+t) = j | X(s) = i} = Pr{X(t) = j | X(0) = i}
for all i, j, s, t. Let
pij(t) = Pr{X(t) = j | X(0) = i}
= probability of going from state i to state j in time t
21
p12(t)  p1N(t)
p22(t)  p2N(t)
N1
pN2(t)  pNN
 p (t)
 p (t)

 p (t)
11
P(t) =



(t) 
= transition matrix
Note that (10) and (11) combined say
(12)
Pr{X(s + t) = j | X(s) = i, X(r1) = k1, X(r2) = k2, ... , X(rn) = kn} = pij(t)
Example 1. In the context of the office copier in Example 1 of section 4.1, express the probability that the
copier is broken Thursday at 4 p.m. given that it is in good condition Wednesday at 12 noon and in poor
condition Monday at 10 a.m. in terms of the pij(t).
Solution. If we assume t = 0 is Monday at 8 a.m. then
Pr{ Broken Thursday at 4 p.m. | Good condition Wednesday at 12 noon, Broken Monday at 10 a.m.}
= Pr{X(3.8) = 3 | X(2.4) = 1, X(0.2) = 2} = p13(1.4)
where we have used (12).
By using the Markov property repeatedly one can prove the following.
Proposition 1. If t0 < t1 < … < tn then
Pr{X(t1) = j1, X(t2) = j2, …, X(tn) = jn | X(t0) = j0} = pj0j1(t1 – t0) pj1j2(t2 – t1) … pjn-1jn(t1 – t0)
Proof. For brevity, we only prove the case n = 2; the general case is similar. For n = 2, we need to show
4.2 - 2
Pr{X(t1) = j1, X(t2) = j2 | X(t0) = j0} = pj0j1(t1 – t0) pj1j2(t2 – t1)
One has
Pr{X(t1) = j1, X(t2) = j2 | X(t0) = j0} = Pr{X(t2) = j2 | X(t1) = j1, X(t0) = j0} Pr{X(t1) = j1 } X(t0) = j0}
By the Markov property Pr{X(t2) = j2 | X(t1) = j1, X(t0) = j0} = Pr{X(t2) = j2 | X(t1) = j1}, so
Pr{X(t1) = j1, X(t2) = j2 | X(t0) = j0} = Pr{X(t2) = j2 | X(t1) = j1} Pr{X(t1) = j1 } X(t0) = j0}
Using time invariance (11) this becomes
Pr{X(t1) = j1, X(t2) = j2 | X(t0) = j0} = pj0j1(t1 – t0) pj1j2(t2 – t1)
which is what we wanted to show. //
Example 2. In the context of the office copier in Example 1 of section 4.1, express the probability that the
copier is broken Tuesday at 10 a.m. and in poor condition Thursday at 4 p.m given that it is in good
condition Monday at 12 noon in terms of the pij(t).
Solution. If we assume t = 0 is Monday at 8 a.m. then using Proposition 1 one has
Pr{Broken Tues at 10 a.m. & in poor condition Thursday at 4 p.m. | Good condition Monday at noon }
= Pr{X(1.2) = 3, X(3.8) = 2 | X(0.4) = 1} = p13(1.2 – 0.4) p32(3.8 – 1.2) = p13(0.8) p32(2.6)
Proposition 2. P(0) = I
Proof. This follows from the fact that pii(0) = Pr(X(0) = i | X(0) = i} = 1 and
pij(0) = Pr(X(0) = j | X(0) = i} = 0 if j  i. //
In sections 4.4 and 4.5 we shall discuss how to compute P(t) from the generator matrix Q. Before doing
this we look at an important property of the transition matrices, the Chapman-Kolmogoroff equation in the
next section.
4.2 - 3
Related documents