Download Probability

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Statistics wikipedia , lookup

History of statistics wikipedia , lookup

Birthday problem wikipedia , lookup

Probability interpretations wikipedia , lookup

Probability wikipedia , lookup

Transcript
2.5 Hitting Times
2.5.1 Hitting Times
Sometimes we want to know things like
What is the probability that the system will have been in a certain state by a certain time?
or
What is the probability that the system will be in a certain state for the first time at a certain time?
Here is an example of the first type of question.
Example 1. (see Example 1 of section 2.1) At the start of each day an office copier is checked and its
condition is classified as either good (1), poor (2) or broken (3). Suppose the transition matrix from one
state to another in the course of day is
(1)
 0.8
P =  0
 0.32
0.06
0.4
0.08
0.14 
0.6 
0.6 
We might want to know the following
Suppose the copier was in good condition at the start of the day today. What is the probability that
the copier will be broken at the start of the day in at least one of the next five days?
There are two ways one can represent this probability symbolically. One way is with the random variables
Xn that define the process. Then the probability that the copier will be broken at the start of the day in at
least one of the next five days given that it is in good condition today is
Pr{X1 = 3 or X2 = 3 or X3 = 3 or X4 = 3 or X5 = 3 | X0 = 1}
Another way to represent this probability symbolically is with what is called the hitting time (or first
passage time) for a state. If j is a state then
(2)
T(j) = first time (greater than or equal to one) that the system is in state j
= hitting time for state j
= time to reach j (or return to j if you start there)
= first passage time for state j
Note that T(j) is a random variable. It depends on the values of X0, X1, … If the system never returns to
state j then we define T(j) = . In terms of hitting times the probability that the copier will be broken at the
start of the day in at least one of the next five days given that it is in good condition today is
2.5.1 - 1
Pr{T(3)  5 | X0 = 1}
Questions regarding hitting times occur frequently, so notation has been developed for their probabilities.
We let
(3)
Fij(n) = Pr{T(j)  n | X0 = i}
If we fix i and j and let n vary then Fij(n) is the cumulative distribution of T(j) given that we start in state i. In
terms of Fij(n) the probability that the copier will be broken at the start of the day in at least one of the next
five days given that it is in good condition today is F13(5). There are various ways one can compute the Fij(n).
One way is to create another Markov chain which is the same as the original one except that once we reach
state j we stay there. In Example 1 if we
0.4
want to compute the probability that the
0.8
copier will be broken at the start of the
day in at least one of the next five days
Good condition now
and never broken
then we would create another Markov
0.06
Poor condition now and
never broken
chain that is the same as the original one
except once the copier is broken it stays
broken. It doesn't mean the copier is not
0.6
0.14
fixed in reality. The modified state means
the copier is or has been broken some
Broken now or at some
prior time
time prior to the present. The transition
1
diagram for this modified Markov chain is
at the right and the transition matrix is
(4)
Q =
 0.8
 0
 0
0.06
0.4
0
0.14 
0.6 
1 
If we call this new Markov chain Yn, then Pr{T(3)  5 | X0 = 1} = Pr{Y5 = 3 |Y0 = 1} = (Q5)13. In section
2.5.2 we computed Q5 which is
Q5 =
 0.328
 0
 0
0.048
0.010
0
0.625 
0.990 

1
So the probability that the copier will be broken at the start of the day in at least one of the next five days
given that it is in good condition today is (Q5)13 = 0.625.
Using the eigenvalue technique discussed in section 2.2.1, one can find a formula for Qn. This is done in
section 2.5.2. Using that, one obtains Pr{T(3)  n | X0 = 1} = F13(n) = (Qn)13 = 1 – 0.15(0.4n) – 1.15(0.8n).
As expected, Pr{T(3)  n | X0 = 1} = F13(n)  1 as n  .
There is one case where we have to make a slight modification in the above procedure. This is when the
starting state and ending state are the same.
2.5.1 - 2
Example 2. Consider the office copier in Example 1. Suppose the copier was broken at the start of the day
today. What is the probability that the copier will be broken again at the start of the day in at least one of
the next five days? In terms of the hitting time T(3) we want to know Pr{T(3)  5 | X0 = 3}.
It is not (Q5)33 with Q as in (4). The reason is that (Q5)33 = 1 since one is stuck in state 3 once one gets
there. There are several ways to determine Pr{T(3)  5 | X0 = 3}.
One way is use the conditional additivity formula (Proposition 1 in section 1.3.6) to divide
Pr{T(3)  5 | X0 = 3} into three pieces depending on the first transition, i.e.
Pr{T(3)  5 | X0 = 3} = Pr{T(3)  5, X1 = 1 | X0 = 3} + Pr{T(3)  5, X1 = 2 | X0 = 3}
+ Pr{T(3)  5, X1 = 3 | X0 = 3}
Consider the first term on the right of the equal sign. Using the conditional intersection formula
(Proposition 2 in section 1.3.6) one has
Pr{T(3)  5, X1 = 1 | X0 = 3} = Pr{T(3)  5 | X1 = 1, X0 = 3} Pr{ X1 = 1 | X0 = 3}
= Pr{T(3)  5 | X1 = 1, X0 = 3} p31
Note that {T(3)  5} =

(i1,... i5)  E
{ X1 = i1, ..., X5 = i5} where E = {(i1, ..., i5): at least one of the ij is 3}. In the
same fashion {T(3)  5, X1 = 1, X0 = 3} =

(i2,... i5)  F
{ X2 = i2, ..., X5 = i5, X1 = 1, X0 = 3} where
F = {(i2, ..., i5): at least one of the ij is 3}. So
Pr{T(3)  5, | X1 = 1, X0 = 3} = Pr{

(i2,... i5)  F
{ X2 = i2, ..., X5 = i5 | X1 = 1, X0 = 3}
Using the extended Markov property ((12) in Proposition 1 in section 2.1.3), this can be written as
Pr{T(3)  5, | X1 = 1, X0 = 3} = Pr{
= Pr{

(i2,... i5)  F

(i1,... i4)  F
{ X1 = i2, ..., X4 = i5 | X0 = 1}
{ X1 = i1, ..., X4 = i4 | X0 = 1}
However, by the same argument one has
Pr{T(3)  4 | X0 = 1} = Pr{

(i1,... i4)  F
{ X1 = i1, ..., X4 = i4 | X0 = 1}
So
Pr{T(3)  5, | X1 = 1, X0 = 3} = Pr{T(3)  4 | X0 = 1} = (Q4)13
and
2.5.1 - 3
Pr{T(3)  5, X1 = 1 | X0 = 3} = p31(Q4)13
By the same argument
Pr{T(3)  5, X1 = 2 | X0 = 3} = p32(Q4)23
For Pr{T(3)  5, X1 = 3 | X0= 3} one has
Pr{T(3)  5, X1 = 3 | X0 = 3} = Pr{T(3)  5 | X1 = 3, X0 = 3} Pr{ X1 = 3 | X0 = 3}
= Pr{T(3)  5 | X1 = 3, X0 = 3} p33 = p33 = p33(Q4)33
This is because Pr{T(3)  5 | X1 = 3, X0 = 3} = 1 since T(3)  5 if X1 = 3. Putting this altogether gives
Pr{T(3)  5 | X0= 3} = p31(Q4)13 + p32(Q4)23 + p33(Q4)33 = (PQ4)33
When one computes PQ4 one obtains
PQ4 =
 0.328
 0
 0.131
0.625 
0.990 
0.848 
0.048
0.010
0.020
So the probability that the copier will be broken at the start of the day in at least one of the next five days
given that it is broken today is (PQ4)33 = 0.848.
Another way to look at this
0.4
computation is as follows. We make
0.8
an additional modification to the
Markov chain above by adding a
Good condition now
and never broken = 1
special state (let's number it state 4)
0.06
Poor condition now and
never broken = 2
corresponding to being broken to
begin with. The transition diagram
0.08
for this modified Markov chain is at
0.32
0.14
the right and the transition matrix is
 0.8
0
U =  0

 0.32
0.06
0.4
0
0.08
0.14
0.6
1
0.6
0
0
0
0
Start broken = 4
0.6
Broken now or at some
prior time = 3
0.6
In terms of U the probability that the copier will be broken at the start of the day in at least one of the next
five days given that it is broken today is (U5)43. If one computes U5 one obtains
 0.328
0
U5 = 
 0
 0.131
0.048
0.010
0
0.020
0.625
0.990
1
0.848
0
0
0
0
2.5.1 - 4
1
So the probability that the copier will be broken at the start of the day in at least one of the next five days
given that it is broken today is (U5)43 = 0.848.
Here is an example of finding the probability that the system will be in a certain state for the first time at a
certain time.
Example 3. Consider the office copier in Example 1. What is the probability that the copier will have
broken for the first time five days from now given that it is in good condition today.
In symbols this is
Pr{X1  3, X2  3, X3  3, X4  3, X5 = 3 | X0 = 1} = Pr{T(3) = 5 | X0 = 1} = f13(5)
where
fij(n) = Pr{T(j) = n | X0 = i}
(5)
If we fix i and j and let n vary then fij(n) is the pmf of T(j) given that we start in state i. There are various
ways one can compute the fij(n). One way is simply
 Fij(n) - Fij(n-1)
 Fij(1)
fij(n) = 
if n  2
if n = 1
which is the general relation between the pmf and the cumulative distribution function for a random
variable.
So, in Example 3, one has
0.4
Pr{T(3) = 5 | X0 = 1} = f13(5) = F13(5) - F13(4)
0.8
= (Q5)13 - (Q4)13 = 0.0919
Another way to answer this is to make
Good condition now
and never broken = 1
another modification of the Markov chain
0.06
Poor condition now and
never broken = 2
in Example 1. We add another state which
0.08
we label "Broken on the first time on some
previous day". Let's call this state 4. We
go to this state after we hit the Broken
0.6
0.14
1
state, state 3. The transition diagram for
this modified Markov chain is at the right
and the transition matrix is
 0.8
0
S =  0

 0
0.06
0.4
0
0
0.14
0.6
0
0
Broken on the first time
on some previous day
=4
Broken for first time
now = 3
1
0
0
1
1
If we call this new Markov chain Zn, then Pr{T(3) = 5 | X0 = 1} = Pr{Z5 = 3 |Z0 = 1} = (S5)13 = 0.0919. In
fact, one has
2.5.1 - 5
 0.328
0
S =  0

 0
0.0476
0.0102
0
0
0.0919
0.0154
0
0
0.533
0.974
1 
1 
2.5.1 - 6