Download Markov Chains - WordPress.com

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Randomness wikipedia , lookup

Birthday problem wikipedia , lookup

Probability amplitude wikipedia , lookup

Transcript
Markov Chains
The overall picture …
• Markov Process
• Discrete Time Markov Chains
– Homogeneous and non-homogeneous Markov
chains
– Transient and steady state Markov chains
• Continuous Time Markov Chains
– Homogeneous and non-homogeneous Markov
chains
– Transient and steady state Markov chains
• Stochastic Process
• Markov Property
MARKOV PROCESS
What is “Discrete Time”?
time
0
1
2
3
4
Events occur at a specific points in time
What is “Stochastic Process”?
State Space = {SUNNY,
RAINNY}
X day i   "S " or " R ": RANDOM VARIABLE that varies with the DAY
X day 2  "S "
X day 1  "S "
X day 4  "S "
X day 3  " R "
X day 6  "S "
X day 5  " R "
X day 7   "S "
Day
Day 1
THU
Day 2
FRI
Day 3
SAT
Day 4
SUN
Day 5
MON
Day 6
TUE
Day 7
WED
X day i  IS A STOCHASTIC PROCESS
X(dayi): Status of the weather observed each DAY
Markov Processes

Stochastic Process X(t) is a random variable that varies with time.
A state of the process is a possible value of X(t)

Markov Process



The future of a process does not depend on its past, only on its present

a Markov process is a stochastic (random) process in which the probability
distribution of the current value is conditionally independent of the series
of past value, a characteristic called the Markov property.

Markov property: the conditional probability distribution of future states of
the process, given the present state and all past states, depends only upon
the present state and not on any past states
Marko Chain: is a discrete-time stochastic process with the Markov property
What is “Markov Property”?
Pr X DAY 6  "S " | X DAY 5  " R ", X DAY 4  "S ",..., X DAY 1  "S " 
Pr X DAY 6  "S " | X DAY 5  " R "
PAST EVENTS
NOW
X day 2  "S "
X day 1  "S "
FUTURE EVENTS
X day 4  "S "
X day 3  " R "
X day 5  " R "
?
Probability of “R” in DAY6 given all previous states
Probability of “S” in DAY6 given all previous states
Day
Day 1
THU
Day 2
FRI
Day 3
SAT
Day 4
SUN
Day 5
MON
Day 6
TUE
Day 7
WED
Markov Property: The probability that it will be (FUTURE) SUNNY in DAY 6
given that it is RAINNY in DAY 5 (NOW) is independent from PAST EVENTS
Notation
Discrete time tk or k
Value of the stochastic
process at instant tk or k
X(tk) or Xk = xk
The stochastic process at time tk or k
Discrete Time Markov Chains (DTMC)
MARKOV CHAIN
Markov Processes

Markov Process

The future of a process does not depend on its past, only on its present
Pr X t k 1   x k 1 | X t k   x k ,..., X t 0   x 0 
 Pr X t k 1   x k 1 | X t k   x k 



Since we are dealing with “chains”, X(ti) = Xi can take discrete values from a
finite or a countable infinite set.
The possible values of Xi form a countable set S called the state space of the
chain
For a Discrete-Time Markov Chain (DTMC), the notation is also simplified to
Pr  X k 1  xk 1 | X k  xk ,..., X 0  x0   Pr  X k 1  xk 1 | X k  xk 

Where Xk is the value of the state at the kth step
General Model of a Markov Chain
p11
p01
p00
S0
p10
S  S 0, S 1, S 2 State Space
i
or
Si
State i
p12
S1
p21
p22
S2
p20
Discrete Time (Slotted Time)
time  t 0 , t 1 , t 2 ,..., t k 
 {0,1, 2,..., k }
pij Transition Probability from State Si to State Sj
Example of a Markov Process
A very simple weather model
pSR=0.3
pSS=0.7
SUNNY
RAINY
pRR=0.4
pRS=0.6
State Space
S  SUNNY , RAINY



If today is Sunny, What is the probability that to have a SUNNY weather
after 1 week?
If today is rainy, what is the probability to stay rainy for 3 days?
Problem: Determine the transition probabilities
from one state to another after n events.