Download 11 stoch processes

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Probability wikipedia , lookup

Transcript
Lecture 11 – Stochastic
Processes
Topics
• Definitions
• Review of probability
• Realization of a stochastic process
• Continuous vs. discrete systems
• Examples
• Classification scheme
8/14/04
J. Bard and J. W. Barnes
Operations Research Models and Methods
Copyright 2004 - All rights reserved
Basic Definitions
Stochastic process: System that changes over time in
an uncertain manner
State: Snapshot of the system at some fixed point in
time
Transition: Movement from one state to another
Examples
• Automated teller machine (ATM)
• Printed circuit board assembly operation
• Runway activity at airport
2
Elements of Probability Theory
Experiment: Any situation where the outcome is uncertain.
Sample Space, S: All possible outcomes of an experiment (we
will call it “state space”).
Event: Any collection of outcomes (points) in the sample
space. A collection of events E1, E2,…,En is said to be
mutually exclusive if Ei  Ej =  for all i ≠ j = 1,…,n.
Random Variable: Function or procedure that assigns
a real number to each outcome in the sample space.
Cumulative Distribution Function (CDF), F(·): Probability
distribution function for the random variable X such that
F(a) = Pr{X ≤ a}.
3
Model Components (continued)
Time: Either continuous or discrete parameter.
t0 t1
t2
t3 t4
time
State: Describes the attributes of a system at some point in time.
s = (s1, s2, . . . , sv); for ATM example s = (n)
Convenient to assign a unique nonnegative integer index to
each possible value of the state vector. We call this X and
require that for each s  X.
For ATM example, X = n.
In general, Xt is a random variable.
4
Activity: Takes some amount of time – duration.
Culminates in an event.
For ATM example  service completion.
Transition: Caused by an event and results in movement
from one state to another. For ATM example,
a
a
0
1
d
a
3
2
d
a
d
d
Stochastic Process: A collection of random variables {Xt},
where t  T = {0, 1, 2, . . .}.
5
Markovian Property
Given that the present state is known, the conditional probability of
the next state is independent of the states prior to the present state.
Present state at time t is i: Xt = i
Next state at time t + 1 is j: Xt+1 = j
Conditional Probability Statement of Markovian Property:
Pr{Xt+1 = j | X0 = k0, X1 = k1,…,Xt = i} = Pr{Xt+1 = j | Xt = i}
for t = 0, 1,…, and all possible sequences i, j, k0, k1, . . . , kt–1.
6
Realization of the Process
Deterministic Process
Time between
arrivals
Pr{ta   } = 0,  < 1 min
Time for
servicing
customer
Pr{ts   } = 0,  < 0.75 min
Arrivals occur
every minute.
= 1,   1 min
= 1,   0.75 min
Processing takes
exactly 0.75
minutes.
n
Number in
system, n
2
1
0
0
1
2
3
4
5
6
7
8
9
10
time
(no transient
response)
7
Realization of the Process (continued)
Stochastic Process
Pr{ts   } = 0,  < 0.75 min
Time for servicing
customer
= 0.6, 0.75    1.5 min
= 1,   1.5 min
n
a
3
a
2
a
1
a
d
a
d
a
a
d
a
d
d
a
a
d
Number in
system, n
d
d
d
d
0
0
2
4
6
8
10
12
time
8
Birth and Death Processes
Pure Birth Process; e.g., Hurricanes
0
1
a0
2
a1
3
a2
4
a3
…
Pure Death Process; e.g., Delivery of a truckload of parcels
0
d1
1
d2
2
d3
3
d4
…
4
Birth-Death Process; e.g., Repair shop for taxi company
d2
d1
0
2
1
a0
d3
a1
d4
3
a2
4
…
a3
9
Queueing Systems
Queue Discipline: Order in which customers are served; FIFO,
LIFO, Random, Priority
Five Field Notation:
Arrival distribution / Service distribution / Number of servers /
Maximum number in the system / Number in the calling population
10
Queueing Notation
Distributions (interarrival and service times)
M = Exponential
D = Constant time
Ek = Erlang
GI = General independent (arrivals only)
G = General
Parameters
s = number of servers
K = Maximum number in system
N = Size of calling population
11
Characteristics of Queues
Infinite queue: e.g., Mail order company (GI/G/s)
d
0
2d
1
a
2
…
sd
sd
s –1
a
s
a
s +1
…
a
Finite queue: e.g., Airline reservation system (M/M/s/K)
…
sd
K–1
K
a
a. Customer arrives but then leaves
a
…
sd
K–1
K
a
b. No more arrivals after K
12
Characteristics of Queues (continued)
Finite input source: e.g., Repair shop for trucking firm (N vehicles)
with s service bays and limited capacity parking lot (K – s spaces).
Each repair takes 1 day (GI/D/s/K/N).
d
2d
0
1
Ka
…
(K– s + 1)a
…
(K–1)a
sd
s –1
2
sd
s
s+1
(K– s )a
… K –1
sd
K
a
In this diagram N = K so we have GI/D/s/K/K system.
13
Examples of Stochastic Processes
Service Completion Triggers an Arrival: e.g., multistage
assembly process with single worker, no queue.
a
0
d1
1
d2
2
d3
3
d4
4
5
d5
state = 0, worker is idle
state = k, worker is performing operation k = 1, . . . , 5
14
Examples (continued)
Multistage assembly process with single worker with queue.
(Assume 3 stages only)
s = (s1, s2) where
s1 = number of parts in system
{ s = current operation being performed
2
1,3
Assume
2,3
a
d3
d3
d2
k = 1, 2, 3
d2
1,2
a
d1
0,0
a
1,1
d3
2,2
a
d1
a
2,1
3,3
a
…
d2
3,2
…
d1
a
3,1
…
15
Queueing Model with Two Servers, One Operation
0 if server i is idle
s = (s1, s2 , s3) where si =
{ 1 if server i is busy
i = 1, 2
s3 = number in queue
Arrivals
d1
1
…
d2
2
(1,0,0)
Statetransition
network
d1
0
(0,0,0)
a
1
d2
a
d1
d2
2
a
d1 ,d2
d1 , d2
3
(1,1,0)
4
a
5
• • •
a
(1,1,1)
(1,1,2)
(0,1,0)
16
Series System with No Queues
Arrivals
Transfer
1
2
Component Notation
State
Transfer
0 if server i is idle
si =
Activities
Y = {a, d1, d2 , d3}
Finished
Definition
s = (s1, s2 , s3)
State space S = { (0,0,0), (1,0,0), . . . ,
(0,1,1), (1,1,1) }
3
{ 1 if server i is busy
for i = 1, 2, 3
The state space consists of all
possible binary vectors of 3
components.
a = arrival at operation 1
dj = completion of operation j for
j = 1, 2, 3
17
Transitions for Markov Processes
Exponential interarrival and service times (M/M/s)
State space: S = {1, 2, . . .}
Probability of going from state i to state j in one move: pij
State-transition matrix
1  p11

2  p21

P=
  

m  pm1
p12  p1m 

p22  p2m 


 
pm2  pmm 
Theoretical requirements: 0  pij  1, j pij = 1, i = 1,…,m
18
Single Channel Queue – Two Kinds of Service
Bank teller: normal service (d), travelers checks (c), idle (i)
Let p = portion of customers who buy checks after normal service
s1 = number in system
s2 = status of teller, where s2  {i, d, c}
(1,c)
c
a
d, p
…
d, p
d, 1– p
(2,d)
a
(3,c)
c
d, 1– p
(1,d)
a
(2,c)
c
d, p
d, 1– p
(0,i)
a
(3,d)
…
Statetransition
network
a
19
Part Processing with Rework
Consider a machining operation in which there is a 0.4 probability
that upon completion, a processed part will not be within tolerance.
Machine is in one of three states:
0 = idle, 1 = working on part for first time, 2 = reworking part.
(0)
State-transition network
s1, 0.4
a
(1)
(2)
a = arrival
s1 = service completion
from state 1
s1, 0.6
s2
s2 = service completion
from state 2
20
Markov Chains
• A discrete state space
• Markovian property for transitions
• One-step transition probabilities, pij, remain constant over
time (stationary)
Example: Game of Craps
Roll 2 dice: Win = 7 or 11; Loose = 2, 3, 12; otherwise 4, 5, 6, 8, 9, 10
(called point) and roll again  win if point  loose if 7
otherwise roll again, and so on.
(There are other possible bets not include here.)
21
State-Transition Network for Craps
not (4,7)
not (5,7)
not (6,7)
not (8,7)
not (9,7)
not (10,7)
P4
P5
P6
P8
P9
P10
4
5
Win
6
8
10
9
7
5 6
4
(7, 11)
Start
7
8 9
7
10
7 7
7
Lose
(2, 3, 12)
22
Transition Matrix for Game of Craps
Start
P=
Win
Lose
P4
P5
P6
P8
P9
P10
Start
0
Win
0
1
0
0
0
0
0
0
0
Lose
0
0
1
0
0.
0
0
0
0
P4
0
0.083 0.167 0.75
0
0
0
0
0
P5
0
0.111 0.167
0
0.722
0.
0
0
0
P6
0
0.139 0.167
0
0
0.694
0
0
0
P8
0
0.139 0.167
0
0
0
0.694
0
0
P9
0
0.111 0.167
0
0
0
0
0.722
0
P10
0
0.083 0.167
0
0
0
0
0
0.75
0.222 0.111 0.083 0.111 0.139 0.139 0.111 0.083
23
State-Transition Network for
Simple Markov Chain
(0.6)
1
(0.1)
(1)
3
1
(0.3)
1
(0.8)
P=
2
 0.6
2  0.8
3  1
2
3
0.3 0.1
0.2
0
0
0



(0.2)
24
Classification of States
Accessible: Possible to go from state i to state j (path exists in
the network from i to j).
d2
d1
0
2
1
a1
a0
0
a0
d3
1
a1
d4
3
a2
…
4
…
a3
a2
2
4
3
a3
Two states communicate if both are accessible from each other. A
system is irreducible if all states communicate.
State i is recurrent if the system will return to it after leaving some
time in the future.
If a state is not recurrent, it is transient.
25
Classification of States (continued)
A state is periodic if it can only return to itself after a
fixed number of transitions greater than 1 (or multiple
of a fixed number).
A state that is not periodic is aperiodic.
0
(0.5)
0
4
(0.5)
(1)
(1)
2
1
(1)
a. Each state visited
every 3 iterations
(1)
(1)
2
1
(1)
b. Each state visited in multiples
of 3 iterations
26
Classification of States (continued)
An absorbing state is one that locks in the system once it enters.
d1
0
d2
d3
2
1
a1
3
a2
4
a3
This diagram might represent the wealth of a gambler who
begins with $2 and makes a series of wagers for $1 each.
Let ai be the event of winning in state i and di the event of
losing in state i.
There are two absorbing states: 0 and 4.
27
Classification of States (continued)
Class: set of states that communicate with each other.
A class is either all recurrent or all transient and may be either
all periodic or aperiodic.
States in a transient class communicate only with each other so
no arcs enter any of the corresponding nodes in the network
diagram from outside the class. Arcs may leave, though,
passing from a node in the class to one outside.
3
0
2
1
5
6
4
28
Illustration of Concepts
0
Example 1
State
0
1
2
3
0
0
X
0
X
1
X
0
0
0
2
X
0
0
0
3
0
0
X
X
1
3
2
Every pair of states communicates, forming a single
recurrent class; however, the states are not periodic.
Thus the stochastic process is aperiodic and irreducible.
29
Illustration of Concepts
Example 2
0
State
0
1
2
3
4
0
X
X
0
0
X
1
X
X
0
0
0
2
0
0
X
X
0
3
0
0
0
X
0
4
0
0
0
0
0
4
1
3
2
States 0 and 1 communicate and for a recurrent class.
States 3 and 4 form separate transient classes.
State 2 is an absorbing state and forms a recurrent class.
30
Illustration of Concepts
Example 3
0
State
0
1
2
3
0
0
0
0
X
1
X
0
0
0
2
X
0
0
0
3
0
X
X
0
1
3
2
Every state communicates with every other state, so we
have irreducible stochastic process.
Periodic? Yes, so Markov chain is irreducible and periodic.
31
What you Should know
about Stochastic Processes
• What a state is
• What a realization is (stationary vs.
transient)
• What the difference is between a
continuous and discrete-time system
• What the common applications are
• What a state-transition matrix is
• How systems are classified
32