Download A j

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Generalized Jackson Networks
(Approximate Decomposition Methods)
• The arrival and departure processes are approximated by
renewal process and each station is analyzed individually
as GI/G/m queue.
• The performance measures (E[N], E[W]) are approximated
2
2
by 4 parameters {Ca0 j , CS j , 0 j ,  j }. (Don’t need the exact
distribution)
• References
– Whitt, W. (1983a), “The Queuing Network Analyzer,” Bell System
Technical Journal, 62, 2779-2815.
– Whitt, W. (1983b), “Performance of Queuing Network Analyzer,”
Bell System Technical Journal, 62, 2817-2843.
– Bitran, G. and R. Morabito (1996), “Open queuing networks:
optimization and performance evaluation models for discrete
manufacturing systems,” Production and Operations Management,
5, 2, 163-193.
1
Single-class GI/G/1 OQNS
Notation:
• n = number of internal stations in the network.
For each j, j = 1,…,n:
• 0 j = expected external arrival rate at station j
[ 0 j  1/ E[( A0 j )] ],
2
• Ca = squared coefficient of variation (scv) or variability of
external interarrival time at station j
{Ca2 = [Var(A0j)]/[E(A0j)2]},
•  j = expected service rate at station i [j =1/E(Sj)]
2
• CS = scv or variability of service time at station j
{CS2 = [Var(Sj)]/[E(Sj)2]}.
0j
0j
0j
0j
2
Notation (cont)
- For each pair (i,j), i = 1,…,n, j = 1,…,n:
pij = probability of a job going to station j after
completing service at station i.
*We assume no immediate feedback, pii  0
- For n nodes, the input consists of n2 + 4n numbers
• The complete decomposition is essentially
described in 3 steps
o Step 1. Analysis of interaction between stations of the
networks,
o Step 2. Evaluation of performance measures at each
station,
o Step 3. Evaluation of performance measures for the
whole network.
3
Step 1:
•
Determine two parameters for each station j:
(i) the expected arrival rate j = 1/E[Aj] where Aj
is the interarrival time at the station j.
(ii) the squared coefficient of variation (scv) or
variability of interarrival time, Ca2 =
Var(Aj)/E(Aj)2.
Consists of 3 sub-steps:
i
•
(i) Superposition of arrival processes
(ii) Overall departure process
(iii) Departure splitting according to destinations
4
Step 1:
Superposed arrival rate
•  j can ben obtained from the traffic rate equations:
 j  0 j   ij for j = 1,…,n
i 1
where ij = piji is the expected arrival rate at
station j from station i. We also get  j = j/j,
0   j < 1 (the utilization or offered load)
• The expected (external) departure rate to station 0
from station j is given by  j 0   j 1  n p ji
.

i 1

• Throughput 0 =   or  
• The expected number of visits vj = E(Kj) = j/0.
n
j 1
n
0j
j 1
j0
5
Step 1(i): (cont.)
• The s.c.v. of arrival time can be
approximated by Traffic variability equation
(Whitt(1983b))
n 
2
(1)
Ca  w j  ij Ca2  1  w j
j
where
wj 
i 0
j
ij
1
1  4(1   j ) ( x j  1)
2
and
xj 
1
 ij
 i 0  
 j
n



2
6
Step 1(ii): Departure process
Cd2j
is interdeparture time variability from station
j. Cd2   2j CS2  (1   2j )Ca2
(2)
j
j
j
Note that if the arrival
time
and
service
processes
2
2
C
C
a
S
are Poisson (i.e.,
=
= 1), then Cd2 is exact
and leads to
Cd2 = 1.
Note also that if j  1, then we obtain Cd2  CS2
On the other hand if j  0, then we obtain
j
j
j
j
j
Cd2j

j
Ca2j
7
Step 1(iii): Departure splitting
C a2ji = the interarrival time variability at station i from
station j.
Cd2 = the departure time variability from station j to i.
ji
Ca2ji  Cd2ji  p jiCd2j  1  p ji
(3)
Equations (1), (2), and (3) form a linear
system – the traffic variability equations –
2
2
2
2
C
,
C
,
C

C
in  a d a
d 
j
j
ij
ij
8
Step 2:
• Find the expected waiting time by using Kraemer &
Lagenbach-Belz formula [modified by Whitt(1983a)]
E (W j ) 
 j (Ca2  CS2 ) g (  j , Ca2 , CS2 )
j
j
j
j
2 j (1   j )

(Ca2j  CS2 j ) g (  j , Ca2j , CS2j )
2
E[W ]M / M /1
   2(1   j )(1  Ca2 ) 2  if Ca2 < 1
j

exp
 

g (  j , Ca2j , CS2j )    3 j (Ca2j  CS2 j ) 



Ca2  1
if
1
where
j
j
Step 3:
• Expected lead time for an arbitrary job (waiting times +
n
service times)
E (T )  v ( E (W )  E (S ))

j 1
j
j
j
9
Single-class GI/G/m OQNS
with probabilistic routing
More general case where there are m identical machines at
each stationn i.
Ca2j   j   Ca2j  ij for j = 1,…,n
i 1
where
n


2
 j  1  w j ( p0 j Ca0 j  1)   pij [(1  qij )  qij i2 xi ]
i 1


ij  w j pij qij (1  i2 )
wj 
1
1  4(1   j )2 ( x j  1)
ij
ij 
i 
qij 
pij qij  , pij  
 j 
j
i 
and xi  1 


(max CS2i ,0.2  1)
mi
10
Single-class GI/G/m OQNS
with probabilistic routing (cont.)
• Then the approximation of expected waiting time
is similar to GI/G/1 case:
(Ca2j  CS2j )
E (W j ) 
E (W j ( M / M / m j ))
2
11
Symmetric Job Shop
A job leaving m/c center i is equally likely to go to any other
m/c next:
Then
pij  1 m , j  i; pi 0  1 m (leaves system)
i   , i  1,..., m;
Cai  1 as m  
2
So we can approximate each m/c center as M/G/1.
Then, Wˆ (i , i , Ca2i , Cs2i ) can be replaced by mean waiting
time in M/G/1 queue.
2
2
2

(
1

C
E[Si ]
i
Si )
E[Wi ] 

2(1  i ) [2i (1  i )]
12
Symmetric Job Shop
13
Uniform Flow Job Shop
All jobs visit the same sequence of machine centers. Then
i   , i  1,..., m;
2
but Cai is unaffected by m
In this case, M/G/1 is a bad approximation, but GI/G/1 works
fine.
The waiting time of GI/G/1 queue can be approximated by
using equation 3.142, 3.143 or 3.144 above and the lower
and upper bound is
2 2
2
i (Ca2  1  i )  i 2CS2

(
2


)
C


i
ai
i CS i
 Wˆ ( ,  , C 2 , C 2 ) i
i
i
ai
si
2 (1  i )
2 (1   )
i
i
i
14
Uniform-Flow Shop
15
Job Routing Diversity
Assume high work load: i  1, i  1,..., m
and machine centers each with same number of machines,
same service time distribution and same utilization. Also,
vi  1, i  1,..., m
Q: What job routing minimizes mean flow time?
2
2
2
C

C
and
C
A1: If a
S
S  1 then uniform-flow job routing
minimizes mean flow time;
A2: If Ca 2  CS 2 and CS 2  1 then symmetric job routing
minimizes mean flow time.
16
Variance of Flow Time
The mean flow time is useful information but not sufficient for
setting due dates. The two distributions below have the
same mean, but if the due date is set as shown, tardy jobs
are a lot more likely under distribution B!
A
B
E[T]
Due
17
Flow Time of an Aggregate Job
Pretend we are following the progress of an arbitrary tagged
job. If Ki is the number of times it visits m/c center i, then
its flow time is (Ki is the r.v. for which vi is the mean)
m
Ki
T   Tij , where Tij is the time spent in i on j th visit
i 1 j 1
Given the values of each Ki, the expected flow time is
m
E T K i , i  1,..., m    K i E Ti 
i 1
m
m
i 1
i 1
and then, “unconditioning”, E T  
 E  Ki  E Ti  = vi E Ti 
18
Variance
In a similar way, we can find the variance of T conditional
upon Ki and then uncondition.
Assume {Tij, i = 1,…,m; j = 1, …} and {Ki, i = 1,…,m} are all
independent and that, for each i, {Tij, j = 1, …} are
identically distributed.
Similar to conditional expectation, there is a relation for
conditional variance:
Var[ X ]  EY  Var  X Y   VarY  E  X Y 
19
Using Conditional Variance
Since T depends on {K1, K2, …, Km}, we can say
Var T   EK1 , K2 ,..., Km  Var  T K1 ,..., K m  
 VarK1 , K2 ,..., Km  E  T K1 ,..., K m  
The first term equals

 m Ki

m

EK1 , K2 ,..., Km  Var   Tij K1 ,..., K m    EK1 , K2 ,..., K m   K i Var Ti  
 i 1


 i 1 j 1
 
m
m
i 1
i 1
  E  K i  Var Ti    vi Var Ti 
20
Using Conditional Variance
The second term is
VarK1 , K2 ,..., Km
  m Ki

m

 E   Tij K1 ,..., K m    VarK1 , K 2 ,..., K m   K i E Ti  
 i 1

  i 1 j 1
 
m
  Var  K i E Ti    2
i 1

1i  j  m
m
  E Ti  Var  K i   2
2
i 1
Cov  K i E Ti  , K j E T j  
 E T  E T  Cov  K , K
1i  j  m
i
j
i
j

   Cov  K i , K j  E Ti  E T j 
m
m
i 1 j 1
21
Variance
The resulting formula for the variance is:
m
m
m
i 1
i 1 j 1
Var T    vi Var Ti  + Cov  Ki , K j E Ti E T j 
If arrivals to each m/c center are approximately Poisson,
we can find Var[Ti] from the M/G/1 transform equation
(3.73), p. 61.
E Ti   
dFTi  s 
ds
2
, E Ti  
d 2 FTi  s 
s 0
ds 2
2
2
, Var Ti   E Ti   E Ti 
s 0
But we still need Cov(Ki, Kj ).
22
Markov Chain Model
Think of the tagged job as following a Markov chain with
states 1,…,m for the machine centers and absorbing state 0
representing having left the job shop.
The transition matrix is
1
0
0 ... 0 

Ki is the number of times
this M.C. visits state i
before being absorbed;
let Kji be the number of
visits given X0 = j.

m
1   p1 j
 j 1

m
1  p
2j
 
j 1



m
1   pmj
 j 1
0
p12
...
p21
0
...
pm1
pm 2

p1m 


p2 m 




0 

23
with probability 1  l 1 p jl
 ji ,
(7.83)
K ji  
with
probability
,
l
=
1,…,m
p jl
 K li   ji
m
Where
 ji  1
if j = i and 0 otherwise.
with probability 1  l 1 p jl
 ji jr ,
K ji K jr  
(7.84)
p
with
probability
,
l
=
jl
( K li   ji )( K lr   jr ),
1,…,m
m
24
Expectations
Take expectation of (7.83), we get

E  K ji 

j ,i 1,..., m
 I  P
1
and take expectation of (7.84),
E[ K ji K jr ]  E[ K jr ]E[ K ri ]  E[ K ji ]E[ K ir ], i  r
and
E[ K ]  E[ K ji ]{2E[ Kii ] 1}
2
ji
25
E[ Ki K r ]   j 1  j E[ K ji K jr ]
Because
m
E[ K ]   j 1  j E[ K 2ji ]
2
i
and
m
, we obtain
E[ Ki K r ]  E[ Ki ]E[ Kir ]  E[ K r ]E[ K ri ], i  r
and
E[ K i2 ]  E[ K i ]{2 E[ K ii ]  1}
where
E  Ki    j 1  j E  K ji 
m
Therefore
 E[ K i ]E[ K ij ]  E[ K j ]E[ K ji ]  E[ K i ]E[ K j ], i  j
Cov[ K i , K j ]  
 E[ K i ]2 E[ K ii ]  E[ K i ]  1 , i  j
26
Job Routing Extremes
• Symmetric job shop
 m  1  m  1 , i  j
E[Ki ] = 1, Cov  K , K   
 i j  2  m  1  m  1 , i  j

If all m/c centers are identical and m is large, Var T   E T  ,
and we can use an exponential dist’n to approximate T .
• Uniform-flow job shop
E[Ki ] = 1, Cov  K i , K j   0
Then
m
m
i 1
i 1
E T    E Ti , Var T    Var Ti 
27
Setting Due Dates
Use mean and variance of T to obtain approximate distribution
FT(t) = P[T < t], e.g.,
if Var[T] < E[T]2, fit an Erlang distribution;
if Var[T] = E[T]2, use an exponential dist’n with mean E[T];
if Var[T] > E[T]2, use a hyperexponential distribution.
If td is the due date, what matters is
FT (td )  1  FT (td )  P T  td .
28
Costs Related to Due Dates
• Setting the due date too far away: c1  td  A 

• Completing job after its due date: c2 T  td 

• Completing job before its due date: c3  td  T 

Total expected cost





C  td   c1  td  A  c2 E T  td   c3 E td  T  





has derivative
dC  td  c1  c2 FT  td   c3 FT  td  , td  A

dtd
 c2 FT  td   c3 FT  td  , td  A
29
Optimal Due Date
 1  c1  c3 
 FT 
 , if this  A
 c2  c3 


 1  c3 
*
td   FT 
 , if this  A
 c2  c3 


A, otherwise


30
Related documents