Download Introduction to Queuing Analysis

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Wake-on-LAN wikipedia , lookup

Deep packet inspection wikipedia , lookup

Transcript
Queuing Analysis:
Introduction to Queuing Analysis
Hongwei Zhang
http://www.cs.wayne.edu/~hzhang
Acknowledgement: this lecture is partially based on the slides of Dr. Yannis A. Korilis.
Outline
Delay in packet networks
Introduction to queuing theory
Exponential and Poisson distributions
Poisson process
Little’s Theorem
Outline
Delay in packet networks
Introduction to queuing theory
Exponential and Poisson distributions
Poisson process
Little’s Theorem
Sources of Network Delay?
Processing Delay
Queueing Delay
Time buffered waiting for transmission
Transmission Delay
Time between receiving a packet and assigning the packet to an outgoing link queue
Time between transmitting the first and the last bit of the packet
Propagation Delay
Time spend on the link – transmission of electrical signal
Independent of traffic carried by the link
Focus: Queueing & Transmission Delay
Outline
Delay in packet networks
Introduction to queuing theory
Exponential and Poisson distributions
Poisson process
Little’s Theorem
Basic Queueing Model
Buffer
Server(s)
Departures
Arrivals
Queued
In Service
A queue models any service station with:
One or multiple servers
A waiting area or buffer
Customers arrive to receive service
A customer that upon arrival does not find a free
server waits in the buffer
Characteristics of a Queue
b
m
Number of servers m: one, multiple, infinite
Buffer size b
Service discipline (scheduling)
FCFS, LCFS, Processor Sharing (PS), etc
Arrival process
Service statistics
Arrival Process
n +1
n
n −1
τn
tn
τ n : interarrival time between customers n and n+1
τ n is a random variable
{τ n , n ≥ 1} is a stochastic process
t
Interarrival times are identically distributed and have a common
mean E [τ n ] = E [τ ] = 1/ λ , where λ is called the arrival rate
Service-Time Process
n +1
n −1
n
sn
t
sn : service time of customer n at the server
{ s n , n ≥ 1} is a stochastic process
Service times are identically distributed with common mean
E[ sn ] = E [ s ] = µ
µ is called the service rate
For packets, are the service times really random?
Queue Descriptors
Generic descriptor: A/S/m/k
A denotes the arrival process
For Poisson arrivals we use M (for Markovian)
S denotes the service-time distribution
M: exponential distribution
D: deterministic service times
G: general distribution
m is the number of servers
k is the max number of customers allowed in the system – either in
the buffer or in service
k is omitted when the buffer size is infinite
Queue Descriptors: Examples
M/M/1: Poisson arrivals, exponentially distributed
service times, one server, infinite buffer
M/M/m: same as previous with m servers
M/M/m/m: Poisson arrivals, exponentially distributed
service times, m server, no buffering
M/G/1: Poisson arrivals, identically distributed service
times follows a general distribution, one server,
infinite buffer
*/D/∞ : A constant delay system
Outline
Delay in packet networks
Introduction to queuing theory
Exponential and Poisson distributions
Poisson process
Little’s Theorem
Some probability distributions and random process
Exponential Distribution
Memoryless Property
Poisson Distribution
Poisson Process
Definition and Properties
Interarrival Time Distribution
Modeling Arrival Statistics
Exponential Distribution
A continuous R.V. X follows the exponential distribution with
parameter µ, if its pdf is:
 µe− µ x
f X ( x) = 
 0
if x ≥ 0
if x < 0
=> Probability distribution function:
1 − e − µ x
FX ( x ) = P{ X ≤ x} = 
 0
Usually used for modeling service time
if x ≥ 0
if x < 0
Exponential Distribution (contd.)
Mean and Variance:
E[ X ] =
1
µ
, Var( X ) =
1
µ2
Proof:
∞
∞
0
0
E[ X ] = ∫ x f X ( x ) dx = ∫ xµ e− µ x dx =
= − xe
∞
−µx ∞
0
E[ X ] = ∫ x µ e
2
0
2
∞
1
0
µ
+ ∫ e− µ x dx =
−µx
2 −µx ∞
0
dx = − x e
Var( X ) = E[ X 2 ] − ( E[ X ])2 =
2
µ2
−
∞
2
0
µ
+ 2 ∫ xe− µ x dx =
1
µ2
=
1
µ2
E[ X ] =
2
µ2
Memoryless Property
Past history has no influence on the future
P{ X > x + t | X > t} = P{ X > x}
Proof:
P{ X > x + t | X > t} =
P{ X > x + t , X > t} P{ X > x + t}
=
P{ X > t}
P{ X > t}
e − µ ( x +t )
= − µt = e− µ x = P{ X > x}
e
Exponential: the only continuous distribution with the memoryless
property
Poisson Distribution
A discrete R.V. X follows the Poisson distribution with parameter λ if
its probability mass function is:
P{ X = k } = e − λ
λk
k!
, k = 0,1, 2,...
Wide applicability in modeling the number of random events that occur
during a given time interval (=>Poisson Process)
Customers that arrive at a post office during a day
Wrong phone calls received during a week
Students that go to the instructor’s office during office hours
packets that arrive at a network switch
etc
Poisson Distribution (contd.)
Mean and Variance
E[ X ] = λ , Var( X ) = λ
Proof:
∞
E[ X ] = ∑ kP{ X = k } = e
−λ
k =0
∞
= e λ∑
j =0
λj
j!
k =0
= e − λ λ eλ = λ
∞
E[ X ] = ∑ k P{ X = k } = e
2
k =0
−λ
∞
∑k
2
k =0
∞
= e λ ∑ ( j + 1)
−λ
j =0
λk
∞
∑ k k ! = e ∑ ( k − 1)!
−λ
k =0
−λ
2
λk
∞
λj
j!
∞
= λ ∑ je
j =0
λk
k!
−λ
=e
−λ
λk
∞
∑ k ( k − 1)!
k =0
λj
j!
+ λe
Var( X ) = E[ X 2 ] − ( E[ X ])2 = λ 2 + λ − λ 2 = λ
−λ
∞
∑
j =0
λj
j!
= λ2 + λ
Sum of Poisson Random Variables
Xi , i =1,2,…,n, are independent R.V.s
Xi follows Poisson distribution with parameter λi
Sum S = X + X + ... + X
n
1
2
n
Follows Poisson distribution with parameter λ
λ = λ1 + λ2 + ... + λn
Sum of Poisson Random Variables (cont.)
Proof: For n = 2. Generalization by induction. The pmf of S = X1 + X2 is
P fS = mg =
=
m
X
k=0
m
X
P fX1 = k; X2 = m ¡ kg
f 1 = kg
g P fX
f 2 = m ¡ kg
g
P fX
k=0
m
X
m¡k
k
¸
¸
e¡¸1 1 ¢ e¡¸2 2
=
k!
(m ¡ k)!
k=0
= e
¡(¸1 +¸2)
m
1 X
m!
¸k1¸m¡k
2
m! k=0 k!(m ¡ k)!
+ ¸2)m
= e
m!
Poisson with parameter ¸ = ¸1 + ¸2.
¡(¸1 +¸2) (¸1
Sampling a Poisson Variable
X follows Poisson distribution with parameter λ
Each of the X arrivals is of type i with probability pi,
i =1,2,…,n, independent of other arrivals;
p1 + p2 +…+ pn = 1
Xi denotes the number of type i arrivals, then
X1 , X2 ,…Xn are independent
Xi follows Poisson distribution with parameter λi= λpi
=> Splitting of Poisson process (later)
Sampling a Poisson Variable (contd.)
Proof: For n = 2. Generalize by induction. Joint pmf:
P fX1 = k1; X2 = k2g =
= P fX1 = k1; X2 = k2 jX = k1 + k2g P fX = k1 + k2g
³k + k ´
¸k1 +k2
1
2
k1 k2
¡¸
=
p1 p2 ¢ e
k1
(k1 + k2)!
1
=
(¸p1)k1 (¸p2)k2 ¢ e¡¸(p1+p2)
k1 !k2!
k1
k2
¡¸p1 (¸p1 )
¡¸p2 (¸p2 )
¢e
= e
k1 !
k2 !
² X1 and X2 are independent
k1
² P fX1 = k1g = e¡¸p1 (¸pk 1!) , P fX2 = k2g = e¡¸p2 (¸pk 2!)
1
2
k2
Xi follows Poisson distribution with parameter ¸pi .
Poisson Approximation to Binomial
Binomial distribution with parameters
(n, p)
n
P{ X = k } =   p k (1 − p )n− k
k 
As n→∞ and p→0, with np=λ
moderate, binomial distribution
converges to Poisson with parameter λ
Proof:
n
P{ X = k } =   p k (1 − p )n− k
k 
k
( n − k + 1)...( n − 1)n  λ   λ 
=
⋅   1 − 
k!
n  n
( n − k + 1)...( n − 1)n

→1
n→∞
nk
n
 λ
→ e− λ
 1 −  
n→∞
n

k
 λ
→1
 1 −  
n→∞
n

P{ X = k } 
→e
n →∞
−λ
λk
k!
n−k
Outline
Delay in packet networks
Introduction to queuing theory
Exponential and Poisson distributions
Poisson process
Little’s Theorem
Poisson Process with Rate λ
{A(t): t≥0} counting process
A(t) is the number of events (arrivals) that have occurred from time 0 to
time t, when A(0)=0
A(t)-A(s) number of arrivals in interval (s, t]
Number of arrivals in disjoint intervals are independent
Number of arrivals in any interval (t, t+τ] of length τ
Depends only on its length τ
Follows Poisson distribution with parameter λτ
P{ A(t + τ ) − A(t ) = n} = e − λτ
(λτ )n
, n = 0,1,...
n!
=> Average number of arrivals λτ; λ is the arrival rate
Interarrival-Time Statistics
Interarrival times for a Poisson process are independent and follow
exponential distribution with parameter λ
tn: time of nth arrival; τn=tn+1-tn: nth interarrival time
P{τ n ≤ s} = 1 − e − λ s , s ≥ 0
Proof:
Probability distribution function
P{τ n ≤ s} = 1 − P{τ n > s} = 1 − P{ A(tn + s ) − A(tn ) = 0} = 1 − e − λ s
Independence follows from independence of number of arrivals in disjoint
intervals
Small Interval Probabilities
Interval (t+ δ, t] of length δ
P{ A(t + δ ) − A(t ) = 0} = 1 − λδ + ο (δ )
P{ A(t + δ ) − A(t ) = 1} = λδ + ο (δ )
P{ A(t + δ ) − A(t ) ≥ 2} = ο (δ )
Proof:
Merging & Splitting Poisson Processes
λ1
λp
p
λ1 + λ2
λ
1-p
λ2
A1,…, Ak independent Poisson
processes with rates λ1,…, λk
Merged in a single process
A= A1+…+ Ak
A is Poisson process with rate
λ= λ1+…+ λk
λ(1-p)
A: Poisson processes with rate λ
Split into processes A1 and A2
independently, with probabilities p and
1-p respectively
A1 is Poisson with rate λ1= λp
A2 is Poisson with rate λ2= λ(1-p)
Modeling Arrival Statistics
Poisson process widely used to model packet arrivals in numerous
networking problems
Justification: provides a good model for aggregate traffic of a large
number of “independent” users
n traffic streams, with independent identically distributed (iid) interarrival
times with PDF F(s) – not necessarily exponential
Arrival rate of each stream λ/n
As n→∞, combined stream can be approximated by Poisson under mild
conditions on F(s) – e.g., F(0)=0, F’(0)>0
☺
Most important reason for Poisson assumption: Analytic tractability of
queueing models
Outline
Delay in packet networks
Introduction to queuing theory
Exponential and Poisson distributions
Poisson process
Little’s Theorem
Little’s Theorem
N
λ
T
λ: customer arrival rate
N: average number of customers in system
T: average delay per customer in system
Little’s Theorem: System in steady-state
N = λT
Counting Processes of a Queue
α(t)
N(t)
β(t)
t
N(t) : number of customers in system at time t
α(t) : number of customer arrivals till time t
β(t) : number of customer departures till time t
Ti : time spent in system by the ith customer
Time Averages
Time average over interval [0,t]
Steady state time averages
Nt
λt
Tt
δt
1 t
= ∫ N ( s )ds
t 0
a (t )
=
t
1 a (t )
=
Ti
∑
a (t ) i =1
β (t )
=
t
Little’s theorem:
N = lim N t
t →∞
λ = lim λt
t →∞
T = lim Tt
t →∞
δ = lim δ t
t →∞
N=λT
Applies to any queueing system
provided that:
Limits T, λ, and δ exist, and
λ= δ
We give a simple graphical proof
under a set of more restrictive
assumptions
Proof of Little’s Theorem for FCFS
α(t)
FCFS system, N(0)=0
α(t) and β(t): staircase graphs
N(t)
i
N(t) = α(t)- β(t)
Ti
Shaded area between graphs
β(t)
t
S (t ) = ∫ N ( s )ds
0
T1
T2
t
Assumption: infinitely often, N(t)=0. For any such t
1 t
α (t ) ∑ 1 Ti
(
)
(
)
N
s
ds
=
T
⇒
N
s
ds
=
⇒ N t = λtTt
∑
i
∫0
∫
0
t
t α (t )
i =1
t
α (t )
α (t)
If limits Nt→N, Tt→T, λt→λ exist, Little’s formula follows
We will relax the last assumption (i.e., infinitely often, N(t)=0)
Proof of Little’s for FCFS (contd.)
α(t)
N(t)
i
Ti
β(t)
T1
T2
In general – even if the queue is not empty infinitely often:
β (t ) ∑ T 1 t
α (t ) ∑ T
Ti ≤ ∫ N ( s )ds ≤ ∑ Ti ⇒
≤ ∫ N ( s )ds ≤
∑
0
β (t )
t
t 0
t α (t )
i =1
i =1
⇒ δ tTt ≤ N t ≤ λtTt
β (t )
t
α (t )
β (t)
1
α (t)
i
1
i
Result follows assuming the limits Tt →T, λt→λ, and δt→δ exist, and λ=δ
Probabilistic Form of Little’s Theorem
Have considered a single sample function for a stochastic process
Now will focus on the probabilities of the various sample
functions of a stochastic process
Probability of n customers in system at time t
pn (t ) = P{N (t ) = n}
Expected number of customers in system at t
∞
∞
n =0
n =0
E[ N (t )] = ∑ n.P{N (t ) = n} = ∑ npn (t )
Probabilistic Form of Little (contd.)
pn(t), E[N(t)] depend on t and initial distribution at t=0
We will consider systems that converge to steady-state, where there exist pn
independent of initial distribution
lim pn (t ) = pn , n = 0,1,...
t →∞
Expected number of customers in steady-state [stochastic aver.]
∞
EN = ∑ npn = lim E[ N (t )]
n =0
t →∞
For an ergodic process, the time average of a sample function is equal to the
steady-state expectation, with probability 1.
N = lim N t = lim E[ N (t )] = EN
t →∞
t →∞
Probabilistic Form of Little (contd.)
In principle, we can find the probability distribution of the delay Ti for customer
i, and from that the expected value E[Ti], which converges to steady-state
ET = lim E[Ti ]
i →∞
For an ergodic system
∑
T = lim
i →∞
∞
1
i
Ti
= lim E[Ti ] = ET
i →∞
Probabilistic Form of Little’s Formula:
EN = λ .ET
where the arrival rate is define as
E[α (t )]
λ = lim
t →∞
t
Time vs. Stochastic Averages
“Time averages = Stochastic averages” for all systems of interest in
this course
It holds if a single sample function of the stochastic process contains all
possible realizations of the process at t→∞
Can be justified on the basis of general properties of Markov chains
Example 0: a single line
For a transmission line,
λ: packet arrival rate
NQ: average number of packets waiting in queue (i.e., not under
transmission)
W: average time spent by a packet waiting in queue (i.e., not including
transmission time)
=>
N Q = λW
Similarly, if X is the average transmission time, then the average # of
packets under transmission is
ρ = λX
ρ is also called the utilization factor
Example 1: a network
Given
A network with packets arriving at n different nodes, and the arrival rates
are λ1, ..., λn respectively.
N: average # of packets inside the network,
Then
Average delay per packet (regardless of packet length distribution and
routing algorithms) is
T=
N
∑
n
i =1
Ni = λTi for each node i
λi
Example 2: data transport (congestion control)
Consider
a window flow congestion system with a window of size W for each session
λ: per session packet arrival rate
T: average packet delay in the network
Then W ≥ λT
=> if congestion builds up (i.e., T increases), λ must eventually decrease
Now suppose
network is congested and capable of maintaining λ delivery rate, then
W ≈ λT
=> increasing W only increases delay T
Summary
Delay in packet networks
Introduction to queuing theory
A few more points about probability theory
The Poisson process
Little’s Theorem
Homework #7
Problems 3.1, 3.4, and 3.6 of R1
Grading:
Overall points 130
20 points for Prob. 3.1
50 points for Prob. 3.4
60 points for Prob. 3.6