Download ppt

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
TCOM 501:
Networking Theory & Fundamentals
Lecture 2
January 22, 2003
Prof. Yannis A. Korilis
1
2-2
Topics
Delay in Packet Networks
 Introduction to Queueing Theory
 Review of Probability Theory
 The Poisson Process
 Little’s Theorem

Proof and Intuitive Explanation
 Applications

2-3
Sources of Network Delay

Processing Delay


Assume processing power is not a constraint
Queueing Delay

Time buffered waiting for transmission
Transmission Delay
 Propagation Delay



Time spend on the link – transmission of electrical signal
Independent of traffic carried by the link
Focus: Queueing & Transmission Delay
2-4
Basic Queueing Model
Buffer
Departures
Arrivals
Queued

Server(s)
In Service
A queue models any service station with:
One or multiple servers
 A waiting area or buffer

Customers arrive to receive service
 A customer that upon arrival does not find a
free server is waits in the buffer

2-5
Characteristics of a Queue
b
m
Number of servers m: one, multiple, infinite
 Buffer size b
 Service discipline (scheduling): FCFS, LCFS,
Processor Sharing (PS), etc
Arrival process
Service statistics

2-6
Arrival Process
n 1
n
n
tn
n 1
t
 n : interarrival time between customers n and n+1
  n is a random variable
 { n , n  1} is a stochastic process

Interarrival times are identically distributed and have
a common mean
E[ n ]  E[ ]  1/ l

l is called the arrival rate
2-7
Service-Time Process
n 1
n 1
n
sn
t
sn : service time of customer n at the server
 {sn , n  1} is a stochastic process

Service times are identically distributed with common mean
E[ sn ]  E[ s]  m

m is called the service rate
 For
packets, are the service times really random?
2-8
Queue Descriptors
Generic descriptor: A/S/m/k
 A denotes the arrival process



For Poisson arrivals we use M (for Markovian)
B denotes the service-time distribution



M: exponential distribution
D: deterministic service times
G: general distribution
m is the number of servers
 k is the max number of customers allowed in the
system – either in the buffer or in service


k is omitted when the buffer size is infinite
2-9
Queue Descriptors: Examples
M/M/1: Poisson arrivals, exponentially distributed
service times, one server, infinite buffer
 M/M/m: same as previous with m servers
 M/M/m/m: Poisson arrivals, exponentially distributed
service times, m server, no buffering
 M/G/1: Poisson arrivals, identically distributed service
times follows a general distribution, one server, infinite
buffer
 */D/∞ : A constant delay system

2-10
Probability Fundamentals
Exponential Distribution
 Memoryless Property
 Poisson Distribution
 Poisson Process

Definition and Properties
 Interarrival Time Distribution
 Modeling Arrival and Service Statistics

2-11
The Exponential Distribution

A continuous RV X follows the exponential distribution
with parameter m, if its probability density function is:
 me  m x
f X ( x)  
 0
if x  0
if x  0
Probability distribution function:
1  e m x
FX ( x )  P{ X  x}  
 0
if x  0
if x  0
2-12
Exponential Distribution (cont.)

Mean and Variance:
E[ X ] 
1
m
, Var( X ) 
1
m2
Proof:


0
0
E[ X ]   x f X ( x ) dx   x m e  m x dx 
  xe m x


0
E[ X ]   x m e
2
0
2

1
0
m
  e m x dx 
mx
2 mx 
0
dx   x e
Var( X )  E[ X 2 ]  ( E[ X ])2 
2
m2


2
0
m
 2  xe  m x dx 
1
m2

1
m2
E[ X ] 
2
m2
2-13
Memoryless Property

Past history has no influence on the future
P{ X  x  t | X  t}  P{ X  x}
Proof:
P{ X  x  t | X  t} 
P{ X  x  t , X  t} P{ X  x  t}

P{ X  t}
P{ X  t}
e m ( x t )
  mt  e m x  P{ X  x}
e

Exponential: the only continuous distribution with the
memoryless property
2-14
Poisson Distribution

A discrete RV X follows the Poisson distribution with
parameter l if its probability mass function is:
P{ X  k}  e

k
l
l
k!
, k  0,1,2,...
Wide applicability in modeling the number of random
events that occur during a given time interval – The
Poisson Process:




Customers that arrive at a post office during a day
Wrong phone calls received during a week
Students that go to the instructor’s office during office hours
… and packets that arrive at a network switch
2-15
Poisson Distribution (cont.)

Mean and Variance
E[ X ]  l , Var( X )  l
Proof:

E[ X ]   kP{ X  k }  e
l
k 0

 e l
l
j 0
lj
j!
k 0
E[ X ]   k P{ X  k }  e
2
k 0

 e l  ( j  1)
l
j 0
lk

 k k !  e  ( k  1)!
l
k 0
 e  l l el  l

2
lk

lj
j!
l

k
2
k 0

 l  je
j 0
lk
k!
l
e
l
lk

 k ( k  1)!
k 0
lj
j!
 le
Var( X )  E[ X 2 ]  ( E[ X ])2  l 2  l  l 2  l
l


j 0
lj
j!
 l2  l
2-16
Sum of Poisson Random Variables

Xi , i =1,2,…,n, are independent RVs
Xi follows Poisson distribution with parameter li
 Partial sum defined as:

Sn  X 1  X 2  ...  X n
Sn follows Poisson distribution with parameter l
l  l1  l2  ...  ln
2-17
Sum of Poisson Random Variables (cont.)
Proof: For n = 2. Generalization by induction. The pmf of S = X1 + X2 is
P fS = mg =
=
=
m
X
k=0
m
X
k=0
m
X
k=0
P fX1 = k; X2 = m ¡ kg
P fX1 = kg P fX2 = m ¡ kg
k
¸
¡¸
e 1 1
k!
¢ e¡¸2
¸m¡k
2
(m ¡ k)!
m
X
1
m!
¡(¸1+¸2)
= e
¸k1¸m¡k
2
m! k=0 k!(m ¡ k)!
+ ¸2)m
=
m!
Poisson with parameter ¸ = ¸1 + ¸2.
(¸
e¡(¸1+¸2) 1
2-18
Sampling a Poisson Variable
X follows Poisson distribution with parameter l
 Each of the X arrivals is of type i with probability pi,
i =1,2,…,n, independently of other arrivals;
p1 + p2 +…+ pn = 1
 Xi denotes the number of type i arrivals

X1 , X2 ,…Xn are independent
Xi follows Poisson distribution with parameter li lpi
2-19
Sampling a Poisson Variable (cont.)
Proof: For n = 2. Generalize by induction. Joint pmf:
P fX1 = k1; X2 = k2g =
= P fX1 = k1; X2 = k2 jX = k1 + k2g P fX = k1 + k2g
³k + k ´
¸k1+k2
1
2
k1 k2
¡¸
=
p1 p2 ¢ e
k1
(k1 + k2)!
1
=
(¸p1)k1 (¸p2)k2 ¢ e¡¸(p1+p2)
k1 !k2!
k1
k2
¡¸p1 (¸p1 )
¡¸p2 (¸p2 )
= e
¢e
k1 !
k2!
² X1 and X2 are independent
k1
k2
² P fX1 = k1g = e¡¸p1 (¸pk11!) , P fX2 = k2g = e¡¸p2 (¸pk22!)
Xi follows Poisson distribution with parameter ¸pi.
Poisson Approximation to Binomial
2-20

Binomial distribution with
parameters (n, p)
n
P{ X  k}    p k (1  p )n k
k 

As n→∞ and p→0, with np=l
moderate, binomial distribution
converges to Poisson with
parameter l

Proof:
n k
P{ X  k }    p (1  p ) n  k
k 
( n  k  1)...( n  1)n  l   l 

   1  
k!
n  n
( n  k  1)...( n  1)n

1
n
nk
k
 l
 e l
 1   
n
 n
n
 l
1
 1   
n
 n
k
P{ X  k } 
e
n
l
lk
k!
nk
2-21
Poisson Process with Rate l

{A(t): t≥0} counting process




A(t) is the number of events (arrivals) that have occurred from
time 0 – when A(0)=0 – to time t
A(t)-A(s) number of arrivals in interval (s, t]
Number of arrivals in disjoint intervals independent
Number of arrivals in any interval (t, t+] of length 
 Depends only on its length 
 Follows Poisson distribution with parameter l
n
 l (l )
P{ A(t   )  A(t )  n}  e
, n  0,1,...
n!
Average number of arrivals l; l is the arrival rate
Interarrival-Time Statistics
2-22

Interarrival times for a Poisson process are independent
and follow exponential distribution with parameter l
tn: time of nth arrival; n=tn+1-tn: nth interarrival time
P{ n  s}  1  e  l s , s  0
Proof:
 Probability distribution function
P{ n  s}  1  P{ n  s}  1  P{ A(tn  s)  A(tn )  0}  1  e  l s

Independence follows from independence of number of arrivals in
disjoint intervals
2-23
Small Interval Probabilities

Interval (t+ d, t] of length d
P{ A(t  d )  A(t )  0}  1  ld   (d )
P{ A(t  d )  A(t )  1}  ld   (d )
P{ A(t  d )  A(t )  2}   (d )
Proof:
( ld )2
P{ A(t  d )  A(t )  0}  e  1  ld 
 1  ld   (d )
2

( ld )2 
 ld
P{ A(t  d )  A(t )  1}  e ld  ld 1  ld 
  ld   (d )
2 

 ld
1
P{ A(t  d )  A(t )  2}  1   P{ A(t  d )  A(t )  k }
k 0
 1  (1  ld   (d ))  ( ld   (d ))   (d )
Merging & Splitting Poisson Processes
2-24
l1
lp
p
l
l1  l2
1-p
l(1-p)
l2


A1,…, Ak independent Poisson
processes with rates l1,…, lk
Merged in a single process
A= A1+…+ Ak
A is Poisson process with rate
l= l1+…+ lk


A: Poisson processes with rate l
Split into processes A1 and A2
independently, with probabilities p
and 1-p respectively
A1 is Poisson with rate l1= lp
A2 is Poisson with rate l2= l(1-p)
2-25
Modeling Arrival Statistics
Poisson process widely used to model packet arrivals
in numerous networking problems
 Justification: provides a good model for aggregate
traffic of a large number of “independent” users




n traffic streams, with independent identically distributed (iid)
interarrival times with PDF F(s) – not necessarily exponential
Arrival rate of each stream l/n
As n→∞, combined stream can be approximated by Poisson
under mild conditions on F(s) – e.g., F(0)=0, F’(0)>0
Most important reason for Poisson assumption:
Analytic tractability of queueing models
2-26
Little’s Theorem
N
l
T
l: customer arrival rate
 N: average number of customers in system
 T: average delay per customer in system
Little’s Theorem: System in steady-state

N  lT
2-27
Counting Processes of a Queue
(t)
N(t)
b(t)
t
N(t) : number of customers in system at time t
 (t) : number of customer arrivals till time t
 b(t) : number of customer departures till time t
 Ti : time spent in system by the ith customer

Time Averages
2-28


Time average over interval [0,t]
Steady state time averages
Nt
lt
Tt
dt
1 t
  N ( s )ds
t 0
a (t )

t
1 a (t )

Ti

a (t ) i 1
b (t )

t
N  lim N t
t 
l  lim lt
t 
T  lim Tt
t 
d  lim d t
t 


Little’s theorem N=λT
Applies to any queueing system
provided that:
Limits T, λ, and d exist, and
λ= d
We give a simple graphical proof
under a set of more restrictive
assumptions
Proof of Little’s Theorem for FCFS
2-29
(t)

N(t)
i
Ti
b(t)
FCFS system, N(0)=0
(t) and b(t): staircase graphs
N(t) = (t)- b(t)
Shaded area between graphs
t
S (t )   N ( s)ds
0
T1

T2
t
Assumption: N(t)=0, infinitely often. For any such t
1 t
 (t )  1 Ti
N
(
s
)
ds

T

N
(
s
)
ds

 N t  ltTt

i
0

0
t
t  (t )
i 1
t
 (t )
 (t)
If limits Nt→N, Tt→T, λt→λ exist, Little’s formula follows
We will relax the last assumption
Proof of Little’s for FCFS (cont.)
2-30
(t)
N(t)
i
Ti
b(t)
T1

T2
In general – even if the queue is not empty infinitely often:
b

b (t )
 (t )
t
b (t )  T 1 t
 (t )  T
Ti   N ( s)ds   Ti 
  N ( s)ds 

0
t
b (t )
t 0
t  (t )
i 1
i 1
 d tTt  N t  ltTt
Result follows assuming the limits Tt →T, λt→λ, and dt→d exist,
and λ=d
(t )
1

(t)
i
1
i
2-31
Probabilistic Form of Little’s Theorem
Have considered a single sample function for a
stochastic process
 Now will focus on the probabilities of the various
sample functions of a stochastic process
 Probability of n customers in system at time t

pn (t )  P{N (t )  n}

Expected number of customers in system at t


n 0
n 0
E[ N (t )]   n.P{N (t )  n}   npn (t )
2-32
Probabilistic Form of Little (cont.)



pn(t), E[N(t)] depend on t and initial distribution at t=0
We will consider systems that converge to steady-state
there exist pn independent of initial distribution
lim pn (t )  pn , n  0,1,...
t 

Expected number of customers in steady-state [stochastic aver.]

EN   npn  lim E[ N (t )]
n 0

t 
For an ergodic process, the time average of a sample function is
equal to the steady-state expectation, with probability 1.
N  lim N t  lim E[ N (t )]  EN
t 
t 
2-33
Probabilistic Form of Little (cont.)

In principle, we can find the probability distribution of the delay
Ti for customer i, and from that the expected value E[Ti], which
converges to steady-state
ET  lim E[Ti ]
i 

For an ergodic system

T  lim
i 

1
i
Ti
 lim E[Ti ]  ET
i 
Probabilistic Form of Little’s Formula: EN  l.ET
Arrival rate define as
E[ (t )]
l  lim
t 
t
2-34
Time vs. Stochastic Averages
“Time averages = Stochastic averages,” for all systems
of interest in this course
 It holds if a single sample function of the stochastic
process contains all possible realizations of the
process at t→∞


Can be justified on the basis of general properties of
Markov chains
2-35
Moment Generating Function
1. De¯nition: for any t 2 IR:
8 Z 1
>
>
etx fX (x) dx;
X continuous
<
¡1
X
MX (t) = E[etX ] =
>
etxj P fX = xj g ; X discrete
>
:
j
2. If the moment generating function MX (t) of X
exists and is ¯nite in some neighborhood of t = 0,
it determines the distribution of X uniquely.
3. Fundamental Properties: for any n 2 IN:
dn
(i)
MX (t) = E[X netX ]
n
dt
dn
(ii)
MX (0) = E[X n]
n
dt
4. Moment Generating Functions and Independence:
X; Y : independent ) MX+Y (t) = MX (t)MY (t)
The opposite is not true.
2-36
Discrete Random Variables
Distribution
(parameters)
Binomial
(n; p)
Prob. Mass Fun.
P fX = kg
¡n¢
k
n¡k
p
(1
¡
p)
k
k = 0; 1; : : : ; n
Poisson
¸
Mean
E[X]
Variance
Var(X)
(pet + 1 ¡ p)n
np
np(1 ¡ p)
pet
1¡(1¡p)et
1
p
1¡p
p2
r
p
r(1¡p)
p2
¸
¸
p)k¡1 p
(1 ¡
k = 1; 2; : : :
Geometric
p
Negative Bin.
(r; p)
Moment Gen. Fun.
MX (t)
³
k¡1
r¡1
´
pr (1 ¡ p)k¡r
k = r; r + 1; : : :
k
e¡¸ ¸k!
k = 0; 1; : : :
h
ir
pet
1¡(1¡p)et
t
e¸(e ¡1)
2-37
Continuous Random Variables
Distribution
(parameters)
Prob. Density Fun.
fX (x)
Moment Gen. Fun.
MX (t)
Mean
E[X]
Variance
Var(X)
Uniform over
(a; b)
1
b¡a
etb ¡eta
t(b¡a)
a+b
2
(b¡a)2
12
¸
¸¡t
1
¸
1
¸
¹
¾2
Exponential
¸
Normal
(¹; ¾ 2 )
a<x<b
¸e¡¸x
x¸0
2
2
p 1 e¡(x¡¹) =2¾
2¼¾
¡1 < x < 1
e¹t+(¾t)
2 =2
Related documents