Download ppt

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Statistics wikipedia , lookup

History of statistics wikipedia , lookup

Probability interpretations wikipedia , lookup

Probability wikipedia , lookup

Transcript
Probability Review
Thinh Nguyen
Probability Theory Review





Sample space
Bayes’ Rule
Independence
Expectation
Distributions
Sample Space - Events

Sample Point


Sample Space S



The outcome of a random experiment
The set of all possible outcomes
Discrete and Continuous
Events


A set of outcomes, thus a subset of S
Certain, Impossible and Elementary
Set Operations

Union A  B
Intersection A  B
Complement AC

Properties






S
A B
AC
Commutation
A B  B  A
Associativity
A   B  C    A  B  C
Distribution
A   B  C    A  B   A  C 
De Morgan’s Rule
C
 A  B   AC  BC
A B
Axioms and Corollaries




Axioms
0  P  A
PS   1
If A  B  
P  A  B  P  A  P  B

If A1, A2, … are pairwise
exclusive
  
P  Ak    P  Ak 
 k 1  k 1





Corollaries
P  AC   1  P  A
P  A  1
P   0
P  A  B 
P  A  P  B   P  A  B 
Conditional Probability

Conditional Probability of
event A given that event
B has occurred
P  A | B 

A B
S
P  A  B
P  B
If B1, B2,…,Bn a partition
of S, then
B1
B2
P  A  P  A | B1  P  B1   ... 
P  A | B j  P  B j 
(Law of Total Probability)
A B
AC
A
B3
Bayes’ Rule

If B1, …, Bn a partition of S then
P  B j | A 

P  A  B j 
P  A
P  A | B j  P  B j 
n
 P A | B  PB 
k 1
posterior 
k
k
likelihood  prior
evidence
Event Independence

Events A and B are independent if
P  A  B  P  A P  B

If two events have non-zero probability and are mutually
exclusive, then they cannot be independent
Random Variables
Random Variables

The Notion of a Random
Variable



The outcome is not always
a number
Assign a numerical value to
the outcome of the
experiment
Definition

A function X which assigns
a real number X(ζ) to each
outcome ζ in the sample
space of a random
experiment
S
ζ
X(ζ) = x
x
Sx
Cumulative Distribution Function

Defined as the probability
of the event {X≤x}
FX  x   P  X  x

Fx(x)
1
Properties
0  FX  x   1
x
lim FX  x   1
Fx(x)
lim FX  x   0
¾
x
x
if a  b then FX  a   FX  a 
P  a  X  b  FX  b   FX  a 
P  X  x  1  FX  x 
1
½
¼
0
1
2
3
x
Types of Random Variables

Continuous

Probability Density
Function
fX  x 
FX  x  
dFX  x 
dx
x


f X  t  dt

Discrete

Probability Mass
Function
PX  xk   P  X  xk 
FX  x    PX  xk  u  x  xk 
k
Probability Density Function

The pdf is computed from
fX(x)
dFX  x 
fX  x 
dx

Properties
P  a  X  b   f X  x  dx
b
fX(x)
a
FX  x  
x

f X  t  dt


1

f X  t  dt


For discrete r.v.
f X  x    PX  xk   x  xk 
k
dx
P  x  X  x  dx  f X  x  dx
x
Expected Value and Variance

The expected value or mean of
X is

E  X    tf X  t  dt

E  X    xk PX  xk 

2
Var  X    2  E  X  E  X  



k

The standard deviation of X is
Std  X     Var  X 
Properties
E c  c
The variance of X is

Properties
E cX   cE  X 
Var c  0
E  X  c  E  X   c
Var cX   c2Var  X 
Var  X  c  Var  X 
Queuing Theory
Example
nSend
a file over the internet
packet
Modem
card
buffer
link (fixed rate)
place
C
Computation
(Queuing)
transmission
propagation
Delay Models
B
A
time
Queue Model
Practical Example
Multiserver queue
Multiple Single-server queues
Standard Deviation impact
Queueing Time
Queuing Theory
The theoretical study of waiting lines,
expressed in mathematical terms
input
server
queue
Delay= queue time +service time
output
The Problem
Given
 One or more servers that render the service
 A (possibly infinite) pool of customers
 Some description of the arrival and service
processes.
Describe the dynamics of the system
Evaluate its Performance

If there is more than one queue for the server(s), there may also
be some policy regarding queue changes for the customers.
Common Assumptions


The queue is FCFS (FIFO).
We look at steady state : after the system has
started up and things have settled down.
State=a vector indicating the total # of customers in
each queue at a particular time instant
(all the information necessary to completely describe
the system)
Notation for queuing systems
A/B/c/d
A = the interarrival time distribution
B = the service time distribution
c = the number of servers
d = the queue size limit
nomitted
:Where A and B can be
D for Deterministic distribution
M for Markovian (exponential) distribution
G for General (arbitrary) distribution
if infinite
The M/M/1 System
Poisson
Process
Exponential
server
queue
output
Arrivals follow a Poisson process
nReadily
amenable for analysis
nReasonable



for a wide variety of situations
a(t) = # of arrivals in time interval [0,t]
 = mean arrival rate
t = k ; k = 0,1,…. ; 0
Pr(exactly 1 arrival in [t,t+]) = 
Pr(no arrivals in [t,t+]) = 1-
Pr(more than 1 arrival in [t,t+]) = 0
Pr(a(t) = n) = e- t ( t)n/n!
Model for Interarrivals and Service times



Customers arrive at times t0 < t1 < .... - Poisson distributed
The differences between consecutive arrivals are the
interarrival times : n = tn - t n-1
n in Poisson process with mean arrival rate
exponentially distributed,
, are
Pr(n  t) = 1 - e- t

Service times are exponentially distributed, with mean
service rate
:
Pr(Sn  s) = 1 - e-s
System Features

Service times are independent
service times are independent of the arrivals

Both inter-arrival and service times are memoryless

Pr(Tn > t0+t | Tn> t0) = Pr(Tn  t)
future events depend only on the present state
 This is a Markovian System
Exponential Distribution
given an arrival at time x
P (x    (x  t ))
P ((  x )  t|  x ) 
P (  x )
)  (1  e
P (  (x  t ))  P (  x ) (1  e


1  P (  x )
P (  x )
 (x t )
 (x t )
 x
 x
 t
e (1  e )
e
e


 x
 x
e
1  (1  e )
Same as probability starting at time = 0
 x
)
Markov Models
• n+1
Buffer
Occupancy
departure
•n
• n-1
t
•n
arrival
t
Probability of being in state n
Pn (t  t )  Pn (t )[(1  t )(1  t )  tt ]
 Pn 1 (t )[(t )(1  t )]
 Pn 1 (t )[(t )(1  t )]
as t  0, Taylor series
dPn (t )
Pn (t  t )  Pn (t ) 
t
dt
Steady State Analysis
Substituting for Pn (t  t )
(   ) Pn  Pn 1  Pn 1
Steady state
P0  P1
Markov Chains

0

1
...
n-1

n
n+1



Rate leaving n = Pn (   )
Rate arriving n = Pn 1  Pn 1
Steady State Pn (   )  Pn 1  Pn 1
State 0 P0  P1
Substituting Utilization

P1  P0  P0

P2  (   ) P1  P0


P2  P1  P1  P0  P1 (  1)  P0


Substituting P1
P2  P0 (  1)  P0
  P0  P0  P0   P0
2
2
Pn   P0
n
• Higher states have decreasing probability
• Higher utilization causes higher probability
of higher states
What about P0



P0
 Pn  1    P0  P0   
n 0
n 0
n 0
1 
P0
1
 P0  1  
1 
Pn  (1   )
n
n
Queue determined by



n
E(n), Average Queue Size



q  E (n)   nPn   n(1   )   (1   ) n
n
n 0

=
1- 
n 0
n 0
n
Selecting Buffers
E(N)
1/3
1
3
9

.25
.5
.75
.9
For large utilization, buffers grow exponentially
Throughput

Throughput=utilization/service time = /Ts

For =.5 and Ts=1ms

Throughput is 500 packets/sec
Intuition on Little’s Law

If a typical customer spends T time units, on the overage, in
the system, then the number of customers left behind by
that typical customer is equal to
q  Tq
Applying Little’s Law
M/M/1 Average Delay
E (n)  E (T ) or w  Tw or q  Tq
E ( n)

1
1
E (T ) 




 (1   )  (1   )   
Ts
1

/
Ts  so E (T ) 



 (1   )  (1   ) (1   )
Probability of Overflow


n  N 1
n  N 1
P (n  N )   pn  (1   )   n   N 1
Buffer with N Packets
N 1

1  
n
pn  1  p0    p0 


n 0
n 0
 1  
n
1 
(1   ) 
p0 
and pn 
N 1
N 1
1 
1 
N
N
(1   ) 
N
N +1
pN 
 (1   )  with   1
N 1
1 
N
Example

Given



Arrival rate of 1000 packets/sec
Service rate of 1100 packets/sec
Find
 1000
 
 0.91
 1100

Utilization

Probability of having 4 packets in the queue
P4  (1   ) .062
4
P1 .082, P2 .075, P3 .068, P5 .056
Example
E ( n) 

 9.99
1 
With infinite buffers
P(n  12)   121  .28
With 12 fixed buffers cell loss probabilit y
( 1  )
P12 
 .04
121
1 
Pn  .11,.10,.09,.09,.08,.07,.07,.06,.05,.05,.04
12
Application to Statistcal Multiplexing


Consider one transmission
line with rate R.
Time-division Multiplexing


Divide the capacity of the
transmitter into N channels,
each with rate R/N.
R/N
R/N
R/N
1
T 
 
Statistical Multiplexing

Buffering the packets
coming from N streams into
a single buffer and
transmitting them one at a
time.
R
T'
1
T

N  N N
Network of M/M/1 Queues
1
1
3
4
2
3
2
1   1   2
Li 
i
 i  i
2   1   2   3
3   1   3
  1   2   3
L  L1  L2  L3
T
1

i
J

i 1
i
 i
M/G/1 Queue
The customers pat at rate C since each
customer pays C on the average and 
customers go through the queue per unit time.
Assume that every customer in the queue pays at rate R when his or
her remaining service time is equal
R. time t, the customers pay at a rate
At atogiven
equal to the sum of the remaining service times
of all the customer in the queue. The queue
begin first
S 2come-first served, this sum is equal to
SQqueueing

Total cost paid by a customer: the
time of a customer who would
2
2
enter the queue at time t.
E
[
Q
]
E
[
S
]
Expected cost paid by each customer: C 

S : Service Time
Q : Queuing Time

S
0
Q
2
 E[Q] E[ S 2 ] 

E[Q]  C   

2 
 
E[S 2 ]
E[Q] 
2(1   )
1
T  E[Q] 

S