Download C12_Math3033 - CIS @ Temple University

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Probability wikipedia , lookup

History of statistics wikipedia , lookup

Statistics wikipedia , lookup

Randomness wikipedia , lookup

Transcript
MATH 3033 based on
Dekking et al. A Modern Introduction to Probability and Statistics. 2007
Slides by Gautam Shankar
Format by Tim Birbeck
Instructor Longin Jan Latecki
C12: The Poisson process
12.1 – Random Points

Poisson process model: often applies in situations where there is a very
large population, and each member of the population has a very small
probability to produce a point of the process.

Examples of Random points: arrival times of email messages at a server,
the times at which asteroids hit earth, arrival times of radioactive particles at a
Geiger counter, times at which your computer crashes, the times at which
electronic components fail, and arrival times of people at a pump in an oasis.
12.2 – Taking a closer look at random
arrivals





Example: Telephone calls arrival times
Calls arrive at random times, X1, X2, X3…
Homegeneity aka weak stationarity: is the rate lambda at which arrivals
occur in constant over time: in a subinterval of length u the expectation of
the number of telephone calls is lambda * u.
Independence: The number of arrivals in disjoint time intervals are
independent random variables.

N(I) = total number of calls in an interval I
N([0,t])  Nt
E[Nt] = λ t

Divide Interval [0,t] into n intervals, each of size t/n

12.2 – Taking a closer look at random
arrivals





When n is large enough, every interval Ij,n = ((j-1)t/n , jt/n] will contain
either 0 or 1 arrival.
Arrival: For such a large n ( n > λ t),
Rj = number of arrivals in the time interval Ij,n
Rj has a Ber(pj) distribution for some pj.
Recall: (For a Bernoulli random variable)
E[Rj] = 0 • (1 – pj) + 1 • pj = pj
By Homogeneity assumption (see prev slide), for each j
pj = λ • length of Ij,n = ( λ t / n)
Total number of calls:
Nt = R1 + R2 + … + Rn.
By Independence assumption (see prev slide)
Rj are independent random variables, so
Nt has a Bin(n,p) distribution, with p = λ t/n

12.2 – Taking a closer look at random
arrivals
Definition: A discrete random variable X has a Poisson distribution with
parameter µ, where µ > 0 if its probability mass function p is given by
k
for k = 0,1,2..

P(k)  P(X  k) 

k!
e
We denote this distribution by Pois(µ)
The Expectation an variance of a Poisson Distribution
Let X have a Poisson distribution with parameter µ; then
E[X] = µ and Var(X) = µ
12.3 – The one-dimensional Poisson
process
Interarrival Times
The differences
Ti = Xi – Xi-1 are called interarrival times.
This imples that
P(T1  t)  1 - P(T1  t)  1 - P(N t  0)  1 - e -t
Therefore T1 has an exponential distribution with parameter λ
P(T2  t | T1  s)  P(no arrivals in (s, s  t] | T1  s)
 P(no arrivals in (s, s  t])
 P(N((s, s  t])  0)  e
- t
12.3 – The one-dimensional Poisson
process
T1 and T2 are independent, and
P(T2  t)  e
- t
The one-dimensional Poisson process with intensity λ is a
sequence X1 , X 2, X 3 ,.. Of random variables having the
property that the inter-arrival times X 1 , X 2  X 1 , X 3  X 2 ,...
are independent random variables, each with an Exp(λ)
distribution.
N t is equal to the number of Xi that are smaller than
(or equal to) t.