Download More on the exponential- distribution and Poisson processes.

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Stochastic geometry models of wireless networks wikipedia , lookup

Transcript
More on the exponentialdistribution and Poisson
processes.
Covered by 5.6, 6.6, 6.7 in Walpole this note
and parts of 12.2 in Ghahramani (not necessary
to go into all the technical details in 12.2).
The exponential distribution
So far we have learnt that there are two ways
to parametrise the exponential distribution.
Either we use
1 −x/β
f (x) = e
, x ≥ 0,
β
which gives E(X) = β and Var(X) = β 2 , or
f (x) = λe−λx , x ≥ 0,
which gives E(X) = 1/λ and Var(X) = 1/λ2 .
I.e. β = 1/λ.
We have also seen that if X1 , . . . , Xk are indep.
and exponentially distributed with expectation
β then U = min(X1 , . . . , Xk ) is exponentially
1
distributed with expectation β/k. (While max
is not exponentially distributed!)
Further we have learnt that a sum of independent identically exponentially distributed
variables is having a gamma distribution. If
X1 , . . . , Xk are indep. exponentially distributed
k
with expectation β then Y = i=1 is gamma
distributed with parameters α = k and β.
We shall now see that the exponential distribution is memoryless!
Example: You have a light bulb which you
so far have used for 300 hours, and it is still
functioning. What is the probability that it will
be functioning for 500 more hours? Let T be the
lifetime of the light bulb and assume that T has
an exponential distribution with expectation β.
We then have that
∞
1 −u/β
−t/β
P (T > t) =
e
du = [−e−u/β ]∞
.
t =e
β
t
We shall find P (T > 500 + 300|T > 300) =
P (T > 800|T > 300)
2
P ((T > 800) ∩ (T > 300))
P (T > 300)
P (T > 800))
P (T > 300)
P (T > 800|T > 300) =
=
e−800/β
=
e−300/β
= e−500/β = P (T > 500)
I.e. the probability that a light bulb which is
functioning after 300 hours will be functioning
for another 500 hours is the same as the
probability that a new bulb will be functioning
for 500 hours!! (if the lifetime is exponentially
distributed)
2
3
The memoryless property
Generally we have for the exponential distribution that:
P (T > s + u|T > u) =
=
P ((T > s + u) ∩ (T > u))
P (T > u)
P (T > s + u))
P (T > u)
e−(s+u)/β
=
e−u/β
= e−s/β = P (T > s)
We describe this property by saying that the
exponential distribution is memoryless.
If T is the time until failure of a system or
component, the exponential model implies that
the system/component is neither improving nor
deteriorating over time.
It can be shown that the exponential distribution is the only continuous distribution which is
memoryless. Among the discrete distributions
the geometric distribution is memoryless.
4
Hazard rate
Another illustration of the memoryless property
of the exponential distribution is seen by
considering the hazard rate of the exponential
distribution.
Hazard rate (failure rate) generally:
r(t) =
=
=
=
=
1
P (X ∈ (t, t + Δt)|X > t)
Δt→0 Δt
1 P ((X ∈ (t, t + Δt)) ∩ (X > t))
lim
Δt→0 Δt
P (X > t)
1 P (X ∈ (t, t + Δt))
lim
Δt→0 Δt
P (X > t)
1
F (t + Δt) − F (t)
lim
Δt→0
Δt
P (X > t)
f (t)
1 − F (t)
lim
We can think of this as the conditional failure
rate for a unit which is still functioning at time
t. I.e. a measure of how likely it is that a unit
functioning at time t will fail in the near future.
5
For the exponential distribution:
λe−λt
f (t)
= −λt = λ
r(t) =
1 − F (t)
e
I.e. the exponential distribution has constant
hazard rate (and is the only distribution having
constant hazard rate). This means that a unit
with exponentially distributed lifetime has the
same chance of failing regardless of the age of the
unit. Consequently, the exponential distribution
is best suited as a model for phenomenon where
an event/failure happens “spontaneously” (no
ageing effects, fatigue, or similar).
6
Poisson processes
Notice: Chapter 12.2 in Ghahramani covers
Poisson processes, but is quite technical (12.3
covering the next topic is far better!). It is not
required to read all proofs and other technical
details in 12.2. Read the theorems and the part
on queue theory and otherwise use this note and
Walpole as your reference to Poisson processes.
Counting process:
Let N (t) = the number of events in [0, t].
{N (t), t ≥ 0} is then called a counting process.
Note in particular that “the number of events
which occur in an interval [a, b]” can now simply
be written: N (b) − N (a).
7
Definition of Poisson process
A counting process is a Poisson process with
intensity λ if:
1. N (0) = 0
2. The numbers of events that occur in disjoint
intervals are independent. I.e. N (b) − N (a)
indep. of N (d) − N (c) if [a, b] and [c, d] are
disjoint. (Independent increments).
3. P (N (s + Δs) − N (s) = 1) ≈ λΔs.
(Stationary increments).
4. P (N (s + Δs) − N (s) > 1) ≈ 0.
It can be shown that N (s + t) − N (s) is having
a Poisson distribution with expectation λt (see
Ghahramani, but do not be too concerned about
all the technical details.).
8
Properties
We have previously shown that the time until
the first event in a Poisson process is having an
exponential distribution with expectation 1/λ:
(λt)0 −λt
P (T1 > t) = P (N (t) = 0) =
= e−λt
e
0!
The time between any two subsequent events
is also having an exponential distribution with
expectation 1/λ.
Since the exponential distribution is memoryless
it follows that the Poisson process also is
memoryless. We have already shown this in the
bus example in the notes for chap. 6. Whenever
we enter a Poisson process, the time until the
next event is having an exponential distribution
with expectation 1/λ, independent of what has
happened previously. I.e. what has happened
in the past does not influence the future.
The fact that the process is memoryless follows directly from the independent increment
property (point 2 in the definition).
9
It has been stated earlier that a sum of k independent identically exponentially distributed
variables with expectation β is having a gamma
distribution with parameters α = k and β = β.
From this it follows directly that the time until
event number k in a Poisson process is having a
gamma distribution with α = k and β = 1/λ.
10
Further properties
Poisson processes have several other properties,
we shall briefly mention two.
• If we know the number of events n in an
interval of length t, the number of those
events occurring in a sub-interval of length
u is described by a binomial distribution
with parameters n and p = u/t. E.g.,
N (u)|(N (t) = n) ∼ B(n, u/t).
Intuitively reasonable since events have the
same probability of occurring anywhere in
an interval. See theorem 12.2 in Ghahramani for a formal proof (this proof should
be readable).
• If we know the number of events n in an
interval of length t, the event times are uniformly distributed over the interval. This is
basically what theorem 12.4 in Ghahramani
says, and is useful for simulations.
11
Example:
Assume that visitors to a web page arrive as
a Poisson process with intensity λ = 10 per
hour. We know that during the last three
hours there has been 36 visitors, and during the
first of those hours there was a mistake on the
page. What is the expected number of visitors
during the period with the mistake? What is
the probability that less than 10 persons visited
during that period?
12
Queueing systems
Queueing systems appear in many different
areas. E.g. customers waiting to be served
in a bank, post office, counter, etc, patients
waiting to the served by a doctor or hospital,
boats waiting to be served by a harbour,
computer programs waiting for server time,
failed equipment waiting to be repaired, etc,
etc.
Queueing systems are characterised by the
arrival process (how “customers” arrive), the
serving process (how customers are served) and
the number of servers.
A particular notation has evolved in the queueing theory literature for describing these mechanisms for “first come first serve” queues:
number of
service time
arrival
process
distribution
13
servers
Some examples:
• M/M/c: Arrival as Poisson process, exponentially distributed service times, c
servers. (E.g. a bank)
• M/M/∞: Arrival as Poisson process,
exponentially distributed service times,
infinitely many servers. (E.g. an internet
bank)
• GI/M/∞: General independent interarrival times, exponentially distributed
service times, c servers.
• M/G/1: Arrival as Poisson process, general
distribution of service times, one server.
(E.g. an office)
• D/M/c: Deterministic interarrival times,
exponentially distributed service times, c
servers.
• M/D/c: Arrival as Poisson process, deterministic service times, c servers.
14
We generally assume the interarrival and service
times to be independent, one waiting line and
the c servers to be operating in parallel.
Let T1 , T2 , . . . be the interarrival times and
C1 , C2 , . . . be the service times. Let E(Ti ) = 1/λ
and E(Ci ) = 1/γ. The queue is stable if:
E(Ti ) > E(Ci )/c
1/λ
λ
> 1/cγ
<
cγ
Example:
The queue at at post office is a M/M/3 queue
with arrival rate λ = 0.5 per minute and
expected service time of 1/γ = 3 minutes.
When you arrive three persons are being served
and 8 are waiting in line.
What is the expected waiting time until you are
served?
What is the probability that you are finished
before the person in front of you in the queue?
15