Download Exponential Distribution I love memoryless distributions and the

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Exponential Distribution
I love memoryless distributions and the exponential is the unique one in continuous
time to have this property. Even we have waited a time t for a certain “success” to occur, the
probability that from now on we’ll still have to wait a time s until this success is simply the same
like we would fresh start now, forgetting about the time already waited. This property makes the
exponential distribution special and meaningful. It also tells us that it is unrealistic to use it for
approximating human lifetimes, because in human lifetimes if you have already lived 80 years
the probability that you will live for another 60 is not the same like the probability to live 60
years if you are born today…
Wonderful connections exist between the exponential distribution and other discrete
or continuous distributions.
From the point of view of its inner definition the exponential is the analogous in
continuous time of the geometric distribution. Both measure how much we have to wait until the
first success: the geometric in discrete time, the exponential in continuous time. Also, the pmf of
the geometric distribution is the same like the density of the exponential distribution, only that
the first on is defined only for discrete values of x and the second one is defined for continuous
values of x. These two constitute the unique distributions to have the memoryless property.
The exponential distribution is related to the Poisson one by being exactly the time
between two successive Poisson occurrences. This could be one possible definition of the
exponential distribution. As the Poisson type occurrences are very irregular, the waiting time
between two of them represent also some very irregular random variable, which makes another
inviting aspect of the exponential distribution.
Finally, the exponential distribution is incident to the Gamma distribution by
representing a particular case of the Gamma distribution. You see, the Gamma distribution is just
more general: it quantifies the time between any two Poisson occurrences, not necessarily
consecutive.
But let’s rush to the formal definition.
Definition: We’ll call exponential random variable of parameter
of X of density:
( >0) the random variable
f(x)= e - x , for x 0
0, otherwise.
Notation: X ~ Exp( ).
is called the “rate parameter” and it represents the “rate” at which some phenomenon occurs.
Generally this means the average # of associated Poisson occurrences over a unit time length.
Frequently met examples of exponential random variables:
•
X=The time between two successive earthquakes in California
•
X=The time between two successive worldwide airplane crashes
•
X=The time between two successive telephone calls at a public cabin
•
X=The time between two successive car arrivals at a gas station
•
X=The lifetime of an electronic device
•
X=The time you have to wait in line until you arrive to the cashier
•
X=The time a teller in a bank spends with a client
•
X=The time you have to wait, when entering into a bank, until a teller will
be available for you
The Analysis of an Exponential Random Variable of parameter :
•
Im(X)=[0, ).
This represents exactly the interval on which the density is nonzero (the “support” of the
distribution)
•
Graph of f:
Now let’s check that f is a valid density:
∫R f(x)dx=∫[0,∞) Λe-Λxdx=Λ(-1/Λ)e-Λx│0∞ = 0-(-1)=1
•
The distribution function of X:
For x<0, F(x)=P(X≤x)=∫(-∞,x] 0dt=0.
For x≥0, F(x)=P(X≤x)=∫(-∞,x] f(t)dt=∫[0,x] Λe-Λtdt=Λ(-1/Λ)e-Λt│0x = 1-e-Λx.
Putting these together we get that
F(x)=1-e-Λx, for x≥0
0, otherwise
In particular, for the exponential random variable of parameter 1 we’ll have the following
distribution function:
F(x)=1-e-x, for x≥0
0, otherwise
•
Connection between the standard exponential and the general
exponential:
If X ~ Exp(1), then Y=(1/Λ)X ~ Exp(Λ)
(And, of course, conversely, if Y ~EXp(Λ), then ΛY ~ Exp(1).This looks very much like the
standardization of the normal distribution.)
Proof:
We need to prove that
FY(x)=1-e-Λx, for x≥0
0, otherwise.
For x≥0, we have: FY(x)=P((1/Λ)X≤x)=P(X≤Λx)=FX(Λx)=1-e-Λx.
For x<0, we have: FY(x)=P(Y≤x)=0 as Y is a positive random variable and x is negative.
Putting these two together, we get exactly what we wanted:
FY(x)=1-e-Λx, for x≥0
0, otherwise
So Y is indeed an Exp(Λ) random variable.
•
Expectation :
Integration by parts
For X ~ Exp(1) , E[X]=∫[0,∞) te dt
=
1
-t
For Y ~ Exp(Λ), E[Y]=E[(1/Λ)X]=(1/Λ)E[X]=(1/Λ)1=1/Λ.
•
Variance:
Integration by parts
For X ~ Exp(1), Var(X)=E[X ]-E[X] =∫[0,∞) t e dt-12
=
2
2
2 -t
2-1=1
For Y ~ Exp(Λ), Var(Y)=Var((1/Λ)X)=(1/Λ)2Var(X)=(1/Λ)21=1/Λ2
•
Moment Generating Function:
For X ~ Exp(1), MX(t)=∫[0,∞) etxe-xdx=∫[0,∞) ex(t-1)dx=[1/(t-1)]ex(t-1)│0∞ =1/(1-t), for t<1
(only for t<1 the above integral is finite; so Domain M x=(-∞,1))
For Y ~ Exp(Λ), MY(t)=M(1/Λ)X(t)=MX(t/Λ)=1/[1-(t/Λ)]=Λ/(Λ-t)
and its domain will be obtained from the condition t/Λ Є Domain(M X), which means Λ<t. So
Domain(MY)=(-∞,Λ).
Special Properties of the Exponential Distribution
•
Memorylessness:
P(X>s+t/X>t)=P(X>s) for any s,t>0
Intuition behind this: Even we already waited a time t, the probability that we will wait an
additional time s is the same like the probability to wait a time s if we would have just started
now.
Proof:
P(X>s+t / X>t)=P(X>s+t,X>t)/P(X>t)=P(X>s+t)/P(X>t)=[1-F(s+t)]/[1-F(t)]=
=[1-(1-e-Λ(s+t))]/[1-(1-e-Λt)]=e-Λ(s+t)/e-Λt=e-Λs=1-(1-e-Λs)=1-F(s)=P(X>s)
Conversely, it can be proved that if a continuous distribution is memoryless, then this
distribution is the exponential one (the same style of proof that we’ve had for the fact that if a
discrete distribution is memoryless, then it must be the geometric one). So the property of
memorylessness characterizes the exponential distributions among all continuous distributions.
•
Renewal property:
If X ~ Exp(Λ) and X>c (c is a strictly positive constant), then also X-c ~ Exp(Λ)
(It’s like after the time c the exponential fresh starts again with the same parameter, it
renews itself)
Proof:
Let’s denote X-c by Y and prove that Y ~ Exp(Λ) by proving that its distribution function F Y
looks like this:
FY(x)=1-e-Λx, x≥0
0, otherwise
If x≥0, then
Memorylessness
FY(x)=P(X-c≤x)=P(X≤c+x)=1-P(X>c+x)=1-P(X>c+x / X>c)
=
1-P(X>x)=
=P(X≤x)=FX(x)=1-e-Λx
If x<0, then
FY(x)=P(X-c≤x)=0,
as X-c is a strictly positive random variable and x is a negative number. Putting the above results
together, we get:
FY(x)=1-e-Λx, x≥0
0, otherwise,
which is exactly what we needed to prove.
•
The Close Connection with the Poisson Distribution:
The waiting time between two successive Poisson occurrences in a Poisson(Λ) setup
(called a Poisson process) will be an Exp(Λ) random variable.
Proof:
Here is a picture to illustrate our Poisson(Λ) set of occurrences:
X=The waiting time
------I--------I---I----I(-------------)--------------I-----------I-------------------I---I----t
Let t≥0. Then
P(X>t)=P(0 occurrences over the interval of length t)=
=P(# of occurrences over this interval =0).
As
# of occurrences over this interval ~ P(Λt)
the above probability is
e-Λt(Λt)0/0!
Now for t<0,
P(X>t)=1,
as X is strictly positive and t strictly negative. Combining these facts,
P(X>t)=e-Λt, for t≥0
1, for t<0,
hence its complementary is
P(X≤t)=1-e-Λt, for t≥0
0, otherwise.
This means exactly that the distribution function of X is
F(x)=1-e-Λt, for t≥0
0, otherwise,
so indeed X is the Exp(Λ) random variable.
Application 1:
The average # of red cars arriving at an intersection per hour is 4. Supposing that now a red car
has just arrived, compute the probability that we have to wait more than 15 minutes until the next
red car will arrive.
Solution:
Let X be the # of red cars arriving at the intersection per hour. By hypothesis
X ~ P(4).
Then, if we let Y be the waiting time between two successive red cars, we’ll have
Y ~ Exp(4)
so, as the measure unit is one hour, the required probability will be
P(Y>1/4)=∫[1/4,∞) 4e-4xdx=4(-1/4)e-4x│1/4∞ = e-1.
Application 2:
Suppose clients are entering into the store at a rate of 300 over a 12 hours interval. If a client is
just now entering into the store, what’s the probability that during the next 10 minutes another
one will enter into the store?
Solution:
Let’s first decide a time measure unit, for instance 1 hour. If the rate of clients per 12 hours is
300, the rate of clients per hour will be 300/12. This means that X, the # of clients entering into
the store per hour is a P(300/12) random variable. But this means that the associated exponential
random variable Y, the waiting time between two successive clients entering into the store, will
be an Exp(300/12) random variable. (It has the same parameter like the Poisson one, but take
care that the parameter of the Poisson one is over the unit time interval!). What we want is:
P(Y<10 minutes)=P(Y<1/6 hours)=∫[0,1/6] (300/12)e-(300/12)x=
=-e-(300/12)x│01/6 = 1-e-25/6.