Download MTH/STA 561 EXPONENTIAL PROBABILITY DISTRIBUTION As

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

History of the function concept wikipedia , lookup

Infinite monkey theorem wikipedia , lookup

History of network traffic models wikipedia , lookup

Karhunen–Loève theorem wikipedia , lookup

Expected value wikipedia , lookup

Law of large numbers wikipedia , lookup

Tweedie distribution wikipedia , lookup

Negative binomial distribution wikipedia , lookup

Exponential distribution wikipedia , lookup

Exponential family wikipedia , lookup

Transcript
MTH/STA 561
EXPONENTIAL PROBABILITY DISTRIBUTION
As discussed in Example 1 (of Section of Uniform Probability Distribution), in a Poisson
process, events are occurring independently at random and at a uniform rate per unit of
time. Suppose that we label at time zero the instant at which we begin observing the
Poisson process. Now let T denote the time at which the …rst event occurs. Consider the
event fT > tg that the time of the …rst event is greater than t. This event occurs if and only
if there are zero events in the …xed interval (0; t]. Nevertheless, it follows from Section of
Poisson Probability Distribution that the probability of zero events occurring in any interval
of …xed length is
( t)0 e t
p (0; t) =
=e t
0!
Therefore,
P fT > tg = e t
from which we …nd the distribution function for T ; that is,
F (t; ) = 1
t
e
for t > 0
and its density function is
f (t; ) =
d
F (t; ) = e
dt
t
for t > 0
This is called the exponential density function. Formally, we have the following distribution.
De…nition 1. The continuous random variable Y has an exponential probability distribution with parameter > 0 if its probability density function is given by
f (y; ) =
y
e
for y > 0
elsewhere.
0
Any continuous random variable that follows an exponential distribution with parameter
> 0 is referred to as the exponential random variable with parameter > 0.
The distribution function for the exponential random variable Y is given by
F (y; ) =
1
0
e
y
for y > 0
elsewhere.
The …gures below plot the density and distribution functions for the exponential random
variable with parameter .
1
The mean and variance of the exponential random variable Y with
Theorem 1.
parameters are
= E (Y ) =
1
2
and
1
= V ar (Y ) =
2;
respectively. The moment-generating function is
mY (t) =
for t <
t
Proof. The moment-generating function is given by
tY
mY (t) = E e
=
Z1
ty
e
y
e
Z1
dy =
0
=
t)y
(
e
(
e
dy =
0
for t < :
t
Thus,
E (Y ) =
d
mY (t)
dt
=
(
t=0
=
2
t)
=
2
1
t=0
and
E Y2
=
=
d2
mY (t)
dt2
(
=
t=0
2
)
+
3(
)
2
(
+
t)
2
2
=
3
t=0
2
=
3
=
+
+
3
Hence,
V ar (Y ) = E Y
2
2
[E (Y )] =
2
2
2
1
2
=
1
2:
2
2
2
:
t)y
y!1
t
y=0
Example 1.
Suppose that a system contains a certain type of component whose
time (in years) to failure is given by the random variable T , distributed exponentially with
parameter = 1=5. If …ve of these components are installed in di¤erent systems, what is
the probability that at least two are still functioning at the end of 8 years?
Solution. We …rst observe that a given component is still functioning after 8 years is
equal to
P fT > 8g = e 8=5 0:2
Now let Y represents the number of components functioning after 8 years which is clearly
a binomial random variable with parameter n = 5 and p = 0:2. Therefore, the desired
probability equals
P fY
2g = 1
P fY
1g = 1
1
X
5
(0:2)y (0:8)5
y
y=0
y
=1
0:7373 = 0:2627:
Also, the mean and variance of the survival time T are
= E (T ) =
1
=5
1=5
and
2
= V ar (T ) =
1
= 25:
(1=5)2
Memoryless Property. The exponential distribution has the same memoryless property that we found for the geometric distribution.
Theorem 2 (Memoryless Property). Suppose Y has an exponential distribution with
parameter . For any positive numbers a and b,
P (Y > a + b j Y > a) = P (Y > b) :
Proof
For any positive numbers a and b,
P (Y > a + b and Y > a)
P (Y > a + b)
=
P (Y > a)
P (Y > a)
1 P (Y
a + b)
1 F (a + b)
=
=
1 P (Y
a)
1 F (a)
e (a+b)
=
= e b = 1 F (b) = P (Y > b) :
e a
P (Y > a + b j Y > a) =
Thus, if in Example 1 we have observed a = 4 years with no failures, the probability
that it will be at least b = 2 years until the …rst failure is unchanged from the original value
for this probability when we begin observation. The exponential distribution is the only
continuous probability distribution with the memoryless property.
This property tells us that a used exponential component is essentially “as good as new”.
It may be equivalently stated as
P (Y > a + b) = P (Y > a) P (Y > b) :
3
(4.1)
Then the converse of Theorem 2 can be shown to be also true as follows.
Theorem 3. Let Y be a non-degenerate nonnegative random variable. If (4:1) holds for
any numbers a > 0 and b > 0, then Y has an exponential distribution with some parameter
> 0.
Proof
De…ne F (y) = 1 F (y) = P (Y > y) for all y. Let c > 0 and m and n be
positive integers. Then, by (4:1), it is easy to see that
h
c im
n
F (nc) = F (c)
and F (c) = F
:
m
n
Now we claim 0 < F (1) < 1. If F (1) = 1, then F (n) = F (1) = 1 for all n, which
m
= F (1) = 0 an so F m1 = 0 for
contradicts F (+1) = 0. If F (1) = 0, then F m1
all m, which contradicts F ( 1) = 1. Since 0 < F (1) < 1, write F (1) = e 1/ for some
appropriate > 0. It follows from the second equality of the above that
F
1
m
= F (1)
1=m
=e
1/ m
:
Thus, by the …rst equality of the above, we have
F
n
= F
m
1
m
n
=e
(n=m)/
;
that is, F (y) = e y/ for any positive rational number y. By the right continuity of F (y), it
follows that F (y) = e y/ for any positive real number y. Therefore, Y has an exponential
distribution with parameter > 0.
The reason that the geometric and exponential distribution should both have the memoryless property can be seen by remembering that the continuous time Poisson process can
be derived as a limit of a sequence of independent Bernoulli trials. A geometric random
variable Y is the number of Bernoulli trials until the …rst success occurs and the exponential
random variable T is the time of occurrence to the …rst event (success). In fact, as seen in
Section ??, if Y is a geometric random variable with parameter p, then
p)n
P fY > ng = q n = (1
where q = 1 p. In deriving the Poisson process in Section ??, we set p =
t = t=n and
divide the interval (0; t] into n subintervals of length t = t=n. Then the events fY > ng
and fT > tg are equivalent and
P fT > tg = lim P fY > ng = lim (1
n!1
n!1
n
p) = lim
n!1
1
t
n
n
=e
t
so the exponential distribution function is the limit of the geometric distribution function;
the exponential random variable inherits the memoryless property from the geometric random variable.
4
Suppose that we begin observing a Poisson process at time zero and let Tr be the time
to occurrence of the rth event (r 1). this random variable associates with the negative
binomial random variable de…ned on independent Bernoulli trials. Let t be any …xed positive
number and consider the event fTr > tg that the time to the rth event is greater than t. the
event fTr > tg is equivalent to the event fY
r 1g, where Y is the number of events that
occurs in (0; t], because the time to the rth event can exceed t if and only if there are r 1
or fewer events in (0; t]. Since Y is a Poisson random variable with parameter = t, we
have
r 1
X
( t)k e t
P fTr > tg = P fY
r 1g =
k!
k=0
and the distribution function for Tr , the time to the rth occurrence, is
F (t; r; ) = P fTr
P fTr > tg = 1
tg = 1
r 1
X
( t)k e
k!
k=0
t
The random variable Tr is called the Erlang random variable with parameters r and . The
density function of Tr is
d
F (t; r; )
dt "
d
=
1 e t
dt
f (t; r; ) =
=
e
t
e
r 1 r 2
t
(r
t
te
t
e
2)!
+
2
( t)2 e
2!
r r 1
t
+
2
t
te
t
(r
( t)r 1 e
(r 1)!
t
3 2
te
t
+
te
2!
t
#
t
e t
1)!
r
=
(r
1)!
tr 1 e
t
for t > 0:
Erlang Distribution. Let Tr be the random variable representing the waiting time to
occurrence of the rth event in a Poisson process with parameter . then the density function
of Tr is
r
tr 1 e t
for t > 0
(r
1)!
f (t; r; ) =
0
elsewhere
As will be derived later in the chapter, the mean and variance of the Erlang random
variable are
r
r
2
=
and
= 2
and the moment-generating function is
mY (t) =
(1
1
t= )
5
for t < :