Download Poisson processes and their properties

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Poisson processes and their properties
Note #2
Poisson processes and their properties
Poisson processes. A collection {N (t) : t ∈ [0, ∞)} of random variable indexed by time t is
called a continuous-time stochastic process, Furthermore, we call N (t) a Poisson process if
(a) Starting with N (0) = 0, the process N (t) takes a non-negative integer 0, 1, 2, . . . for all
t > 0;
(b) the increment N (t + s) − N (t) is surely nonnegative for any s > 0;
(c) the increments
N (t1 ), N (t2 ) − N (t1 ),
···
, N (tn ) − N (tn−1 )
are independent for any 0 < t1 < t2 < · · · < tn−1 < tn ;
(d) the increment N (t + s) − N (t) has the distribution which is dependent on the value s > 0
but independent of t > 0.
Counting processes. A stochastic process satisfying (a) and (b) is called a counting process in
which N (t) represents the total number of “events” counted up to time t. Properties (c) and (d)
are respectively called independent and stationary increments. Particularly by applying (a)
and (d) together, we obtain
P (N (t + s) − N (t) = k) = P (N (s) = k)
(2.1)
for all k = 0, 1, 2, . . ..
Arrival time and sojourn time. Events counted by a Poisson process N (t) are called Poisson
events. Let Tn denote the time when the n-th Poisson event occurs. Here we call Tn a arrival
time (also referred as “occurrence time”), and define the sojourn time Wn (or “interarrival
time”) by
Wn := Tn − Tn−1 for n = 1, 2, . . .,
where T0 = 0 for convenience.
Properties of sojourn time. We can observe that the event {Wn > s} for a sojourn time is
equivalently expressed by the event {N (Tn−1 + s) − N (Tn−1 ) = 0} in terms of counting process,
and that it has the probability that N (s) = 0. This will justify the following properties of
sojourn time:
(a) The sojourn times W1 , W2 , . . . are independent;
(b) the sojourn time Wn has the distribution which is independent of n.
Survival function. Consider the probability K(s) := P (W1 > s). The function K(s) is known
as a survival function. Then we obtain
K(t + s) = P (W1 > t + s) = P (N (t) = 0, N (t + s) − N (t) = 0)
= P (N (t) = 0)P (N (t + s) − N (t) = 0)
= P (N (t) = 0)P (N (s) = 0)
= P (W1 > t)P (W1 > s) = K(t)K(s)
Page 1
Special lecture/June 2016
Poisson processes and their properties
Note #2
Then the sojourn time W1 must have an exponential distribution with parameter λ; see Problem 1 below.
Problem 1. Let X be a non-negative random variable satisfying for s, t ≥ 0,
P (X > t + s | X > s) = P (X > t).
(2.2)
The above property is generally referred as memoryless property. By completing the following
questions, we will show that X must be an exponential random variable.
(a) Let K(t) = P (X > t) be the survival function of X. Show that the memoryless property
implies that K(s + t) = K(s)K(t) for s, t ≥ 0.
(b) Let κ = K(1). Argue that κ > 0.
1
m
(c) Show that K n1 = κ n and K m
= κn.
n
Therefore, we can obtain K(t) = κt for any t ≥ 0. By letting λ = − ln κ, we can write
K(t) = e−λt , and therefore, we can find the pdf f (t) = λe−λt , t ≥ 0, for X.
Problem 2. Customers arrive at a service facility according to a Poisson process of rate λ.
Let N (t) be the number of customers that have arrived up to time t, and let T1 , T2 , . . . be the
successive arrival times of the customers.
(a) Find E(T4 − t|N (t) = 3).
(b) Find E(T4 |N (t) = 3).
(c) Find E(T5 |N (t) = 3).
Poisson distribution. Since W1 , W2 , . . . are independent
and exponentially distributed with
Pn
the common parameter λ, the arrival time Tn = k=1 Wk has the gamma distribution with
parameter (n, λ). As we have already seen in the previous lecture note, the discrete random
variable N (t) with a fixed time t has the Poisson distribution with parameter λt,
P (N (t) = k) = e−λt
(λt)k
k!
for k = 0, 1, 2, . . ..
Problem 3. Let N (t) be a Poisson process with rate λ, and let 0 < s < t.
(a) Find P (N (t) − N (s) ≥ 1).
(b) Find P (N (s) = 0, N (t) ≥ 1).
(c) Find P (N (s) = 1|N (t) = 1).
Page 2
Special lecture/June 2016
Poisson processes and their properties
Note #2
Distribution of arrival times. Consider the small probability that a Poisson event occurs in
the time interval [t, t + dt). By noticing that e−λdt ≈ 1, we can find such probability as
P (N (dt) = 1) = λ dt
and call λ an arrival rate. Let f1 (s) be the conditional density of the arrival time T1 given
N (t) = 1. Then we can compute the infinitesimal probability P (s < T1 ≤ s + ds | N (t) = 1) =
f1 (s) ds by
P (s < T1 ≤ s + ds | N (t) = 1)
= P (N (s) = 0, N (s + ds) − N (s) = 1, N (t) − N (s + ds) = 0|N (t) = 1)
P (N (s) = 0)P (N (ds) = 1)P (N (t − s − ds) = 0)
e−λt λds
= −λt
P (N (t) = 1)
e λt
1
= ds
t
=
(2.3)
Thus, f1 (s) is the uniform pdf on (0, t).
General cases. Suppose that T1 , T2 , . . . , Tn are the arrival times up to the n-th occurrence
in a Poisson process. Then we can generalize the notion of infinitesimal probability to a joint
distribution of the arrival times T1 , T2 , . . . , Tn , and express an infinitesimal probability by
P (s1 < T1 ≤ s1 + ds1 , . . . , sn < Tn ≤ sn + dsn |N (t) = n)
= f (s1 , s2 , . . . , sn ) ds1 ds2 · · · dsn ,
(2.4)
where f (s1 , s2 , . . . , sn ) is the joint density function.
n!
Problem 4. Show that the joint density function of (2.4) is given by f (s1 , s2 , . . . , sn ) = n for
t
0 < s1 < s2 < · · · < sn < t.
Joint distribution of arrival times. Let U1 , U2 , . . . , Un be independent and identically distributed (iid) uniform random variable on (0, t), and let U(1) < U(2) < · · · < U(n) be the order
statistics of Ui ’s. Then the joint infinitesimal probability is given by
P (s1 < U(1) ≤ s1 + ds1 , s2 < U(2) ≤ s2 + ds2 , . . . , sn < U(n) ≤ sn + dsn )
n!
= n ds1 ds2 · · · dsn .
t
Thus, conditioned upon N (t) = n, the joint distribution of the arrival time T1 , T2 , . . . , Tn are
identical to that of the order statistics U(1) , U(2) , . . . , U(n) of iid uniform random variables on
(0, t).
Problem 5. Let T1 , T2 , . . . , Tn be the arrival times in a Poisson process N (t) as in Problem 4.
(a) Find E(Ti |N (1) = n) for i = 1, . . . , n.
(b) Find E(T1 + T2 + · + Tn |N (1) = n).
Page 3
Special lecture/June 2016
Poisson processes and their properties
Note #2
Problem 6. Let N (t) be a Poisson process with rate λ, representing the number of customers
entering a store. Each customer spend a duration in the store randomly. Then let X(t) denote
the number of customers remaining in the store at time t. Assuming that the durations of
customer are independent random variables and identically distributed as the pdf g(v), determine
the distribution for X(t).
Problem 7. Alpha particles are emitted according to a Poisson process with rate λ. Each
particle exists for a random duration and is annihilated. Suppose that the lifetimes of particle are
independent random variables, and are identically and exponentially distributed with parameter
β. Find the expected number of particles existing at time t.
Superposition of Poisson processes. The moment generating function (mgf) MX (t) for a
t
Poisson random variable X with parameter λ can be calculated as MX (t) = eλ(e −1) . Suppose
that X and Y are independent Poisson random variables with respective parameter λ and µ.
t
Then the mgf for X + Y is given by MX+Y (t) = MX (t) × MY (t) = e(λ+µ)(e −1) , which implies
that X + Y is also a Poisson random variable with parameter λ + µ. Now consider independent
two Poisson processes N (t) and M (t) having the respective arrival rate λ and µ. Then the
superposition
L(t) = N (t) + M (t)
becomes a Poisson process with rate λ + µ.
Random sums. Let X1 , X2 , . . . be iid random variables with mean µ = E[Xi ] and variance σ 2 =
Var(Xi ) for all i. Suppose that a discrete random variable N takes a nonnegative integer value
0, 1, 2, . . ., and is independent of the random variables X1 , X2 , . . .. Then we define a random
sum by
N
X
Z=
Xi .
i=1
For the random variable Z observe that E(Z | N ) = µN and Var(Z | N ) = E((Z − µN )2 | N ) =
σ 2 N . We can compute
E[Z] = E[E(Z | N )] = µE[N ],
Var(Z) = Var(E(Z | N )) + E[Var(Z | N )] = µ2 Var(N ) + σ 2 E[N ]
Compound Poisson processes. Now suppose that N (t) is a Poisson process with arrival
rate λ, independent of X1 , X2 , . . .. Then we can introduce a compound Poisson process by
Z(t) =
N (t)
X
Xi .
i=1
The compound Poisson process is essentially the random sum. Since E[N (t)] = Var[N (t)] = λt,
we obtain E[Z(t)] = µλt and Var(Z(t)) = (µ2 + σ 2 )λt.
Non-homogeneous Poisson processes. For an interval A on [0, ∞) we can introduce the
total number N (A) of Poisson events occurring on the time interval A, and calculate its expected
Page 4
Special lecture/June 2016
Poisson processes and their properties
Note #2
R
number by E[N (A)] = A λ dt. Suppose that the stationary increment of property (d) does not
hold [but (a)–(c) does], and that its expectation is rather determined by
Z
E[N (A)] = m(A) :=
τ (t) dt
A
with nonnegative intensity function τ (t). Then the random variable N (A) has the Poisson
distribution with parameter m(A), and the process N (t) is called a non-homogeneous Poisson
process.
Spatial Poisson processes. Consider a multi-dimensional space S. We can introduce a
Poisson point process on S by extending a concept of non-homogeneous Poisson process. For
a subset A of S let N (A) denote the total number of Poisson events each of which occurs at a
point on the subset A. If (i) the random variables N (A1 ), N (A2 ), . . . , N (An ) are independent
for every sequence A1 , A2 , . . . , An of mutually
disjoint subsets and (ii) the expected number
R
E[N (A)] is given by E[N (A)] = m(A) := A τ (x) dx with nonnegative intensity function τ (x)
on S. Then the random variable N (A) has the Poisson distribution with parameter m(A). In
particular when τ (x) ≡ λ, we call the process homogeneous.
Problem 8. The number of bacteria distributed throughout a volume of liquid can be considered
as a Poisson spatial process. Suppose that the intensity function τ (x) ≡ 0.6 (organisms per mm3 )
Find the probability that more than two bacteria are detected in a 10 mm3 volume.
Marked point process. Consider a Poisson process N (t) with arrival rate λ, and iid random
variables X1 , X2 , . . . having a common pdf g(x) on a space S. Let T1 , T2 , . . . denote arrival time of
the Poisson process. Then we can define a marked Poisson point process by introducing Poisson
events at points (T1 , X1 ), (T2 , X2 ), . . . on the space [0, ∞) × S. For any subset A of [0, ∞) × S,
the number N (A) of Poisson events on A becomes a non-homogeneous Poisson point process
with
ZZ
E[N (A)] = m(A) :=
λ g(x) dx dt
A
Problem 9. Customers arrive according to a Poisson process with rate λ = 8. Suppose that
each customer is independently classified as “high priority” with probability 0.2, or “low priority”
with probability 0.8. Then find the probability that three high priority and five low priority
customers arrive by t = 1.
Page 5
Special lecture/June 2016
Poisson processes and their properties
Note #2
Problem solutions
Problem 1. (a) Observe that
P (X > t + s | X > s) =
K(t + s)
P (X > t + s)
=
P (X > s)
K(s)
By applying it to (2.2) we obtain K(s + t) = K(s)K(t).
(b) The existence of conditional probability in (2.2) requires that P (X > s) > 0; thus, κ =
P (X > 1) > 0.
n
1
m
1 m
1
m
n and K
=
K
,
we
have
K
=
κ
= κn.
(c) Since K(1) = K n1 and K m
n
n
n
n
Problem 2. Recall that the waiting time Vt = T(N (t)+1) − t has an exponential distribution
with parameter λ, and that Vt is independent of N (t).
(a) E(T4 − t|N (t) = 3) = E(T(N (t)+1) − t|N (t) = 3) = E(Vt |N (t) = 3) =
1
λ
(b) E(T4 |N (t) = 3) = E(T(N (t)+1) |N (t) = 3) = E(t + Vt |N (t) = 3) = t +
1
λ
(c) E(T5 |N (t) = 3) = E(T4 + W5 |N (t) = 3) = E(T4 |N (t) = 3) + E(W5 |N (t) = 3) = t + λ2 ,
where W5 is the sojourn time and it is independent of the event that N (t) = 3.
Problem 3. (a) P (N (t) − N (s) ≥ 1) = P (N (t − s) ≥ 1) = 1 − P (N (t − s) = 0) = 1 − e−λ(t−s)
(b) P (N (s) = 0, N (t) ≥ 1) = P (N (s) = 0, N (t) − N (s) ≥ 1) = P (N (s) = 0) × P (N (t) − N (s) ≥
1) = e−λs (1 − e−λ(t−s) )
(c) P (N (s) = 1|N (t) = 1) =
P (N (s) = 0, N (t) − N (s) = 0)
P (N (s) = 0) × P (N (t) − N (s) = 0)
=
=
P (N (t) = 1)
P (N (t) = 1)
e−λs (λs) × e−λ(t−s)
s
=
e−λt (λt)
t
Problem 4. We can demonstrate the extension of (2.3) in the case of n = 2.
P (s1 < T1 ≤ s1 + ds1 , s2 < T2 ≤ s2 + ds2 | N (t) = 2)
= P (N (s1 ) = 0, N (s1 + ds1 ) − N (s1 ) = 1, N (s2 ) − N (s1 + ds1 ) = 0,
N (s2 + ds2 ) − N (s2 ) = 1, N (t) − N (s2 + ds2 ) = 0 | N (t) = 2)
P (N (s1 ) = 0)P (N (ds1 ) = 1)P (N (s2 − s1 − ds1 ) = 0)P (N (ds2 ) = 1)P (N (t − s2 − ds2 ) = 0)
=
P (N (t) = 2)
e−λt (λds1 )(λds2 )
2!
=
= 2 ds1 ds2
−λt
2
e (λt) /2!
t
Further extension for n ≥ 3 is obvious.
Problem 5. Let U1 , U2 , . . . , Un be iid uniform random variable on (0, 1), and let U(1) < U(2) <
· · · < U(n) be the order statistics of Ui ’s.
(a) E(Ti |N (1) = n) = E[U(i) ] =
Page 6
i
.
n+1
Special lecture/June 2016
Poisson processes and their properties
(b) E(T1 + T2 + · + Tn |N (1) = n) =
Note #2
Pn
i
i=1 n+1
n
2
=
Problem 6. Provided that N (t) = n, we call the first n customers “1”, “2”, . . ., “n.” Let Ui
be the arrival time of customer “i,” and let Vi be the duration of the same customer remaining
in the store [which is distributed as the pdf g(v)]. The pair (Ui , Vi ) of random variables has the
joint density function
1
f (u, v) = g(v), 0 ≤ u ≤ t, 0 ≤ v
t
We can calculate
Z ∞
Z ∞
Z t Z ∞
Z
Z
1 t
1 t
du
f (u, v) dv =
du
g(v) dv =
dz
g(v) dv
p = P (Ui + Vi ≥ t) =
t 0
t 0
0
t−u
t−u
z
Then the number X(t) of customers remaining in the store at time t counts all the events that
Ui + Vi ≥ t, i = 1, . . . , n. We obtain
n k
P (X(t) = k | N (t) = n) =
p (1 − p)n−k
k
and therefore,
P (X(t) = k) =
∞
X
P (X(t) = k | N (t) = n)P (N (t) = n)
n=k
=e
λt (λpt)
k
∞
X
(λ(1 − p)t)n−k
k!
(n − k)!
n=k
= eλpt
(λpt)k
k!
which implies that X(t) has a Poission distribution with parameter
Z t Z ∞
λpt = λ
dz
g(v) dv
0
z
Problem 7. This is the immediate application of Problem 6 with g(v) = βe−βv . Thus, we
obtain
Z t Z ∞
λ
E[X(t)] = λpt = λ
dz
βe−βv dv = (1 − e−βt )
β
0
z
Problem 8. Since m(A) = (0.6)(10) = 6, we obtain
P (N (A) > 2) = 1 −
2
X
k=0
e−m(A)
m(A)k
≈ 0.938
k!
Problem 9. Consider the marked Poisson point process on [0, ∞) × {0, 1} with probability
mass function g(0) = 0.8 and g(1) = 0.2. Let Alow = [0, 1] × {0} and Ahigh = [0, 1] × {1}. Then
we can calculate the probability as
P (N (Alow ) = 5, N (Ahigh ) = 3) = P (N (Alow ) = 5) × P (N (Ahigh ) = 3)
(6.4)5
(1.6)3
= e−6.4
× e−1.6
≈ 0.0205
5!
3!
Page 7
Special lecture/June 2016
Related documents