Download JOINT DISTRIBUTION OF MINIMUM OF N IID EXPONENTIAL

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Journal of Asian Scientific Research, 2013, 3(3):337-343
Journal of Asian Scientific Research
journal homepage: http://aessweb.com/journal-detail.php?id=5003
JOINT DISTRIBUTION OF MINIMUM OF N IID EXPONENTIAL RANDOM
VARIABLES AND POISSON MARGINAL
Ali Hussein Mahmood Al-Obaidi
Department of Mathematics, University of Babylon, Babil, Iraq
ABSTRACT
We introduced a random vector 𝑋, 𝑁 , where 𝑁 has Poisson distribution and 𝑋 are minimum of 𝑁
independent and identically distributed exponential random variables. We present fundamental
properties of this vector such as PDF, CDF and stochastic representations. Our results include
explicit formulas for marginal and conditional distributions, moments and moments generating
functions. We also derive moments estimators and maximum likelihood estimators of the parameter
of this distribution.
Keywords: Hierarchical approach, joint distribution, Poisson marginal, moments estimators,
maximum likelihood estimators, marginal distributions, conditional distributions, stochastic
representations.
INTRODUCTION
We suppose that 𝑋1 , 𝑋2 , 𝑋3 , … , 𝑋𝑁 are independent and identically distributed (IID) exponential
random variables with parameter 𝛽 and the PDF
𝑓 π‘₯ = 𝛽𝑒 βˆ’π›½π‘₯ , π‘₯ β‰₯ 0
(1)
𝐹 π‘₯ = 1 βˆ’ 𝑒 βˆ’π›½π‘₯ , π‘₯ β‰₯ 0
(2)
and the CDF
and 𝑁 be random variable with parameter πœ† and the probability density function
𝑒 βˆ’πœ† πœ†π‘›
, 𝑛 ∈ 0,1,2, …
𝑛!
In this article, we study a probability distribution of a new vector
𝑓𝑁 𝑛 =
(3)
𝑁
𝑋, 𝑁 =
𝑋𝑖 , 𝑁 ,
(4)
𝑖=1
where ∧ denotes the minimum of the 𝑋𝑖 . We shall denote to this distribution as the JMEPM
distribution with parameter 𝛽, πœ† > 0, which stands for joint distribution of minimum of n IID
337
Journal of Asian Scientific Research, 2013, 3(3):337-343
exponential random variables and Poisson marginal. Note that the minimum of the n IID
exponential variables 𝑋𝑖 has distribution with parameter 𝛽 and PDF
𝑓𝑋
𝑁=𝑛
𝑛
π‘₯ = 𝑛𝛽 𝑒 βˆ’π›½π‘₯
,π‘₯ β‰₯ 0
(5)
and CDF
𝑛
𝐹𝑋 𝑁=𝑛 π‘₯ = 1 βˆ’ 𝑒 βˆ’π›½π‘₯ , π‘₯ β‰₯ 0
(6)
by using the formal of probability density of i-th order statistics 𝑋(𝑖) , 𝑖 = 1,2, … , 𝑛, given below
𝑓𝑋 𝑖 π‘₯ =
𝑛!
𝐹 π‘₯
π‘–βˆ’1 ! π‘›βˆ’π‘– !
π‘–βˆ’1
1βˆ’πΉ π‘₯
π‘›βˆ’π‘–
1βˆ’πΉ π‘₯
π‘›βˆ’π‘–
𝑓(π‘₯)
and CDF
𝑛
𝐹𝑋 𝑖 π‘₯ =
𝑖=1
𝑛
𝑖
𝐹 π‘₯
π‘–βˆ’1
(David and Nagaraja, 2003)
The JMEPM model has important applications in several fields of research including hydrology,
climate, weather, and finance. The Poisson distribution is closely related to the exponential
distribution in Poisson process where the waiting time between observations is exponential
distributed and the number of observations is Poisson distribution (Ross, 2007), so that we study
minimum customer wait time in this paper.
(Park, 1970) who found properties of the joint
distribution of the total time for the emission of particles from a radioactive substance but we
discuss minimum time for the emission of particles. (Sarabia and Guillen, 2008) emphasize the
value of studying a joint distribution, rather than Park's joint distribution with constant 𝑁. They use
joint distribution of random vector (𝑋, 𝑁). their application of interest is from the insurance
industry where 𝑁 represents the number of claims and 𝑋 represents the total claim amount while
we search minimum claim amount. (Al-Obaidi and Al-Khafaji, 2010) studies the joint distribution
of this random vector (𝑋, 𝑁) but 𝑋 is a maximum of n IID exponential random variables.
Our paper is organized as follows. we present definitions and properties of the JMEPM
distribution. It contains information about the marginals. Conditional distributions are offered in it.
Moment and moment generated function are discussed in it. Estimation of parameters of this
distribution are represented in it.
Definition and Basic Properties
Definition 1.
A random vector 𝑋, 𝑁 with stochastic representation(4), where 𝑋𝑖 are IID
exponential variables(1)and 𝑁 is a Poisson variable(3), independent of the 𝑋𝑖 , is said to have a
JMEPM distribution with parameter 𝛽, πœ† > 0, denoted by π‰πŒπ„ππŒ(𝛽, πœ†).
The joint distribution of 𝑋, 𝑁 can be derived via hierarchical approach and standard technique of
conditioning. Let 𝑓 π‘₯, 𝑛 is joint PDF of 𝑋, 𝑁 while 𝑓𝑋
𝑓 π‘₯, 𝑛 = 𝑓𝑋
𝑁=𝑛
𝑁=𝑛
π‘₯ = 𝑓 π‘₯ 𝑁 = 𝑛 , then
π‘₯ 𝑓𝑁 𝑛 = 𝑓 π‘₯ 𝑁 = 𝑛 𝑓𝑁 𝑛
(7)
Theorem 1. The PDF of 𝑋, 𝑁 ~π‰πŒπ„ππŒ(𝛽, πœ†)is
338
Journal of Asian Scientific Research, 2013, 3(3):337-343
βˆ’π‘›π›½π‘₯
𝑓 π‘₯, 𝑛 = 𝑛𝛽𝑒
𝑒 βˆ’πœ†
𝑒 βˆ’πœ† πœ†π‘›
,
𝑛!
π‘₯ > 0, 𝑛 ∈ 1,2, …
(8)
π‘₯=𝑛=0
and CDF of 𝑋, 𝑁 ~π‰πŒπ„ππŒ(𝛽, πœ†)is
𝑁
𝐹 π‘₯, 𝑛 =
𝑛=1
𝑒 βˆ’πœ† πœ†π‘›
1 βˆ’ 𝑒 βˆ’π‘›π›½π‘₯ ,
𝑛!
𝑒 βˆ’πœ†
π‘₯ > 0, 𝑛 ∈ 1,2, …
(9)
π‘₯=𝑛=0
Where βˆ™ denotes the greatest integer function.
If 𝑛 = 0, then 𝑓𝑁 𝑛 = 𝑒 βˆ’πœ† and 𝑓𝑋
𝑁=0
π‘₯ = 1, so that 𝑓 π‘₯, 𝑛 = 𝑒 βˆ’πœ† 𝐼 0,0 because there are zero
events. Thus we have
𝑓 π‘₯, 𝑛 = 𝑒 βˆ’πœ† 𝐼
0,0
+ (1 βˆ’ 𝑒 βˆ’πœ† )
𝑛𝛽
1βˆ’π‘’ βˆ’πœ†
𝑒 βˆ’π›½π‘₯
𝑛 𝑒 βˆ’πœ† πœ† 𝑛
𝑛!
𝐼ℝ+ ×β„•
(10)
where 𝐼𝐴 denotes the indicator function of the set 𝐴. We see that 𝑓 π‘₯, 𝑛 has mixture distribution
with probability 𝑒 βˆ’πœ† is a point mass at
0,0 , or with probability 1 βˆ’ 𝑒 βˆ’πœ† is random vector
𝑋, 𝑁 ∈ ℝ+ × β„• given by the PDF
βˆ’πœ† 𝑛
𝑛𝛽
𝑛𝑒 πœ†
𝑒 βˆ’π›½π‘₯
, π‘₯ > 0, 𝑛 ∈ 1,2, …
βˆ’πœ†
𝑛!
1βˆ’π‘’
Proposition 1. If 𝑋, 𝑁 has joint distribution 𝑓 π‘₯, 𝑛 , then
𝑔 π‘₯, 𝑛 =
𝑋, 𝑁
𝑑
=
𝐼𝑋, 𝐼𝑁
11
(12)
Where 𝑋, 𝑁 is random vector with PDF 𝑔 π‘₯, 𝑛 given by 11 and 𝐼 is an indicator random
variable, independent of 𝑋, 𝑁 , taking on the values of 1 and 0 with probability 1 βˆ’ 𝑒 βˆ’πœ† and 𝑒 βˆ’πœ†
respectively.
Marginal Distributions
It is useful to know univariate distributions for each variable. We will begin by finding the
marginal distribution of 𝑋 , 𝑁 of the mixture representation in equation (12). We start with the
marginal PDF and CDF of 𝑋.
Proposition 2. Let 𝑋, 𝑁 has joint distribution 𝑔 π‘₯, 𝑛 , then the marginal probability distribution
of 𝑋 is
π›½πœ†π‘’ βˆ’π›½π‘₯ 𝑒 βˆ’πœ† πœ†π‘’ βˆ’π›½π‘₯
𝑒
,π‘₯ > 0
1 βˆ’ 𝑒 βˆ’πœ†
the marginal cumulative distribution function of 𝑋 is
𝑓𝑋 π‘₯ =
𝐹𝑋 π‘₯ =
1
βˆ’π›½π‘₯
1 βˆ’ 𝑒 βˆ’πœ† 𝑒 πœ†π‘’
,π‘₯ > 0
1 βˆ’ 𝑒 βˆ’πœ†
(13)
(14)
Next, we find the marginal CDF and PDF of 𝑋 by using the equation(12). This implies 𝑋=𝑑 𝐼𝑋,
where 𝑋 is a continuous random variable with PDF and CDF in equations (13) and (14),
respectively.
Proposition 3. Let 𝑋, 𝑁 ~π‰πŒπ„ππŒ(𝛽, πœ†), then the marginal cumulative distribution function of 𝑋
is
339
Journal of Asian Scientific Research, 2013, 3(3):337-343
𝐹𝑋 π‘₯ = 1 + 𝑒 βˆ’πœ† 1 βˆ’ 𝑒 πœ†π‘’
βˆ’π›½π‘₯
,π‘₯ β‰₯ 0
(15)
and the marginal probability distribution of 𝑋 is
πœ†π›½ 𝑒π‘₯𝑝 πœ†π‘’ βˆ’π›½π‘₯ βˆ’ 𝛽π‘₯ βˆ’ πœ†
𝑓𝑋 π‘₯ =
𝑒
π‘₯>0
βˆ’πœ†
(16)
π‘₯=0
On the other hand, we find marginal PMF and CDF of 𝑁 as the following.
Proposition 4. Let
𝑋, 𝑁
has joint distribution 𝑔 π‘₯, 𝑛 , then the marginal probability mass
function of 𝑁 is
𝑒 βˆ’πœ† πœ†π‘›
1 βˆ’ 𝑒 βˆ’πœ† 𝑛!
𝑓𝑁 (𝑛) =
(17)
the marginal cumulative distribution function of 𝑁 is
𝐹𝑁 𝑛 =
𝑒 βˆ’πœ†
1 βˆ’ 𝑒 βˆ’πœ†
𝑛
𝑖=1
πœ†π‘–
𝑖!
(18)
By definition 1, the marginal PMF of 𝑁 is Poisson distribution in equation (3). In view
representation of Proposition 1, we have the relation 𝑁 = 𝐼𝑁 , where 𝑁 has PMF in equation(17)
while 𝐼, independent of 𝑁 takes on the values of 1 and 0 with probabilities 1 βˆ’ 𝑒 βˆ’πœ† and 𝑒 βˆ’πœ† ,
respectively. Thus, it is clear that 𝑁 = 0 if and only if 𝐼 = 0, the probability of which is 𝑒 βˆ’πœ† , while
for 𝑛 ∈ β„•, we have
𝑃 𝑁 = 𝑛 = 𝑃 𝐼𝑁 = 𝑛 =
𝑒 βˆ’πœ† πœ†π‘›
𝑛!
(19)
Conditional distributions
Another important distribution is the conditional distribution, the probability of one variable when
the value of the other variable is known. We saw in equation(10) that the conditional PDF of 𝑋 for
the case 𝑁 = 0 is as follows:
𝑓𝑋
𝑁=0
1 ,
π‘₯β‰₯0
0 ,
π‘₯<0
π‘₯ =
Combining this with equation(5), we obtain
𝑓𝑋
𝑁=𝑛
π‘₯ =
𝑛𝛽 𝑒 βˆ’π›½π‘₯
𝑛
π‘₯β‰₯0
20
1
𝑛=π‘₯=0
We can then note that the conditional CDF of 𝑋 for the case 𝑁 = 0 is
𝐹𝑋
𝑁=0
1 ,
π‘₯β‰₯0
0 ,
π‘₯<0
π‘₯ =
Combining this with equation(6), we obtain
340
Journal of Asian Scientific Research, 2013, 3(3):337-343
𝐹𝑋
𝑁=0
𝑛
1 βˆ’ 𝑒 βˆ’π›½π‘₯
π‘₯ =
,
π‘₯β‰₯0
(21)
1,
𝑛 = 0 π‘Žπ‘›π‘‘ π‘₯ β‰₯ 0
Next, we find the conditional PMF of 𝑁 given 𝑋 = π‘₯. We make the following proposition.
Proposition 5. Let 𝑋, 𝑁 ~π‰πŒπ„ππŒ(𝛽, πœ†), then the conditional probability distribution of 𝑁 given
𝑋 = π‘₯ is
π‘›βˆ’1
𝑓𝑁
𝑋=π‘₯
𝑛 =
𝑒 βˆ’πœ†π‘’
βˆ’π›½π‘₯
πœ†π‘’ βˆ’π›½π‘₯
(𝑛 βˆ’ 1)!
,
𝑛 ∈ β„• π‘Žπ‘›π‘‘ π‘₯ > 0
(22)
1 ,
𝑛=π‘₯=0
Note that 𝑁 𝑋 = π‘₯ is a shifted Poisson random variable. More precisely, for π‘₯ > 0, 𝑁 𝑋 = π‘₯ has
the same distribution as 𝑍 + 1, where 𝑍 is a Poisson with parameter πœ†π‘’ βˆ’π›½π‘₯ .
Moments and Moments generating function
We find various representations for bivariate and univariate moments connected with the joint
distribution 𝑓 π‘₯, 𝑛 . To begin, we obtain a general expression for 𝐸 𝑋 πœ‚ 𝑁 𝛾 and then proceed to
obtain various special cases for particular values of πœ‚ and 𝛾. We recall in equation (12). This allows
us to proceed as follows:
𝐸 𝑋 πœ‚ 𝑁 𝛾 = 𝐸 𝐼 πœ‚ +𝛾 𝐸 𝑋 πœ‚ 𝑁 𝛾
(23)
We shall throughout that πœ‚, 𝛾 β‰₯ 0. Noting that for πœ‚ + 𝛾 > 0, we have
𝐸 𝐼 πœ‚ +𝛾 = 1 βˆ’ 𝑒 βˆ’πœ†
(24)
We obtain
𝐸 π‘‹πœ‚ 𝑁𝛾 ,
πœ‚
𝐸𝑋 𝑁
𝛾
πœ‚=𝛾=0
=
(25)
1βˆ’π‘’
πœ‚
Now, we state an expression for 𝐸 𝑋 𝑁
𝛾
βˆ’πœ†
πœ‚
𝛾
𝐸𝑋 𝑁 ,
π‘œπ‘‘π‘•π‘’π‘Ÿπ‘€π‘–π‘ π‘’
in the proposition below.
Proposition 6. If 𝑋, 𝑁 ~π‰πŒπ„ππŒ(𝛽, πœ†), then for any non-negative πœ‚ and 𝛾, we have
1,
𝐸 π‘‹πœ‚ 𝑁𝛾 =
πœ‚=𝛾=0
Ξ“ πœ‚+1
π›½πœ‚
∞
𝑛𝛾
𝑛=1
𝑒 βˆ’πœ† πœ†π‘›
,
𝑛! π‘›πœ‚
π‘œπ‘‘π‘•π‘’π‘Ÿπ‘€π‘–π‘ π‘’
(26)
We can derive expressions for univarite moments by letting πœ‚ or 𝛾 equal 0. The first moment in the
univariate case represents the mean. This, in addition to the second moment, will reveal the
variance. we will find the first and the second moments for 𝑋 and 𝑁. Additionally, we will find the
special case of 𝐸 𝑋𝑁 , which will be useful in deriving covariance of 𝑋 and 𝑁.
A special case of 𝐸 𝑋 πœ‚ 𝑁 𝛾 where 𝛾 = 0 gives us
1,
𝐸 π‘‹πœ‚ =
Ξ“ πœ‚+1
π›½πœ‚
πœ‚=0
∞
𝑛=1
𝑒 βˆ’πœ† πœ†π‘›
,
𝑛! π‘›πœ‚
π‘œπ‘‘π‘•π‘’π‘Ÿπ‘€π‘–π‘ π‘’
(27)
341
Journal of Asian Scientific Research, 2013, 3(3):337-343
Therefore,
𝐸𝑋 =
1
𝛽
∞
𝑛 =1
𝑒 βˆ’πœ† πœ†π‘›
2
, 𝐸 𝑋2 = 2
𝑛! 𝑛
𝛽
πœ‚
Another special case of 𝐸 𝑋 𝑁
𝛾
∞
𝑛=1
𝑒 βˆ’πœ† πœ†π‘›
1
, 𝐸 𝑋𝑁 = (1 βˆ’ 𝑒 βˆ’πœ† )
2
𝑛! 𝑛
𝛽
where πœ‚ = 0 gives us
1,
𝐸 𝑁𝛾 =
𝛾=0
∞
𝑛𝛾
𝑛=1
𝑒 βˆ’πœ† πœ†π‘›
,
𝑛!
π‘œπ‘‘π‘•π‘’π‘Ÿπ‘€π‘–π‘ π‘’
(28)
As expected since 𝑁 is a Poisson variable, then
𝐸 𝑁 = πœ†, 𝐸 𝑁 2 = πœ†2 + πœ†
The covariance matrix which has the following formal
Ξ£=
𝑒 βˆ’2πœ†
𝛽2
∞
2
1
𝛽
𝑛=1
∞
𝑛=1
𝑒 βˆ’πœ† πœ†π‘›
βˆ’
𝑛! 𝑛2
𝑒
βˆ’πœ† 𝑛
πœ†
𝑛!
∞
𝑛=1
𝑒 βˆ’πœ† πœ†π‘›
𝑛! 𝑛
2
1
𝛽
∞
𝑛 =1
𝑒 βˆ’πœ† πœ†π‘›
πœ†
1βˆ’
𝑛!
𝑛
πœ†
1βˆ’
𝑛
(29)
πœ†
We derive the moment generating function of joint PDF 𝑓 π‘₯, 𝑛 in following proposition
Proposition 7. let 𝑋, 𝑁 ~π‰πŒπ„ππŒ(𝛽, πœ†), then the moment generating function of 𝑋, 𝑁 is
∞
𝑀 𝑑1 , 𝑑2 =
𝑛=1
𝑒 𝑑 2 𝑛 𝑒 βˆ’πœ† πœ†π‘›
,
𝑑
𝑛!
1βˆ’ 1
𝑛𝛽
𝑑1 , 𝑑2 ∈ ℝ
(30)
Estimation
In practice the values of the parameters the joint distribution 𝑓(π‘₯, 𝑛) are almost unknown and have
to be estimated based on the sample data. That is why it is valuable to develop methods of
estimation of our parameters, 𝛽 and πœ†.
Moment estimators
Moment estimators are found by setting sample moments equal to the equation for the population
moments and solving for the parameters which are to be estimated.
Proposition 8. Suppose
𝑋1 , 𝑁1 , 𝑋2 , 𝑁2 , … , π‘‹π‘š , π‘π‘š
from a random sample where
𝑋𝑖 , 𝑁𝑖 ~π‰πŒπ„ππŒ(𝛽, πœ†) and there exists at least one 𝑁𝑖 such that 𝑁𝑖 > 0. Then the moment
estimators, 𝛽 and πœ† of 𝛽 and πœ†, respectively, are
1
𝛽=
𝑋
∞
𝑛=1
πœ†=𝑁
Where 𝑋 =
π‘š 𝑋𝑖
𝑖=1 π‘š
and 𝑁 =
π‘š 𝑁𝑖
𝑖=1 π‘š
𝑒 βˆ’πœ† πœ†π‘›
𝑛! 𝑛
(31)
(32)
.
Maximum likelihood estimators
342
Journal of Asian Scientific Research, 2013, 3(3):337-343
In this section we derive maximum likelihood estimators MLE for 𝛽 and πœ†. We now state the MLE
of 𝛽 and πœ† in the preposition below.
Proposition 9. Suppose
𝑋1 , 𝑁1 , 𝑋2 , 𝑁2 , … , π‘‹π‘š , π‘π‘š
from a random sample where
𝑋𝑖 , 𝑁𝑖 ~π‰πŒπ„ππŒ(𝛽, πœ†) and there exists at least one 𝑁𝑖 such that 𝑁𝑖 > 0. Then the maximum
likelihood estimators, 𝛽 and πœ† of 𝛽 and πœ†, respectively, are
𝐴
𝛽=
𝑗 ∈𝐢 𝑁𝑗 𝑋𝑗
(33)
πœ†=𝑁
(34)
Where 𝐢 =the set of all j's in the range 1,2,3, … , π‘š such that 𝑁𝑗 > 0, and 𝐴 = 𝐢 = the number of
elements in 𝐢.
CONCLUSIONS
We derived some properties of joint distribution of random vector X, N , where N has Poisson
distribution and X are minimum of N independent and identically distributed exponential random
variables such as PDF and CDF of it. Also marginals and conditionals distributions of univariate
random of this vector. Moments and moments generated function of the bivariate distribution and
estimators of parameter of distribution of this random vector.
ACKNOWLEDGEMENT
Thanked for each who helped me even accomplished this work.
REFERENCES
Al-Obaidi, A.H. and K.A. Al-Khafaji, 2010. Some probabilistic properties of a poisson
exponential joint distribution. In: Department of Mathematics. College of
Education, University of Babylon.
David, H.A. and H.N. Nagaraja, 2003. Order statistics. 3rd Edn., USA.: John Wiley &
Sons Inc.
Park, S., 1970. Regression and correlation in a bivariate distribution with different
marginal densities. Technometrics, 12(3): 687-691.
Ross, S.M., 2007. Introduction to probability models. 9th Edn.: Academic Press
Sarabia, J.M. and M. Guillen, 2008. Joint modeling of the total amount and the
number of claims by conditionals. Insurance. Mathematics and Economics, 43:
466-473.
343
Related documents