Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Journal of Asian Scientific Research, 2013, 3(3):337-343 Journal of Asian Scientific Research journal homepage: http://aessweb.com/journal-detail.php?id=5003 JOINT DISTRIBUTION OF MINIMUM OF N IID EXPONENTIAL RANDOM VARIABLES AND POISSON MARGINAL Ali Hussein Mahmood Al-Obaidi Department of Mathematics, University of Babylon, Babil, Iraq ABSTRACT We introduced a random vector π, π , where π has Poisson distribution and π are minimum of π independent and identically distributed exponential random variables. We present fundamental properties of this vector such as PDF, CDF and stochastic representations. Our results include explicit formulas for marginal and conditional distributions, moments and moments generating functions. We also derive moments estimators and maximum likelihood estimators of the parameter of this distribution. Keywords: Hierarchical approach, joint distribution, Poisson marginal, moments estimators, maximum likelihood estimators, marginal distributions, conditional distributions, stochastic representations. INTRODUCTION We suppose that π1 , π2 , π3 , β¦ , ππ are independent and identically distributed (IID) exponential random variables with parameter π½ and the PDF π π₯ = π½π βπ½π₯ , π₯ β₯ 0 (1) πΉ π₯ = 1 β π βπ½π₯ , π₯ β₯ 0 (2) and the CDF and π be random variable with parameter π and the probability density function π βπ ππ , π β 0,1,2, β¦ π! In this article, we study a probability distribution of a new vector ππ π = (3) π π, π = ππ , π , (4) π=1 where β§ denotes the minimum of the ππ . We shall denote to this distribution as the JMEPM distribution with parameter π½, π > 0, which stands for joint distribution of minimum of n IID 337 Journal of Asian Scientific Research, 2013, 3(3):337-343 exponential random variables and Poisson marginal. Note that the minimum of the n IID exponential variables ππ has distribution with parameter π½ and PDF ππ π=π π π₯ = ππ½ π βπ½π₯ ,π₯ β₯ 0 (5) and CDF π πΉπ π=π π₯ = 1 β π βπ½π₯ , π₯ β₯ 0 (6) by using the formal of probability density of i-th order statistics π(π) , π = 1,2, β¦ , π, given below ππ π π₯ = π! πΉ π₯ πβ1 ! πβπ ! πβ1 1βπΉ π₯ πβπ 1βπΉ π₯ πβπ π(π₯) and CDF π πΉπ π π₯ = π=1 π π πΉ π₯ πβ1 (David and Nagaraja, 2003) The JMEPM model has important applications in several fields of research including hydrology, climate, weather, and finance. The Poisson distribution is closely related to the exponential distribution in Poisson process where the waiting time between observations is exponential distributed and the number of observations is Poisson distribution (Ross, 2007), so that we study minimum customer wait time in this paper. (Park, 1970) who found properties of the joint distribution of the total time for the emission of particles from a radioactive substance but we discuss minimum time for the emission of particles. (Sarabia and Guillen, 2008) emphasize the value of studying a joint distribution, rather than Park's joint distribution with constant π. They use joint distribution of random vector (π, π). their application of interest is from the insurance industry where π represents the number of claims and π represents the total claim amount while we search minimum claim amount. (Al-Obaidi and Al-Khafaji, 2010) studies the joint distribution of this random vector (π, π) but π is a maximum of n IID exponential random variables. Our paper is organized as follows. we present definitions and properties of the JMEPM distribution. It contains information about the marginals. Conditional distributions are offered in it. Moment and moment generated function are discussed in it. Estimation of parameters of this distribution are represented in it. Definition and Basic Properties Definition 1. A random vector π, π with stochastic representation(4), where ππ are IID exponential variables(1)and π is a Poisson variable(3), independent of the ππ , is said to have a JMEPM distribution with parameter π½, π > 0, denoted by πππππ(π½, π). The joint distribution of π, π can be derived via hierarchical approach and standard technique of conditioning. Let π π₯, π is joint PDF of π, π while ππ π π₯, π = ππ π=π π=π π₯ = π π₯ π = π , then π₯ ππ π = π π₯ π = π ππ π (7) Theorem 1. The PDF of π, π ~πππππ(π½, π)is 338 Journal of Asian Scientific Research, 2013, 3(3):337-343 βππ½π₯ π π₯, π = ππ½π π βπ π βπ ππ , π! π₯ > 0, π β 1,2, β¦ (8) π₯=π=0 and CDF of π, π ~πππππ(π½, π)is π πΉ π₯, π = π=1 π βπ ππ 1 β π βππ½π₯ , π! π βπ π₯ > 0, π β 1,2, β¦ (9) π₯=π=0 Where β denotes the greatest integer function. If π = 0, then ππ π = π βπ and ππ π=0 π₯ = 1, so that π π₯, π = π βπ πΌ 0,0 because there are zero events. Thus we have π π₯, π = π βπ πΌ 0,0 + (1 β π βπ ) ππ½ 1βπ βπ π βπ½π₯ π π βπ π π π! πΌβ+ ×β (10) where πΌπ΄ denotes the indicator function of the set π΄. We see that π π₯, π has mixture distribution with probability π βπ is a point mass at 0,0 , or with probability 1 β π βπ is random vector π, π β β+ × β given by the PDF βπ π ππ½ ππ π π βπ½π₯ , π₯ > 0, π β 1,2, β¦ βπ π! 1βπ Proposition 1. If π, π has joint distribution π π₯, π , then π π₯, π = π, π π = πΌπ, πΌπ 11 (12) Where π, π is random vector with PDF π π₯, π given by 11 and πΌ is an indicator random variable, independent of π, π , taking on the values of 1 and 0 with probability 1 β π βπ and π βπ respectively. Marginal Distributions It is useful to know univariate distributions for each variable. We will begin by finding the marginal distribution of π , π of the mixture representation in equation (12). We start with the marginal PDF and CDF of π. Proposition 2. Let π, π has joint distribution π π₯, π , then the marginal probability distribution of π is π½ππ βπ½π₯ π βπ ππ βπ½π₯ π ,π₯ > 0 1 β π βπ the marginal cumulative distribution function of π is ππ π₯ = πΉπ π₯ = 1 βπ½π₯ 1 β π βπ π ππ ,π₯ > 0 1 β π βπ (13) (14) Next, we find the marginal CDF and PDF of π by using the equation(12). This implies π=π πΌπ, where π is a continuous random variable with PDF and CDF in equations (13) and (14), respectively. Proposition 3. Let π, π ~πππππ(π½, π), then the marginal cumulative distribution function of π is 339 Journal of Asian Scientific Research, 2013, 3(3):337-343 πΉπ π₯ = 1 + π βπ 1 β π ππ βπ½π₯ ,π₯ β₯ 0 (15) and the marginal probability distribution of π is ππ½ ππ₯π ππ βπ½π₯ β π½π₯ β π ππ π₯ = π π₯>0 βπ (16) π₯=0 On the other hand, we find marginal PMF and CDF of π as the following. Proposition 4. Let π, π has joint distribution π π₯, π , then the marginal probability mass function of π is π βπ ππ 1 β π βπ π! ππ (π) = (17) the marginal cumulative distribution function of π is πΉπ π = π βπ 1 β π βπ π π=1 ππ π! (18) By definition 1, the marginal PMF of π is Poisson distribution in equation (3). In view representation of Proposition 1, we have the relation π = πΌπ , where π has PMF in equation(17) while πΌ, independent of π takes on the values of 1 and 0 with probabilities 1 β π βπ and π βπ , respectively. Thus, it is clear that π = 0 if and only if πΌ = 0, the probability of which is π βπ , while for π β β, we have π π = π = π πΌπ = π = π βπ ππ π! (19) Conditional distributions Another important distribution is the conditional distribution, the probability of one variable when the value of the other variable is known. We saw in equation(10) that the conditional PDF of π for the case π = 0 is as follows: ππ π=0 1 , π₯β₯0 0 , π₯<0 π₯ = Combining this with equation(5), we obtain ππ π=π π₯ = ππ½ π βπ½π₯ π π₯β₯0 20 1 π=π₯=0 We can then note that the conditional CDF of π for the case π = 0 is πΉπ π=0 1 , π₯β₯0 0 , π₯<0 π₯ = Combining this with equation(6), we obtain 340 Journal of Asian Scientific Research, 2013, 3(3):337-343 πΉπ π=0 π 1 β π βπ½π₯ π₯ = , π₯β₯0 (21) 1, π = 0 πππ π₯ β₯ 0 Next, we find the conditional PMF of π given π = π₯. We make the following proposition. Proposition 5. Let π, π ~πππππ(π½, π), then the conditional probability distribution of π given π = π₯ is πβ1 ππ π=π₯ π = π βππ βπ½π₯ ππ βπ½π₯ (π β 1)! , π β β πππ π₯ > 0 (22) 1 , π=π₯=0 Note that π π = π₯ is a shifted Poisson random variable. More precisely, for π₯ > 0, π π = π₯ has the same distribution as π + 1, where π is a Poisson with parameter ππ βπ½π₯ . Moments and Moments generating function We find various representations for bivariate and univariate moments connected with the joint distribution π π₯, π . To begin, we obtain a general expression for πΈ π π π πΎ and then proceed to obtain various special cases for particular values of π and πΎ. We recall in equation (12). This allows us to proceed as follows: πΈ π π π πΎ = πΈ πΌ π +πΎ πΈ π π π πΎ (23) We shall throughout that π, πΎ β₯ 0. Noting that for π + πΎ > 0, we have πΈ πΌ π +πΎ = 1 β π βπ (24) We obtain πΈ ππ ππΎ , π πΈπ π πΎ π=πΎ=0 = (25) 1βπ π Now, we state an expression for πΈ π π πΎ βπ π πΎ πΈπ π , ππ‘ππππ€ππ π in the proposition below. Proposition 6. If π, π ~πππππ(π½, π), then for any non-negative π and πΎ, we have 1, πΈ ππ ππΎ = π=πΎ=0 Ξ π+1 π½π β ππΎ π=1 π βπ ππ , π! ππ ππ‘ππππ€ππ π (26) We can derive expressions for univarite moments by letting π or πΎ equal 0. The first moment in the univariate case represents the mean. This, in addition to the second moment, will reveal the variance. we will find the first and the second moments for π and π. Additionally, we will find the special case of πΈ ππ , which will be useful in deriving covariance of π and π. A special case of πΈ π π π πΎ where πΎ = 0 gives us 1, πΈ ππ = Ξ π+1 π½π π=0 β π=1 π βπ ππ , π! ππ ππ‘ππππ€ππ π (27) 341 Journal of Asian Scientific Research, 2013, 3(3):337-343 Therefore, πΈπ = 1 π½ β π =1 π βπ ππ 2 , πΈ π2 = 2 π! π π½ π Another special case of πΈ π π πΎ β π=1 π βπ ππ 1 , πΈ ππ = (1 β π βπ ) 2 π! π π½ where π = 0 gives us 1, πΈ ππΎ = πΎ=0 β ππΎ π=1 π βπ ππ , π! ππ‘ππππ€ππ π (28) As expected since π is a Poisson variable, then πΈ π = π, πΈ π 2 = π2 + π The covariance matrix which has the following formal Ξ£= π β2π π½2 β 2 1 π½ π=1 β π=1 π βπ ππ β π! π2 π βπ π π π! β π=1 π βπ ππ π! π 2 1 π½ β π =1 π βπ ππ π 1β π! π π 1β π (29) π We derive the moment generating function of joint PDF π π₯, π in following proposition Proposition 7. let π, π ~πππππ(π½, π), then the moment generating function of π, π is β π π‘1 , π‘2 = π=1 π π‘ 2 π π βπ ππ , π‘ π! 1β 1 ππ½ π‘1 , π‘2 β β (30) Estimation In practice the values of the parameters the joint distribution π(π₯, π) are almost unknown and have to be estimated based on the sample data. That is why it is valuable to develop methods of estimation of our parameters, π½ and π. Moment estimators Moment estimators are found by setting sample moments equal to the equation for the population moments and solving for the parameters which are to be estimated. Proposition 8. Suppose π1 , π1 , π2 , π2 , β¦ , ππ , ππ from a random sample where ππ , ππ ~πππππ(π½, π) and there exists at least one ππ such that ππ > 0. Then the moment estimators, π½ and π of π½ and π, respectively, are 1 π½= π β π=1 π=π Where π = π ππ π=1 π and π = π ππ π=1 π π βπ ππ π! π (31) (32) . Maximum likelihood estimators 342 Journal of Asian Scientific Research, 2013, 3(3):337-343 In this section we derive maximum likelihood estimators MLE for π½ and π. We now state the MLE of π½ and π in the preposition below. Proposition 9. Suppose π1 , π1 , π2 , π2 , β¦ , ππ , ππ from a random sample where ππ , ππ ~πππππ(π½, π) and there exists at least one ππ such that ππ > 0. Then the maximum likelihood estimators, π½ and π of π½ and π, respectively, are π΄ π½= π βπΆ ππ ππ (33) π=π (34) Where πΆ =the set of all j's in the range 1,2,3, β¦ , π such that ππ > 0, and π΄ = πΆ = the number of elements in πΆ. CONCLUSIONS We derived some properties of joint distribution of random vector X, N , where N has Poisson distribution and X are minimum of N independent and identically distributed exponential random variables such as PDF and CDF of it. Also marginals and conditionals distributions of univariate random of this vector. Moments and moments generated function of the bivariate distribution and estimators of parameter of distribution of this random vector. ACKNOWLEDGEMENT Thanked for each who helped me even accomplished this work. REFERENCES Al-Obaidi, A.H. and K.A. Al-Khafaji, 2010. Some probabilistic properties of a poisson exponential joint distribution. In: Department of Mathematics. College of Education, University of Babylon. David, H.A. and H.N. Nagaraja, 2003. Order statistics. 3rd Edn., USA.: John Wiley & Sons Inc. Park, S., 1970. Regression and correlation in a bivariate distribution with different marginal densities. Technometrics, 12(3): 687-691. Ross, S.M., 2007. Introduction to probability models. 9th Edn.: Academic Press Sarabia, J.M. and M. Guillen, 2008. Joint modeling of the total amount and the number of claims by conditionals. Insurance. Mathematics and Economics, 43: 466-473. 343