Survey

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Survey

Document related concepts

Inductive probability wikipedia, lookup

Random variable wikipedia, lookup

Birthday problem wikipedia, lookup

Infinite monkey theorem wikipedia, lookup

Ars Conjectandi wikipedia, lookup

Probability interpretations wikipedia, lookup

Central limit theorem wikipedia, lookup

Transcript

Chapter 8. Some Approximations to Probability Distributions: Limit Theorems Sections 8.2 -- 8.3: Convergence in Probability and in Distribution Jiaping Wang Department of Mathematical Science 04/22/2013, Monday The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Outline Convergence in Probability Convergence in Distribution The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Part 1. Convergence in Probability The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Introduction Suppose that a coin has probability p, with 0β€pβ€1, of coming up heads on a single flip. Suppose that we flip the coin n times, what can we say about the fraction of heads observed in the n flips? For example, if p=0.5, we draw different numbers of trials in a simulation, the result is given in the table n 100 200 300 400 % 0.4700 0.5200 0.4833 0.5050 0.02 0.0167 0.005 |%-0.5| 0.03 From here, we can find when nο β, the ratio is closer to 0.5 and thus the difference is closer to zero. The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Definition 8.1 In mathematical notations, let X denote the number of heads observed in the n tosses. Then E(X)=np, V(X)=np(1-p). One way to measure the closeness of X/n to p is to ascertain the π probability that the distance | β π| will be less than a preπ assigned small value Ξ΅ so that π π π β π < π β 1. Definition 8.1: The sequence of random variables X1,X2, .., Xn is said to convergence in probability to the constant c, if for every positive number Ξ΅, lim π ππ β π < π = 1 . πββ The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Theorem 8.1 Weak Law of Large Numbers: Let X1,X2, .., Xnbe independent and identical distributed random variables, with E(Xi)=ΞΌ and 1 π 2 V(Xi)=Ο <β for each i=1,β¦, n. Let ππ = π . Then, for any n π=1 π positive real number Ξ΅, lim π ππ β π β₯ π = 0 Or πββ lim π ππ β π < π = 1. πββ Thus, ππ converges in probability toward ΞΌ. The proof can be shown based on the Tchebysheffβs theorem with π 2 2 X replaced by ππand Ο by Ο /n, then let π = π. π The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Theorem 8.2 Suppose that Xn converges in probability toward ΞΌ1 and Yn converges in probability toward ΞΌ2. Then the following statements are also true. 1. Xn+Yn converges in probability toward u1+u2. 2. XnYnconverges in probability toward u1u2. 3. Xn/Yn converges in probability toward u1/u2, provided u2β 0. 4. Xn converges in probability toward u1, provided P(Xnβ₯0)=1. The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Example 8.1 Let X be a binomial random variable with probability of success p and number of trials n. Show that X/n converges in probability toward p. Answer: We have seen that we can write X as βYi with Yi=1 if the i-th trial results in Success, and Yi=0 otherwise. Then X/n=1/n βYi . Also E(Yi)=p and V(Yi)=p(1-p). Then the conditions of Theorem 8.1 are fulfilled with ΞΌ=p and Ο2=p(1-p)< β and thus we can conclude that, for any positive Ξ΅, limnο βP(|X/n-p| β₯Ξ΅)=0. The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Example 8.2 Suppose that X1, X2, β¦, Xn are independent and identically distributed random Variables with πΈ(ππ) = π1, πΈ(ππ2) = π2, πΈ(ππ3) = π3, πΈ(ππ4) = π4 and all assumed finite. Let S2 denote the sample variance given by 1 π2 = ππ β π 2. π 2 Show that S converges in probability to V(Xi). Answer: Notice that π2 = 1 1 π π 2 2 π=1 ππ β π where π = 1 π π π=1 ππ . The quantity π ππ=1 ππ2 is the average of n independent and identical distributed variables of the form ππ2 with E(ππ2 )= π2, and V (ππ2 )= π4 - π22, which is finite. Thus 1 Theorem 8.1 tell us that π ππ=1 ππ2 converges to π2 in probability. Finally, based on 1 Theorem 8.2, we can have π2 = π ππ=1 ππ2 β π2 converges in probability to π2 - π12 =V(Xi). This example shows that for large samples, the sample variance has a high probability of being close to the population variance. The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Part 2. Convergence in Distribution The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Definition 8.2 In the last section, we only study the convergence of certain random variables Toward constants. In this section, we study the probability distributions of certain type random variables as n tends toward infinity. Definition 8.2: Let Xn be a random variable with distribution function Fn(x). Let X be a random variable with distribution function F(x). If limnο βFn(x)=F(x) At every point x for which F(x) is continuous, then Xn is said to converge in distribution toward X. F(x) is called the limiting distribution function of Xn. The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Example 8.3 Let X1, X2, β¦, Xn be independent uniform random variables over the interval (ΞΈ, 0) for a negative constant ΞΈ. In addition, let Yn=min(X1, X2, β¦, Xn). Find the limiting distribution of Yn. Answer: The distribution function for the uniform random variable Xi is 0, π₯ < π πΉ(ππ) = π(ππ β€ π₯) = π₯βπ , βπ πβ€π₯β€0 1, π₯ > 0. We know πΊ π¦ = π ππ β€ π¦ = 1 β π ππ > π¦ = 1 β π min π1, π2, β¦ , ππ > π¦ = 1 β π π1 > π¦ π π2 > π¦ β¦ π ππ > π¦ = 1 β 1 β πΉπ π¦ π 0, π¦ < 0 0, π¦ < 0 = 1β π¦ π , π π β€ π¦ β€ 0 so we can find lim πΊ(π¦) = 1, π¦ > 0. 0, π¦ < π = 1, π¦ β₯ π. πββ lim 1 β πββ π¦ π , π πβ€π¦β€0 1, π¦ > 0. The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Theorem 8.3 Let Xn and X be random variables with moment-generating functions Mn(t) and M(t), respectively. If limnο βMn(t)=M(t) For all real t, then Xn converges in distribution toward X. The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Example 8.4 Let Xn be a binomial random variable with n trials and probability p of success on each trial. If n tends toward infinity and p tends zero with np remaining fixed. Show that Xn converges in distribution toward a Poisson random variable. Answer: We know the moment-generating function for the binomial random variables Xn, Mn(t) is given as ππ π‘ = π + πππ‘ π = 1 + π ππ‘ β 1 π ππ π = 1 β π π Ξ» = 1 + π ππ‘ β 1 based on np=Ξ» . π π Recall that lim 1 + π πββ = ππ. Letting k=Ξ»(et-1), we have lim ππ π‘ = exp Ξ» ππ‘ β 1 πββ which is the moment generating function of the Poisson random variable. As an example, when n=10 and p=0.1, we can find the true probability from the binomial Distribution is 0.73609 for X is less than 2 and the approximate value from the Poisson Is 0.73575, they are very close. So we can approximate the probability from binomial Distribution by the Poisson distribution when n is large and p is small. The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Example 8.5 In monitoring for a pollution, an experiment collects a small volume of water and counts the number of bacteria in the sample. Unlike earlier problems, we have only one observation. For purposes of approximating the probability distribution of counts, we can think of the volume as the quantity that is getting large. Let X denote the bacteria count per cubic centimeter of water and assume that X has a Poisson probability distribution with mean Ξ», which we do by showing πβπ that π = converges in distribution toward a standard normal random Ξ» variable as Ξ» tends toward infinity. Specifically, if the allowable pollution in a water supply is a count of 110 bacteria per cubic centimeter, approximate the probability that X will be at most 110, assuming that Ξ»=100. The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Solution Answer: We know the mgf for Poisson random variable X is ππ(π‘) = exp[π(ππ‘ β 1)], thus We can have the mgf of Y as ππ π‘ = exp βπ‘ Ξ» exp[Ξ»(exp(π‘/ Ξ»)-1)]. The term (exp(π‘/ Ξ»)-1) can be written as π‘2 π‘3 exp(π‘/ Ξ»)β1= t/ Ξ»+2Ξ» + + β― Thus MY(t)=exp[βπ‘ Ξ»+ Ξ»(t/ 6Ξ» Ξ» 3 π‘2 π‘ Ξ»+2Ξ» + + 6Ξ» Ξ» π‘2 β― )]=exp[ 2 π‘3 + + 6 Ξ» β― )] When Ξ»ο β, MY(t)ο exp(t2/2) which is the mgf of the standard normal distribution. So we can approximate the probability of the Poisson random variable by the standard normal distribution when Ξ» is large enough (for example, Ξ»β₯25). π π β€ 110 = π πβπ Ξ» β€ 110 β 100 = π π β€ 1 = 0.8413. 10 The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL