Download Page 190 - the UHCL Math Department

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
190 Chapter 5 Distributions of Functions of Random Variables
Theorem
5.4-2
Let X1 , X2 , . . . , Xn be independent chi-square random variables with r1 , r2 , . . . , rn
degrees of freedom, respectively. Then Y = X1 +X2 +· · ·+Xn is χ 2 (r1+r2+· · ·+rn ).
Proof By Theorem 5.4-1 with each a = 1, the mgf of Y is
MY (t) =
n
MX i (t) = (1 − 2t)−r1 /2 (1 − 2t)−r2 /2 · · · (1 − 2t)−rn /2
i=1
= (1 − 2t)−ri /2 ,
with t < 1/2,
which is the mgf of a χ 2 (r1 +r2 + · · · + rn ). Thus, Y is χ 2 (r1 +r2 + · · · + rn ).
䊐
The next two corollaries combine and extend the results of Theorems 3.3-2 and
5.4-2 and give one interpretation of degrees of freedom.
Corollary
5.4-2
Let Z1 , Z2 , . . . , Zn have standard normal distributions, N(0, 1). If these random
variables are independent, then W = Z12 + Z22 + · · · + Zn2 has a distribution that is
χ 2 (n).
Proof By Theorem 3.3-2, Zi2 is χ 2 (1) for i = 1, 2, . . . , n. From Theorem 5.4-2, with
䉳
Y = W and ri = 1, it follows that W is χ 2 (n).
Corollary
5.4-3
If X1 , X2 , . . . , Xn are independent and have normal distributions N(μi , σi2 ), i =
1, 2, . . . , n, respectively, then the distribution of
W=
n
(Xi − μi )2
i=1
σi2
is χ 2 (n).
Proof This follows from Corollary 5.4-2, since Zi = (Xi − μi )/σi is N(0, 1), and thus
Zi2 =
is χ 2 (1), i = 1, 2, . . . , n.
(Xi − μi )2
σi2
䉳
Exercises
5.4-1. Let X1 , X2 , X3 be a random sample of size 3 from
the distribution with pmf f (x) = 1/4, x = 1, 2, 3, 4.
For example, observe three independent rolls of a fair
four-sided die.
(a) Find the pmf of Y = X1 + X2 + X3 .
(b) Sketch a bar graph of the pmf of Y.
5.4-2. Let X1 and X2 have independent distributions
b(n1 , p) and b(n2 , p). Find the mgf of Y = X1 + X2 . How
is Y distributed?
5.4-3. Let X1 , X2 , X3 be mutually independent random
variables with Poisson distributions having means 2, 1,
and 4, respectively.
Section 5.4 The Moment-Generating Function Technique 191
(a) Find the mgf of the sum Y = X1 + X2 + X3 .
(b) How is Y distributed?
(c) Compute P(3 ≤ Y ≤ 9).
5.4-4. Generalize Exercise 5.4-3 by showing that the sum
of n independent Poisson random variables with respective means μ1 , μ2 , . . . , μn is Poisson with mean
μ1 + μ2 + · · · + μn .
5.4-5. Let Z1 , Z2 , . . . , Z7 be a random sample from the
standard normal distribution N(0, 1). Let W = Z12 + Z22 +
· · · + Z72 . Find P(1.69 < W < 14.07).
5.4-6. Let X1 , X2 , X3 , X4 , X5 be a random sample of size
5 from a geometric distribution with p = 1/3.
(a) Find the mgf of Y = X1 + X2 + X3 + X4 + X5 .
(b) How is Y distributed?
5.4-7. Let X1 , X2 , X3 denote a random sample of size 3
from a gamma distribution with α = 7 and θ = 5.
(a) Find the mgf of Y = X1 + X2 + X3 .
(b) How is Y distributed?
5.4-8. Let W = X1 + X2 + · · · + Xh , a sum of h mutually
independent and identically distributed exponential random variables with mean θ . Show that W has a gamma
distribution with parameters α = h and θ , respectively.
5.4-9. Let X and Y, with respective pmfs f (x) and g( y),
be independent discrete random variables, each of whose
support is a subset of the nonnegative integers 0, 1, 2, . . . .
Show that the pmf of W = X + Y is given by the
convolution formula
h(w) =
w
(a) The first die has three faces numbered 0 and three
faces numbered 2, and the second die has its faces
numbered 0, 1, 4, 5, 8, and 9.
(b) The faces on the first die are numbered 0, 1, 2, 3, 4,
and 5, and the faces on the second die are numbered
0, 6, 12, 18, 24, and 30.
5.4-12. Let X and Y be the outcomes when a pair of fair
eight-sided dice is rolled. Let W = X + Y. How should
the faces of the dice be numbered so that W has a uniform
distribution on 0, 1, . . . , 15?
5.4-13. Let X1 , X2 , . . . , X8 be a random sample from a
distribution having pmf f (x) = (x + 1)/6, x = 0, 1, 2.
(a) Use Exercise 5.4-9 to find the pmf of W1 = X1 + X2 .
(b) What is the pmf of W2 = X3 + X4 ?
(c) Now find the pmf of W = W1 + W2 = X1 + X2 +
X3 + X4 .
(d) Find the pmf of Y = X1 + X2 + · · · + X8 .
(e) Construct probability histograms for X1 , W1 , W, and
Y. Are these histograms skewed or symmetric?
5.4-14. The number of accidents in a period of one week
follows a Poisson distribution with mean 2. The numbers
of accidents from week to week are independent. What is
the probability of exactly seven accidents in a given three
weeks? Hint: See Exercise 5.4-4.
5.4-15. Given a fair four-sided die, let Y equal the number of rolls needed to observe each face at least once.
(a) Argue that Y = X1 + X2 + X3 + X4 , where Xi has a
geometric distribution with pi = (5−i)/4, i = 1, 2, 3, 4,
and X1 , X2 , X3 , X4 are independent.
(b) Find the mean and variance of Y.
f (x)g(w − x),
w = 0, 1, 2, . . . .
(c) Find P(Y = y), y = 4, 5, 6, 7.
x=0
Hint: Argue that h(w) = P(W = w) is the probability
of the w + 1 mutually exclusive events (x, y = w − x),
x = 0, 1, . . . , w.
5.4-10. Let X equal the outcome when a fair four-sided
die that has its faces numbered 0, 1, 2, and 3 is rolled. Let
Y equal the outcome when a fair four-sided die that has
its faces numbered 0, 4, 8, and 12 is rolled.
(a) Define the mgf of X.
(b) Define the mgf of Y.
(c) Let W = X + Y, the sum when the pair of dice is
rolled. Find the mgf of W.
(d) Give the pmf of W; that is, determine P(W = w),
w = 0, 1, . . . , 15, from the mgf of W.
5.4-11. Let X and Y equal the outcomes when two fair
six-sided dice are rolled. Let W = X + Y. Assuming
independence, find the pmf of W when
5.4-16. The number X of sick days taken during a year by
an employee follows a Poisson distribution with mean 2.
Let us observe four such employees. Assuming independence, compute the probability that their total number of
sick days exceeds 10.
5.4-17. In a study concerning a new treatment of a certain disease, two groups of 25 participants in each were
followed for five years. Those in one group took the old
treatment and those in the other took the new treatment. The theoretical dropout rate for an individual was
50% in both groups over that 5-year period. Let X be
the number that dropped out in the first group and Y
the number in the second group. Assuming independence
where needed, give the sum that equals the probability that Y ≥ X + 2. Hint: What is the distribution of
Y − X + 25?
5.4-18. The number of cracks on a highway averages 0.5 per mile and follows a Poisson distribution.
192 Chapter 5 Distributions of Functions of Random Variables
Assuming independence (which may not be a good
assumption; why?), what is the probability that, in a
40-mile stretch of that highway, there are fewer than 15
cracks?
independence, what integral gives the probability that you
will wait more than 90 minutes?
5.4-21. Let X and Y be independent with distributions
N(5, 16) and N(6, 9), respectively. Evaluate P(X > Y) =
P(X − Y > 0).
5.4-19. A doorman at a hotel is trying to get three taxicabs for three different couples. The arrival of empty
cabs has an exponential distribution with mean 2 minutes.
Assuming independence, what is the probability that the
doorman will get all three couples taken care of within 6
minutes?
5.4-22. Let X1 and X2 be two independent random variables. Let X1 and Y = X1 + X2 be χ 2 (r1 ) and χ 2 (r),
respectively, where r1 < r.
(a) Find the mgf of X2 .
(b) What is its distribution?
5.4-20. The time X in minutes of a visit to a cardiovascular disease specialist by a patient is modeled by a gamma
pdf with α = 1.5 and θ = 10. Suppose that you are such
a patient and have four patients ahead of you. Assuming
5.4-23. Let X be N(0, 1). Use the mgf technique to show
that Y = X 2 is χ 2 (1). Hint: Evaluate the integral repre√
2
senting E(etX ) by writing w = x 1 − 2t.
5.5 RANDOM FUNCTIONS ASSOCIATED WITH NORMAL
DISTRIBUTIONS
In statistical applications, it is often assumed that the population from which a sample is taken is normally distributed, N(μ, σ 2 ). There is then interest in estimating
the parameters μ and σ 2 or in testing conjectures about these parameters. The usual
statistics that are used in these activities are the sample mean X and the sample variance S2 ; thus, we need to know something about the distribution of these statistics
or functions of these statistics.
We now use the mgf technique of Section 5.4 to prove a theorem that deals with
linear functions of independent normally distributed random variables.
Theorem
5.5-1
If X1 , X2 , . . . , Xn are n mutually independent normal variables with means
μ1 , μ2 , . . . , μn and variances σ12 , σ22 , . . . , σn2 , respectively, then the linear function
Y=
n
ci Xi
i=1
has the normal distribution
N
n
ci μi ,
i=1
n
c2i σi2
.
i=1
Proof By Theorem 5.4-1, we have, with −∞ < ci t < ∞, or −∞ < t < ∞,
MY (t) =
n
MX i (ci t) =
i=1
n
exp μi ci t + σi2 c2i t2 /2
i=1
because MX i (t) = exp(μi t + σi2 t2 /2), i = 1, 2, . . . , n. Thus,
n
n
t2
2 2
MY (t) = exp
ci μi t +
ci σi
.
2
i=1
i=1
Related documents