yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
274 Chapter 4 Special Distributions
4.6.1. An Arctic weather station has three electronic wind
gauges. Only one is used at any given time. The lifetime of each gauge is exponentially distributed with a
mean of one thousand hours. What is the pdf of Y , the
random variable measuring the time until the last gauge
wears out?
4.6.2. A service contact on a new university computer
system provides twenty-four free repair calls from a technician. Suppose the technician is required, on the average,
three times a month. What is the average time it will take
for the service contract to be fulfilled?
4.6.3. Suppose a set of measurements Y1 , Y2 , . . . , Y100 is
taken from a gamma pdf for which E(Y ) = 1.5 and
Var(Y ) = 0.75. How many Yi ’s would you expect to find
in the interval [1.0, 2.5]?
4.6.4. Demonstrate that λ plays the role of a scale parameter by showing that if Y is gamma with parameters r and
λ, then λY is gamma with parameters r and 1.
4.6.5. Show that a gamma pdf has the unique mode
r −1
that is, show that the function f Y (y) = (r ) y r −1 e−λy takes its
and at no other point.
maximum value at ymode = r −1
1 √
4.6.6. Prove that 2 = π. [Hint: Consider E(Z 2 ),
where Z is a standard normal random variable.]
4.6.7. Show that 72 = 158 π.
4.6.8. If the random variable Y has the gamma pdf with
integer parameter r and arbitrary λ > 0, show that
E(Y m ) =
[Hint: Use the fact that
a positive integer.]
(m + r − 1)!
(r − 1)!λm
y r −1 e−y dy = (r − 1)! when r is
4.6.9. Differentiate the gamma moment-generating function to verify the formulas for E(Y ) and Var(Y ) given in
Theorem 4.6.3.
4.6.10. Differentiate the gamma moment-generating
function to show that the formula for E(Y m ) given in
Question 4.6.8 holds for arbitrary r > 0.
4.7 Taking a Second Look at Statistics (Monte
Carlo Simulations)
Calculating probabilities associated with (1) single random variables and (2) functions of sets of random variables has been the overarching theme of Chapters 3
and 4. Facilitating those computations has been a variety of transformations, summation properties, and mathematical relationships linking one pdf with another.
Collectively, these results are enormously effective. Sometimes, though, the intrinsic
complexity of a random variable overwhelms our ability to model its probabilistic behavior in any formal or precise way. An alternative in those situations is
to use a computer to draw random samples from one or more distributions that
model portions of the random variable’s behavior. If a large enough number of
such samples is generated, a histogram (or density-scaled histogram) can be constructed that will accurately reflect the random variable’s true (but unknown)
distribution. Sampling “experiments” of this sort are known as Monte Carlo
Real-life situations where a Monte Carlo analysis could be helpful are not
difficult to imagine. Suppose, for instance, you just bought a state-of-the-art, highdefinition, plasma screen television. In addition to the pricey initial cost, an optional
warranty is available that covers all repairs made during the first two years. According to an independent laboratory’s reliability study, this particular television is
likely to require 0.75 service call per year, on the average. Moreover, the costs
of service calls are expected to be normally distributed with a mean (μ) of $100
and a standard deviation (σ ) of $20. If the warranty sells for $200, should you
buy it?