Download Solutions to Problem Set #7 Section 8.1 1. A fair coin is tossed 100

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Addition wikipedia , lookup

Proofs of Fermat's little theorem wikipedia , lookup

Bernoulli number wikipedia , lookup

Expected value wikipedia , lookup

Risk aversion (psychology) wikipedia , lookup

Law of large numbers wikipedia , lookup

Transcript
Solutions to Problem Set #7
Section 8.1
1. A fair coin is tossed 100 times. The expected number of heads is 50, and the standard
deviation for the number of heads is (100 · 1/2 · 1/2)1/2 = 5. What does Chebyshev’s inequality
tell you about the probability that the number of heads that turn up deviates from the expected
number 50 by three or more standard deviations (i.e., by at least 15)?
Chebyshev’s Inequality tells us that
P (| X − µ ≥ 3σ |) = P (| X − 50 |≥ 15) ≤
25
1
= .
225
9
4. A 1-dollar bet on craps has an expected winning of −.0141. What does the Law of Large
Numbers say about your winnings if you make a large number of 1-dollar bets at the craps
table? Does it assure you that your losses will be small? Does it assure you that if n is very
large you will lose?
The Law of Large Numbers does NOT assure your losses will be small. In fact, it says that
your losses will average .0141 per game, and hence your total losses will become arbitrarily
large on average. Directly speaking, it makes no assertion that you will lose. It says it is
extremely unlikely that you do not lose for large n. For no finite value of n, however, does
it ever assure a loss.
6. Let Sn be the number of successes in n Bernoulli trials with probability p for success on
each trial. Show, using Chebyshev’s Inequality, that for any > 0,
P (|
Let X =
yields
Sn
.
n
p(1 − p)
Sn
− p |≥ ) ≤
.
n
n2
p(1−p)
.
n
Then µ = p, σ 2 =
pq
n
P (| X − µ |≥ ) ≤
Sn
p(1 − p)
V (X)
= P (|
− p |≥ ) ≤
.
2
n
n2
=
Plugging this into Chebyshev’s Inequality
8. A fair coin is tossed a large number of times. Does the Law of Large Numbers assure us
that, if n is large enough, with probability > .99 the number of heads that turn up will not
deviate from n/2 by more than 100?
This question is open to interpretation. There are several decent arguments for the answer to be “no”:
1) Strictly speaking, the Law of Large Numbers only tells us about what the outcome
should be on average. It doesn’t say anything about the actual number of heads that will
turn up.
1
Solutions to Problem Set #7
2) You could interpret the statement in the problem to mean that the person who wrote
it is intending to take µ to be n/2, which tends to infinity as n tends to infinity, in which
case the Law of Large Numbers does not apply (one of the hypotheses for using the Law of
Large Numbers is that µ must be finite).
There is also a way to interpret the problem so that the answer is “yes:
Let Sn be the number of successes in a Bernoulli trials process (in this case, the number
of heads that you obtain) with expected value µ = 21 . Then, the average number of heads
can be represented by Snn . The Law of Large Numbers says that, as n → ∞,
P (|
Sn 1
− |≥ ) → 0,
n
2
for all > 0. This is equivalent to saying that
P (| Sn −
for all > 0. Letting =
100
n
n
|≥ n) → 0,
2
yields the statement in the problem.
Section 9.1
2. Let S200 be the number of heads that turn up in 200 tosses for a fair coin. Estimate:
(a) P (S200 = 100) =
√ 1
100π
= 0.05642.
(b) P (S200 = 90) =
√ 1 e−1
100π
≈ 0.0208.
(c) P (S200 = 80) =
√ 1 e−4
100π
≈ 0.0010.
5. A rookie is brought to a baseball club on the assumption that he will have a .300 batting
average (Batting average is the ratio of the number of hits to the number of times at bat).
In the first year, he comes to bat 300 times and his batting average is .267. Assume that his
at bats can be considered Bernoulli trials with probability .3 for success. Could such a low
average be considered just bad luck or should he be sent back to the minor leagues? Comment
on the assumption of Bernoulli trials in this situation.
It’s probably not bad luck. His performance has probably decreased, since the central
limit theorem approximates b(300, .3, 80) to be about .0092. Moreover, the event “having
a worse batting average,” which in our notation is P (A300 ≤ .267) is approximated by the
Central Limit Theorem by .5 − N A(0, 1.26) = .104. This is not attributed to bad luck, but
rather poor performance.
2