Download Bernoulli Trials The Geometric Model The Geometric Model (cont

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
• The expected value is P 1p .
Slide 12-3
• One of the important requirements for Bernoulli
trials is that the trials be independent.
• When we don’t have an infinite population, the
trials are not independent. But, there is a rule
that allows us to pretend we have independent
trials:
Geometric probability model for Bernoulli trials:
Geom(p)
p = probability of success
q = 1 – p = probability of failure
X = # of trials until the first success occurs
P(X = x) = qx-1p , x = 1,2,3,…
Slide 12-4
– The 10% condition – Bernoulli trials must be
independent. If that assumption is violated, it is still
okay to proceed as long as the sample is smaller than
10% of the population.
Independence
Slide 12-2
The Geometric Model (cont.)
Slide 12-1
• A single Bernoulli trial is usually not all that
interesting.
• A Geometric model tells us the probability for a
random variable that counts the number of
Bernoulli trials until the first success.
• Geometric models are completely specified by
one parameter, p, the probability of success, and
are denoted Geom(p).
• The basis for the probability models we
will examine in this chapter is the Bernoulli
trial.
• We have Bernoulli trials if:
– there are two possible outcomes (success
and failure).
– the probability of success, p, is constant.
– the trials are independent.
The Geometric Model
Bernoulli Trials
0
0.98
100
0.02 +
0
1
100
= 0.4033
1
0.98 0.02
99
= 0.98100 + 100 × 0.9899 × 0.02
=
100
P(X < 2) = P(X = 0) + P(X = 1)
Here we can use the binomial model, because the experiment can be
regarded as a sequence of Bernoulli trials with p = 0.02
Example
Two percent of the population carry a certain gene defect. In a random
sample of 100 people, what is the probability that less than two carry
the defect?
and
and
E [X] = np
sd(X) =
√
npq =
np(1 − p)
Var (X) = npq = np(1 − p)
If X ∼ Binom(n, p), then
Moments of Binomial Distribution
Slide 12-6
Binomial probability model for Bernoulli trials:
Binom(n,p)
n = number of trials
p = probability of success
q = 1 – p = probability of failure
X = # of successes in n trials
• A Binomial model tells us the probability
for a random variable that counts the
number of successes in a fixed number of
Bernoulli trials.
• Two parameters define the Binomial
model: n, the number of trials; and, p, the
probability of success. We denote this
Binom(n, p).
Slide 12-5
The Binomial Model (cont.)
The Binomial Model
P(T ≥ 8) = 1 − P(T < 8)
etc
0.0975
P (X = 1)
0.01769
P (X = 2)
0.2134
P (X = 3)
P (X ≥ 4)
0.4854
Note, for large n (so p = µ/n is small) Binomial(n, µ/n) probabilities do
not vary with n, i.e. depend only on µ.
X ∼ Binomial(360, 0.01)
Now, P (mistyped word) = 0.01 and
In fact page counts are variable, e.g. what if 360 words per page?
Poisson(3.6)
B(360, 0.01)
0.0268
P (X = 0)
P (X = 0) = (1 − 0.01125)320 = 0.0268,
B(320, 0.01125)
• T ∼
• Probability any question correct =
• Let T be number correct
Probability model for guessing all answers:
3.6
= 0.01125
320
e−λ λx
,
x!
x = 0, 1, 2, . . .
• More generally applicable to counts of events over time, or space.
Binomial(n, p) ≈ Poisson(np)
• Approximation to the binomial for “rare events”, say p ≤ 0.05.
Improves as n gets larger
• Named after Siméon Denis Poisson (1781–1840)
Write X ∼ Poisson(λ).
P (X = x) = fX (x) =
The random variable X has a Poisson distribution with parameter λ if
The Poisson Distribution
X ∼ Binomial(320, 0.01125)
• Assuming independence from word to word,
• Let X =number of errors per page
=⇒ P (mistyped word) =
• Assume 320 words per page
• Suppose a literary book publisher produces proofs with an average
of 3.6 errors per page
• What is the effect of guessing?
• Suppose pass mark is 40%
• Book proofs contain typographical errors
Typographical Errors
Rare Events
• each question – five options – one correct
• twenty questions
Multiple Choice Exams
What Can Go Wrong?
N (t) ∼ Poisson(λt)
• Don’t confuse Geometric and Binomial
models.
• Poisson model assumes independence,
events occurring singly at a constant
average rate.
Be sure you have Bernoulli trials—two
outcomes per trial, a constant probability of
success, and independence.
• Geometric and Binomial distributions –
then
N (t) = number of events in interval of length t
If such events occur at rate λ per unit time, let
• events occuring at a constant average rate per unit time or area.
• events occuring singly rather than in groups
• independence of events over time or space
Common features
• Number of flying bomb hits on London per km2
• Number of calls at a phone switchboard in an hour
• Number of radioactive emissions in one minute
Examples
Slide 12-7
Var (X) = λ
E [X] = λ
Slide 12-8
• We are particularly interested in Bernoulli trials.
• When we are looking for the probability for the
number of Bernoulli trials until a success occurs,
we have a Geometric model.
• When we are looking for the probability for the
number of successes in a fixed number of
Bernoulli trials, we have a Binomial model.
• Poisson model can be used for rare events and
counts (over time, space)
So What Do We Know?
• Comparing the sample mean, x and the sample variance, s2 , gives a
simple assessement of the appropriateness of the Poisson
distribution.
• For a random sample of count data an obvious estimate of λ is x.
and
If X ∼ Poisson(λ), then
Moments of Poisson Distribution
Related documents