Download Binomial Probability Distribution

Document related concepts

Statistics wikipedia , lookup

History of statistics wikipedia , lookup

Birthday problem wikipedia , lookup

Ars Conjectandi wikipedia , lookup

Probability interpretations wikipedia , lookup

Probability wikipedia , lookup

Transcript
Chapter 4
Using Probability and
Discrete Probability
Distributions
©
Chapter 4 - Chapter Outcomes
After studying the material in this chapter, you
should be able to:
• Understand the three approaches to
assessing probabilities.
• Apply the common rules of probability.
• Identify the types of processes that are
represented by discrete probability
distributions.
Chapter 4 - Chapter Outcomes
(continued)
After studying the material in this chapter, you
should be able to:
• Know how to determine probabilities
associated with binomial and Poisson
distribution applications.
Probability
Probability refers to the chance that a
particular event will occur.
•The probability of an event will be a value in
the range 0.00 to 1.00. A value of 0.00 means
the event will not occur. A probability of 1.00
means the event will occur. Anything
between 0.00 and 1.00 reflects the uncertainty
of the event occurring.
Events and Sample Space
An experiment is a process that
produces a single outcome whose
result cannot be predicted with
certainty.
Events and Sample Space
Elementary events are the most
rudimentary outcomes resulting
from a simple experiment.
Events and Sample Space
The sample space is the collection
of all elementary outcomes that
can result from a selection or
decision.
Events and Sample Space
An event is a collection of
elementary events.
Events and Sample Space
(Able Accounting Example)
Elementary Event
e1
e2
e3
e4
e5
e6
e7
e8
e9
Audit 1
early
early
early
on time
on time
on time
late
late
late
Audit 2
early
on time
late
early
on time
late
early
on time
late
Sample Space = SS = (e1, e2, e3, e4, e5, e6, e7, e8, e9)
Mutually Exclusive Events
Two events are mutually exclusive if
the occurrence of one event precludes
the occurrence of a second event.
Mutually Exclusive Events
(Able Accounting Example)
The event in which at least one of the two
audits is late:
E1 = {e3, e6, e7, e8, e9}
The event that neither audit is late:
E2 = {e1, e2, e4, e5}
E1 and E2 are mutually exclusive!
Independent and Dependent
Events
Two events are independent if the
occurrence of one event in no way
influences the probability of the
occurrence of the other event.
Independent and Dependent
Events
Two events are dependent if the
occurrence of one event impacts
the probability of the other event
occurring.
Classical Probability
Assessment
Classical Probability Assessment refers to the
method of determining probability based on
the ratio of the number of ways the event of
interest can occur to the total number of ways
any event can occur when the individual
elementary events are equally likely.
Classical Probability
Assessment
CLASSICAL PROBABILITY MEASUREMENT
Number of ways E i can occur
P(E i ) 
Total number of elementary events
Relative Frequency of
Occurrence
Relative Frequency of Occurrence refers to a
method that defines probability as the
number of times an event occurs, divided
by the total number of times an experiment
is performed in a large number of trials.
Relative Frequency of
Occurrence
RELATIVE FREQUENCY OF OCCURRENCE
Number of times E i occurs
RF(Ei ) 
n
where:
Ei = the event of interest
RF(Ei) = the relative frequency of Ei occurring
n = number of trials
Relative Frequency of
Occurrence
(Example 4-6)
Commercial
Heating Systems
55
Air-Conditioning
Systems
45
Total
100
Residential
145
Total
200
255
400
300
500
400
P(Re sidential )  RF (Re sidential ) 
 0.80
500
200
P( Heating )  RF ( Heating ) 
 0.40
500
Subjective Probability
Assessment
Subjective Probability Assessment refers
to the method that defines probability of
an event as reflecting a decision maker’s
state of mind regarding the chances that
the particular event will occur.
The Rules of Probability
PROBABILITY RULE 1
For any event Ei
0.0  P(Ei)  1.0 for all i
The Rules of Probability
PROBABILITY RULE 2
k
 P ( e )  1.0
where:
i 1
i
k = Number of elementary events
in the sample space
ei = ith elementary event
The Rules of Probability
PROBABILITY RULE 3
The probability of an event Ei is equal to
the sum of the probabilities of the
elementary events forming Ei. That is, if:
Ei = {e1, e2, e3}
then:
P(Ei) = P(e1) + P(e2) + P(e3)
Complements
The complement of an event E is
the collection of all possible
elementary events not contained
in event E. The complement of
event E is represented by E.
The Rules of Probability
COMPLEMENT RULE
P( E )  1  P( E )
The Rules of Probability
PROBABILITY RULE 4
Addition rule for any two events E1 and E2:
P(E1 or E2) = P(E1) + P(E2) - P(E1 and E2)
The Rules of Probability
PROBABILITY RULE 5
Addition rule for mutually exclusive events E1
and E2:
P(E1 or E2) = P(E1) + P(E2)
Conditional Probability
Conditional probability refers to
the probability that an event will
occur given that some other event
has already happened.
The Rules of Probability
PROBABILITY RULE 6
Conditional probability for any two events E1 , E2:
P( E1 and E2 )
P( E1 | E2 ) 
P ( E2 )
P ( E2 )  0
Tree Diagrams
Another way of organizing events
of an experiment that aids in the
calculation of probabilities is the
tree diagram.
Tree Diagrams
(Figure 4-1)
Male
P(E5) = 0.66
Female
P(E4) = 0.34
Tree Diagrams
(Figure 4-1)
P(E1) = 0.38
Male
P(E5) = 0.66
P(E2) = 0.44
P(E3) = 0.18
Female
P(E4) = 0.34
P(E1) = 0.38
P(E2) = 0.44
P(E3) = 0.18
Tree Diagrams
(Figure 4-1)
P(E1) = 0.38
Male
P(E5) = 0.66
P(E2) = 0.44
P(E3) = 0.18
Female
P(E4) = 0.34
P(E1 and E5) = 0.38 x 0.66 = 0.20
P(E2 and E5) = 0.44 x 0.66 = 0.32
P(E3 and E5) = 0.18 x 0.66 = 0.14
P(E1 and E4) = 0.38 x 0.34 = 0.18
P(E1) = 0.38
P(E2) = 0.44
P(E2 and E4) = 0.44 x 0.34 = 0.12
P(E3) = 0.18
P(E3 and E4) = 0.18 x 0.34 = 0.04
The Rules of Probability
PROBABILITY RULE 7
Conditional probability for independent
events E1 , E2:
P( E1 | E2 )  P( E1 ); P( E2 )  0
and
P( E2 | E1 )  P( E2 ); P( E1 )  0
The Rules of Probability
PROBABILITY RULE 8
Multiplication rule two events E1 and E2:
P( E1 and E2 )  P( E1 ) P( E2 | E1 )
and
P( E2 and E1 )  P( E2 ) P( E1 | E2 )
The Rules of Probability
PROBABILITY RULE 9
Multiplication rule independent events E1 , E2:
P( E1 and E2 )  P( E1 ) P( E2 )
Bayes’ Theorem
BAYES’ THEOREM
P( Ei ) P( B | Ei )
P( Ei | B) 
P( E1 ) P( B | E1 )  P( E2 ) P( B | E2 )    P( Ek ) P( B | Ek )
where:
Ei = ith event of interest of the k
possible events
B = new event that might impact P(Ei)
Discrete Probability
Distributions
A random variable is a variable that
assigns a numerical value to each
outcome of a random experiment or trial.
Discrete Probability
Distributions
A discrete random variable is a
variable that can only assume a
countable number of values.
Discrete Probability
Distributions
A continuous random variable is a
variable that can assume any value on a
continuum. Alternatively, they are
random variables that can assume an
uncountable number of values.
Discrete Distributions
(Example 4-19)
Service Calls = x Frequency
0
3
1
4
2
10
3
8
4
7
5
6
6
2
P(x)
0.075
0.100
0.250
0.200
0.175
0.150
0.050
.
 1000
Discrete Distributions
(Example 4-19)
Discrete Probability Distribution
0.300
0.250
0.200
0.150
0.100
0.050
0.000
0
1
2
3
4
5
x = Number of service calls
6
Discrete Probability
Distributions
The uniform probability distribution
is a probability distribution that has
equal probabilities for all possible
outcomes of the random variable
Discrete Distributions
(Example 4-20)
Uniform Probability Distribution
0.3
0.25
0.2
0.15
0.1
0.05
0
1 week
2 weeks
3 weeks
Delivery Lead Time
4 weeks
Mean and Standard Deviation
of Discrete Distributions
EXPECTED VALUE FOR A DISCRETE
DISTRIBUTION
E( x )   xP( x )
where:
E(x) = Expected value of the random variable
x = Values of the random variable
P(x) = Probability of the random variable taking on
the value of x
Mean and Standard Deviation
of Discrete Distributions
STANDARD DEVIATION FOR A DISCRETE
DISTRIBUTION
x 
 { x  E( x )}
2
P( x )
where:
E(x) = Expected value of the random variable
x = Values of the random variable
P(x) = Probability of the random variable having
the value of x
Binomial Probability
Distribution
• A manufacturing plant labels items as
either defective or acceptable.
• A firm bidding for a contract will either get
the contract or not.
• A marketing research firm receives survey
responses of “Yes, I will buy,” or “No, I
will not.”
• New job applicants either accept the offer
or reject it.
Binomial Probability Distribution
Characteristics of the Binomial Probability
Distribution:
• A trial has only two possible outcomes – a
success or a failure.
• There is a fixed number, n, of identical trials.
• The trials of the experiment are independent of
each other and randomly generated.
• The probability of a success, p, remains
constant from trial to trial.
• If p represents the probability of a success, then
(1-p) = q is the probability of a failure.
Combinations
A combination is an outcome of
an experiment where x objects
are selected from a group of n
objects.
Combinations
COUNTING RULE FOR COMBINATIONS
n!
C 
x!(n  x)!
n
x
where:
n! =n(n - 1)(n - 2) . . . (2)(1)
x! = x(x - 1)(x - 2) . . . (2)(1)
0! = 1
Binomial Probability Distribution
BINOMIAL FORMULA
where:
n!
x n x
P( x) 
p q
x!(n  x)!
n = sample size
x = number of successes
n - x = number of failures
p = probability of a success
q = 1 - p = probability of a failure
n! =n(n - 1)(n - 2) . . . (2)(1)
x! = x(x - 1)(x - 2) . . . (2)(1)
0! = 1
Binomial Probability
Distribution Table
Binomial Probability
Distribution
MEAN OF THE BINOMIAL
DISTRIBUTION
 x  E ( x)  np
where:
n = Sample size
p = Probability of a success
Binomial Probability
Distribution
STANDARD DEVIATION FOR THE
BINOMIAL DISTRIBUTION
  npq
where:
n = Sample size
p = Probability of a success
q = (1 - p) = Probability of a failure
Poisson Probability Distribution
Characteristics of the Poisson Probability
Distribution:
• The outcomes of interest are rare relative to the
possible outcomes.
• The average number of outcomes of interest per
segment is ..
• The number of outcomes of interest are random,
and the occurrence of one outcome does not
influence the chances of another outcome of
interest.
• The probability of that an outcome of interest
occurs in a given segment is the same for all
segments.
Poisson Probability
Distribution
POISSON PROBABILITY DISTRIBUTION
(t ) e
P( x) 
x!
x
where:
 t
x = number of successes in segment t
t = expected number of successes in segment t
e =base of the natural number system (2.71828)
Poisson Probability
Distribution Table
Mean and Standard Deviation for
the Poisson Probability
Distribution
MEAN OF THE POISSON
DISTRIBUTION
  t
STANDARD DEVIATION FOR THE
POISSON DISTRIBUTION
  t
Key Terms
• Binomial Probability
Distribution
• Classical Probability
• Conditional Probability
• Continuous Random
Variable
• Dependent Events
• Discrete Random
Variable
• Elementary Events
• Event
• Independent Events
• Mutually Exclusive
Events
• Poisson Probability
Distribution
• Probability
• Random Variable
Key Terms
(continued)
• Relative Frequency of
Occurrence
• Sample Space
• Subjective Probability
Assessment
• Uniform Probability
Distribution