Download Stats ch04.s03

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Statistics wikipedia , lookup

History of statistics wikipedia , lookup

Ars Conjectandi wikipedia , lookup

Inductive probability wikipedia , lookup

Probability interpretations wikipedia , lookup

Probability wikipedia , lookup

Transcript
Chapter 4
Probability
©
Sample Space
The possible outcomes of a random
experiment are called the basic outcomes,
and the set of all basic outcomes is called
the sample space. The symbol S will be
used to denote the sample space.
Sample Space
- An Example What is the sample space for a roll of
a single six-sided die?
S = [1, 2, 3, 4, 5, 6]
Mutually Exclusive
If the events A and B have no common basic
outcomes, they are mutually exclusive and
their intersection A  B is said to be the empty
set indicating that A  B cannot occur.
More generally, the K events E1, E2, . . . , EK
are said to be mutually exclusive if every pair
of them is a pair of mutually exclusive events.
Venn Diagrams
Venn Diagrams are drawings, usually
using geometric shapes, used to depict
basic concepts in set theory and the
outcomes of random experiments.
Intersection of Events A and B
S
S
A
AB
B
(a) AB is the striped area
A
B
(b) A and B are Mutually Exclusive
Collectively Exhaustive
Given the K events E1, E2, . . ., EK in the
sample space S. If E1  E2  . . . EK = S,
these events are said to be collectively
exhaustive.
Complement
Let A be an event in the sample space S.
The set of basic outcomes of a random
experiment belonging to S but not to A is
called the complement of A and is denoted
by A.
Venn Diagram for the Complement
of Event A
S
A
A
Unions, Intersections, and
Complements
A die is rolled. Let A be the event “Number rolled is even”
and B be the event “Number rolled is at least 4.” Then
A = [2, 4, 6] and
B = [4, 5, 6]
A  [1, 3, 5] and B  [1, 2, 3]
A  B  [4, 6]
A  B  [2, 4, 5, 6]
A  A  [1, 2, 3, 4, 5, 6]  S
Classical Probability
The classical definition of probability is the
proportion of times that an event will occur,
assuming that all outcomes in a sample space
are equally likely to occur. The probability of
an event is determined by counting the number
of outcomes in the sample space that satisfy
the event and dividing by the number of
outcomes in the sample space.
Classical Probability
The probability of an event A is
NA
P(A) 
N
where NA is the number of outcomes that satisfy the
condition of event A and N is the total number of
outcomes in the sample space. The important idea
here is that one can develop a probability from
fundamental reasoning about the process.
Combinations
The counting process can be generalized
by using the following equation to
compare the number of combinations of n
things taken k at a time.
n!
C 
k!(n  k )!
n
k
0! 1
Relative Frequency
The relative frequency definition of probability
is the limit of the proportion of times that an
event A occurs in a large number of trials, n,
nA
P(A) 
n
where nA is the number of A outcomes and n
is the total number of trials or outcomes in
the population. The probability is the limit
as n becomes large.
Subjective Probability
The subjective definition of probability
expresses an individual’s degree of belief
about the chance that an event will occur.
These subjective probabilities are used in
certain management decision procedures.
Probability Postulates
1.
2.
Let S denote the sample space of a random experiment,
Oi, the basic outcomes, and A, an event. For each event A
of the sample space S, we assume that a number P(A) is
defined and we have the postulates
If A is any event in the sample space S 0  P( A)  1
Let A be an event in S, and let Oi denote the basic
outcomes. Then
P( A)   P(Oi )
A
where the notation implies that the summation extends
over all the basic outcomes in A.
3.
P(S) = 1
Probability Rules
Let A be an event and A its complement.
The the complement rule is:
P( A )  1  P( A)
Probability Rules
The Addition Rule of Probabilities:
Let A and B be two events. The probability
of their union is
P( A  B)  P( A)  P( B)  P( A  B)
Probability Rules
Conditional Probability:
Let A and B be two events. The conditional probability
of event A, given that event B has occurred, is
denoted by the symbol P(A|B) and is found to be:
P( A  B)
P( A | B) 
P( B)
provided that P(B > 0).
Probability Rules
Conditional Probability:
Let A and B be two events. The conditional probability
of event B, given that event A has occurred, is
denoted by the symbol P(B|A) and is found to be:
P( A  B)
P( B | A) 
P( A)
provided that P(A > 0).
Probability Rules
The Multiplication Rule of Probabilities:
Let A and B be two events. The
probability of their intersection can be
derived from the conditional probability as
P( A  B)  P( A | B) P( B)
Also,
P( A  B)  P( B | A) P( A)
Statistical Independence
Let A and B be two events. These events are said to
be statistically independent if and only if
P( A  B)  P( A) P( B)
From the multiplication rule it also follows that
P(A | B)  P(A)
(if P(B)  0)
P(B | A)  P(B)
(if P(A)  0)
More generally, the events E1, E2, . . ., Ek are mutually
statistically independent if and only if
P(E1  E2  EK )  P(E1 ) P(E 2 )P(E K )
Bivariate Probabilities
B1
B2
...
Bk
A1
P(A1B1)
P(A1B2)
...
P(A1Bk)
A2
P(A2B1)
P(A2B2)
...
P(A2Bk)
.
.
.
Ah
.
.
.
.
.
.
.
.
.
P(AhB1)
P(AhB2)
.
.
.
...
P(AhBk)
Figure 4.1 Outcomes for Bivariate Events
Joint and Marginal
Probabilities
In the context of bivariate probabilities, the
intersection probabilities P(Ai  Bj) are called joint
probabilities. The probabilities for individual
events P(Ai) and P(Bj) are called marginal
probabilities. Marginal probabilities are at the
margin of a bivariate table and can be computed
by summing the corresponding row or column.
Probabilities for the Television
Viewing and Income Example
Viewing
Frequenc
y
High
Income
Middle
Income
Low
Income
Totals
Regular
0.04
0.13
0.04
0.21
Occasional
0.10
0.11
0.06
0.27
Never
0.13
0.17
0.22
0.52
Totals
0.27
0.41
0.32
1.00
Tree Diagrams
P(A1  B1) = .04
P(A1  B2) = .13
P(A1  B3) = .04
P(A2  B1) = .10
P(S) = 1
P(A2) = .27
P(A2  B2) = .11
P(A2  B3) = .06
P(A3 B1) = .13
P(A3  B2) = .17
P(A3  B3) = .22
Probability Rules
Rule for Determining the Independence of Attributes
Let A and B be a pair of attributes, each broken into
mutually exclusive and collectively exhaustive
event categories denoted by labels A1, A2, . . ., Ah
and B1, B2, . . ., Bk. If every Ai is statistically
independent of every event Bj, then the attributes A
and B are independent.
Odds Ratio
The odds in favor of a particular event are
given by the ratio of the probability of the
event divided by the probability of its
complement. The odds in favor of A are
P(A)
P(A)
odds 

1 - P(A) P(A)
Overinvolvement Ratio
The probability of event A1 conditional on event
B1divided by the probability of A1 conditional on
activity B2 is defined as the overinvolvement ratio:
P(A 1 | B1 )
P(A 1 | B 2 )
An overinvolvement ratio greater than 1,
P(A1 | B1 )
 1.0
P(A1 | B2 )
Implies that event A1 increases the conditional odds
ration in favor of B1:
P(B1 | A1 ) P(B1 )

P(B 2 | A1 ) P(B 2 )
Bayes’ Theorem
Let A and B be two events. Then Bayes’ Theorem
states that:
P(A | B)P(B)
P( A | B) 
P(A)
and
P(B | A)P(A)
P( A | B) 
P(B)
Bayes’ Theorem
(Alternative Statement)
Let E1, E2, . . . , Ek be mutually exclusive and
collectively exhaustive events and let A be some
other event. The conditional probability of Ei
given A can be expressed as Bayes’ Theorem:
P(A | E i )P(E i )
P(E i | A) 
P(A | E1 )P(E 1 )  P(A | E 2 )P(E 2 )    P(A | E K )P(E K )
Bayes’ Theorem
- Solution Steps 1. Define the subset events from the
problem.
2. Define the probabilities for the events
defined in step 1.
3. Compute the complements of the
probabilities.
4. Apply Bayes’ theorem to compute the
probability for the problem solution.