Download §3.2 – Conditional Probability and Independence

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Indeterminism wikipedia , lookup

History of randomness wikipedia , lookup

Dempster–Shafer theory wikipedia , lookup

Randomness wikipedia , lookup

Infinite monkey theorem wikipedia , lookup

Probability box wikipedia , lookup

Birthday problem wikipedia , lookup

Inductive probability wikipedia , lookup

Ars Conjectandi wikipedia , lookup

Probability interpretations wikipedia , lookup

Conditioning (probability) wikipedia , lookup

Transcript
§3.2 – Conditional Probability and Independence
Mark R. Woodard
Furman U
2010
Mark R. Woodard (Furman U)
§3.2 – Conditional Probability and Independence
2010
1/6
Outline
1
Conditional Probability
2
Independence
3
Examples and Assignment
Mark R. Woodard (Furman U)
§3.2 – Conditional Probability and Independence
2010
2/6
Conditional Probability
Main Idea
The main idea of conditional probability is that knowing some extra
information about an experiment might change the probability in a specific
way.
Mark R. Woodard (Furman U)
§3.2 – Conditional Probability and Independence
2010
3/6
Conditional Probability
Main Idea
The main idea of conditional probability is that knowing some extra
information about an experiment might change the probability in a specific
way.
Mark R. Woodard (Furman U)
§3.2 – Conditional Probability and Independence
2010
3/6
Conditional Probability
Main Idea
The main idea of conditional probability is that knowing some extra
information about an experiment might change the probability in a specific
way. For example, if you pick a card out of a deck at random, the
probability that the card is from the diamond suit is 1/4, but if you knew
somehow that the card was red, then the probability would jump to 1/2.
Mark R. Woodard (Furman U)
§3.2 – Conditional Probability and Independence
2010
3/6
Conditional Probability
Main Idea
The main idea of conditional probability is that knowing some extra
information about an experiment might change the probability in a specific
way. For example, if you pick a card out of a deck at random, the
probability that the card is from the diamond suit is 1/4, but if you knew
somehow that the card was red, then the probability would jump to 1/2.
We say that the conditional probability of “diamond” given “red” is 1/2.
Mark R. Woodard (Furman U)
§3.2 – Conditional Probability and Independence
2010
3/6
Conditional Probability
Main Idea
The main idea of conditional probability is that knowing some extra
information about an experiment might change the probability in a specific
way. For example, if you pick a card out of a deck at random, the
probability that the card is from the diamond suit is 1/4, but if you knew
somehow that the card was red, then the probability would jump to 1/2.
We say that the conditional probability of “diamond” given “red” is 1/2.
The symbolism for this is Pr(A|B) which we read as “the conditional
probability of A given B,” or sometimes we simply say “the probability of
A given B.”
Mark R. Woodard (Furman U)
§3.2 – Conditional Probability and Independence
2010
3/6
Conditional Probability
Main Idea
The main idea of conditional probability is that knowing some extra
information about an experiment might change the probability in a specific
way. For example, if you pick a card out of a deck at random, the
probability that the card is from the diamond suit is 1/4, but if you knew
somehow that the card was red, then the probability would jump to 1/2.
We say that the conditional probability of “diamond” given “red” is 1/2.
The symbolism for this is Pr(A|B) which we read as “the conditional
probability of A given B,” or sometimes we simply say “the probability of
A given B.” In this symbolism, A and B are events, that is, subsets of the
sample space.
Mark R. Woodard (Furman U)
§3.2 – Conditional Probability and Independence
2010
3/6
Conditional Probability
Definition
The formula for conditional probability is as follows:
Pr(A|B) =
Mark R. Woodard (Furman U)
Pr(A ∩ B)
.
Pr(B)
§3.2 – Conditional Probability and Independence
2010
4/6
Conditional Probability
Definition
The formula for conditional probability is as follows:
Pr(A|B) =
Mark R. Woodard (Furman U)
Pr(A ∩ B)
.
Pr(B)
§3.2 – Conditional Probability and Independence
2010
4/6
Conditional Probability
Definition
The formula for conditional probability is as follows:
Pr(A|B) =
Pr(A ∩ B)
.
Pr(B)
In class, we will dicuss why this is a reasonable formula, but you can take
this to be the definition of conditional probability if you like.
Mark R. Woodard (Furman U)
§3.2 – Conditional Probability and Independence
2010
4/6
Conditional Probability
Definition
The formula for conditional probability is as follows:
Pr(A|B) =
Pr(A ∩ B)
.
Pr(B)
In class, we will dicuss why this is a reasonable formula, but you can take
this to be the definition of conditional probability if you like.
Mark R. Woodard (Furman U)
§3.2 – Conditional Probability and Independence
2010
4/6
Independence
Independent Events
The following statements turn out to be equivalent:
Whenever any of the above are true, we say that A and B are independent
events.
Mark R. Woodard (Furman U)
§3.2 – Conditional Probability and Independence
2010
5/6
Independence
Independent Events
The following statements turn out to be equivalent:
Pr(A ∩ B) = Pr(A) · Pr(B).
Whenever any of the above are true, we say that A and B are independent
events.
Mark R. Woodard (Furman U)
§3.2 – Conditional Probability and Independence
2010
5/6
Independence
Independent Events
The following statements turn out to be equivalent:
Pr(A ∩ B) = Pr(A) · Pr(B).
Pr(A|B) = Pr(A).
Whenever any of the above are true, we say that A and B are independent
events.
Mark R. Woodard (Furman U)
§3.2 – Conditional Probability and Independence
2010
5/6
Independence
Independent Events
The following statements turn out to be equivalent:
Pr(A ∩ B) = Pr(A) · Pr(B).
Pr(A|B) = Pr(A).
Pr(B|A) = Pr(B).
Whenever any of the above are true, we say that A and B are independent
events.
Mark R. Woodard (Furman U)
§3.2 – Conditional Probability and Independence
2010
5/6
Independence
Independent Events
The following statements turn out to be equivalent:
Pr(A ∩ B) = Pr(A) · Pr(B).
Pr(A|B) = Pr(A).
Pr(B|A) = Pr(B).
Whenever any of the above are true, we say that A and B are independent
events.
Mark R. Woodard (Furman U)
§3.2 – Conditional Probability and Independence
2010
5/6
Independence
Independent Events
The following statements turn out to be equivalent:
Pr(A ∩ B) = Pr(A) · Pr(B).
Pr(A|B) = Pr(A).
Pr(B|A) = Pr(B).
Whenever any of the above are true, we say that A and B are independent
events.
Warning
Note that although the word “independent” seems something like the word
“disjoint”, they are very different in this context. In particular, if A and B
are disjoint, then Pr(A ∩ B) = Pr(φ) = 0, rather than as in the definition
above.
Mark R. Woodard (Furman U)
§3.2 – Conditional Probability and Independence
2010
5/6
Independence
Independent Events
The following statements turn out to be equivalent:
Pr(A ∩ B) = Pr(A) · Pr(B).
Pr(A|B) = Pr(A).
Pr(B|A) = Pr(B).
Whenever any of the above are true, we say that A and B are independent
events.
Warning
Note that although the word “independent” seems something like the word
“disjoint”, they are very different in this context. In particular, if A and B
are disjoint, then Pr(A ∩ B) = Pr(φ) = 0, rather than as in the definition
above.
Mark R. Woodard (Furman U)
§3.2 – Conditional Probability and Independence
2010
5/6
Independence
Independent Events
The following statements turn out to be equivalent:
Pr(A ∩ B) = Pr(A) · Pr(B).
Pr(A|B) = Pr(A).
Pr(B|A) = Pr(B).
Whenever any of the above are true, we say that A and B are independent
events.
Warning
Note that although the word “independent” seems something like the word
“disjoint”, they are very different in this context. In particular, if A and B
are disjoint, then Pr(A ∩ B) = Pr(φ) = 0, rather than as in the definition
above. You should think of events as being independent when knowing
that one occurs doesn’t have any effect on the probability of the other.
Mark R. Woodard (Furman U)
§3.2 – Conditional Probability and Independence
2010
5/6
Examples and Assignment
Examples and Assignment
Examples: 6, 10, 12, 18, 24.
Assignment:
Numbers 1, 5, 9, 11, 15, 19, 23, 27, 31.
Mark R. Woodard (Furman U)
§3.2 – Conditional Probability and Independence
2010
6/6