Download Example 1: Experiment roll a die. Sample space

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Example 1: Experiment roll a die. Sample space {1, . . . , 6}. N - number shown
by a die. Event A - "N is even"
A = {2} ∪ {4} ∪ {6}.
Three disjoint events hence
P(A) = P({2}) + P({4}) + P({6}).
Let B = {2, 3} then A ∪ B = {2, 3, 4, 6} and for a fair die
P(A ∪ B) = 4/6 6= P(A) + P(B) = 5/6.
This are not excluding, i.e. is because A ∩ B 6= ∅. We have that
P(A ∪ B) = P(A) + P(B) − P(A ∩ B).
Since A ∩ B = {2} we have P(A ∩ B) = 1/6. (Note that here P(A ∩ B) =
P(A)P(B). Is this a coincident?)
Example 2: Roll die twice then sample space contains 36 elements {(1,1),
(1,2),. . . ,(6,6)}. Consider an event that in first roll A is true and in the second
roll B = {2, 3} is true, viz.
A ∩ B = {(2, 2), (4, 2), (6, 2), (2, 3), (4, 3), (6, 3)}
occurs. If die is fair and one is rolling it in "independent" manner then all
outcome are equally probable and hence P(A ∩ B) = 6/36. We note that
P(A ∩ B) = P(A) · P(B) = 3/6 · 2/6. Are all events independent?
Here if the event A considers property of result of the first roll while event
asks about properties of the result of the second roll then the events are
independent. (This is meaning of assumption that rolls are independent).
Now if we consider the event A and introduce B = "number shown on die in
the second roll is bigger than in the first roll" then in general A and B are
dependent.
It is always easy to know if two events are independent or not. This depends
on probabilities and not events. Let reconsider the sets A, B and only one roll.
Obviously A ∩ B = {2}. If die is fair then
P(A ∩ B) = 1/6 = 3/6 · 2/6 = P(A) · P(B),
i.e. the events are independent. However this is not the case for the false die,
see Example 1.6, p. 9.
Example 3: Independence is a powerful tool for computing probabilities, we
check Example 1.7 (Rescue Station) on page 9 in the book.
1
Example 3a: Consider event Ai the result of first roll is i, i = 1, . . . , 6 and let
B = "number shown on die in the second roll is bigger than in the first roll".
Obviously P(B) = 12/36 = 1/3. Obviously A1 and B are dependent since
P(A1 ∩ B) =
6 12
5
6= P(A)P(B) =
· ,
36
36 36
Suppose that we already performed the first roll and one came up, i.e. A1 is
true. What would be the probability that B is true? Clearly it is 5/6. This
probability will be called the conditional probability of B if A1 happen and
denoted by P(B|A1 ). We note that
P(A1 ∩ B) =
5
5 1
= P(B|A1 )P(A1 ) = · ,
36
6 6
which basically is the definition of conditional probability.
Example 4: Obviously for independent events A and B we have that P(B|A) =
... = P(B). So conditional probabilities are of interest for dependent events.
For a finite sample space S and experiment for which all outcomes are equally
likely to occur. Conditional probabilities P(B|A) can be computed by taking
S = A and for this narrower sample space still assume that the outcomes are
equally probable.
Let roll a dice and let A be "die show even number". As before let B = {2, 3}
then
P(A ∩ B)
= (1/6)/(1/2) = 1/3,
P(B|A) =
P(A)
as we could say at once.
Example 5: The convenience of the law of total probabilities lies in possibility
of estimation of the conditional probabilities. For example if we have three producers of some component. Then in laboratory we can check strength (quality)
of their products. The tests could give results that B = "a component (taken
at random) is of good quality"
P(B|producer I) = 0.95,
P(B|producer II) = 0.75,
P(B|producer III) = 0.99
Now knowing that producers I, II, III have 15%, 75% and 10% of market then
P(B) = 0.95 · 0.15 + 0.75 · 0.75 + 0.99 · 0.1 = 0.804
Example 6: Suppose that component is of good quality, i.e. B is true. As
before we have three producers and let hypothesis A1 , A2 , A3 be that the component was produced by producer I, II, III, respectively. We want to know
2
what are probabilities P(Ai |B). The likelihood function L(Ai )
P(B|A1 ) = 0.95,
P(B|A2 ) = 0.75, P(B|A3 ) = 0.99
Hence
P(A1 |B) =
P(A1 )
P(B|A1 ) = (0.15/0.804)·0.95 = 0.18,
P(B)
P(A2 |B) = 0.7,
P(A3 |B) = 0.12
Example 7: We consider a medical example. Let B be "a test for disease
(BSE) is positive" while A1 =" animal is infected", A2 ="animal is not infected". For the test the likelihoods P(B|Ai ) are known from laboratory test
and let P(B|A1 ) = 0.99 while P(B|A2 ) = 0.001. We are interested in odds
that positively tested animal is infected or not. The Bayes’ formula tells us
q1post = 0.99 · q1prior ,
q2post = 0.001 · q2prior .
Since BSE is very rare disease (there maybe 100 infected in population of 107 )
thus q1prior : q2prior is ca. 1:100 000. Consequently the posteriori odds are ca.
1:100.
3
Related documents