Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Math 112 Mathematics for Teachers Spring 2012 Probability 1. Sample spaces, events, and probability measure We consider experiments with outcomes that are random or probabilistic rather than deterministic. A single execution of an experiment is called a trial. The possible different outcomes of a given experiment make up the sample space, let’s call it S. An event is a part of the sample space, technically a subset of S, and it can be empty or all of S. The event S is the certain event, and the void (or empty) subset ∅ is the impossible event. Example 1.1. • Experiment: Toss a die twice • Sample space: (1, 1) (1, 2) (2, 1) (2, 2) (3, 1) (3, 2) S= (4, 1) (4, 2) (5, 1) (5, 2) (6, 1) (6, 2) in succession. (1, 3) (2, 3) (3, 3) (4, 3) (5, 3) (6, 3) (1, 4) (2, 4) (3, 4) (4, 4) (5, 4) (6, 4) (1, 5) (2, 5) (3, 5) (4, 5) (5, 5) (6, 5) (1, 6) (2, 6) (3, 6) (4, 6) (5, 6) (6, 6) • The event “odd numbers on both tosses” is {(1, 1), (1, 3), (1, 5), (3, 1), (3, 3), (3, 5), (5, 1), (5, 3), (5, 5)}. • The event “point total is 5” is {(1, 4), (2, 3), (3, 2), (4, 1)}. • The event “4 on the first toss” is {(4, 1), (4, 2), (4, 3), (4, 4), (4, 5), (4, 6)}. Example 1.2. • Experiment: Flip a coin four times in succession. Each flip results in (H)eads or (T)ails. Hence the outcomes of our experiment are lists of four letters with the letters being either H or T. There are 16 possible outcomes. • Sample space: HHHH HHHT HHT H HT HH T HHH HHT T HT HT T HHT S= HT T H T HT H T T HH HT T T T HT T T T HT T T T H T T T T • The event “the number of Heads is 2 (in the four flips)” is {HHT T, HT HT, HT T H, T HHT, T HT H, T T HH}. • The event “there are more heads than tails” is {HHHH, HHHT, HHT H, HT HH, T HHH}. 1 2 The probability of the event E, denoted by P(E) is a real number not negative and not larger than 1 such that an event is the more likely to occur the larger its probability measure is. The extreme cases are P(E) = 0, which means that E is impossible (for practical purposes), and P(E) = 1, which means that E is certain (for practical purposes). In particular, P(S) = 1 and P(∅) = 0. Probability Interpretation: This “interpretation” is not part of the mathematical theory but gives us an idea what probability means for practical purposes. Experience shows that the interpretation works in applications as expected. An event E with probability P(E) will occur roughly N ·P(E) times if our experiment is performed N times and N is a very large number. The equally-likely case: If every outcome of an experiment is as likely as any other, then the following formula allows us to compute the probability of an event E. If |E| is the number of outcomes belonging to E, and |S| is the number of all possible outcomes of the experiment, then (1.3) P(E) = |E|/|S|. If we use a balanced die and a fair coin, then the two examples above fall under the equally-likely case. Thus (1) P(“both tosses of the die produce odd numbers”) = 9/36 = 1/4. (2) P(“the sum of the points is 5”) = 4/36 = 1/9. (3) P(“the first toss produces a 4”) = 6/36 = 1/6. (4) P(“there are more Heads than Tails”) = 5/16. Some basic laws: The following laws facilitate the computation of probabilities by reducing more complicated situations to simpler ones, or by expressing unknown probabilities in terms of known ones. The Multiplication Rule: If two events E1 and E2 are independent, then P(E1 AND E2 ) = P(E1 ) · P(E2 ). Two events are independent if the occurrence or non-occurrence of the one has no effect on the occurrence or non-occurrence of the other. The Multiplication Rule is useful when we consider an experiment that consists of repeated independent trials of some other experiment. If an event E1 concerns only the outcome in one trial while the event E2 concerns only the outcome in another trial, then E1 and E2 are independent. Example 1.4. Toss a die 5 times in succession. The events “4 on the second toss” and “an even number of points on the 4th toss” are independent. Hence P(“4 on second” AND “even on 4th”) = P(“4 on 2nd”) · P(“even on 4th”). In addition, P(“4 on 2nd”) = 1/6 and P(“even on 4th”) = 1/2. Thus P(“4 on 2nd” AND “even on 4th”) = 1/6 · 1/2 = 1/12. 1. SAMPLE SPACES, EVENTS, AND PROBABILITY MEASURE 3 The Addition Rule: If the events E1 and E2 and are mutually exclusive, then P(E1 OR E2 ) = P(E1 ) + P(E2 ). Two events are mutually exclusive if they cannot occur simultaneously. Example 1.5. Toss a die 2 times in succession. The events “sum of 4” and “sum of 5” are mutually exclusive. Thus P(“sum of 4” OR “sum of 5”) = P(“sum of 4”) + P(“sumof 500 ) = 3/36 + 4/36 = 7/36. By contrast “sum of 4” and “2 on 1st” are not mutually exclusive. In fact, the event E = “sum of 4” OR “2 on 1st” = {(1, 3), (2, 2), (3, 1), (2, 1), (2, 3), (2, 4), (2, 5), (2, 6)} and P(E) = 8/36 = 2/9 while P(“sum of 4”) + P(“2 on 1st”) = 3/36 + 6/36 = 9/36 = 1/4. The General Addition Rule: For any two events E1 and E2 , P(E1 OR E2 ) = P(E1 ) + P(E2 ) − P(E1 AND E2 ) The Subtraction Rule: For any event E, it is true that P( NOT E) = 1 − P(E) and P(E) = 1 − P( NOT E). Example 1.6. Toss a die 5 times in succession. P(“4 on 1st”) = 1/6, and by the Subtraction Rule, P(NOT “4 on 1st”) = 1 − 1/6 = 5/6. This saves a considerable amount of work. Also note: “NOT 4 on 1st” = “1,2,3,5 or 6 on 1st”. Exercise 1.7. Toss a fair die twice in succession. In each case list the event and compute the probability. (1) P(“less than 4 on 1st”) =? (2) P(“3 or 4 on 2nd”) =? (3) P(“sum of 7”) =? (4) P(“sum NOT 7”) =? (5) P(“more points on 1st than on 2nd”) =? (6) P(“same points on 1st and 2nd”) =? (7) P(“more points on 2nd than on 1st”) =? (8) P(“sum of 7” OR “sum of 3”) =? (9) P(“sum of 3” OR “odd on 1st”) =? (10) P(“odd on 1st” OR “even on 2nd”) =? (11) P(“sum of 6” OR “sum of 5”) =? Exercise 1.8. Toss a fair coin 4 times in succession. In each case list the event and compute the probability. (1) P(“exactly one head”) =? (2) P(“less than two heads”) =? (3) P(“3 or more tails”) =? (4) P(“heads on 1st and tails on 3rd”) =? (5) P(“4 heads”) =? (6) P(“more H than T”) =? (7) P(“fewer H than T”) =? 4 (8) (9) (10) (11) (12) P(NOT “more H than T”) =? P(“more H than T” OR “more T than H”) =? P(“more H than T” AND “more T than H”) =? P(“H on the first 3 tosses” AND “T on 4th”) =? P(“T on the first 3 tosses” AND “H on 4th”) =? 2. Expectation Frequently a numerical value is attached to the outcome of a probabilistic experiment such as the gain or loss in a gambling game. We will talk about the ”pay-off” which may be positive or negative and in case of a gambling game includes both the wins and the losses. The technical mathematical term is ”random variable” but we will not use it and stick to the ”pay-offs” in gambling games and such like. Interpretation: There is an interpretation of expectation that is analogous to our ”probability interpretation”: The expectation in a game or the expected pay-off is the long-term average pay-off per game (or trial). Computation of the expectation: First list the possible different pay-offs. The pay-off 0 may be ignored. Suppose the different pay-offs are N1 , N2 , N3 , .... Secondly, find the probabilities of these pay-offs. Suppose the probabilities are respectively P1 , P2 , P3 , .... The expected pay-off is then given by the formula N1 · P1 + N2 · P2 + N3 · P3 + · · · In words, the expectation is expectation = sum of all possible pay-offs × the probability of the pay-off Example 2.1. Suppose that a fair die is rolled. If an even number comes up, you get that number of dollars, if an odd number comes up, you must pay that number of dollars. We first note that there are six possible outcomes, 1, 2, 3, 4, 5, 6, and they all have probability 1/6. The possible pay-offs are −1, 2, −3, 4, −5, 6. The expectation is 1 1 1 1 1 1 1 2 1 (−1) + 2 + (−3) + 4 + (−5) + 6 = (−1 + 2 − 3 + 4 − 5 + 6) = = . 6 6 6 6 6 6 6 6 2 Example 2.2. A die is tossed three times in succession. (1) You win $6 whenever a triple occurs such as (1,1,1) or (4,4,4), and you lose $1 whenever each of the three throws results in an even number. The list of pay-offs is N1 = 6, N2 = −1. The list of the corresponding probabilities is P1 = 6/(6 · 6 · 6) = 1/36, P2 = 3 · 3 · 3 · /6 · 6 · 6 = 1/8. Expectation = 6 · (1/36) − 1 · (1/8) = 1/6 − 1/8 = 1/24. It pays to play this game. However, the gain is small. (2) You win $4 if there is no repetition in the three throws and you lose $5 if there is some repetition. What is your expected pay-off ? Would you play this game? 2. EXPECTATION 5 The list of pay-offs is N1 = 4, N2 = −5. The list of the corresponding probabilities is P1 = 6 · 5 · 4/6 · 6 · 6 = 5/9, P2 = 1 − 5/9 = 4/9. Expectation = 4 · (5/9) − 5 · (4/9) = 0. This is a “fair game”. You may play it, and in the long run you will neither lose nor win. Example 2.3. Toss a fair coin 5 times in succession. You win $4 if heads appears exactly once, you lose $1 if there are more heads than tails, and you lose $2 if all tosses turn up tails. The list of pay-offs is N1 = 4, N2 = −1, N3 = −2. The list of corresponding probabilities is P1 = 5/2 · 2 · 2 · 2 · 2, P2 = 1/2, P3 = 1/2 · 2 · 2 · 2 · 2. Expectation = 4 · (5/32) − 1 · (1/2) − 2 · (1/32) = 1/16. You win in the long run at the rate of $1 per 16 games. Note: P2 is obtained easily as follows. Note that in five tosses there are either more tails than heads or more heads than tails. For five throws P(“more H than T”) = 1 − P(“more T than H”) and also P(“more T than H”) = P(“more H than T”) by symmetry. Hence P(“more H than T”) = 1 − P(“more H than T”) and this implies P(“more H than T”) = 1/2. Exercise 2.4. A fair die is tossed twice in succession. (1) You win $10 if the sum of the throws is 5 or less, and you lose $1 if the sum is larger than 5. (2) You win $12 if one or more 6’s are thrown, you lose $3 if the two throws are both 3 or less, and you lose $1 on a double 5. (3) You win $4 if there is no repetition in the two throws and you lose $5 if there is some repetition. Exercise 2.5. A coin is tossed four times in succession. Compute the expected payoff. (1) You win $4 if exactly 2 heads turn up and you lose $8 if exactly 3 tails turn up. (2) You win as many dollars as is the positive difference between heads and tails. (The positive difference between heads and tails for the outcome TTTT and for HHHH is 4 − 0 = 4, for TTTH and for HHHT it is 3 − 1 = 2.) Exercise 2.6. A coin is tossed three times in succession. Compute the expected payoff. You win $4 whenever H appears on the first throw, and you lose $2 whenever T appears on the 1st throw and H appears on the 2nd or 3rd throw (or both).