Download Dice Games3

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Statistics wikipedia , lookup

History of statistics wikipedia , lookup

Birthday problem wikipedia , lookup

Inductive probability wikipedia , lookup

Probability interpretations wikipedia , lookup

Ars Conjectandi wikipedia , lookup

Probability wikipedia , lookup

Transcript
Dice Games:
Probability and Pascal
by Lauren McCluskey
This power point was made with help from:
•Bayesian Learning Application to Text Classification Example: spam
filtering by Marius Bulacu & prof. dr. Lambert Schomaker
•Mathematicians by www.2julymaths.co.uk/powerpoint/mathematicians.ppt
•Basic Models of Probability by Ron S. Kenett, Weizmann Institute of
Science Probability
Introduction to Information Theory by Larry Yaeger, Professor of
Informatics, Indiana University
•Access to Math, Probability published by www.pearsonlearning.com
•www.2july-maths.co.uk/powerpoint/mathematicians.ppt
• www.mtsu32.mtsu.edu:11208/Chap9Pres.ppt
Founders of Probability Theory
Blaise Pascal
Pierre Fermat
(1623-1662, France)
(1601-1665, France)
They laid the foundations of the probability
theory in a correspondence on a dice game.
From: Bayesian Learning Application to Text Classification Example: spam filtering
by Marius Bulacu & prof. dr. Lambert Schomaker
Pascal
from: Mathematicians by www.2july-maths.co.uk/powerpoint/mathematicians.ppt
Blaise Pascal
1623 - 1662
Blaise Pascal, according to
contemporary observers,
suffered migraines in his youth,
deplorable health as an adult,
and lived much of his brief life of
39 years in pain.
Nevertheless, he managed to
make considerable contributions
in his fields of interest,
mathematics and physics, aided
by keen curiosity and
penetrating analytical ability.
Pascal
from: Mathematicians by www.2july-maths.co.uk/powerpoint/mathematicians.ppt
Probability theory was Pascal's principal
and perhaps most enduring contribution to
mathematics, the foundations of
probability theory established in a long
exchange of letters between Pascal and
fellow French mathematician Fermat.
Basic Models of Probability
by Ron S. Kenett, Weizmann Institute of Science
The Paradox of the Chevalier de Mere - 1
Success = at least one “1”
Basic Models of Probability
by Ron S. Kenett, Weizmann Institute of Science
The Paradox of the Chevalier de Mere - 2
Success = at least one “1,1”
Basic Models of Probability
by Ron S. Kenett, Weizmann Institute of Science
The Paradox of the Chevalier de Mere - 3
P (Success) = P(at least one “1”)
 4 1  2
6 3
P (Success) = P(at least one “1,1”)
 24  1
36
Experience proved otherwise !
Game A was a better game to play
2
3
Basic Models of Probability
by Ron S. Kenett, Weizmann Institute of Science
The Paradox of the Chevalier de Mere - 4
The calculations of Pascal and Fermat
P (Failure) = P(no “1”)
 6   .482
 5
4
P (Success) = .518
P (Failure) = P(no “1,1”)

 35

36
24
 .509
P (Success) = .491
What went wrong before?
Sample Space for Dice
from: Introduction to Information Theory by Larry Yaeger
• Single die has six elementary outcomes:
• Two dice have 36 elementary outcomes:
1/6*1/6*1/6*1/6= (1/6)4 or .518
1/6 chances
• While…
1/36*1/36*1/36…= (1/36)24 or .491
1/36
chances
Apply it!
When to sit and when to stand…?
How many times can we roll one die before
we get a “1”?
Try this:
S.K.U.N.K.
1. Stand up.
2. Someone rolls a die.
3. Sit down: Keep your score.
OR:
Remain standing: Add it up.
But…
Watch Out!
4. If you’re standing on “1”: Score= “0”!
5. New round: Stand up.
6. Repeat 5 times: one round for each
letter in the word S.K.U.N.K.
Reflection:
*What is your winning strategy?
*Why will this work?
Remember:
Sample Space for Dice
from: Introduction to Information Theory by Larry Yaeger
• Single die has six elementary outcomes:
• Two dice have 36 elementary outcomes:
Apply It!
• When to roll and when to stop…?
• How many times can we roll 2 dice before
we roll a “1” or a “1, 1”?
• Try this:
PIG
1. Take turns rolling 2 dice.
2. Keep rolling: Add it up.
3. Stop: Keep your score.
But…
Watch Out!
4. Roll “1”: Lose your turn.
Roll “1, 1”: Lose it ALL! (Back to “0”!)
5. Get a score of 100: You WIN!
Reflection:
• *What is your winning strategy?
• *Why will this work?
• Remember:
Sample Space for Dice
from: Introduction to Information Theory by Larry Yaeger
• Single die has six elementary outcomes:
• Two dice have 36 elementary outcomes:
1/36
chances
Sample Space for Dice
from: Introduction to Information Theory by Larry Yaeger
• Single die has six elementary outcomes:
• Two dice have 36 elementary outcomes:
11/36
Chances
To
roll 1
“1”
from: Introduction to Information Theory by Larry Yaeger
The Addition Rule
• Now throw a pair of black & white dice, and ask: What is
the probability of throwing at least one one?
– Let event a = the white die will show a one
– Let event b = the black die will show a one
Basic Models of Probability
by Ron S. Kenett, Weizmann Institute of Science
P(“1” with 2 dice) =?
To add or to multiply ?
Independent Events
• Independent events: two events with
outcomes that do not depend on each
other.” (from: Access to Math, Probability”)
Independent Events: Either /OR
• When two events are independent, AND either
one is favorable, you add their probabilities.
Example:
What is the probability that I might roll a 1 on the
black die? 6/36 or 1/6
What is the probability that I might roll a 1 on the
white die? 6/36 or 1/6
What is the probability that I will roll either 1 black
OR 1 white “1”? 12/36 or 1/3.
Independent Events: Either /OR
*This is true when the die are rolled one at a
time, if, however, you roll them together,
then 1W and 1B cannot be counted twice.
So the probability of rolling a “1” is 11/36
instead of 12/36.
Sample Space for Dice
from: Introduction to Information Theory by Larry Yaeger
• Single die has six elementary outcomes:
• Two dice have 36 elementary outcomes:
11/36
Chances
To
roll 1
“1”
Independent Events:
And Then…
• When two events are independent, BUT you
want to have BOTH of them, you multiply their
probabilities.
• Example:
• What is the probability that I will roll a “1, 1”?
[P(1) = 1/6 * P(1) = 1/6] or [1/6*1/6= 1/36].
• The P(1,1) = 1/36 because there is only ONE
way that I can do this.
1/36*1/36*1/36…= (1/36)24 or .491
1/36
chances
Dependent Events
• “Dependent events: a set of events in
which the outcome of the first event affects
the outcome of the next event.”
(from: Access to Math, Probability”)
Dependent Events
• To find the probability of dependent
events, multiply the probability of the first
by the probability of the second (given that
the first has occurred).
• Example: You have the letters : M; A; T;
and H in an envelope. What is the
probability that you will pull a “M” then a
“A”?
Dependent Events
• P(M) = ¼ because there are 4 cards and
• P(A after M) = 1/3 because there are NOW
only 3 cards left…
so …
¼ * 1/3= 1/12.
Apply It!
• Put the letters: M; A; T; and H in an envelope
and pull them out 1 at a time.
• Replace the card then do it again.
(Repeat 20 times.)
• Record your results.
*Think about it: what would happen if you hadn’t
replaced the cards each time?