Download Probability Theory, Discrete, and Continuous Probability

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Statistics wikipedia , lookup

History of statistics wikipedia , lookup

Ars Conjectandi wikipedia , lookup

Probability interpretations wikipedia , lookup

Probability wikipedia , lookup

Transcript
Probability Theory, Discrete, and
Continuous Probability
Chapters 4, 5, and 6
I. Probability Theory
A. Set Theory The background for probability theory comes from set
theory. We shall define certain operations on the set of points which
make up the sample space. A set of points, sometimes called simply
a set, is an aggregate of elements having certain specified properties.
1. Points or Elements Points are representations of characteristics
or outcomes of interest. Simple points characterize the set of
all possible, mutually exclusive outcomes. Points are the basic
elements as suggested by the analogy to points as the building
blocks of lines and solids in geometry
2. The Universal Set In each discussion there will be a universal
set, the sample space S, such that all other sets in the discussion
are subsets of S.
3. A Set is a collection of points.
a. Membership If s is a point or an element belonging to
the set A, we shall write s ∈ A.
b. Equality Two sets S1 and S2 are said to be equal if every
element or point of S1 is also a point of S2 , and every
point of S2 is also a point of S1 . This will be written
S1 = S2 .
c. Subsets If every point of S1 is also a point of S2 , then S1
will be called a subset of S2 , and we shall write this as
S1 ⊂ S2 .
4. The null or empty set If a set S1 contains no points, it will be
called the null set, which will be indicated by ∅.
5. Complement The complement of a set S1 with respect to the
sample space S will be the set of points in S but not in S1 .
This set will be denoted by S1c .
6. Intersections Let S1 and S2 be any two events in the sample
space S; then the event which consists of all points which are in
S1 and S2 , is called the intersection of S1 and S2 , and is written
S1 ∩ S2 .
7. Union Let S1 and S2 be any two events in the sample space
S; then the event which consists of all points which are in S1
or S2 or both, is called the union of S1 and S2 , and is written
S1 ∪ S2 .
B. Probability Theory Probability theory builds upon set theory. It
begins by positing a probability function on the basic outcomes and
extends this to the probability of event which is the union of simple
outcomes.
1. The probability function For each basic outcome there is a relative likelihood, called a probability such that:
X
0 ≤ Pr{ei } ≤ 1, and
Pr{ei } = 1.0
ei ∈S
An event A in the sample space S is defined to be a set A of
points in S, and when we say “The probability that event A
occurs”, we shall mean the probability that any of the points
of A occurs. Thus
X
Pr{A} =
Pr{ei }
ei ∈A
2. Correspondences
Set Theory Probability Theory
Probability
Universal Set Sample Space
1.0
Point
Simple Outcome or Event
Pr{e
i}
P
Set
Compound Outcome or Event
Pr{e
i}
ei ∈A
Empty Set
Impossible
0
C. Probability Structure
1. Pr(S) = 1.0.
2. Let S be a sample space, and let P be a probability function
on S. If S0 is the null set, then Pr(S0 ) = 0.
3. If S1 , S2 , S3 . . . is a sequence of mutually exclusive events in
S, then
Pr(S1 ∪ S2 ∪ · · · ) = Pr(S1 ) + Pr(S2 ) + · · ·
4. Pr(A) is a real number such that Pr(A) ≥ 0 for every event A
in S and 0 ≤ Pr(A) ≤ 1.
5. Let S be a sample space, and let Pr be a probability function
on S. The probability that the event A does not happen is
1 − Pr(A).
D. Conditional Probability and Independence Suppose a population of N people contains NA color-blind people, NB females, and
NA∩B people who are female and color-blind (NA∩B ≤ min(NA , NB )).
Let the outcome that a person chosen at random is color-blind be
event A, and the outcome that a person chosen at random is a female
be event B. Finally, let the outcome that a person is both color-blind
and female be denoted by A ∩ B. The probabilities associated with
each event are based upon the assertion that drawing any person
among the N is equally likely. These probabilities are:
NA
NB
NA∩B
Pr(A) =
Pr(B) =
Pr(A ∩ B) =
N
N
N
Instead of the entire population, we may only be interested in the
percentage of females who are color-blind Quite often in probabilistic
situations, we have information which tells us that we are within a
particular subpopulation of the total population of a particular experiment. It is natural to ask whether or not the knowledge concerning
the subset allows us to refine our probability estimate of the event
we are interested in. For example, our probability estimate of having
more than one inch of rain fall one a particular day would differ if we
were told that it did or didn’t rain on that day.
Pr(A ∩ B)
NA∩B
Pr(A|B) =
=
Pr(B)
NB
1. Independence A group of events (A1 , A2 , . . . An ) are said to be
independent if (and only if)
Pr(A1 ∩ A2 . . . ∩ An ) = Pr(A1 ) · Pr(A2 ) · . . . · Pr(An )
II. Discrete and Continuous Probability Distributions
A. Introduction to Probability Distributions and Random Variables (Chapter 5).
1. What is a random variable?
2. What are the basic requirements of a discrete probability distribution?
a. 0P
≤ Pr(x) ≤ 1.0
b.
Pr(x) = 1.0
all x
3. What is a cumulative distribution?
4. What is the difference between a discrete and a continuous
random variable?
5. What are the basic requirements of a continuous probability
density?
a. height ≥ 0
b. Area under density = 1.0
B. Expected Values
1. What is meant by a expected value and how is it calculated?
2. How is the population mean related to expected value?
3. How is the population variance related to expected value?
C. Discrete probability Distributions
1. The Binomial Distribution
a. What is the Binomial distribution?
b. What conditions/situations does it describe?
c. What is the mean of a binomial random variable?
d. What is the variance of a binomial random variable?
2. The Poisson Distribution
a. What is the Poisson distribution?
b. What conditions/situations does it describe?
c. What is the mean of a poisson random variable?
d. What is the variance of a poisson random variable?
D. Continuous Distributions
1. What is the primary difference between a continuous and a
discrete distribution?
2. What is the probability that a continuous random variable takes
a particular (exact) value?
3. The Uniform Distribution
a. What is the uniform distribution?
b. What conditions/situations does it describe?
c. What is the mean of a uniform random variable?
d. What is the variance of a uniform random variable?
4. The standard normal distribution and its tables (6.2)
a. The table displays areas between the mean (0) and positive values of z. Remember that the probability that a
random variable is less than a particular value a is written F (a), where F (a) is the cumulative distribution. The
standard normal table then displays
T able Area = F (z) − .5
z > 0.
b. You should be able to use the table to compute the
Pr{0 < z < b} for a given b > 0.
c. You should be able to use the table to compute the
Pr{z > b}.
d. You should be able to use the table to compute the
Pr{a < z < b} .
e. You should be able to use the table to compute the
Pr{a > z or z > b}.
f. You should also be able to find a z which produces a
certain probability.
5. Transforming a normal random variable into a z-score and vice
versa (6.2).
a. There are two important formula in this section:
b. The first formula transforms a normal x into a z-score
x − µx
z=
σx
c. The second formula transforms a z-score into a normally
distributed x,
x = µx + zσ x .
d. Using these formulae, you should be able to make any
probability statements from section 4-3 as they apply to
x (the non-standard normal).
(1)
(2)