Download 251probabl

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Statistics wikipedia , lookup

History of statistics wikipedia , lookup

Randomness wikipedia , lookup

Probability interpretations wikipedia , lookup

Probability wikipedia , lookup

Transcript
251probabl 9/28/04 (Open this document in 'Outline' view!)
H. Introduction to Probability
1. Experiments and Probability
Define a random experiment, a sample space, an outcome, a basic
outcome
a. Definition and rules for Statistical Probability.
(i) If A1 is impossible, P A1   0 .
(ii) If A1 is certain, P A1   1 .
(iii) For any Outcome A1 , 0  P A1   1 .
(iv). If A1 , A2 , A3 ,    , AN represent all possible outcomes
and are mutually exclusive, then
P A1   P A2   P A3       P AN   1 .
b. An Event.
c. Symmetrical, Statistical and Subjective Probability.
2. The Venn Diagram.
A diagram representing events as sets of points or ‘puddles.’
a. The Addition Rule. P A  B  P A  PB  P A  B .
(i) Meaning of Union (or).
P A  B  means the probability of A occurring or of
B occurring or both. It always includes P A  B if
it exists.
(ii) Meaning of Intersection (and).
P A  B means the probability of both A and
occurring. Note that if A and B are mutually
exclusive P A  B   0
1 2 3 4 5
1     
2     
Diagram for dice problems. 3     
4     
5     
6     
B
6






b. Meaning of Complement. PA  1  PA .
This event can be called ' not A' . Note that if A and B are
collectively exhaustive P A  B   1 . If A and B are both
collectively exhaustive and mutually exclusive B is the
complement of A .
c. Extended Addition Rule.
P A  B  C   P A  PB  PC   P A  B  P A  C   PB  C   P A  B  C 
3. Conditional and Joint Probability.
a. The Multiplication Rule PA  B  PA BPB or
P A B  
P A  B 
.
P B 
The conditional probability of A given B , PA B  
P A  B 
P B 
is the probability of event A assuming that
event B has occurred.
2
b. A Joint Probability Table.
What is the difference between joint, marginal and conditional
probabilities? Remember that we cannot read a conditional
probability directly from a joint probability table but must
compute it using the second version of the Multiplication Rule.
c. Extended Multiplication Rule.
P A  B  C   PC A  BPB AP A
d. Bayes' Rule. PB A 
PA B PB 
P A
4. Statistical Independence.
a. Definition: PA B  PA
b. Consequence: P A  B   P A PB 
c. Consequence: If A and B are independent so are A and B , A and B etc.
5. Review.
Rule
Multiplication
P A  B 
Addition
P A  B 
In General
A and B
independent
 PA B PB 
A and B
mutually
exclusive
0
 P A  PB 
 P A  PB 
 P A  PB 
P B AP A
0
 P A
P A B PB 
0
 PB 
 PB AP A
 P A  B 
Bayes' Rule
P AB
 

Bayes' Rule
PBA

 
P B 
 P APB 
 P APB 
P  A
I. Permutations and Combinations.
1. Counting Rule for Outcomes.
a. If an experiment has k steps and there are n1
possible outcomes on the first step, n2 possible
outcomes on the second step, etc. up to n k possible
outcomes on the k th step, then the total number of
possible outcomes is the product n1n2  nk .
b. Consequence. If there are exactly n outcomes at
each step, the total possible outcomes from k steps is
nk .
2. Permutations.
a. The number of ways that one can arrange n
objects: n!
3
b. Prn 
n!
n  r !
Order counts!
3. Combinations.
a. C rn 
n!
n  r !r!
Order doesn't count!
b. Probability of getting a given combination
This is the number of ways of getting the specified
combination divided by the total number of possible
combinations.
If there are a equally likely ways to get what you want and
b equally likely possible outcomes, the probability of getting
the outcomes you want is a . Example: If there is only one
b
way to get 4 jacks from 4 jacks in a poker hand and C148 ways
to get another card, a  1C148 The number of ways to get a
poker hand of 5 cards is b  C552 , so the probability of getting
48
a poker hand with 4 jacks is a  C1
.
b
C552
J. Random Variables.
1. Definitions.
Discrete and Continuous Random Variables. Finite and infinite
populations. Sampling with replacement.
2. Probability Distribution of a Discrete Random
Variable.
By this we mean either a table with each possible value of a random
variable and the probability of each value (These probabilities better
add to one!) or a formula that will give us these results. We can still
speak of Relative Frequency and define Cumulative Frequency to a
point x 0 as the probability up to that point, i.e. F x 0   Px  x 0 
3. Expected Value (Expectation) of a Discrete Random
Variable.  x  E x    x  Px  .
Rules for linear functions of a random variable:
a and b are constants. x is a random variable.
a. Eb  b
b. Eax  aEx 
c. Ex  b  Ex  b
d. Eax  b  aEx  b
4. Variance of a Discrete Random Variable.

  
 x2  Varx   E x   2  E x 2   x2
4
Rules for linear functions of a random Variable:
a. Varb  0
b. Varax  a 2Varx
c. Varx  b  Varx 
d. Varax  b  a 2Varx
Example -- see 251probex1.
5. Summary
a. Rules for Means and Variances of Functions of
Random Variables. 251probex4
b. Standardized Random Variables, z 
x
.

See 251probex2.
6. Continuous Random Variables.
a. Normal Distribution (Overview).
The General formula is f x  
1  x 
 

e 2  
1
 2
2
. Don’t try to
memorize or even use this formula. It is much more important
to remember what the normal curve looks like.
b. The Continuous Uniform Distribution.
f x  
1
d c
f x   0
F x  
for
cxd
otherwise.
xc
for c  x  d , F x   0 for x  c and
d c
F x   1 for x  d .
To find probabilities under this distribution, go to 251probex3.
c. Cumulative Distributions, Means and Variances
for Continuous Distributions.
Discrete Distributions
Cumulative
Function
F x 0   Px  x 0  
Mean
  E x  
 Px 
x  x0
 xPx 
Continuous Distributions
F  x 0   P  x  x0  
  E x  




x0

f x dx
xf x dx
5
Variance
 x2  E x   2
 x2  E x   2


 x    Px
 Ex   
  x Px  
2
2

x   2 f x dx
 

  x

2
2


 E x2   2
2


2

f x dx   2

Example: For the Continuous Uniform Distribution,
x c
cd
d  c 2 .
(i) F x 0   0
(ii)  
and (iii)  2 
d c
2
12
The proofs below are intended only for those who have had
calculus!
d
1
1 1 2

  E x   x
dx 
x  1 x2 

c
d c
d  c  2 d 2 c 

Proof: (i) 
2
2
 1 d  c  c  d
 2 d c
2




 F x0  
(ii) 
 x 0  c
 d  c


(iii)  x2  

d


x0
c
1
1
dx 
d c
d c
x2
c
1 1 3
x
d  c  3

x0
dx 
c

1
x xc
d  c x0

1

dx   2
d  c 
d
cd 
 13 x 3   

c
  2 
2
13  dd  cc  14 c 2  2cd  d 2 
4
c 2  cd  d 2  123 c 2  2cd  d 2 
 12


3
3
 c
121 c 2  2cd  d 2   d 12
2
6
d. Chebyshef's Inequality Again.
P  k  x    k   1 
1
k2
and don't forget the
Empirical Rule.
Proof and extensions
7. Skewness and Kurtosis (Short Summary).
Skewness:  3  E x   3 . For a discrete distribution, this means
 x    Px, and, for a continuous distribution
 x    f xdx
3

3

Relative Skewness:  1 
3
3
Kurtosis:  4  E x   4 . For the Normal distribution  4  3 4
Coefficient of Excess:  1 
 4  3 4
4
7