Survey
* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project
2
Probability
Basic Definitions: Events, Sample Space,
and Probabilities
Basic Rules for Probability
Conditional Probability
Independence of Events
Combinatorial Concepts
The Law of Total Probability and Bayes’
Theorem
Random variables
Probability
A measure of uncertainty
A measure of the strength of belief in
the occurrence of an uncertain event
A measure of the degree of chance or
likelihood of occurrence of an
uncertain event
Measured by a number between 0
and 1 (or between 0% and 100%)
Types of Probability
Objective or Classical
Subjective Probability
Probability
– based on personal beliefs,
based on equally-likely
experiences, prejudices,
events
intuition - personal judgment
based on long-run relative – different for all observers
frequency of events
(subjective)
not based on personal
– examples: Super Bowl,
beliefs
elections, new product
is the same for all
introduction, snowfall
observers (objective)
examples: toss a coins,
throw a die, pick a card
Basic Definitions
Set - a collection of elements or objects of interest
Empty set (denoted by )
a set containing no elements
Universal set (denoted by S)
a set containing all possible elements
Complement (Not). The complement of A is
a set containing all elements of S not in A
• Intersection (And) A B
a set containing all elements in both A and B
• Union (Or) A B
– a set containing all elements in A or B or both
–
A
Basic Definitions (cont)
• Mutually exclusive or disjoint sets
•
•
- sets having no elements in common, having
no intersection, whose intersection is the
empty set
Collectively exhaustive
– If the list of outcomes include all of
possible outcomes
Partition
– a collection of mutually exclusive sets
which together include all possible
elements, whose union is the universal set
Sets: Diagrams
A
A
A B
B
B
A
A
A B
A3
A1
A2
A4
Partition
A5
A B
Experiment
•
•
•
Process that leads to one of several possible
outcomes *, e.g.:
Coin toss
• Heads,Tails
Throw die
• 1, 2, 3, 4, 5, 6
Pick a card
AH, KH, QH, ...
Each trial of an experiment has a single observed
outcome.
The precise outcome of a random experiment is
unknown before a trial.
* Also called a basic outcome, elementary event, or simple event
Events
Sample Space or Event Set
Set of all possible outcomes (universal set) for a given
experiment
E.g.: Throw die
S = (1,2,3,4,5,6)
Event
Collection of outcomes having a common characteristic
E.g.: Even number
A = (2,4,6)
Event A occurs if an outcome in the set A occurs
Probability of an event
Sum of the probabilities of the outcomes of which it
consists
P(A) = P(2) + P(4) + P(6)
Equally-likely Probabilities
•
For example:
Throw a die
• Six possible outcomes (1,2,3,4,5,6)
• If each is equally-likely, the probability of each is 1/6 =
.1667 = 16.67%
1
n( S )
• Probability of each equally-likely outcome is 1 over the
number of possible outcomes
Event A (even number)
• P(A) = P(2) + P(4) + P(6) = 1/6 + 1/6 + 1/6 = 1/2
• P ( A ) P ( e)
for e in A
P ( e)
n( A ) 3 1
n( S ) 6 2
Pick a Card: Sample Space
Union of
Events ‘Heart’
and ‘Ace’
P ( Heart Ace )
n ( Heart Ace )
n(S )
16
4
52
Hearts
Diamonds
Clubs
Spades
A
K
Q
J
10
9
8
7
6
5
4
3
2
A
K
Q
J
10
9
8
7
6
5
4
3
2
A
K
Q
J
10
9
8
7
6
5
4
3
2
A
K
Q
J
10
9
8
7
6
5
4
3
2
Event ‘Ace’
n ( Ace )
P ( Ace )
4
n(S )
1
52
13
13
Event ‘Heart’
n ( Heart )
P ( Heart )
13
n(S )
1
52
The intersection of the
events ‘Heart’ and ‘Ace’
comprises the single point
circled twice: the ace of hearts
4
n ( Heart Ace )
P ( Heart Ace )
1
n(S )
52
Basic Rules for Probability (1)
Range of Values 0 P( A) 1
Complements - Probability of not A
P( A ) 1 P( A)
Intersection - Probability of both A and B
P( A B) n( A B)
n( S )
Mutually exclusive events (A and C) :
P( A C ) 0
Basic Rules for Probability (2)
• Union - Probability of A or B or both
P( A B) n( A B) P( A) P( B) P( A B)
n( S )
Mutually exclusive events:
P( A C) 0 so P( A C) P( A) P(C)
•
Conditional Probability - Probability of A given B
P( A B) P( A B)
P( B)
Independent events:
P( A B) P( A)
P( B A) P( B)
Conditional Probability
Condition Probability P(A|B) is probability of A given B
Rules of conditional probability:
P( A B) P( A B)
P( B)
so P( A B) P( A B) P( B)
P( B A) P( A)
If events A and D are statistically independent:
P ( A D) P ( A)
P ( D A) P ( D)
so
P( A D) P( A) P( D)
Example (Contingency Table)
Counts
AT& T
IBM
Total
Telecommunication
40
10
50
Computers
20
30
50
Total
60
40
100
Probabilities
AT& T
IBM
Total
Telecommunication
.40
.10
.50
Computers
.20
.30
.50
Total
.60
.40
1.00
Probability that a project
is undertaken by IBM
given it is a
telecommunications
project:
P ( IBM T )
P(T )
.10
.2
.50
P ( IBM T )
Independence of Events
Conditions for the statistical independence of events A and B:
P( A B) P( A)
P( B A) P( B)
and
P( A B) P( A) P( B)
P ( Ace Heart )
P ( Heart )
1
1
52
P ( Ace )
13 13
52
P ( Ace Heart )
P ( Ace Heart )
P ( Heart Ace )
P ( Ace )
1
1
52 P ( Heart )
4
4
52
P ( Heart Ace )
4 13 1
P ( Ace ) P ( Heart )
52 52 52
Combinatorial Concepts
Consider a pair of six-sided dice. There are six possible outcomes
from throwing the first die (1,2,3,4,5,6) and six possible outcomes
from throwing the second die (1,2,3,4,5,6). Altogether, there are
6*6=36 possible outcomes from throwing the two dice.
In general, if there are n events and the event i can happen in
Ni possible ways, then the number of ways in which the
sequence of n events may occur is N1N2...Nn.
Pick 5 cards from a
deck of 52 - with
replacement
52*52*52*52*52=525
380,204,032 different
possible outcomes
Pick 5 cards from a
deck of 52 - without
replacement
52*51*50*49*48 =
311,875,200 different
possible outcomes
Factorial
How many ways can you order the 3 letters A, B, and C?
There are 3 choices for the first letter, 2 for the second, and 1 for
the last, so there are 3*2*1 = 6 possible ways to order the three
letters A, B, and C.
How many ways are there to order the 6 letters A, B, C, D, E,
and F? (6*5*4*3*2*1 = 720)
Factorial: For any positive integer n, we define n factorial as:
n(n-1)(n-2)...(1). We denote n factorial as n!.
The number n! is the number of ways in which n objects can
be ordered. By definition 1! = 1.
Permutations
What if we chose only 3 out of the 6 letters A, B, C, D, E, and F?
There are 6 ways to choose the first letter, 5 ways to choose the
second letter, and 4 ways to choose the third letter (leaving 3
letters unchosen). That makes 6*5*4=120 possible orderings or
permutations.
Permutations are the possible ordered selections of r objects out
of a total of n objects. The number of permutations of n objects
taken r at a time is denoted nPr.
n!
P
n r ( n r )!
6
For example:
6!
6! 6 * 5 * 4 * 3 * 2 * 1
P
6 * 5 * 4 120
(6 3)! 3!
3 * 2 *1
3
Combinations
Suppose that when we picked 3 letters out of the 6 letters A, B, C, D, E, and F
we chose BCD, or BDC, or CBD, or CDB, or DBC, or DCB. (These are the
6 or 3! permutations or orderings of the 3 letters B, C, and D.) But these are
orderings of the same combination of 3 letters. How many combinations of 6
different letters, taking 3 at a time, are there?
Combinations are the possible selections of r items from a group of n items n
regardless of the order of selection. The number of combinations is denoted r
and is read n choose r. An alternative notation is nCr. We define the number
of combinations of r out of n elements as:
n!
n
C
n r
r
r!(n r)!
For example:
6!
6!
6 * 5 * 4 * 3 * 2 * 1 6 * 5 * 4 120
n
20
6 C3
r
3!(6 3)! 3!3! (3 * 2 * 1)(3 * 2 * 1) 3 * 2 * 1
6
The Law of Total Probability and
Bayes’ Theorem
P( A) P( A B) P( A B )
In terms of conditional probabilities:
P( A) P( A B) P( A B )
P( A B) P( B) P( A B ) P( B )
More generally (where Bi make up a partition):
P( A) P( A B )
i
P( AB ) P( B )
i
i
Example - The Law of Total
Probability
Event U: Stock market will go up in the next year
Event W: Economy will do well in the next year
P(U W ) .75
P(U W ) .30
P(W ) .80 P(W ) 1.8 .2
P(U ) P(U W ) P(U W )
P(U W )P(W ) P(U W )P(W )
(.75)(.80) (.30)(.20)
.60.06 .66
Bayes’ Theorem
•
•
Bayes’ theorem enables you, knowing just a
little more than the probability of A given B, to
find the probability of B given A.
Based on the definition of conditional
probability and the law of total probability.
P ( A B)
P ( A)
P ( A B)
P ( A B) P ( A B )
P ( A B) P ( B)
P ( A B) P ( B) P ( A B ) P ( B )
P ( B A)
Applying the law of total
probability to the denominator
Applying the definition of
conditional probability throughout
Random Variables
A real number is assigned to every outcome
or event in an experiment.
Outcome is numerical or quantitative
outcome number can be random variable.
Outcome is not number RV so that every
outcome corresponding with a unique
number.
Discrete RV
Continuous RV
Example
Consider the experiment of tossing two six-sided dice. There are 36 possible
outcomes. Let the random variable X represent the sum of the numbers on
the two dice:
3
1,3
2,3
3,3
4,3
5,3
6,3
4
1,4
2,4
3,4
4,4
5,4
6,4
5
1,5
2,5
3,5
4,5
5,5
6,5
6
1,6
2,6
3,6
4,6
5,6
6,6
7
8
9
10
11
12
P(x)*
1/36
2/36
3/36
4/36
5/36
6/36
5/36
4/36
3/36
2/36
1/36
1
Probability Distribution of Sum of Two Dice
0.17
0.12
p(x)
1,1
2,1
3,1
4,1
5,1
6,1
2
1,2
2,2
3,2
4,2
5,2
6,2
x
2
3
4
5
6
7
8
9
10
11
12
0.07
0.02
2
3
4
5
6
7
8
9
10
x
* Note that: P(x) (6 (7 x) 2 ) / 36
11
12
Example
Given Probability Distribution
The Probability Distribution of the Number of Switches
P(x)
0.1
0.2
0.3
0.2
0.1
0.1
1
0.4
0.3
P(x)
x
0
1
2
3
4
5
0.2
0.1
0.0
0
1
2
3
4
5
x
Probability of more than 2:
P(X > 2) = P(3) + P(4) + P(5) = 0.2 + 0.1 + 0.1 = 0.4
Probability of at least 1:
P(X 1) = 1 - P(0) = 1 - 0.1 = .9
Discrete and Continuous Random
Variables
A discrete random variable:
has a countable number of possible values
has discrete jumps between successive values
has measurable probability associated with individual values
counts
A continuous random variable:
has an uncountable infinite number of possible values
moves continuously from value to value
has no measurable probability associated with each value
measures (e.g.: height, weight, speed, value, duration, length)
Discrete Probability Distributions (Probability Mass Function)
The probability distribution of a discrete random
variable X sometime is called Probability Mass
function, must satisfy the following two conditions.
1. P ( x ) 0 for all values of x.
2. P ( x ) 1
all x
Corollary :
0 P ( X ) 1
Cumulative Distribution Function
(Probability Distribution)
The cumulative distribution function, F(x), of a discrete
random variable X is:
F(x) P( X x)
P(i)
all i x
x
0
1
2
3
4
5
P(x)
0.1
0.2
0.3
0.2
0.1
0.1
1
F(x)
0.1
0.3
0.6
0.8
0.9
1.0
Example
The probability that at most three will occur:
P(x)
0.1
0.2
0.3
0.2
0.1
0.1
1
F(x)
0.1
0.3
0.6
0.8
0.9
1.0
The Probability That at Most Thre e S witches W ill Occur
0 .4
P(X 3) F(3)
0 .3
P ( x)
x
0
1
2
3
4
5
0 .2
0 .1
0 .0
0
1
2
3
x
4
5
Expected Values of Discrete
Random Variables
The mean of a probability distribution is a measure of its centrality or
location.
The mean is also known as the expected value (or expectation) of a random
variable, because it is the value that is expected to occur, on average.
The expected value of a discrete
random variable X is equal to the sum
of each value of the random variable
multiplied by its probability.
E ( X ) xP ( x )
all x
x
0
1
2
3
4
5
P(x)
0.1
0.2
0.3
0.2
0.1
0.1
1.0
xP(x)
0.0
0.2
0.6
0.6
0.4
0.5
2.3 = E(X)=
Expected Value of a Function of a
Discrete Random Variables
The expected value of a function of a discrete random variable X is:
E[ g ( X )] g ( x) P( x)
allx
Example: Monthly sales of a certain
product are believed to follow the
probability distribution given below.
Suppose the company has a fixed
monthly production cost of $8000 and
that each item brings $2. Find the
expected monthly profit from product
sales.
Number
of items, x
5000
6000
7000
8000
9000
P(x)
0.2
0.3
0.2
0.2
0.1
1.0
E[ g ( X )] g ( x) P( x) 5400
all x
xP(x)
g(x) g(x)P(x)
1000 2000
400
1800 4000
1200
1400 6000
1200
1600 8000
1600
900 10000
1000
6700
5400
Variance and Standard Deviation
of a Random Variable
The variance of a random variable is the expected
squared deviation from the mean:
V ( X ) E [( X ) ] ( x ) P ( x )
2
2
2
all x
2
2
2
E ( X ) [ E ( X )] x P ( x ) xP ( x )
all x
all x
2
The standard deviation of a random variable is the
square root of its variance: SD( X ) V ( X )
The Binomial Distribution
Bernoulli trials are a sequence of n identical trials satisfying the
following conditions:
1. Each trial has two possible outcomes, called success *and failure.
The two outcomes are mutually exclusive and exhaustive.
2. The probability of success, denoted by p, remains constant from trial
to trial. The probability of failure is denoted by q, where q = 1-p.
3. The n trials are independent. That is, the outcome of any trial does
not affect the outcomes of the other trials.
A random variable, X, that counts the number of successes in n
Bernoulli trials, where p is the probability of success* in any given trial,
is said to follow the binomial probability distribution with parameters
n (number of trials) and p (probability of success). We call X the
binomial random variable.
Binomial Probabilities
(Continued)
In general:
1. The probability of a given 2. The number of different sequences
sequence of x successes
of n trials that result in exactly x
out of n trials with
successes is equal to the number of
probability of success p
choices of x elements out of a total of
and probability of failure q
n elements. This number is denoted:
is equal pxq(n-x)
n
n!
nCx
x x! ( n x )!
The Binomial Probability
Distribution
The binomial probability distribution:
n!
n x ( n x )
P( x) p q
p x q ( n x)
x!( n x)!
x
where :
p is the probability of success in a single trial,
q = 1-p,
n is the number of trials, and
x is the number of successes.
N u m b er o f
su ccesses, x
0
1
2
3
n
P ro b ab ility P (x )
n!
p 0 q (n 0)
0 !( n 0 ) !
n!
p 1 q ( n 1)
1 !( n 1 ) !
n!
p 2 q (n 2)
2 !( n 2 ) !
n!
p 3 q (n 3)
3 !( n 3 ) !
n!
p n q (n n)
n !( n n ) !
1 .0 0
Mean, Variance, and Standard
Deviation of the Binomial Distribution
Mean of a binomial distribution:
E ( X ) np
Variance of a binomial distribution:
V ( X ) npq
2
Standard deviation of a binomial distribution:
= SD(X) = npq
For example, if H counts the number of
heads in five tosses of a fair coin:
H E (H ) (5)(.5) 2.5
2H V (H ) (5)(.5)(.5) 05
.
H SD(H ) 05
. .7071
Continuous Random Variables
A continuous random variable is a random variable that can take on any value in an
interval of numbers.
The probabilities associated with a continuous random variable X are determined by the
probability density function of the random variable. The function, denoted f(x), has the
following properties.
1.
2.
3.
f(x) 0 for all x.
The probability that X will be between two numbers a and b is equal to the area
under f(x) between a and b.
The total area under the curve of f(x) is equal to 1.00.
The cumulative distribution function of a continuous random variable:
F(x) = P(X x) =Area under f(x) between the smallest possible value of X (often -) and
the point x.
Probability Density Function and
Cumulative Distribution Function
F(x)
1
F(b)
}
F(a)
P(a X b)=F(b) - F(a)
0
a
b
x
f(x)
P(a X b) = Area under
f(x) between a and b
= F(b) - F(a)
0
a
b
x
Normal distribution Introduction
As n increases, the binomial distribution approaches a ...
n=6
n = 10
Binomial Distribution: n=10, p=.5
Binomial Distribution: n=14, p=.5
0.3
0.2
0.2
0.2
0.1
P(x)
0.3
P(x)
0.3
0.1
0.0
0.1
0.0
0
1
2
3
4
5
0.0
6
0
x
1
2
3
4
5
6
7
8
9
10
x
0.4
0.3
0.2
0.1
0.0
-5
0
x
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14
x
Normal Distribution: = 0, = 1
f(x)
P(x)
Binomial Distribution: n=6, p=.5
n = 14
5
The Normal Probability
Distribution
The normal probability density function:
1
e
0.4
x 2
2 2
0.3
for
x
2 2
where e 2.7182818... and 314159265
.
...
f(x)
f ( x)
Normal Distribution: = 0, = 1
0.2
0.1
0.0
-5
0
x
5
The Normal Probability
Distribution
• The normal is a family of
Bell-shaped and symmetric distributions. because
the distribution is symmetric, one-half (.50 or 50%)
lies on either side of the mean.
Each is characterized by a different pair of mean,
, and variance, . That is: [X~N()].
Each is asymptotic to the horizontal axis.
Normal Probability
Distributions
All of these are normal probability density functions, though each has a different mean and variance.
Normal Distribution: =40, =1
Normal Distribution: =30, =5
0.4
Normal Distribution: =50, =3
0.2
0.2
0.2
f(y)
f(x)
f(w)
0.3
0.1
0.1
0.1
0.0
0.0
35
40
45
0.0
0
10
20
30
w
40
x
W~N(40,1)
X~N(30,25)
50
60
35
45
50
55
y
Y~N(50,9)
Normal Distribution: =0, =1
Consider:
0.4
f(z)
0.3
0.2
0.1
0.0
-5
0
z
Z~N(0,1)
5
P(39 W 41)
P(25 X 35)
P(47 Y 53)
P(-1 Z 1)
The probability in each
case is an area under a
normal probability density
function.
65
The Standard Normal
Distribution
The standard normal random variable, Z, is the normal random
variable with mean = 0 and standard deviation = 1: Z~N(0,12).
Standard Normal Distribution
0 .4
=1
{
f( z)
0 .3
0 .2
0 .1
0 .0
-5
-4
-3
-2
-1
0
=0
Z
1
2
3
4
5
Finding Probabilities of the Standard
Normal Distribution: P(0 ≤ Z ≤ 1.56)
Standard Normal Probabilities
Standard Normal Distribution
0.4
f(z)
0.3
0.2
0.1
{
1.56
0.0
-5
-4
-3
-2
-1
0
1
2
3
4
Z
Look in row labeled 1.5
and column labeled .06
to find
P(0 z 1.56) = .4406
5
z
0.0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1.0
1.1
1.2
1.3
1.4
1.5
1.6
1.7
1.8
1.9
2.0
2.1
2.2
2.3
2.4
2.5
2.6
2.7
2.8
2.9
3.0
.00
0.0000
0.0398
0.0793
0.1179
0.1554
0.1915
0.2257
0.2580
0.2881
0.3159
0.3413
0.3643
0.3849
0.4032
0.4192
0.4332
0.4452
0.4554
0.4641
0.4713
0.4772
0.4821
0.4861
0.4893
0.4918
0.4938
0.4953
0.4965
0.4974
0.4981
0.4987
.01
0.0040
0.0438
0.0832
0.1217
0.1591
0.1950
0.2291
0.2611
0.2910
0.3186
0.3438
0.3665
0.3869
0.4049
0.4207
0.4345
0.4463
0.4564
0.4649
0.4719
0.4778
0.4826
0.4864
0.4896
0.4920
0.4940
0.4955
0.4966
0.4975
0.4982
0.4987
.02
0.0080
0.0478
0.0871
0.1255
0.1628
0.1985
0.2324
0.2642
0.2939
0.3212
0.3461
0.3686
0.3888
0.4066
0.4222
0.4357
0.4474
0.4573
0.4656
0.4726
0.4783
0.4830
0.4868
0.4898
0.4922
0.4941
0.4956
0.4967
0.4976
0.4982
0.4987
.03
0.0120
0.0517
0.0910
0.1293
0.1664
0.2019
0.2357
0.2673
0.2967
0.3238
0.3485
0.3708
0.3907
0.4082
0.4236
0.4370
0.4484
0.4582
0.4664
0.4732
0.4788
0.4834
0.4871
0.4901
0.4925
0.4943
0.4957
0.4968
0.4977
0.4983
0.4988
.04
0.0160
0.0557
0.0948
0.1331
0.1700
0.2054
0.2389
0.2704
0.2995
0.3264
0.3508
0.3729
0.3925
0.4099
0.4251
0.4382
0.4495
0.4591
0.4671
0.4738
0.4793
0.4838
0.4875
0.4904
0.4927
0.4945
0.4959
0.4969
0.4977
0.4984
0.4988
.05
0.0199
0.0596
0.0987
0.1368
0.1736
0.2088
0.2422
0.2734
0.3023
0.3289
0.3531
0.3749
0.3944
0.4115
0.4265
0.4394
0.4505
0.4599
0.4678
0.4744
0.4798
0.4842
0.4878
0.4906
0.4929
0.4946
0.4960
0.4970
0.4978
0.4984
0.4989
.06
0.0239
0.0636
0.1026
0.1406
0.1772
0.2123
0.2454
0.2764
0.3051
0.3315
0.3554
0.3770
0.3962
0.4131
0.4279
0.4406
0.4515
0.4608
0.4686
0.4750
0.4803
0.4846
0.4881
0.4909
0.4931
0.4948
0.4961
0.4971
0.4979
0.4985
0.4989
.07
0.0279
0.0675
0.1064
0.1443
0.1808
0.2157
0.2486
0.2794
0.3078
0.3340
0.3577
0.3790
0.3980
0.4147
0.4292
0.4418
0.4525
0.4616
0.4693
0.4756
0.4808
0.4850
0.4884
0.4911
0.4932
0.4949
0.4962
0.4972
0.4979
0.4985
0.4989
.08
0.0319
0.0714
0.1103
0.1480
0.1844
0.2190
0.2517
0.2823
0.3106
0.3365
0.3599
0.3810
0.3997
0.4162
0.4306
0.4429
0.4535
0.4625
0.4699
0.4761
0.4812
0.4854
0.4887
0.4913
0.4934
0.4951
0.4963
0.4973
0.4980
0.4986
0.4990
.09
0.0359
0.0753
0.1141
0.1517
0.1879
0.2224
0.2549
0.2852
0.3133
0.3389
0.3621
0.3830
0.4015
0.4177
0.4319
0.4441
0.4545
0.4633
0.4706
0.4767
0.4817
0.4857
0.4890
0.4916
0.4936
0.4952
0.4964
0.4974
0.4981
0.4986
0.4990
Finding Probabilities of the Standard
Normal Distribution: P(Z < -2.47)
To find P(Z<-2.47):
Find table area for 2.47
P(0 < Z < 2.47) = .4932
P(Z < -2.47) = .5 - P(0 < Z < 2.47)
= .5 - .4932 = 0.0068
Standard Normal Distribution
Area to the left of -2.47
P(Z < -2.47) = .5 - 0.4932
= 0.0068
0.4
Table area for 2.47
P(0 < Z < 2.47) = 0.4932
f(z)
0.3
0.2
0.1
0.0
-5
-4
-3
-2
-1
0
Z
1
2
3
4
5
Finding Probabilities of the Standard
Normal Distribution: P(1 ≤ Z ≤ 2)
To find P(1 Z 2):
1. Find table area for 2.00
F(2) = P(Z 2.00) = .5 + .4772 =.9772
2. Find table area for 1.00
F(1) = P(Z 1.00) = .5 + .3413 = .8413
3. P(1 Z 2.00) = P(Z 2.00) - P(Z 1.00)
= .9772 - .8413 = .1359
Standard Normal Distribution
0.4
Area between 1 and 2
P(1 Z 2) = .4772 - .8413 = 0.1359
f(z)
0.3
0.2
0.1
0.0
-5
-4
-3
-2
-1
0
Z
1
2
3
4
5
The Transformation of Normal
Random Variables
The area within k of the mean is the same for all normal random variables. So an area
under any normal distribution is equivalent to an area under the standard normal. In this
example: P(40 X P(-1 Z since 5and
The transformation of X to Z:
X x
Z
x
Normal Distribution: =50, =10
0.07
0.06
Transformation
f(x)
(1) Subtraction: (X - x)
0.05
0.04
0.03
=10
{
0.02
Standard Normal Distribution
0.01
0.00
0.4
0
20
30
40
50
60
70
80
90 100
X
0.3
0.2
(2) Division by x)
{
f(z)
10
1.0
0.1
0.0
-5
-4
-3
-2
-1
0
Z
1
2
3
4
5
The inverse transformation of Z to X:
X x Z x
Using the Normal
Transformation
Example 4-1
X~N(160,302)
Example 4-2
X~N(127,222)
P (100 X 180)
100 X 180
P
P ( X 150)
X 150
P
100 160
180 160
P
Z
30
30
P 2 Z .6667
0.4772 0.2475 0.7247
150 127
P Z
22
P Z 1.045
0.5 0.3520 0.8520
The Transformation of Normal
Random Variables
The transformation of X to Z:
Z
X x
x
The inverse transformation of Z to X:
X
Z
x
x
The transformation of X to Z, where a and b are numbers::
a
P( X a) P Z
b
P( X b) P Z
b
a
P(a X b) P
Z