Download Printer

Document related concepts
no text concepts found
Transcript
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Lecture 5: Mathematical Expectation
Assist. Prof. Dr. Emel YAVUZ DUMAN
MCB1007 Introduction to Probability and Statistics
İstanbul Kültür University
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Outline
1
Introduction
2
The Expected Value of a Random Variable
3
Moments
4
Chebyshev’s Theorem
5
Moment Generating Functions
6
Product Moments
7
Conditional Expectation
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Outline
1
Introduction
2
The Expected Value of a Random Variable
3
Moments
4
Chebyshev’s Theorem
5
Moment Generating Functions
6
Product Moments
7
Conditional Expectation
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Introduction
A very important concept in probability and statistics is that of the
mathematical expectation, expected value, or briefly the
expectation, of a random variable. Originally, the concept of a
mathematical expectation arose in connection with games of
chance, and in its simplest form it is the product of the amount a
player stands to win and the probability that he or she will win. For
instance, if we hold one of 10, 000 tickets in a raffle for which the
grand prize is a trip worth $4, 800, our mathematical expectation is
4, 800 ·
1
= $0.48.
10, 000
This amount will have to be interpreted in the sense of an average
- altogether the 10, 000 tickets pay $4, 800, or on the average
$4,800
10,000 = $0.48 per ticket.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Outline
1
Introduction
2
The Expected Value of a Random Variable
3
Moments
4
Chebyshev’s Theorem
5
Moment Generating Functions
6
Product Moments
7
Conditional Expectation
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
The Expected Value of a Random Variable
Definition 1
If X is a discrete random variable and f (x) is the value of its
probability distribution at x, the expected value of X is
xf (x).
E (X ) =
x
Correspondingly, if X is a continuous random variable and f (x) is
the value of its probability density at x, the expected value of X is
∞
xf (x)dx.
E (X ) =
−∞
In this definition it is assumed, of course, that the sum or the
integral exists; otherwise, the mathematical expectation is
undefined.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 2
Imagine a game in which, on any play, a player has a 20% chance
of winning $3 and an 80% chance of losing $1. The probability
distribution function of the random variable X , the amount won or
lost on a single play is
x
f (x)
$3
0.2
−$1
0.8
and so the average amount won (actually lost, since it is negative)
- in the long run - is
E (X ) = ($3)(0.2) + (−$1)(0.8) = −$0.20.
What does “in the long run” mean? If you play, are you
guaranteed to lose no more than 20 cents?
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Solution. If you play and lose, you are guaranteed to lose $1. An
expected loss of 20 cents means that if you played the game over
and over and over and over · · · again, the average of your $3
winnings and your $1 losses would be a 20 cent loss. “In the long
run” means that you can’t draw conclusions about one or two
plays, but rather thousands and thousands of plays.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 3
A lot of 12 television sets includes 2 with white cords. If three of
the sets are chosen at random for shipment to a hotel, how many
sets with white cords can the shipper expect to send to the hotel?
Solution. Since x of the two sets with
cords and 3 − x of
2 white
10
the 10 other sets can be chosen in x 3−x ways, three of the 12
12
ways,
and
these
sets can be chosen in 12
3
3 possibilities are
presumably equiprobable, we find that the probability distribution
of X , the number of sets with white cords shipped to the hotel, is
given by
f (x) =
2
x
10
123−x
for x = 0, 1, 2
3
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
or, in tabular form,
x
f (x)
0
6/11
1
9/22
2
1/22
Now,
E (X ) =
2
x=0
xf (x) = 0 ·
9
1
1
6
+1·
+2·
=
11
22
22
2
and since half a set cannot possibly be shipped, it should be clear
that the term “expect” is not used in its colloquial sense. Indeed,
it should be interpreted as an average pertaining to repeated
shipments made under the given conditions.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 4
A saleswoman has been offered a new job with a fixed salary of
$290. Her records from her present job show that her weekly
commissions have the following probabilities:
commission
probability
0
0.05
$100
0.15
$200
0.25
$300
0.45
$400
0.1
Should she change jobs?
Solution. The expected value (expected income) from her present
job is
(0 × $0.05) + (0.15 × $100) + (0.25 × $200)
+ (0.45 × $300) + (0.1 × $400) = $240.
If she is making a decision only on the amount of money earned,
then clearly she ought to change jobs.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 5
Dan, Eric, and Nick decide to play a fun game. Each player puts
$2 on the table and secretly writes down either heads or tails.
Then one of them tosses a fair coin. The $6 on the table is divided
evenly among the players who correctly predicted the outcome of
the coin toss. If everyone guessed incorrectly, then everyone takes
their money back. After many repetitions of this game, Dan has
lost a lot of money - more than can be explained by bad luck.
What’s going on?
Solution. A tree diagram for this problem is worked out below,
under the assumptions that everyone guesses correctly with
probability 1/2 and everyone is correct independently.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Nick right?
Eric right?
Y
1/2
Y
Dan right?
Y
1/2
1/2
N
1/2
Y
1/2
N
1/2
1/2
N
Y
1/2
1/2
N
Y
1/2
1/2
N
1/2
Y
N
1/2
1/2
N
Dan’s payoff
Probability
$0
1/8
$1
1/8
$1
1/8
$4
1/8
−$2
1/8
−$2
1/8
−$2
1/8
$0
1/8
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
In the payoff column, we are accounting for the fact that Dan has
to put in $2 just to play. So, for example, if he guesses correctly
and Eric and Nick are wrong, then he takes all $6 on the table, but
his net profit is only $4. Working from the tree diagram, Dan’s
expected payoff is:
1
1
1
1
1
1
+ 1 · + 1 · + 4 · + (−2) · + (−2) ·
8
8
8
8
8
8
1
1
+ (−2) · + 0 ·
8
8
=0.
E (payoff) =0 ·
So the game perfectly fair! Over time, he should neither win nor
lose money.
A game is called fair if the expected value of the game is
zero.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Let say that Nick and Eric are collaborating; in particular, they
always make opposite guesses. So our assumption everyone is
correct independently is wrong; actually the events that Nick is
correct and Eric is correct are mutually exclusive! As a result, Dan
can never win all the money on the table. When he guesses
correctly, he always has to split his winnings with someone else.
This lowers his overall expectation, as the corrected tree diagram
below shows:
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Nick right?
Eric right?
Y
0
Y
Dan right?
Y
1
1/2
N
1/2
Y
1
N
1/2
0
N
Y
0
1/2
N
Y
1
1/2
N
1/2
Y
N
1
0
N
Dan’s payoff
Probability
$0
0
$1
1/4
$1
1/4
$4
0
−$2
0
−$2
1/4
−$2
1/4
$0
0
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
From this revised tree diagram, we can work out Dan’s actual
expected payoff
1
1
1
+ 1 · + 4 · 0 + (−2) · 0 + (−2) ·
4
4
4
1
+ (−2) · + 0 · 0
4
1
=− .
2
E (payoff) =0 · 0 + 1 ·
So he loses an average of a half-dollar per game!
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 6
Consider the following game in which two players each spin a fair
wheel in turn
Player B
Player A
4
12
2
5
6
The player that spins the higher number wins, and the loser has to
pay the winner the difference of the two numbers (in dollars). Who
would win more often? What is that player’s expected winnings?
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Solution. Assuming 3 equally likely outcomes for spinning A’s
wheel, and 2 equally likely outcomes for B’s wheel, there are then
3 × 2 = 6 possible equally-likely outcomes for this game:
H
HH A
HH
B
H
5
6
Sample Space
2
4
12
(2, 5)
(2, 6)
(4, 5)
(4, 6)
(12, 5)
(12, 6)
We see here that player A wins in 2/6 games, and player B wins in
4/6 games, so player B wins the most often. But, let’s compute
the expected winnings for player B:
Winnings, X , for Player B
HH
A
HH
B
H
H
2
4
12
5
6
$3
$4
$1
$2
−$7
−$6
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Each of these possible amounts won has the same probability
(1/6). The expected winnings for player B is then simply:
1
1
1
1
1
1
+ $4 · + $1 · + $2 · + (−$7) · + (−$6) ·
6
6
6
6
6
6
= −$0.50.
E (X ) = $3 ·
Hence, we see that even though player B is more likely to win on a
single game, his expected winnings is actually negative. If he keeps
on playing the game, he will eventually start losing money!
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 7
Find the expected value of the random variable X whose
probability density is given by
⎧
⎪
for 0 < x < 1,
⎨x
f (x) = 2 − x for 1 ≤ x < 2,
⎪
⎩
0
elsewhere.
Solution.
E (X ) =
∞
xf (x)dx
−∞
0
=
=
x f (x) dx +
−∞ 1
x3 +
3 0
0
x2 −
0
1
x f (x) dx +
2
x3 3
x
2
1
x f (x) dx +
∞
0
2−x
23
1
= 1 + 22 −
− 1 + = 1.
3
3
3
1
x f (x) dx
0
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 8
Certain coded measurements of the pitch diameter of threads of a
fitting have the probability density
4
for 0 < x < 1,
π(1+x 2 )
f (x) =
0
elsewhere.
Find the expected value of this random variable.
Solution.
2 1 2xdx
4dx
=
xf (x)dx =
x·
·
E (X ) =
π(1 + x 2 )
π 0 (1 + x 2 )
−∞
0
1
ln 4
2
2
≡ 0.4412712003.
= ln(x 2 + 1)0 = (ln 2 − ln 1) =
π
π
π
∞
1
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Theorem 9
If X is a discrete random variable and f (x) is the value of its
probability distribution at x, the expected value of g (X ) is given by
g (x)f (x).
E [g (X )] =
x
Correspondingly, if X is a continuous random variable and f (x) is
the value of its probability density at x, the expected value of
g (X ) is given by
∞
g (x)f (x)dx.
E [g (X )] =
−∞
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 10
Roulette, introduced in Paris in 1765, is a very popular casino
game. Suppose that Hannah’s House of Gambling has a roulette
wheel containing 38 numbers: zero (0), double zero (00), and the
numbers 1, 2, 3, · · · , 36. Let X denote the number on which the
ball lands and u(x) denote the amount of money paid to the
gambler, such that:
⎧
⎪
$5
if x = 0,
⎪
⎪
⎪
⎨$10 if x = 00,
u(x) =
⎪$1
if x is odd,
⎪
⎪
⎪
⎩$2
if x is even.
How much would Hannah’s House of Gambling has to charge each
gambler to play in order to ensure that Hannah’s House of
Gambling made some money?
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Solution. Assuming that the ball has an equally likely chance of
landing on each number, the probability distribution function of X
is
f (x) = 1/38
for each x = 0, 00, 1, 2, 3, · · · , 36. Therefore, the expected value of
u(X ) is
u(x)f (x)
E (u(X )) =
x
1
1
+ $10
=$5
38
38
1
1
× 18 + $2
× 18
+ $1
38
38
= $1.82
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Note that the 18 that is multiplied by the $1 and $2 is because
there are 18 odd and 18 even numbers on the wheel. Our
calculation tells us that, in the long run, Hannah’s House of
Gambling would expect to have to pay out $1.82 for each spin of
the roulette wheel. Therefore, in order to ensure that the House
made money, the House would have to charge at least $1.82 per
play.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 11
If X is the number of points rolled with a balanced die, find the
expected value of g (X ) = 2X 2 + 1.
Solution. Since each possible outcome has the probability 16 , we
get
E [g (X )] =E [2X 2 + 1] =
6
x=1
g (x)f (x) =
6
1
(2x 2 + 1)
6
x=1
1
1
1
=(2 · 12 + 1) + (2 · 22 + 1) + · · · + (2 · 62 + 1)
6
6
6
94
= .
3
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 12
If X has the probability density
e −x
f (x) =
0
for x > 0,
elsewhere
find the expected value of g (X ) = e 3X /4 .
Solution.
E [g (X )] = E [e
3X /4
= 4.
0
∞
]=
c
= lim
c→∞
g (x)f (x)dx = lim
c→∞ 0
−∞
c
e 4 e −x dx
3x
c
c
− x4
− x4 e dx = −4 lim e = −4 lim (e − 4 − e 0 )
c→∞
0
c→∞
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Some Theorems on Expectation
The determination of mathematical expectation can often be
simplified by using the following theorems. Since the steps are
essentially the same, some proofs will be given either for the
discrete case or the continuous case; others are left for the reader
as exercises.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Theorem 13
If a and b are any constants, then
E (aX + b) = aE (X ) + b.
Proof.
Using Theorem 9 with g (X ) = aX + b, we get
∞
(ax + b)f (x)dx
∞
xf (x)dx +b
f (x)dx
=a
−∞
−∞
E (aX + b) =
−∞
∞
E (X )
= aE (X ) + b.
1
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
If we set b = 0 or a = 0, it follows from Theorem 13 that
Corollary 14
If a is a constant, then
E (aX ) = aE (X ).
Corollary 15
If b is a constant, then
E (b) = b.
Theorem 16
If c1 , c2 , and cn are constants, then
E [c1 g1 (X ) + c2 g2 (X ) + · · · + cn gn (X )]
= c1 E [g1 (X )] + c2 E [g2 (X )] + · · · + cn E [gn (X )].
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
The concept of a mathematical expectation can easily be extended
to situations involving more than one random variable. For
instance, if Z is the random variable whose values are related to
those of the two random variables X and Y by means of the
equation z = g (x, y ), it can be shown that
Theorem 17
If X and Y are discrete random variables and f (x, y ) is the value
of their joint probability distribution at (x, y ), the expected value
of g (X , Y ) is
g (x, y ) · f (x, y ).
E [g (X , Y )] =
x
y
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Theorem 18
If X and Y are any random variables, then
E (X + Y ) = E (X ) + E (Y ).
Proof. Let f (x, y ) be the joint probability distribution of X and Y ,
assumed discrete. Then
(x + y )f (x, y )
E (X + Y ) =
x
=
y
x
xf (x, y ) +
y
x
yf (x, y )
y
= E (X ) + E (Y ).
Note that the theorem is true whether or not X and Y are
independent.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Theorem 19
If X and Y are independent random variables, then
E (X · Y ) = E (X ) · E (Y ).
Proof. Let f (x, y ) be the joint probability distribution of X and Y ,
assumed discrete. If the variables X and Y are independent, we
have f (x, y ) = g (x) · h(y ). Therefore,
(x · y )f (x, y ) =
(x · y )g (x) · h(y )
E (X · Y ) =
x
=
x
=
x
y
x · g (x)
x
y
y · h(y )
y
(x · g (x)E (Y )) = E (X ) · E (Y ).
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Note that the validity of this theorem hinges on whether f (x, y )
can be expressed as a function of x multiplied by a function of y ,
for all x and y , i.e., on whether X and Y are independent. For
dependent variables it is not true in general.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 20
Suppose the probability distribution of the discrete random variable
X is
x
f (x)
0
0.2
1
0.1
2
0.4
3
0.3
Find E (X ), E (X 2 ). Knowing that, what is E (4X 2 ) and
E (3X + 2X 2 )?
Solution.
E (X ) = 3x=0 xf (x) = 0 · 0.2 + 1 · 0.1 + 2 · 0.4 + 3 · 0.3 = 1.8,
E (X 2 ) = 3x=0 x 2 f (x) = 02 · 0.2 + 12 · 0.1 + 22 · 0.4 + 32 · 0.3 = 4.4,
E (4X 2 ) = 4E (X 2 ) = 4 · 4.4 = 17.6,
E (3X + 2X 2 ) = 3E (X ) + 2E (X 2 ) = 3 · 1.8 + 2 · 4.4 = 14.2.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 21
If the probability density of X is given by
2(1 − x) for 0 < x < 1
f (x) =
0
elsewhere,
(a) show that
E (X r ) =
2
,
(r + 1)(r + 2)
(b) and use this result to evaluate
E [(2X + 1)2 ].
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
2(1 − x) for 0 < x < 1
f (x) =
0
elsewhere,
Solution.
(a)
r
∞
E (X ) =
−∞
0
=
x r f (x)dx
r
x f (x) dx +
−∞
0
0
1
r
x f (x) dx +
2(1−x)
∞
1
r +2 1
x
x r +1
−
(x r − x r +1 )dx = 2
r
+
1
r
+2
0
1
2
1
−
=
.
=2
r +1 r +2
(r + 1)(r + 2)
1
x r f (x) dx
=2
0
0
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
E (X r ) =
2
(r +1)(r +2)
(b) Since
E [(2X + 1)2 ] = E [4X 2 + 4X + 1] = 4E (X 2 ) + 4E (X ) + 1
and substitution of r = 1 and r = 2 into the preceding
formula yields
E (X ) =
1
1
2
2
= and E (X 2 ) =
= ,
(2)(3)
3
(3)(4)
6
we get
E [(2X + 1)2 ] = 4 ·
1
1
+ 4 · + 1 = 3.
6
3
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Outline
1
Introduction
2
The Expected Value of a Random Variable
3
Moments
4
Chebyshev’s Theorem
5
Moment Generating Functions
6
Product Moments
7
Conditional Expectation
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Moments
Definition 22
The r th moment about the origin of a random variable X ,
denoted by μr , is the expected value of X r ; symbolically,
x r f (x)
μr = E (X r ) =
x
for r = 0, 1, 2, · · · when X is discrete, and
∞
x r f (x)dx
μr = E (X r ) =
−∞
when X is continuous.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
1
2
When r = 0, we have μ0 = E (X 0 ) = E (1) = 1.
When r = 1, we have μ1 = E (X 1 ) = E (X ), which is just the
expected value of the random variable X , and we give it a
special symbol and a special name.
Definition 23
μ1 is called the mean of the distribution function of X , or simply
the mean of X , and it is denoted by μ.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
The special moments we shall define next are of importance in
statistics because they serve to describe the shape of the
distribution of a random variable, that is, the shape of the graph of
its probability distribution or probability density.
Definition 24
The r th moment about the mean of a random variable X ,
denoted by μr , is the expected value of (X − μ)r ; symbolically,
(x − μ)r f (x)
μr = E [(X − μ)r ] =
x
for r = 0, 1, 2, · · · when X is discrete, and
∞
r
(x − μ)r f (x)dx
μr = E [(X − μ) ] =
−∞
when X is continuous.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
1
2
When r = 0, we have μ0 = E [(X − μ)0 ] = 1.
When r = 1, we have
μ1 = E [(X − μ)1 ] = E (X − μ) = E (X ) − E (μ) = μ − μ = 0.
for any random variable for which μ exists.
The second moment about the mean is of special importance in
statistics because it is indicative of the spread or dispersion of the
distribution of a random variable; thus, it is given a special symbol
and a special name.
Definition 25
μ2 is called the variance of the distribution of X , or simply the
variance of X , and it is denoted by σ 2 , var(X ), or V (X ); σ, the
positive square root of the variance, is called the standard
deviation.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
This figure shows how the variance reflects the spread or dispersion
of the distribution of a random variable. Here we show the
histograms of the probability distributions of four random variables
with the same mean μ = 5 but variances equaling 5.26, 3.18, 1.66,
and 0.88.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
A large standard deviation indicates that the data points are far
from the mean and a small standard deviation indicates that they
are clustered closely around the mean. For example, each of the
three populations {0, 0, 14, 14}, {0, 6, 8, 14} and {6, 6, 8, 8} has a
mean of 7. Their standard deviations are 7, 5, and 1, respectively.
The third population has a much smaller standard deviation than
the other two because its values are all close to 7. It will have the
same units as the data points themselves. If, for instance, the data
set {0, 6, 8, 14} represents the ages of a population of four siblings
in years, the standard deviation is 5 years. As another example, the
population {1000, 1006, 1008, 1014} may represent the distances
traveled by four athletes, measured in meters. It has a mean of
1007 meters, and a standard deviation of 5 meters.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Theorem 26
σ 2 = μ2 − μ2 .
Proof.
σ 2 = E [(X − μ)2 ] = E (X 2 − 2X μ + μ2 )
= E (X 2 ) − 2μ E (X ) +μ2
μ
= E (X ) − 2μ + μ2
2
= E (X 2 ) −μ2
μ2
= μ2 − μ2
2
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 27
Use Theorem 26 to calculate the variance of X , representing the
number of points rolled with a balanced die.
Solution. First we compute
μ = E (X ) = 1 ·
1
1
1
1
1
7
1
+2· +3· +4· +5· +6· = .
6
6
6
6
6
6
2
Now,
μ2 = E (X 2 ) = 12 ·
91
1
1
1
1
1
1
+ 22 · + 32 · + 42 · + 52 · + 62 · =
6
6
6
6
6
6
6
and it follows that
σ =
2
μ2
91
−
−μ =
6
2
2
7
35
= .
2
12
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 28
With reference to Example 8, find the standard deviation of the
random variable X .
Solution. In Example 8 we showed that μ = E (X ) =
ln 4
π .
Now,
4 1
x2
1
dx
=
1
−
dx
2
π 0
1 + x2
0 1+x
4
4
(1 − arctan 1) − (0 − arctan 0)
= (x − arctan x)10 =
π
π
4
π
4
1−
− (0 − 0) = − 1
=
π
4
π
μ2 = E (X 2 ) =
4
π
1
and it follows that
σ2 =
and σ =
4
−1−
π
ln 4
π
2
≈ 0.078519272
√
0.078519272 = 0.280212905.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 29
Find μ, μ2 , and σ 2 for the random variable X that has the
probability distribution f (x) = 12 for x = −2 and x = 2.
Solution.
μ = E (X ) =
x
μ2 = E (X 2 ) =
x
1
1
xf (x) = (−2) + 2 = 0,
2
2
1
1
x 2 f (x) = (−2)2 + 22 = 4,
2
2
σ = μ2 = μ2 − μ2 = 4 − 02 = 4,
2
or
σ 2 = μ2 = E [(X −μ)2 ] =
1
1
(x−μ)2 f (x) = (−2−0)2 +(2−0)2 = 4.
2
2
x
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 30
Find μ, μ2 , and σ 2 for the random variable X that has the
probability density
x/2 for 0 < x < 2,
f (x) =
0
elsewhere.
Solution.
μ = E (X ) =
μ2
2
2
xf (x)dx =
−∞
2
0
2
x 3 x
4
x dx =
= ,
2
6 0 3
∞
∞
= E (X ) =
2
x f (x)dx =
−∞
0
2
x 4 x dx =
= 2,
2
8 0
2x
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
σ = μ2 =
2
μ2
2
4
2
−μ =2−
= ,
3
9
2
or
∞
2
4
(x − μ) f (x)dx =
x−
σ = μ2 = E [(X − μ) ] =
3
−∞
0
4
2
2
1 x
8x 3 8x 2
8
16
1
−
+
x 3 − x 2 + x dx =
=
2 0
3
9
2 3
9
9 0
2
=
9
2
2
2
2
x
dx
2
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Theorem 31
If X has the variance σ 2 , then
var(aX + b) = a2 σ 2 .
Proof.
Since
E [aX + b] = aE (X ) + b,
and
E [(aX + b)2 ] = a2 E (X 2 ) + 2abE (X ) + b 2
then we have
σ 2 = a2 E (X 2 ) + 2abE (X ) + b 2 − [aE (X ) + b]2
= a2 E (X 2 ) + 2abE (X ) + b 2 − a2 [E (X )]2 − 2abE (X ) − b 2
= a2 E (X 2 )
= a2 σ 2 .
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
var(aX + b) = a2 σ 2 .
1
For a = 1, we find that the addition of a constant to the
values of a random variable, resulting in a shift of all the
values of X to the left or to the right, in no way affects the
spread of its distribution.
2
For b = 0 we find that if the values of a random variable are
multiplied by a constant, the variance is multiplied by the
square of that constant, resulting in a corresponding change in
the spread of the distribution.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Theorem 32
If X and Y are independent random variables,
var (X + Y ) = var (X ) + var (Y ), var (X − Y ) = var (X ) + var (Y ).
Proof.
The proof is left as an exercise.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 33
Suppose that X and Y are independent, with var(X ) = 6,
E (X 2 Y 2 ) = 150, E (Y ) = 2, E (X ) = 2. Compute var(Y ).
Solution.
var(X ) = 6 = E (X 2 ) − (E (X ))2 = E (X 2 ) − 22 ⇒ E (X 2 ) = 10.
Since X and Y are independent, we have E (XY ) = E (X )E (Y ),
thus E (X 2 Y 2 ) = E (X 2 )E (Y 2 ). Therefore,
E (X 2 Y 2 ) = 150 = E (X 2 )E (Y 2 ) = 10E (Y 2 ) ⇒ E (Y 2 ) = 15.
Also, since var(Y ) = E (Y 2 ) − (E (Y ))2 , then we obtain that
var (Y ) = E (Y 2 ) − (E (Y ))2 = 15 − 22 = 11.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Outline
1
Introduction
2
The Expected Value of a Random Variable
3
Moments
4
Chebyshev’s Theorem
5
Moment Generating Functions
6
Product Moments
7
Conditional Expectation
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Chebyshev’s Theorem
Theorem 34 (Chebyshev’s Theorem)
Suppose that X is a random variable (discrete or continuous)
having mean μ and variance σ 2 , which are finite. Then if ε is any
positive number
σ2
P(|X − μ| ≥ ε) ≤ 2
ε
or, with ε = kσ
1
P(|X − μ| ≥ kσ) ≤ 2 .
k
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 35
A random variable X has density function given by
2e −2x , x ≥ 0,
f (x) =
0,
x < 0.
(a) Find P(|X − μ| ≥ 1).
(b) Use Chebyshev’s theorem to obtain an upper bound on
P(|X − μ| ≥ 1) and compare with the result in (a).
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Solution.
(a) First, we need to find μ:
∞
∞
xf (x)dx =
x2e −2x dx
μ = E (X ) =
0
c −∞
x2e −2x dx(x = u ⇒ dx = du, 2e −2x dx = dv ⇒ −e −2x = v )
= lim
c→∞ 0
c
1 −2x c
−2x c
−2x
−2x
+
e
dx = lim −xe
− e
= lim −xe
0
c→∞
c→∞
2
0
c
0 1
1
1
= − lim e −2c c +
− e0 0 +
= − lim e −2x x +
c→∞
c→∞
2 0
2
2
1
1
= = 0.5.
=− 0−
2
2
Since P(|X − μ| ≥ 1) = 1 − P(|X − μ| < 1) and
|X − μ| = |X − 0.5| < 1 ⇔ −1 < X − 0.5 < 1 ⇔ −1 + 0.5 < X <
1 + 0.5 ⇒ −0.5 < X < 1.5 then
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
2e −2x , x ≥ 0,
f (x) =
0,
x < 0.
P(|X − μ| ≥ 1) = 1 − P(|X − μ| < 1) = 1 − P(−0.5 < X < 1.5)
1.5
1.5
f (x)dx = 1 −
2e −2x dx
=1−
=1+
−0.5
1.5
e −2x 0
0
= 1 + (e
−3
− e 0 ) = e −3 = 0.049787068.
(b) If we take ε = 1 in Chebyshev’s theorem we obtain that
P(|X − μ| ≥ 1) ≤
σ2
= σ2 .
12
One can show that σ 2 = 0.25. So an upper bound for the given
probability is 0.25 which is quite far from the real value.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 36
If a random variable X is such that E [(X − 1)2 ] = 10,
E [(X − 2)2 ] = 6,
(a) Find the mean of X .
(b) Find the standard deviation of X .
√ (c) Find an upper bound for P X − 72 ≥ 15 .
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Solution.
(a) Since
E [(X − 1)2 ] = E [X 2 − 2X + 1] = E [X 2 ] − 2E [X ] + 1 = 10 ⇒
E [X 2 ] − 2E [X ] = 9,
E [(X − 2)2 ] = E [X 2 − 4X + 4] = E [X 2 ] − 4E [X ] + 4 = 6 ⇒
E [X 2 ] − 4E [X ] = 2,
then we have that
E [X 2 ] = 16 and μ = E [X ] =
(b) σ =
Var (X ) =
E [X 2 ] − (E [X ])2 =
7
.
2
16 −
√
15
49
=
4
2 .
√
7
15
2, ε =
(c) If we use Chebyshev’s inequality by taking μ =
and σ 2 = 15
4 , we obtain
1
15/4
7 √
σ2
= .
P(|X − μ| ≥ ε) ≤ 2 ⇒ P X − ≥ 15 ≤
ε
2
15
4
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Outline
1
Introduction
2
The Expected Value of a Random Variable
3
Moments
4
Chebyshev’s Theorem
5
Moment Generating Functions
6
Product Moments
7
Conditional Expectation
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Moment Generating Functions
Although the moments of most distributions can be determined directly
by evaluating the necessary integrals or sums, an alternative procedure
sometimes provides considerable simplification. This technique utilizes
moment-generating functions.
Definition 37
The moment-generating function of a random variable X , where
it exists, is given by
e tx f (x)
MX (t) = E [e tX ] =
x
when X is discrete and
MX (t) = E [e tX ] =
when X is continuous.
∞
−∞
e tx f (x)dx
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
To explain why we refer to this function as a moment-generating function
let us substitute for e tx its Maclaurin’s series expansion, that is
e tx = 1 + tx +
t 3x 3
tr xr
t 2x 2
+
+ ···+
+ ··· .
2!
3!
r!
For discrete case, we thus get
MX (t) =
=
tr xr
t 2x 2
+ ···+
+ · · · f (x)
1 + tx +
2!
r!
x
f (x) +t
x
1
= 1 + μt + μ2
x
xf (x) +
E [X ]=µ1 =µ
t2 2
tr r
x f (x) + · · · +
x f (x) + · · ·
2! x
r! x
µ2
µr
t2
tr
+ · · · + μr + · · ·
2!
r!
and it can be seen that in the Maclaurin’s series of the moment
generating function of X the coefficient of t r /r ! is μr , the r th moment
about the origin. In the continuous case, the argument is the same.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 38
Find the moment generating function of the random variable
whose probability density is given by
e −x for x > 0,
f (x) =
0
elsewhere
and use it to find an expression for μr .
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Solution. By definition
∞
∞
e tx e −x dx =
e −x(1−t) dx
MX (t) = E (e tX ) =
0
c 0
c
1
−x(1−t)
−x(1−t) lim e
e
dx = −
= lim
c→∞ 0
1 − t c→∞
0
1
1
lim (e −c(1−t) − e 0 ) =
for t < 1.
=−
1 − t c→∞
1−t
As is well known, when |t| < 1 the Maclaurin’s series for this
moment generating function is
1
= 1 + t + t2 + t3 + · · · + tr + . . .
1−t
t2
t3
tr
t
= 1 + 1! + 2! + 3! + · · · + r ! + · · ·
1!
2!
3!
r!
MX (t) =
and hence μr = r ! for r = 0, 1, 2, · · · .
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Theorem 39
d r MX (t) = μr .
dt r t=0
This follows from the fact that if a function is expanded as a power
series in t, the coefficient of t r /r ! is the r th derivative of the
function with respect to t at t = 0.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 40
Given that X has the probability distribution f (x) = 18 x3 for
x = 0, 1, 2 and 3, find the moment-generating function of this
random variable and use it to determine μ1 and μ2 .
Solution. Since
MX (t) = E (e tx ) =
3
x=0
e tx
3
1
= (1 + 3e t + 3e 2t + 3e 3t )
8
x
1
= (1 + e t )3 .
8
Then, by Theorem 39,
3
3
t 2 t
= ,
μ1 = MX (t) t=0 = (1 + e ) e 8
2
t=0
and
μ2
=
MX (t)t=0
=
3
4
t 2 2t
(1 + e ) e
3
t 2 t
+ (1 + e ) e = 3.
8
t=0
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Often the work involved in using moment-generating functions can
be simplified by making use of the following theorem:
Theorem 41
If a and b are constant, then
(a) MX +a (t) = E [e (X +a)t ] = e at MX (t);
(b) MbX (t) = E (e bXt ) = MX (bt);
X +a a
(c) M X +a (t) = E e ( b )t = e b t M t .
b
Proof.
The proof is left as an exercise.
X
b
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Outline
1
Introduction
2
The Expected Value of a Random Variable
3
Moments
4
Chebyshev’s Theorem
5
Moment Generating Functions
6
Product Moments
7
Conditional Expectation
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Product Moments
Definition 42
The r th and sth product moment about the origin of the
random variables X and Y , denoted by μr ,s , is the expected value
of X r Y s ; symbolically
x r y s f (x, y )
μr ,s = E (X r Y s ) =
x
y
for r = 0, 1, 2, · · · and s = 0, 1, 2, · · · when X and Y are discrete.
The double summation extends over the entire joint range of the
two random variables. Note that μ1,0 = E (X ), which we denote
here μX , and that μ0,1 = E (Y ), which we denote here by μY .
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Definition 43
The r th and sth product moment about the means of the
random variables X and Y , denoted by μr ,s , is the expected value
of (X − μX )r (Y − μY )s ; symbolically
(x − μX )r (y − μY )s f (x, y )
μr ,s = E [(X − μX )r (Y − μY )s ] =
x
y
for r = 0, 1, 2, · · · and s = 0, 1, 2, · · · when X and Y are discrete.
In statistics, μ1,1 is of special importance because it is indicative of
the relationship, if any, between the values of X and Y ; thus it is
given a special symbol and special name.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Definition 44
μ1,1 is called the covariance of X and Y , and it is denoted by
σXY , cov (X , Y ), or C (X , Y ).
Observe that if there is a high probability that large values of X
will go with large values of Y and small values of X with small
values of Y , the covariance will be positive, if there is a high
probability that large values of X will go with small values of Y ,
and vice versa, the covariance will be negative.
y
μY
− +
+ −
x
μX
It is in this sense that the covariance measures the relationship, or
association, between the values of X and Y .
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Theorem 45
σXY = μ1,1 − μX μY .
Proof.
Using the various theorems about expected values, we can write
σXY = E [(X − μX )(Y − μY )]
= E [XY − X μY − Y μX + μX μY ]
= E (XY ) − μY E (X ) − μX E (Y ) + μX μY
= E (XY ) − μY μX − μX μY + μX μY
= μ1,1 − μX μY .
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 46
We know that, the joint and marginal probabilities of X and Y ,
the numbers of aspirin and sedative caplets among two caplets
drawn at random from a bottle containing three aspirin, two
sedative, and four laxative caplets were recorded as follows:
HH
x
HH
y
H
H
0
1
2
h(y )
0
1
2
g (x)
6/36
8/36
1/36
15/36
12/36
6/36
3/36
18/36
3/36
21/36
14/36
1/36
1
Find the covariance of X and Y .
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
HH
x
HH
y
H
H
0
1
2
h(y )
0
1
2
g (x)
6/36
8/36
1/36
15/36
12/36
6/36
3/36
18/36
3/36
21/36
14/36
1/36
1
Solution. Referring to the joint probabilities given here, we get
μ1,1
=E (XY ) =
2
2 xyf (x, y )
x=0 y =0
12
3
6
+1·0·
+2·0·
36
36
36
6
1
6
8
+1·1·
+0·2·
=
+0·1·
36
36
36
36
=0 · 0 ·
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
H
HH
y
x
HH
H
0
1
2
g (x)
0
1
2
h(y )
6/36
8/36
1/36
15/36
12/36
6/36
3/36
18/36
3/36
21/36
14/36
1/36
1
and using the marginal probabilities, we get
μX = E (X ) =
2
xf (x, y ) =
x=0
2
xg (x) = 0·
18
3
24
15
+1· +2·
=
36
36
36
36
yh(y ) = 0·
14
1
16
21
+1· +2·
= .
36
36
36
36
x=0
and
μY = E (Y ) =
2
y =0
yf (x, y ) =
2
y =0
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
It follows that
σXY = μ1,1 − μX μY =
24 16
7
6
−
·
=− .
36 36 36
54
The negative result suggests that the more aspirin tablets we get
the fewer sedative tables we will get, and vice versa, and this, of
course, make sense.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 47
A quarter is bent so that the probabilities of heads and tails are 0.4
and 0.6. If it is tossed twice, what is the covariance of Z , the
number of heads obtained on the first toss, and W , the total
number of heads obtained in the two tosses of the coin.
Solution.
f (0, 0) = P(T , T ) = 0.6·0.6 = 0.36, f (1, 1) = P(H, T ) = 0.4·0.6 = 0.24,
f (0, 1) = P(T , H) = 0.6·0.4 = 0.24, f (1, 2) = P(H, H) = 0.4·0.4 = 0.16.
H
HH w
HH
z
H
0
1
h(w )
0
1
2
g (z)
0.36
0.24
0.24
0.48
0.16
0.16
0.60
0.40
1
0.36
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
σZW = μ1,1 − μZ μW = E (ZW ) − E (Z )E (W ).
μ1,1
= E (ZW ) =
2
1 f (z, w )
z=0 w =1
= 0 · 0 · 0.36 + 0 · 1 · 0.24 + 1 · 1 · 0.24 + 1 · 2 · 0.16 = 0.56,
μZ = E (Z ) =
1
f (z, w ) =
z=0
2
μW = E (W ) =
w =0
1
g (z) = 0 · 0.60 + 1 · 0.40 = 0.40,
z=0
2
f (z, w ) =
h(w ) = 0 · 0.36 + 1 · 0.48 + 2 · 0.16
w =0
= 0.80 ⇒
σZW = 0.56 − 0.40 · 0.80 = 0.24.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
As far as the relation between X and Y is concerned, observe that
if X and Y are independent, their covariance is zero; symbolically,
Theorem 48
If X and Y are independent, then σXY = 0.
Proof.
If X and Y are independent then we know that
E (XY ) = E (X )E (Y ).
Since
σXY = μ1,1 − μX μY = E (XY ) − E (X )E (Y )
thus
σXY = E (X )E (Y ) − E (X )E (Y ) = 0.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
It is of interest to note that the independence of two random
variables implies a zero covariance, but a zero covariance does not
necessarily imply their independence. This is illustrated by the
following example.
Example 49
If the joint distribution of X and Y is given by
H
HH x
HH
y
H
−1
0
1
g (x)
−1
0
1
h(y )
1/6
0
1/6
2/6
2/6
0
0
2/6
1/6
0
1/6
2/6
4/6
0
2/6
1
show that their covariance is zero even though the two random
variables are not independent.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Solution. Using the probabilities shown in the margins, we get
μX =
1
xg (x) = (−1) ·
2
2
2
+ 0 · + 1 · = 0,
6
6
6
yh(y ) = (−1) ·
2
2
4
+0·0+1· = − ,
6
6
6
x=−1
μY =
1
y =−1
μ1,1 =
1
1
xyf (x, y )
x=−1 y =−1
1
1
1
1
= (−1)(−1) + 1(−1) · + (−1)1 · + 1 · 1 · = 0.
6
6
6
6
2
Thus, σXY = 0 − 0 − 6 = 0, the covariance is zero, but the two
random variables are not independent. For instance,
f (x, y ) = g (x)h(y ) for x = −1 and y = −1.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Theorem 50
Var (X ± Y ) = Var (X ) + Var (Y ) ± 2Cov (X , Y )
or
σX2 ±Y = σX2 + σY2 ± 2σXY .
Proof.
Var (X ± Y )
=E [((X ± Y ) − (μX ± μY ))2 ]
=E [(X ± Y )2 − 2(X ± Y )(μX ± μY ) + (μX ± μY )2 ]
=E [X 2 ± 2XY + Y 2 − 2X μX ∓ 2X μY ∓ 2Y μX − 2Y μY
+ μ2X ± 2μX μY + μ2Y ]
=E [(X − μX )2 + (Y − μY )2 ± 2XY ∓ 2X μY ∓ 2Y μX ± 2μX μY ]
=E [(X − μX )2 ] + E [(Y − μY )2 ] ± 2E (XY ) ∓ 2μY E (X ) ∓ 2μX E (Y ) ± 2μX μY
=Var (X ) + Var (Y ) ± 2Cov (X , Y ).
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Outline
1
Introduction
2
The Expected Value of a Random Variable
3
Moments
4
Chebyshev’s Theorem
5
Moment Generating Functions
6
Product Moments
7
Conditional Expectation
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Conditional Expectation
Definition 51
If X is a discrete random variable and f (x|y ) is the value of the
conditional probability distribution of X given Y = y at x, the
conditional expectation of u(X ) given Y = y is
u(x)f (x|y ).
E [u(X )|y ] =
x
Similar expressions based on the conditional probability distribution
of Y given X = x define the conditional expectation of v (Y ) given
X = x.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Definition 52
If we let u(X ) = X in Definition 51, we obtain the conditional
mean of the random variable X given Y = y , which we denote by
xf (x|y ).
μX |y = E (X |y ) =
x
Definition 53
Correspondingly, the conditional variance of X given Y = y is
σX2 |y = E [(X − μX |y )2 |y ] = E (X 2 |y ) − μ2X |y
where E (X 2 |y ) is given by Definition 51 with u(X ) = X 2
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 54
With reference to Example 46, find the conditional mean of X
given Y = 1.
Solution. Making use of the results obtained before, that is,
f (0|1) = 47 , f (1|1) = 37 and f (2|1) = 0, we get
E (X |1) =
2
x=0
xf (x|1) = 0 ·
3
3
4
+1· +2·0= .
7
7
7
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Example 55
Suppose that we have the following joint probability distribution
H
HH x
HH
y
H
1
2
3
4
h(y )
1
2
3
g (x)
0.01
0.20
0.09
0.30
0.07
0.09
0.05
0.06
0.20
0.03
0.25
0.12
0.40
0.20
0.50
0.30
1
0.03
0.10
(a) Find the conditional expectation of u(x) = x 2 given Y = 2.
(b) Find the conditional mean of X given Y = 2.
(c) Find the conditional variance of X given Y = 2.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Solution. Since
(x,y )
f (x,2)
f (x,2)
f (x|y ) = f h(y
) ⇒ f (x|2) = h(2) = 0.5 = 2f (x, 2), then we
have
⎧
⎪
2f (1, 2) = 0.40, x = 1,
⎪
⎪
⎪
⎨2f (2, 2) = 0.00, x = 2,
f (x|2) =
⎪
2f (3, 2) = 0.10, x = 3,
⎪
⎪
⎪
⎩2f (4, 2) = 0.50, x = 4.
(a) The conditional expectation of u(x) = x 2 given Y = 2 is
E [X |Y = 2] =
2
4
x 2 f (x|2)
x=1
2
= 1 f (1|2) + 22 f (2|2) + 32 f (3|2) + 42 f (4|2)
= 1(0.40) + 4(0.00) + 9(0.10) + 16(0.50)
= 9.30.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
(b) The conditional mean of X given Y = 2 is
E [X |Y = 2] = μX |2 =
4
xf (x|2)
x=1
= 1f (1|2) + 2f (2|2) + 3f (3|2) + 4f (4|2)
= 1(0.40) + 2(0.00) + 3(0.10) + 4(0.50)
= 2.70
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
(c) The conditional variance of X given Y = 2.
E [(X − μX |2 )2 |2] =σX2 |2 =
4
(x − 2.7)2 f (x|2)
x=1
=(1 − 2.7)2 f (1|2) + (2 − 2.7)2 f (2|2)
+ (3 − 2.7)2 f (3|2) + (4 − 2.7)2 f (4|2)
=2.89(0.40) + 0.49(0.00) + 0.09(0.10) + 1.69(0.50)
=2.01
or simply
E [(X − μX |2 )2 |2] =σX2 |2 = E (X 2 |Y = 2) − μ2X |2
=9.30 − (2.70)2
=2.01.
Introduction The Expected Value of a Random Variable Moments Chebyshev’s Theorem Moment Generating Functions Product
Thank You!!!
Related documents