Download E(X )

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
Transcript
Chapter 4 Mathematical Expectation

4.1 Mean of Random Variables

4.2 Variance and Covariance

4.3 Means and Variances of Linear
Combinations of Random variables

4.4 Chebyshev’s Theorem
1
4.1 Mean of a Random Variables
Example
Consider the example of tossing two coins 16
times, what is the average of heads observed per toss? Let X
be the number of heads that occur per toss, then the values of
X can be 0, 1, and 2. Suppose that the experiment yields no
head, one head, and two heads a total of 4, 7, and 5 times,
respectively. The average-number of heads per toss of the two
coins over 16 tosses is:
(0)(4)  (1)(7)  (2)(5)
=1.06
16
Notice that this average value 1.06 is not a possible outcome
of X.
Rewrite the above computation as
4
7
5
(0)   (1)   (2) 
 16 
 16 
 16 
=1.06
The fractions of the total tosses resulting in 0,1 and 2 heads respectively
2
Definition 4.1 Let X be a random variable with
probability distribution f(x). The mean or expected
value of X is
  E ( X )   xf ( x)
x
if X is discrete, and

  E ( X )   xf ( x)dx

if X is continuous.
Remark: The mean  of a random variable X can be thought of
as a measure of the “center of location” in the sense that it
indicates where the “center” of the density line.
3
Example 4.1, page 89

The probability distribution of a random variable X is
given by
f ( x) 
f(0)=1/35
 4  3 
 x   3 x 
 

7
3
 
f(1)=12/35 f(2)=18/35
x=0,1,2,3.
f(3)=4/35
18
4
12
  (0)( 351 )  (1)( 12
)

(
2
)(
)

(
3
)(
)

35
35
35
7
4
Example

The probability distribution of a random variable X is
given by
e x x  0
f ( x)  
0 elsewhere

  E( X )   xe dx   xe
0
x
x 
0
(

 e
0
x
  0)
1
dx 

5
Example 4.2, page 90
In a gambling game a man is paid $5 if he gets all heads or
all tails when three coins are tossed, and he will pay out
$3 if either one or two heads show. What is his expected
gain?
Let Y be the amount of gain per bet. The possible
values are 5 and –3 dollars.
Let X be the number of heads that occur in tossing
three coins. The possible values of X are 0, 1, 2, and 3.
Solution:
P(Y = 5) = P(X = 0 or X = 3) = 1/8 + 1/8 = ¼
P(Y = -3) = P(X =1 or X = 2) = 6/8 = ¾
 = (5)(1/4) + (-3)(3/4) = –1
Interpretation: Over the long run, the gambler will, on
average, lose $1 per bet. Most likely, the more the gambler
6
play the games, the more he would lose.
Notice that in the preceding example, there are two
random variables, X and Y; and Y is a function of X,
for example if we let
X  0,3
5
Y  g( X )  
 3 X  1,2
E(Y) = E(g(X))
= (5)P(Y = 5) + (-3)P(Y = -3)
= (5)[P(X = 0) + P(X = 3)] + (-3)[P(X = 1) + P(X = 2)]
= (5)P(X = 0) + (5)P(X = 3) + (-3)P(X = 1) + (-3)P(X = 2)
= g(0)P(X = 0) + g(3)P(X = 3) + g(1)P(X = 1) + g(2)P(X = 2)
=
 g ( x) f ( x)
x
7
Theorem 4.1
Let X be a random variable with
probability distribution f(x). The mean or expected
value of random variable g(X) is
 g ( X )  E[ g ( X )]   g ( x) f ( x)
x
if X is discrete, and

 g ( X )  E[ g ( X )]   g ( x) f ( x)dx

if X is continuous.
8
Example
Let X denote the length in minutes of a longdistance telephone conversation. Assume that the
density for X is given by
f ( x) 
1
10
e  x /10
x0
Find E(X) and E(2X+3)
Solution:

 x f ( x)dx
E(X ) =


E(2X+3)=
 (2 x  3)
0

1  x / 10
=  x e dx = 10
0 10
1  x /10
e
dx
10
= 2(10) + 3 = 23 .
9
Extension
Definition 4.2 Let X and Y be random variables with
joint probability distribution f(x, y). The mean or
expected value of the random variable g(X, Y) is
 g ( X ,Y )  E[ g ( X , Y )]   g ( x, y) f ( x, y)
( x, y )
if X and Y are discrete, and


 g ( X ,Y )  E[ g ( X , Y )]    g ( x, y ) f ( x, y )dxdy
 
if X and Y are continuous.
10
Example

Example: Suppose two dice are rolled, one red and
one white. Let X be the number on the top face of the
red die, and Y be the number on the top face of the
white one. Find E(X+Y).
6
6
E[X + Y] =( x, y )( x  y) P( X  x, Y  y) = y1 x1( x  y ) P( X  x, Y  y )
6
6
6
6
6
6
= y1 x1 xP( X  x, Y  y )  y1 x1 yP( X  x, Y  y )
6
6
= y1 x1 x(1 / 36)  y1 x1 y (1 / 36)
= 3.5 + 3.5 = 7
11

Example 4.7, page 93. Find E[Y/X] for the density
 x3 y3
,

f ( x, y )   16
0,
Solution:
0  x  2, 0  y  2
elsewhere


 g ( X ,Y )  E[ g ( X , Y )]    g ( x, y ) f ( x, y )dxdy
 
3 3
2 2 x2 y4
y
x
y
=  
dxdy
dxdy =  
0 0 16
0 0 x 16
2 2
12
In general
If X and Y are two random variables, f(x, y) is the
joint density function, then:
 E(X)=   xf ( x, y ) =
x
y
 xg ( x)
x
 

 

 E(X) =   xf ( x, y )dxdy   xg ( x)dx
 E(Y) =   yf ( x, y ) =  yh( y )
y x
 
(discrete case)
y
(continuous case)
(discrete case)

 E(Y) =  yf ( x, y )dxdy  yh( y )dy
(continuous case)
g(x) and h(y) are marginal probability distributions of X and Y,13
respectively.