Download MTH/STA 561 THE EXPECTED VALUE FOR A CONTINUOUS

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
MTH/STA 561
THE EXPECTED VALUE FOR A
CONTINUOUS RANDOM VARIABLES
Suppose that a commercial ‡ight will delay and arrive at a random time Y > 0 (called
the scheduled arrival time zero on the time scale) with density function f (y) given by
f (y) =
for 0 < y < 20
elsewhere.
y=200
0
and sketched below, where we measure time in minutes.
Let us partition the interval (0; 20) into a large number, n, of small pieces of length y =
20=n, and let y1 ; y2 ;
; yn be the midpoints of these small subintervals. Then the probability
(relative frequency) that the ‡ight arrives at time yi is approximately equal to f (yi ) y. Just
as previously discussed for the discrete case, in the long run, we would expected the average
delaying time to be approximately
n
X
yi f (yi ) y
i=1
that is the sum of products of the length of time, yi , the ‡ight will delay times the probability,
f (yi ) y, of this delaying time occurring. If we take the limit of this sum as n, the number
of ‡ights we subdivided the interval (0; 20), approaches 1, the expected value converges to
lim
n!1
n
X
yi f (yi ) y =
i=1
Z20
yf (y) dy =
0
=
Z20
y
y
dy =
200
0
40
= 13:333
3
Z20
0
y2
y3
dy =
200
600
y=20
y=0
Minutes.
If the density function for Y truly describes the time the ‡ight will delay, the average or
expected delaying time is 13:333 minutes. Again, this expected value is a measure of the
1
central tendency of the probability distribution for Y .
Recall that if Y is a discrete random variable with the probability distribution p (y), then
the expected value of Y is de…ned to be
X
E (Y ) =
yp (y)
y
and the expected value of a function u (Y ) of the random variable Y is given by
X
E [u (Y )] =
u (y) p (y) :
y
The expected value of continuous random variables can be likewise obtained by replacing
the summation operator by the integration operator as shown below.
De…nition 1. Let f (y) be the probability density function of a continuous random
variable Y . The expected value (or mean) of the random variable Y is given by
E (Y ) =
Z1
yf (y) dy
1
provided that the integral exists.
Let u (Y ) be a function of a continuous random variable Y with probability density
function f (y). Then the expected value (or mean) of u (Y ) is given by
E [u (Y )] =
Z1
u (y) f (y) dy
1
provided that the integral exists.
Theorem 1. Let u (Y ), u1 (Y ), u2 (Y ),
, uk (Y ) be functions of a continuous random
variable Y with probability density function f (y). Then
(1) E (c) = c for any constant c.
(2) E [cu (Y )] = cE [u (Y )] for any constant c.
(3) E [u1 (Y ) + u2 (Y ) +
+ uk (Y )] = E [u1 (Y )] + E [u2 (Y )] +
Proof. By de…nition, it is easy to see that
(1)
E (c) =
Z1
cf (y) dy = c
1
Z1
1
2
f (y) dy = c:
+ E [uk (Y )].
(2)
E [cu (Y )] =
Z1
cu (y) f (y) dy = c
1
Z1
u (y) f (y) dy = cE [u (Y )] :
1
(3)
E [u1 (Y ) + u2 (Y ) +
Z1
=
[u1 (y) + u2 (y) +
=
1
1
Z
u1 (y) f (y) dy +
1
+ uk (Y )]
+ uk (y)] f (y) dy
Z1
u2 (y) f (y) dy +
1
= E [u1 (Y )] + E [u2 (Y )] +
f (y) =
Then
E (Y ) =
1
+ E [uk (Y )] :
Z1
y 2 (1
2 (1
y) dy =
0
and
E Y
uk (y) f (y) dy
Let Y be a continuous random variable with the probability density
Example 1.
function given by
2
+
Z1
=
Z1
for 0 y 1
elsewhere.
y)
0
Z1
y=1
2y 3
3
2y 2 dy = y 2
2y
=
y=0
0
y
2
2 (1
y) dy = 2
0
Z1
y2
y 3 dy = 2
y3
3
y4
4
0
y=1
y=0
1
3
1
= :
6
Hence,
V ar (Y ) = E Y 2
[E (Y )]2 =
1
6
1
3
+3
1
6
2
=
1
:
18
Also, by the preceding theorem, we have
E 7Y + 3Y 2 = 7E (Y ) + 3E Y 2 = 7
1
3
=
7 1
17
+ = :
3 2
6
Example 2. Suppose that the probability density function of a continuous random
variable Y is given by
3y 2 / 26
for 1 y 3
f (y) =
0
elsewhere.
Then
E (Y ) =
Z3
1
3y 2
3
y
dy =
26
26
Z3
1
3
3 y4
y dy =
26 4
y=3
3
=
y=1
30
13
and
E Y
2
=
Z3
y
3y 2
3
dy =
26
26
2
Z3
y=3
3 y5
y dy =
26 5
4
=
y=1
1
1
363
:
65
Hence,
V ar (Y ) = E Y
2
30
13
363
[E (Y )] =
65
2
2
=
219
:
845
Also, by virtue of Theorem 1, we have
E 5Y 2
3Y = 5E Y 2
3E (Y ) = 5
363
65
3
30
13
=
363
13
90
273
=
:
13
13
Example 3. Suppose that the probability density function of a continuous random
variable Y is given by
3 (5 y)2
for 4 y 5
f (y) =
0
elsewhere.
By de…nition, we have
E (Y ) =
Z5
y)2 dy = 3
y 3 (5
4
= 3
25y
2
2
3
4
10y
y
+
3
4
Z5
10y 2 + y 3 dy
25y
4
y=5
=
y=4
17
4
and
E Y
2
=
Z5
y
2
2
3 (5
y) dy = 3
4
= 3
25y
3
4
3
5
10y
y
+
4
5
Z5
25y 2
4
y=5
=
y=4
10y 3 + y 4 dy
181
:
10
Hence,
[E (Y )]2 =
V ar (Y ) = E Y 2
181
10
17
4
2
=
3
:
80
Also, by means of Theorem 1, we have
E 10Y 2
8Y = 10E Y 2
8E (Y ) = 10
181
10
8
17
4
= 181
34 = 147:
Example 4. Let Y be a Laplace random variable; that is, its density function is given
by
f (y) = e
jyj
2
for
4
1<y<1
By de…nition, we have
Z1
E (Y ) =
1
y e
2
1
1
=
lim
2 a! 1
Z0
1
jyj
dy =
2
Z0
1
yey dy +
2
1
Z1
ye y dy
0
1
ye dy + lim
2 b!1
y
a
Zb
ye y dy
0
Using the integration by parts, we obtain
1
1
b
lim [(y 1) ey ]0a + lim
(y + 1) e y 0
2 a! 1
2 b!1
1
1
=
lim [ 1 (a 1) ea ] + lim
(b + 1) e
a!
1
2
2 b!1
1
1
=
( 1) + (1) = 0
2
2
E (Y ) =
since, by L’Hôpital’s rule, lima!
1
1) ea = limb!1
(a
b
(b + 1) e
b
+1
= 0.
Example 5. Let Y be a continuous random variable with probability density function
given by
4y 3
for 0 < y < 1
f (y) =
0
elsewhere.
Then
E (Y ) =
Z1
3
y 4y dy = 4
0
E
p
Y
=
Z1
Z1
E
=
p
3
y 4y dy = 4
E Y
2
=
y=0
Z1
y
Z1
y 2 dy =
Z1
y 5 dy =
7=2
8
dy = y 9=2
9
0
Z1
1
4y 3 dy = 4
y
0
and
=
0
0
1
Y
y=1
4
y dy = y 5
5
4
Z1
4 3
y
3
0
y
2
3
4y dy = 4
0
2 6
y
3
0
4
5
y=1
=
y=0
y=1
=
y=0
y=1
y=0
8
9
4
3
2
= :
3
Hence,
V ar (Y ) = E Y 2
[E (Y )]2 =
2
3
4
5
2
=
2
:
75
Moment-Generating Function for Continuous Random Variables.
5
Moments and moment-generating function for continuous random variables can be de…ned analogously to those given for the discrete case.
If Y is a continuous random variable, then the kth moment about the
De…nition 2.
origin is given by
0
k
=E Y
k
=
Z1
y k f (y) dy
k = 1; 2; 3;
1
provided that the integral exists. The kth moment about the men, or kth central moment,
is given by
h
i Z1
)k =
(y
)k f (y) dy
k = 1; 2; 3;
k = E (Y
1
provided that the integral exists.
Example 6. Find
0
k
for the random variable de…ned in Example 2.
Solution. By de…nition,
0
k
=
Z3
1
y
k
3
3y 2
dy =
26
26
Z3
y
k+2
3 y k+3
dy =
26 k + 3
1
y=3
=
y=1
3 3k+3 1
:
26 (k + 3)
Thus,
3 (81 1)
30
=
26 4
13
363
3 (243 1)
E Y 2 = 02 =
=
26 5
65
3 (729 1)
182
E Y 3 = 03 =
=
:
26 6
13
E (Y ) =
0
1
=
De…nition 3. If Y is a continuous random variable, then the moment-generating function of Y is given by
Z1
mY (t) = E etY =
etY f (y) dy:
1
The moment-generating function is said to exist if there exists a constant b such that mY (t)
is …nite for jtj b.
Theorem 2. If the moment-generating function mY (t) of random variable Y exists,
then for any positive integer k,
dk
mY (t)
dtk
(k)
= mY (0) =
t=0
6
0
k
Proof. Since
ey =
1
X
yk
k!
k=0
it follows that
tY
mY (t) = E e
=
Z1
=
Z1
=
Z1
y2 y3
+
+
2!
3!
=1+y+
;
ety f (y) dy
1
1 + ty +
t2 y 2 t3 y 3
+
+
2!
3!
1
f (y) dy + t
1
= 1+t
+
t
2!
0
2
+
Z1
t2
yf (y) dy +
2!
1
2
0
1
Z1
f (y) dy
t3
y 2 f (y) dy +
3!
1
3
t
3!
0
3
Z1
y 3 f (y) dy +
1
+
Taking derivatives with respect to t consecutively yields
d
mY (t) =
dt
0
1
0
2
+t
t2
2!
+
0
3
+
d2
t2 0
0
0
mY (t) = 2 + t 3 +
+
dt2
2! 4
t2 0
d3
0
0
m
(t)
=
+
t
+
+
Y
3
4
dt3
2! 5
Continuing the di¤erentiation of E etY , it is easy to see that
dk
mY (t) =
dtk
0
k
+t
0
k+1
+
t2
2!
0
k+2
+
Hence,
dk
mY (t)
dtk
for k = 1; 2; 3;
=
0
k
t=0
.
Example 7. Let Y be a continuous random variable with probability density function
given by
e y
for y > 0
f (y) =
0
elsewhere.
Find the moment-generating function for Y and then the mean and variance of Y .
Solution. By de…nition,
mY (t) =
Z1
0
ty
e
y
e dy =
Z1
e
0
(1 t)y
dy = lim
b!1
Zb
0
7
e
(1 t)y
dy
=
=
e
lim
1
b!1
1
1
(1 t)y
t
y=b
1
b!1
y=0
(1 t)b
e
= lim
t
+
1
1
t
for t < 1:
t
Now it is easy to see that
d
1
mY (t) =
dt
(1 t)2
and
2
d2
m
(t)
=
:
Y
dt2
(1 t)3
Thus,
E (Y ) =
d
mY (t)
dt
=1
and
E Y2 =
t=0
d2
mY (t)
dt2
= 2:
t=0
Hence,
V ar (Y ) = E Y 2
[E (Y )]2 = 2
12 = 1:
Just as in the discrete case, the moment-generating function can be used to establish the
equivalence of two probability density functions.
Theorem 3 (Uniqueness Theorem). Let Y1 and Y2 be two random variables with
moment-generating functions mY1 (t) and mY2 (t), respectively. If mY1 (t) = mY2 (t) for all
values of t, then Y1 and Y2 have the same probability density function.
Example 8. If W is a continuous random variable with moment-generating function
given by
1
mW (t) =
for t < 1;
1 t
what is the probability density function for W .
Solution. By the Uniqueness Theorem of moment-generating function, W has the probability density function as given in it follows from Example 7.
More examples for …nding moment-generating functions of continuous random variables
will be given in the subsequent sections.
8
Related documents