Download Moments of a Random Variable, Moment Generating Function

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Probability interpretations wikipedia , lookup

Probability box wikipedia , lookup

Random variable wikipedia , lookup

Randomness wikipedia , lookup

Law of large numbers wikipedia , lookup

Conditioning (probability) wikipedia , lookup

Transcript
Section 5 (cont.): Moments of a Random
Variable, Moment Generating Function
October 7th, 2014
Lesson 10
In an earlier lecture, we defined the n-th moment of the
random variable X to be the number E [X n ]. The variance
Var [X ] = E [X 2 ] − (E [X ])2 is in fact equal to E [(X − µX )2 ],
the 2nd central moment of X about the mean µX .
In this lecture we will introduce the idea of the moment
generating function of a random variable X .
Lesson 10
The moment generating function (mgf) of a random
variable X is denoted by MX (t) (sometimes M(t), mX (t), or
m(t)), and it is defined as
MX (t) = E [e tX ].
If X is a discrete random variable, then
MX (t) =
P tx
e p(x ).
If X is a continuous random variable, then
MX (t) =
R∞
−∞
e tx f (x ) dx .
Lesson 10
Here are some useful facts about moment generating
functions.
It is always the case that MX (0) = 1.
We can find the moments of X by taking the derivatives
of MX (t):
MX0 (0) = E [X ],
MX00 (0) = E [X 2 ],
(n)
MX (0) = E [X n ]
We also have
d
dt
ln[MX (t)]
=
t=0
MX0 (0)
MX (0)
= E [X ]
and
d2
dt 2
ln[MX (t)]
t=0
= Var [X ]
Lesson 10
Example (5.8)
The pdf of X is f (x ) = 5e −5x for x > 0. Find the moment
generating function of X and use it to find the first and
second moments of X , and the variance of X .
Lesson 10
Example (5.9)
The moment generating function of X is given as
t < α, where α > 0. Find Var [X ].
Lesson 10
α
α−t
for
Example (5.10)
If the moment generating function for the random variable X
1
is MX (t) = 1−t
, find the third moment of X about the point
x = 3.
Lesson 10
Example (5.11)
Let X1 , X2 , X3 be a random sample from a discrete
distribution with probability function



1/4
p(x ) =  3/4

0
for x = 0
for x = 1
otherwise
Determine the moment generating function, M(t), of
Y = X1 X2 X3 .
Lesson 10
Here we define some concepts that are in the section but will
not come up in the exercises.
For a random variable X , the skewness of X is defined to
be E [(X − µ)3 ]/σ 3 . If skewness is positive, we say the
distribution is skewed to the right. If it is negative, we say
the distribution is skewed to the left.
2
d
00
If h is a function such that dx
2 h(x ) = h (x ) ≥ 0 for all
points x with non-zero density or probability for X , then
E [h(X )] ≥ h(E [X ]). The inequality reverses if h00 ≤ 0.
This is known as Jensen’s inequality.
Lesson 10
For a finite collection of random variables X1 , . . . , Xk with
density or probability functions f1 (x ), . . . , fk (x ), and for
P
real numbers 0 ≤ αi ≤ 1 with ki=1 αi = 1 (these are
called “weights”), then we can construct a new random
variable X with density function
f (x ) = α1 f1 (x ) + · · · + αk fk (x ).
Then for X we have
E [X n ] = α1 E [X1n ] + · · · + αk E [Xkn ]
MX (t) = α1 MX1 (t) + · · · + αk MXk (t)
Lesson 10