Download Expectation and Moment Generating Function

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Big O notation wikipedia , lookup

Abuse of notation wikipedia , lookup

Functional decomposition wikipedia , lookup

Dirac delta function wikipedia , lookup

Continuous function wikipedia , lookup

History of the function concept wikipedia , lookup

Proofs of Fermat's little theorem wikipedia , lookup

Fundamental theorem of calculus wikipedia , lookup

Tweedie distribution wikipedia , lookup

Non-standard calculus wikipedia , lookup

Law of large numbers wikipedia , lookup

Expected value wikipedia , lookup

Transcript
Expectation and Moment Generating Function
Paper: Probability and Statistics
Lesson: Expectation and Moment Generating
Function
Lesson Developer: Dr. Shiv Kumar Kaushik
College/Department: Kirori Mal College,
Department of Mathematics,
University of Delhi
Institute of Lifelong Learning, University of Delhi
1
Expectation and Moment Generating Function
Contents
1.
Introduction ................................................................ 3
2.
Definition and Examples of Expectation ............................ 3
Solved Problems .............................................................. 11
3. Some Special Expectations .............................................. 12
Chebyshev’s Theorem ...................................................... 15
Some Solved Problems ..................................................... 16
4. Moment Generating Function ........................................... 18
Exercises .......................................................................... 23
References........................................................................ 23
Institute of Lifelong Learning, University of Delhi
2
Expectation and Moment Generating Function
1. Introduction
In previous chapter, we have studied the concepts of random or non-deterministic
experiments followed by the classical definition of Probability function and the axiomatic
approach to the Probability Theory. One-variate distribution theory in which discrete and
continuous random variables and the distributions functions associated with them have been
studied. Various examples and some important properties of these concepts were also
studied.
In the present chapter, we will study mathematical expectation of a random variable
with some special expectationsorder moments including mean, variance and standard
deviation of a random variable. Further, moment generating function and characteristic
function will be studied.
2. Definition and Examples of Expectation
We begin this section with the following definition of expectation of a discrete and
continuous random variable followed by the various results and examples.
Definition 2.1 Let
be a continuous random variable with probability density function
(p.d.f.)
. If the integral
(i.e., convergent), the expectation of
If
, denoted by
, is defined as
is a discrete random variable with probability mass function p.m.f.
(i.e., convergent), the expectation of
The expectation
expectation of
by .
, denoted by
. If the series
, is defined as
of a random variable
is also called the
mathematical
, the expected value of , or the mean of
and is also denoted
Note that if
is conditionally convergent (i.e.,
is convergent but not
absolutely convergent),
does not exist. The condition of absolute convergence of
(i.e.,
is convergent) is therefore essential for the existence of
. Thus,
exists if
and only if
exists.
Institute of Lifelong Learning, University of Delhi
3
Expectation and Moment Generating Function
Let us see some of the examples of expectation of a random variable
.
Example 2.2 Let
be a discrete random variable
Here,
is other than the square of the first four natural numbers. Now,
, if
with p.m.f. given as following
since
, the expectation
Example 2.3 Let
be a continuous random variable
with p.d.f.
given by
Since
the expectation
In the next example, we show that the expectation of a constant random variable is
again constant.
Example 2.4 Consider a constant random variable , i.e., a random variable having all its
mass at a constant
Clearly, it is a discrete random variable with p.m.f.
.
Since
Institute of Lifelong Learning, University of Delhi
4
Expectation and Moment Generating Function
the expectation
Similarly, in case of a continuous random variable
constant random variable. Also, since
with p.d.f.
(since
the expectation
, such that
is a
)
is given by
.
Thus, the expectation of a constant is again a constant.
In the next theorem, we will determine the expectation of a function
random variable using the distribution of .
Theorem 2.5 Let be a discrete random variable with p.m.f.
real-valued function of . Then, the expected value of is given by
and let
of a discrete
be any
Proof. Let
be discrete random variable and suppose that
assumes a finite number of
values. Let
,
be possible values of
For each ,
, let
denote
the values of such that
,
Then,
Therefore, we have
Institute of Lifelong Learning, University of Delhi
5
Expectation and Moment Generating Function
If
takes up countably infinite values with positive probability, properties of absolutely
convergent series allow the same conclusion.
Next result gives the geometrical interpretation of
Theorem 2.6 Let
where
be a continuous random variable with p.d.f.
denotes the distribution function of
, then
.
Proof. By definition, we have
We know that
and
Now, consider
(By change of order of integration in the region
Institute of Lifelong Learning, University of Delhi
6
)
Expectation and Moment Generating Function
Therefore,
Consider
(By change of order of integration in the region
)
Therefore,
From (1), (2) and (3), we have
In the next theorem, we will determine the expectation of a function
continuous random variable using the distribution of .
of a
Theorem 2.7 Let
be a continuous random variable with p.d.f.
and let
any real-valued function of . Then, the expected value of is given by
Institute of Lifelong Learning, University of Delhi
be
7
Expectation and Moment Generating Function
Proof. Consider
Replacing
by
(1), we have
and denote the sets:
and
, then by
(Interchange of limits)
Does there exists any random variable whose expected value is not finite?
Yes. The answer to this problem has been addressed in the next example.
Example 2.8 Consider a continuous random variable
and suppose that
.
Then, since
Thus,
is p.d.f. of
. We have
Institute of Lifelong Learning, University of Delhi
8
Expectation and Moment Generating Function
i.e.,
In the next example, we will show that for a finite mean
does not exist.
Example 2.9 Let
,
may not be finite
be a continuous random variable with p.d.f.
Therefore,
But
Next, we give some properties of expectation in terms of following results.
Theorem 2.10 Let
numbers, then
and
Proof. Suppose that
definition, we have
Corollary 2.11 If
and
be two real-valued functions and let
is a continuous random variable with p.d.f.
be arbitrary real
, then using
are constants, then
Institute of Lifelong Learning, University of Delhi
9
Expectation and Moment Generating Function
Proof. By replace
result.
Corollary 2.12 If
by
in (1) of Theorem 2.10, we obtain the required
in Corollary 2.11, then
is constant, then
Proof. If we take
Theorem 2.14 If
Proof. Let
, for
by
is constant, then
Proof. If we take
Corollary 2.13 If
and
in Corollary 2.11, then
, then
.
be continuous random variable with p.d.f.
Hence provided
exists,
. Since
Theorem 2.15 The expected value of a bounded random variable
Proof. Let
that
be a continuous random variable and since
, for
is non-negative,
always exists.
is given to be bounded,
so
Now, consider
If
is discrete and bounded i.e.,
we have
Thus, in either case,
Theorem 2.16 Let
Proof. Since
, for all , so that
, for
, then for all ,
is convergent and therefore expectation necessarily exists.
, for all
, for all
,
, then
.
, for all
.
This gives
.
Institute of Lifelong Learning, University of Delhi
10
Expectation and Moment Generating Function
Thus,
Solved Problems
Problem 1 Prove that the expected value
random variable:
is not defined for each of the following
a)
b)
Solution. a) We have
The -series
therefore
is divergent i.e., not convergent if
. Hence
does not exist and
is not defined.
b) We have
Since the integral does not converge,
does not exist for the given p.d.f.
Problem 2 Let
be a random variable with p.d.f.
, if otherwise.
(i) If
(ii) If
, then find
and
such that
, if
and
.
, then find
and
.
Solution. (i) We have
Institute of Lifelong Learning, University of Delhi
11
Expectation and Moment Generating Function
(ii) We have
From (1) and (2), we have
and
Problem 3 Let
.
be a discrete random variable with p.m.f
Then, find the expected value of
given by the following table
.
Solution. We have
3. Some Special Expectations
In this section, we will study the
order moments of a random variable including
variance and standard deviation, their properties and some useful results followed by
various examples and solved problems.
Definition 3.1 The
order moment of a random variable
denoted by , is defined by
Institute of Lifelong Learning, University of Delhi
about a constant
12
,
Expectation and Moment Generating Function
for
when
when
is continuous with p.d.f.
and
is discrete with p.m.f.
Recall that
If
exists if and only if
, then
is called the
, then
exists.
order moment about the origin and in particular if
is known as the mean of
or expected value of
Definition 3.2 The
order moment of a random variable
denoted by , is defined by
for
when
when
is continuous with p.d.f.
.
about the mean,
and
is discrete with p.m.f.
Note that if
and if
, then
, then
Definition 3.3 Let be a random variable with mean , then the variance of
or , is defined as
.
Thus, if
and if
is discrete with p.m.f.
, then
is continuous with p.d.f.
, then
Next, we give some properties of
Theorem 3.4 Let
in the fom of a following result.
be a random variable with mean
Institute of Lifelong Learning, University of Delhi
, written as
, then
13
Expectation and Moment Generating Function
for another random variable
Proof.
with mean
, we have
Consider
(using linearity of
)
.
Consider
(using linearity of
and Definition 3.3)
Consider
,
where
and
.
This gives
(using linearity of
and Definition 3.3).
Value Addition: Minimal Property of Variance
Consider
This gives
i.e.,
Thus,
is the smallest second order moment,
is the minimum when
.
Institute of Lifelong Learning, University of Delhi
14
Expectation and Moment Generating Function
In the next theorem, we will show that the probability that the random variable
takes on a value within
standard deviations of the mean is at least
. In other words,
we will establish that the variance or standard deviation tells us the spread or dispersion of
the distribution of a random variable.
Chebyshev’s Theorem
Theorem 3.5 Let
and
be the mean and the standard deviation of a random variable
with p.d.f., then for a constant , we have
Proof. Consider
Since
,
Now, since
for
or
.
Therefore, it follows that
This gives
provided that
.
Thus,
and hence it follows that
Institute of Lifelong Learning, University of Delhi
15
Expectation and Moment Generating Function
Example 3.6 For a given discrete random variable
, the p.m.f. is given by
,
Now, we have
Further,
.
.
Using Chebyshev’s Theorem, we have
Also, we have
Since the results given by (1) and (2) coincides, therefore Chebyshev’s inequality cannot be
improved.
Some Solved Problems
Problem 4 If
exists and
exists and
, then show that
,
.
Institute of Lifelong Learning, University of Delhi
16
Expectation and Moment Generating Function
Solution.
Since
, then using p.d.f.
, we have
This gives
Thus,
exists.
.
Thus,
.
By Minimal Property of Variance, we have
Let
, then it follows that
Problem 5 Let the distribution of
. Then, find the value of
exists and that
be given by
for which
for
is maximum.
Solution. We have
and
.
Using Theorem 3.4, we have
Thus,
Problem 6 Let
be a random variable such that
Institute of Lifelong Learning, University of Delhi
and
. Then, show that
17
Expectation and Moment Generating Function
Solution. Consider
Now, let
Using Chebyshev’s Theorem and on taking
, we have
.
Thus,
Therefore,
4. Moment Generating Function
In this section, we will study an alternative method or procedure to calculate the
moments of discrete and continuous distributions. This method employs moment generating
functions. Properties and examples of moment generating function have been given.
Definition 4.1 The moment generating function (m.g.f.) of a random variable
the point
is denoted by
, where it exists is given by
About the origin, i.e., at
about
, m.g.f. is defined as
if
is discrete random variable with p.m.f.
if
is continuous random variable with p.d.f.
Institute of Lifelong Learning, University of Delhi
and
.
18
Expectation and Moment Generating Function
Why do we call this function as moment generating function?
Let us replace
in the formula for m.g.f. of the continuous random variable
Maclaurin’s series expansion for
given by
by the
then
Now, note that the coefficient of
is
, i.e., the
in the Maclaurin’s series of expansion of the m.g.f. of
moment about the origin.
Thus, we see that the function
given by (1) generates moments and that is why it is
called the moment generating function. The same argument can be given for the
discrete case.
In the next result, we will show that the
function with respect to
at
is same as the coefficient of
expansion of the moment generating function of
Theorem 4.2 Let
then
Proof. Since
the origin.
derivative of the moment generating
in the Maclaurin’s series of
.
, the moment generating function associated with variate
exists, then
exists,
is continuously differentiable in some neighborhood of
Then, using (1), we have
Institute of Lifelong Learning, University of Delhi
19
Expectation and Moment Generating Function
On differentiating
Taking
times w.r.t. , we have
, we have
Hence
Note that the above theorem serves as a convenient method of calculating moments.
Next, we give some properties of moment generating function (m.g.f.).
Theorem 4.3 Let
then
, the moment generating function associated with variate
,
exists,
is any constant,
Proof. (i) We have
Now,
(ii) We have
Theorem 4.4 Let
and
be two independent random variables, then
Proof. Consider
Institute of Lifelong Learning, University of Delhi
20
Expectation and Moment Generating Function
Hence
What is the effect of change of origin and scale on moment generating function
Consider
(where
and
?
are constants) be the transformation corresponding to
change of origin and scale.
Then, the m.g.f. of
is given by
Thus,
Next, we give an example of a random variable with p.d.f.
not exists.
such that
does
Example 4.5 Consider a continuous random variable with p.d.f. given by
Now,
The m.g.f. of
Clearly,
is given by
does not exists for any
Let us see another example of m.g.f.
Example 4.6 Let us toss a coin till the first head appears, then the sample space
form
.
If
denotes the number of tosses required, then
is of the
takes the values
Clearly,
,
Institute of Lifelong Learning, University of Delhi
21
Expectation and Moment Generating Function
Note that
So, the probability mass function of
Now, the m.g.f. of
is given by
is given by
Value Addition
Note that there are several distributions for which moment generating function does not
exists but there exists a function of the form
(where denote the imaginary
unit and
is arbitrary) for every distribution.
Such function is known as characteristic function of the distribution.
If
is continuous, we have
Now,
since
p.d.f.
is
non-negative,
and
.
This gives
Thus,
exists for all
.
Similar argument follows for the discrete case.
Institute of Lifelong Learning, University of Delhi
22
Expectation and Moment Generating Function
Exercises
1.
2.
3.
Let
denotes the absolute difference of the upturned faces in the experiment of
tossing two dice. Find
and
.
Let the p.d.f. of be given by
4.
A person draws cards one by one from a pack until he draws all the aces. How many
cards may he be expected to draw?
Show that the expected number of throws of a coin necessary to produce heads is
5.
The p.m.f. of a variate
6.
7.
In a game of chance, a man is allowed to throw a coin indefinitely. He receives
Rs.
if he throws a head at
trials respectively. If the entry fee to
participate in the game is Rs. , then show that the expected value of the net gain is
zero.
Show that if random variable is bounded, it has moments of every order.
8.
Let the p.d.f. of
Does
be given by
hence show that
9.
is
Let
exist ?
Find the M.G.F. of
and
and
. Also, obtain
, then
i.
find if
is a probability density function.
ii.
find the cumulative density function.
iii.
find
10. Suppose that a pair of dice is thrown once. If denotes the sum of numbers showing
up, then prove that
. Compare this value with the exact probability.
11. Suppose we toss two balls into five bags in such a way that each ball is equally likely
to fall into any bag. If denote the number of balls in the first bag, then
i.
what is the density function of ?
ii.
find the mean and variance of .
iii.
find the m.g.f. of .
12. Can
be the M.G.F. of some random variable?
References
1. Robert V. Hogg, Joseph W. McKean and Allen T. Craig, Introduction to Mathematical
Statistics, Pearson Education, Asia, 2007.
2. Irwin Miller and Marylees Miller, John E. Freund’s Mathematical Statistics
with Applications (7th Edition), Pearson Education, Asia, 2006.
3. Sheldon Ross, Introduction to Probability Models (9th Edition), Academic Press,
Indian Reprint, 2007.
Institute of Lifelong Learning, University of Delhi
23