Download Expected Values of Random Variables

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Derivative wikipedia , lookup

Chain rule wikipedia , lookup

Fundamental theorem of calculus wikipedia , lookup

Function of several real variables wikipedia , lookup

Generalizations of the derivative wikipedia , lookup

Multipole expansion wikipedia , lookup

Distribution (mathematics) wikipedia , lookup

Divergent series wikipedia , lookup

Series (mathematics) wikipedia , lookup

Transcript
Expected Values of Random Variables
Examples will be for discrete RVs. General principles extend to continuous RVs.
Definition:
µX = E(X) = Σx x p(x) = Σx x P{X = x}, where the sum is taken over all values of x.
"Law of Unconscious Statistician":
Y = g(X) is a RV.
µY = E(Y) = E(g(X)) = Σx g(x) p(x) = Σx g(x) P{X = x}.
Particular functions g of interest:
g(x) = x2:
µ2' = E(X2) is called the second moment of X.
g(x) = (X – µ)2:
µ2 = E[(X – µ)2] = V(X),
also called the second moment about the mean.
Recall: V(X) = E(X2) – µ2.
Similarly for any k: g(x) = xk: kth moment.
Example: X ~ GEOM(1/2).
X is the number of tosses of a fair coin until we see the first Head.
p(x) = 1/2x, for x = 1, 2, 3, ... .
How do we know this is a "real" distribution.
That is, how do we know that the infinite series
A = Σx p(x) = 1/2 + 1/4 + 1/8 + ... converges to 1 ?
In R, we get a pretty good indication that the sum is 1:
x <- 1:20
sum(1/2^x)
[1] 0.999999
Analytic answer: Sum the geometric series.
Expectations are not always easy to find.
Here an analytic solution for E(X) = Σx x/2x = 1/2 + 2/4 + 3/8 + 4/16 + ...
is not so easy and will be postponed for the moment.
Intuitively. it seems the answer should be 2.
In R, we can get an idea this must be right:
x <- 1:20
sum(x/2^x)
[1] 1.999979
Expectations do not always exist.
Let Y = g(X) = 2X.
E(Y) = E(2X) = Σx 2x p(x) = 2(1/2) + 4(1/4) + 8(1/8) + ... = 1 + 1 + 1 + ...
diverges to ∞.
Let U = h(X) = (–2)X.
E(U) = E((–2)X) = Σx (–2)x p(x) = –2(1/2) + 4(1/4) – 8(1/8) + ... = –1 + 1 – 1 + ...
does not converge at all.
Important technicality: We say that E(X) exists only if E(|X|) converges.
Taylor (Maclauren) expansions of ex.
ex = Σi xi / i!, where the sum is taken over i = 0, 1, 2, ... .
Supposedly, this is proved in Calculus III.
For example: e = 10/0! + 1/1! + 12/2! + 13/3! + 14/4! + ... = 1 + 1 + 1/2 + 1/6 + 1/24 + ...
In R, we can get a pretty good idea it's right:
x <- 0:20
sum(1/factorial(x))
[1] 2.718282
exp(1)
[1] 2.718282
# In earlier R, use gamma(x+1)
Similarly,
e2 = 20/0! + 2/1! + 22/2! + 23/3! + 24/4! + 25/5! +...
= 1 + 2 + 4/2 + 8/6 + 16/24 + 32/120 +... .
In R:
sum(2^x/factorial(x))
[1] 7.389056
> exp(2)
[1] 7.389056
Outline: Expectation of a Geometric RV:
Let X ~ GEOM(p). We want to show (analytically) that E(X) = 1/p.
Already demonstrated in R for p = 1/2.
Consider the random variables gt(X) = etX.
By summing a geometric series we can show that
mX(t) = E(etX) = pet/(1 – qet), for t in a neighborhood of t = 0.
(Essentially, we need the denominator to be positive.)
Using the expansion of ex:
mX'(0) = [dmX(t) / dt]t=0 = E(X).
By taking the derivative of pet/(1 – qet) and setting t = 0,
we get 1/p.
Hence E(X) = 1/p.
Moment generating functions (MGFs).
In general, the MGF of X is defined as E(etX),
whenever this expectation exists for t in a neighborhood of t = 0.
In general, µk' = E(Xk) = mX[k](0),
where [k] denotes taking the kth derivative with respect to t.
This is why mX(t) = E(etX) is called the "moment generating function" of X.
Important facts about MGF's:
• MGFs assist in finding moments
• Uniqueness: No two distributions have the same MGF.
Thus the MGF is another way (in addition to PDF and CDF)
to encode the probability information of a distribution.
• If all of the moments µ1', µ2', µ3', ... of a distribution are known,
the distribution is determined.
• MGFs can be used to find the distribution of the sum of two independent RVs.
• MGFs can be used to prove limit theorems.
We will illustrate the last two properties later in the course.