Download Expected Value and Variance

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Probability wikipedia , lookup

Statistics wikipedia , lookup

Foundations of statistics wikipedia , lookup

History of statistics wikipedia , lookup

Transcript
Expected Value and Variance
Mark Huiskes, LIACS
[email protected]
22/11/2006
Probability and Statistics, Mark Huiskes, LIACS, Lecture 7
Introduction
• This class we will, finally, discuss expectation and variance.
• Often used concepts to summarize probability distributions:
what to expect and how much does it vary around the
expectation.
• As usual we first look at the discrete case, then at the
continuous. For the discrete case we only look at variables
with numerical values (for categorical ones, the expectation
usually doesn’t make sense).
22/11/2006
Probability and Statistics, Mark Huiskes, LIACS, Lecture 7
Expected Value
• Let X be a numerically valued discrete rv with sample space
Ω and distribution function m(x). The expected value E(X) is
defined by
E(X) =
xm(x)
x∈Ω
[provided that this sum converges; the expectation is often
referred to as the mean, and denoted by mu; mean and mu
on blackboard]
• [Expectation can be interpreted as the average outcome:
Law of Large numbers; next time]
22/11/2006
Probability and Statistics, Mark Huiskes, LIACS, Lecture 7
Examples
• Example 1: Throw a die: what is the expected outcome?
• Example 2: Toss a coin three times, what is the expected
number of heads that appear
• 0 * 1/8 + 1 * 3/8 + 2 * 3/8 + 3 * 1/8 = 3/2
22/11/2006
Probability and Statistics, Mark Huiskes, LIACS, Lecture 7
Expectation of the Geometric Distribution
• T: first success in a Bernoulli trials process with parameters n
and p ; sample space: Ω = {1, 2, 3, . . .}
• Distribution function: m(j) = P (T = j) = q j−1 p
• Expectation:
E(T ) = 1 · p + 2 · qp + 3 · q 2 p + . . .
= p(1 + 2q + 3q 2 + . . .)
Differentiate the geometric series:
1 + s + s2 + s3 + . . . =
1 + 2s + 3s2 + . . . =
1
1−s
= (1 − s)−1
1
(1−s)2
So
E(T ) =
22/11/2006
p
(1−q)2
=
p
p2
=
1
p
Probability and Statistics, Mark Huiskes, LIACS, Lecture 7
Expectation of a Function of a Random Variable
X a random variable with sample space Ω and distribution
function m(x); φ : Ω → IR a function. Then
E(φ(X)) =
φ(x)m(x)
x∈Ω
Example: O = {-1, 1, 3} with m(-1)=1/8, m(1) = ¼, m(3)=5/8
What is the expectation of phi(x) = x^2
Method 1: first find the distribution of the new variable
x^2 has possible outcomes 1 and 9 with probs. 3/8 and 5/8. So
E = 3/8 + 9 * 5/8 =48/8=6
Method 2: use the law (of the unconscious statistician)
E = (-1)^2 * 1/8 + 1^2 * 3/8 + 9 * 5/8
22/11/2006
Probability and Statistics, Mark Huiskes, LIACS, Lecture 7
More Properties
•
Sum of random variables (do not have to be independent!):
E(X + Y ) = E(X) + E(Y )
•
Scalar multiplication:
E(cX) = cE(X)
22/11/2006
Probability and Statistics, Mark Huiskes, LIACS, Lecture 7
Expectation of Binomial Distribution
• Counts the number of successes B in a Bernoulli trials
process with parameters n and p
• Sample space: Ω = {1, 2, 3, . . . , n}
N
B=
Xi
i=1
E(Xi ) = 0 · q + 1 · p = p
E(B) =
n
E(Xi ) = np
i=1
b(p, n, k) =
22/11/2006
Probability and Statistics, Mark Huiskes, LIACS, Lecture 7
n
k
pk q n−k
Expectation of the Poisson Distribution
•
•
22/11/2006
The Poisson distribution approximates the binomial
distribution for large n and small p with λ = np
So the expectation of a Poisson variable with parameter λ
is λ
Probability and Statistics, Mark Huiskes, LIACS, Lecture 7
Independence and Expectation
• If X and Y are independent then E(XY) = E(X)E(Y) [for
variables that are not independent this does not have to be
true]
• Very easy to proof using P(X,Y)=P(X)P(Y)
22/11/2006
Probability and Statistics, Mark Huiskes, LIACS, Lecture 7
Conditional Expectation
• Let X be a random variable with sample space Ω and F an
event then:
E(X|F ) =
xj P (X = xj |F )
j
• [Probably skip: Often used as:
Ω = ∪j Fj , Fi ∩ Fj = ∅
E(X) =
22/11/2006
j
E(X|Fj )P (Fj )
Probability and Statistics, Mark Huiskes, LIACS, Lecture 7
Variance of Discrete Random Variables
• [The expectation tells you what to expect, the variance is a
measure from how much the actual is expected to deviate]
• Let X be a numerically valued rv with distribution function
m(x) and expected value mu=E(X). Then the variance V(X)
of X is:
2
V (X) = E((X − µ) ) = (x − µ)2 m(x)
x
• The standard deviation of X is the square root of the
variance and is usually denoted by σ :
V (X) = σ 2
22/11/2006
Probability and Statistics, Mark Huiskes, LIACS, Lecture 7
Example
• Compute the variance of throwing a die; E(X) = 7/2
• V(X) = 1/6 (25/4 + 9/4 + ¼ + ¼ + 9/4 + 25/4) = 35/12
22/11/2006
Probability and Statistics, Mark Huiskes, LIACS, Lecture 7
Alternative Method of Computation
V (X) = E(X 2 ) − µ2
• Proof it… (when there’s time)
• Example: Throwing a die, again
• E(X^2) = 1/6(1 + 4 + 9 + 16 + 25 + 36) = 91/6
22/11/2006
Probability and Statistics, Mark Huiskes, LIACS, Lecture 7
Properties
• Variance is not linear like the expectation
V (cX) = c2 V (X)
V (X + c) = V (X)
• If X and Y are independent:
V (X + Y ) = V (X) + V (Y )
• Point out theorem 6.9 as a direct consequence
22/11/2006
Probability and Statistics, Mark Huiskes, LIACS, Lecture 7
Bernoulli, Binomial, Geometric, Poisson variables
• X: succes with probability p; failure with q
• Variance p – p^2 = p (1 –p)=pq
• Binomial: npq
• Geometric: no time to show it, but E(X)=1/p; Var(X) =
q/(p^2); see book example 6.19
• Poisson: Easier npq with limit lam x q = lambda (p to zero)
22/11/2006
Probability and Statistics, Mark Huiskes, LIACS, Lecture 7
Continuous Random Variables
• Expected Value:
µ = E(X) =
∞
xf (x) dx
−∞
• Same properties apply (also E(XY)=E(X)E(Y) for indep’s)
• Example: uniform distribution: ½
• Expectation of a function of a random variable:
E(φ(X)) =
∞
φ(x)f (x) dx
−∞
22/11/2006
Probability and Statistics, Mark Huiskes, LIACS, Lecture 7
Variance
• Is still:
V (X) = E((X − µ)2 )
2
V (X) = σ =
∞
(x − µ)2 dx
−∞
• Properties stay the same.
22/11/2006
Probability and Statistics, Mark Huiskes, LIACS, Lecture 7
Assignment
• 1. (a), (b)
22/11/2006
Probability and Statistics, Mark Huiskes, LIACS, Lecture 7