Download Continuous Random Variables, Moments and Moment Generating

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Continuous Random Variables,
Moments and Moment
Generating Function
Continuous random variable
I
Definition: A random variable X is continuous if the CDF
FX (x) = P(X ≤ x) is a continuous function of x.
I
Example 1: Weight of a new born baby;
I
Example 2: Waiting time in a bus stop.
Density function
I
Definition: The probability density function fX (x) of a
continuous random variable X is the function that satisfies
Rx
FX (x) = −∞ fX (t)dt.
I
Remark: There exist continuous random variables which
do not have densities. We will only discuss the case where
the density exists.
CDF and density function
I
If X has a density fX (x), then FX (x) =
Rx
−∞ fX (u)du.
If
FX (x) is differentiable, then
d
FX (x)|x=x0 = FX0 (x0 )
dx
but FX0 (x0 ) is not necessary to be the same as fX (x0 ).
I
If fX (x) is continuous at point x0 , then
d
FX (x)|x=x0 = FX0 (x0 ) = fX (x0 ).
dx
Example

 2x
Define FX (x) =
 1x +
2
0≤x <
1
2



2 0 ≤ x < 13


1
1
fX (x) =
2
3 ≤x ≤1



 0 otherwise
(a) FX (x) =
Rx
−∞ fX (t)dt
=
1
3
1
3
≤x ≤1


2 0 ≤ x < 31 , x 6=





 1 x=1
6
∗
fX (x) =

1
1


2
3 ≤x ≤1



 0 otherwise
Rx
∗
−∞ fX (t)dt.
(b)
dFX (x)
dx |x= 13
does not exist but fX ( 13 ) = 12 .
(c)
dFX (x)
dx |x= 16
= fX ( 61 ) 6= fX∗ ( 16 ).
1
6
Density is not probability
I
Note that fX (t) is not probability. Actually, P(X = t) = 0 for
any t if X is a continuous random variable. Because
{X = t} ⊂ {t − < X ≤ t} for any P(X = t) ≤ P(t − < X ≤ t) = FX (t) − FX (t − ).
Hence 0 ≤ P(X = t) ≤ lim→0 [FX (t) − FX (t − )] = 0 by
continuity of FX .
I
Any meaningful statement about probability must consider
X lying in some interval. Probability is interpreted as the
area under the density function.
Example: Logistic distribution
A random variable X with logistic distribution if FX (x) =
1
.
1+e−x
Then
fX (x) =
dFX (x)
e−x
=
dx
(1 + e−x )2
P(a < X < b) = FX (b) − FX (a)
Z b
Z
=
fX (x)dx −
−∞
a
Z
fX (x)dx =
−∞
If ∆x is small, P(a ≤ x ≤ a + ∆x) ≈ fX (a)∆x.
b
fX (x)dx.
a
Quantile and median
I
Definition: Let X be a random variable with CDF FX (x).
For any 0 < α < 1, an quantile of X is any number xα
satisfying
FX (xα ) ≥ α
and FX (xα− ) ≤ α.
The median is x0.5 .
I
Quantiles defined above may not be unique. To make it
unique, we usually define the quantile as
xα = inf{x : FX (x) ≥ α}.
Example
Let X have CDF
F (u) =


0







u 2 /2


3
4




3


4 +



 1
u−2
4
u<0
0≤u<1
1≤x <2
2≤u<3
u ≥ 3.
Where are 0.6, 0.65, 0,75 quantiles and where is the median?
Expected value
I
Definition: Let X be a continuous random variable with
density f (x). Then the expected value of a random variable
g(X ) is
Z
E(g(X )) =
provided that
I
R
g(x)f (x)dx
|g(x)|f (x)dx exists.
Linearity:
E(ag1 (X ) + bg2 (X ) + c) = aE(g1 (X )) + bE(g2 (X )) + c.
Examples
I
Example 1: If X has exponential(λ), i.e.,
fX (x) =
x
1
exp(− ) x ≥ 0 and λ > 0.
λ
λ
What is the expectation of X ?
I
Example 2: If X has Cauchy distribution, the density of X is
fX (x) =
1 1
π 1 + x2
What is the expectation of X ?
− ∞ < x < ∞.
Mixture of continuous and discrete random variables
I
A random variable X could take continuous and discrete
values.
P(X ∈ A) = α
X
Z
fd (x) + (1 − α)
fc (x)dx
A
x∈A
for any Borel set A and 0 < α < 1, where fd (x) is a pmf
and fc (x) is a density.
I
The expectation of X is
E(X ) = α
X
x
Z
xfd (x) + (1 − α)
xfc (x)dx.
Example: Jelly Donut Problem
Suppose that the Jelly Donut Company want to decide how
many jelly donuts to bake every day. Package sells for s dollar
and cost c dollars to make. The demand D is a continuous
random variable with density f and CDF F . To maximize the
profit, how many packages the company should make?
Variance
I
For any random variable X , the variance is
Var(X ) = E[(X − E(X ))2 ].
I
Example: If X has exponential(λ), what is the variance of
X?
Moments
Definition: For each n, the nth moment of X is µn = E(X n ) and
the nth central moment of X is κn = E(X − µ)n where
µ = E(X ).
The moments can be used to measure some aspects of a
distribution. For example, we can measure degree of
asymmetric of a distribution by coefficient of skewness:
γ1 =
κ3
σ3
and for symmetric densities (pmfs), we can measure
peakedness by coefficient of kurtosis γ2 =
σ 2 = Var(X ).
κ4
σ4
− 3, where
Symmetric
I
A random variable is said to symmetric about 0 if X and
−X have the same distribution.
I
If X and −X have the same distribution,
F (u) + F (−u) = 1 + P(X = −u).
I
If X is continuous and fX (x) = fX (−x) for all x except
countable many x, then X is symmetric about 0. If X is
discrete, we require that fX (x) = fX (−x) for all x.
Left skew and right skew
Examples: Skewness
What are the coefficients of skewness for exponential(λ) and
Binomial(n, p)?
0.4
0.3
p=0.1
p=0.8
0.0
0.1
0.1
0.2
0.3
probability
0.4
λ=2
λ=5
0.0
density
Binomial Distribution(n=10)
0.2
0.5
Exponential Distribution
0
2
4
6
x
8
10
0
2
4
6
x
8
10
Kurtosis
Kurtosis γ2 =
κ4
σ4
− 3 is compared with the kurtosis of the
standard normal distribution, which has kurtosis 0.
Moments might not fully determine a distribution
I
Moments reflects some aspects of a distribution, but it can
not determine a distribution. Two totally different
distributions could have all the moments to be the same.
I
Example: Consider two random variables X and Y having
densities
fX (x) = √
1
2πx
exp(−(log x)2 /2)
0<x <∞
fY (y ) = fX (y )(1 + sin(2π log(y))) 0 < y < ∞.
Example continuation
0.6
0.4
0.2
0.0
Densities
0.8
1.0
1.2
fy(y)
fx(x)
0
2
4
6
x
8
10
Moment generating function
I
Let X be a random variable with CDF FX . The moment
generating function (MGF) of X , denoted by MX (t) is
MX (t) = E(etX ) provided that the expectation exists for t in
some neighborhood of 0. That is for all −h < t < h, E(etX )
exists.
I
For continuous random variables with density function
R
fX (x), MX (t) = etx fX (x)dx.
I
For discrete random variables with probability mass
P
function pX (x), MX (t) = x etx pX (x).
Properties
I
For any constants a and b, the MGF of the random variable
aX + b is given by MaX +b (t) = ebt MX (at).
I
For independent and identically distributed random
variables X , X1 , · · · , Xn . Let Sn = X1 + X2 + X3 + · · · + Xn .
Then MSn (t) = MXn (t), where MX (t) is the MGF of X .
Calculate moments using MGF
If X has MGF MX (t), then
E(X n ) =
dn
MX (t)|t=0 .
dt n
That is, the n-th moment is equal to the n-th derivatives of
MX (t) evaluated at t = 0.
Examples
I
(Discrete case) If X has Binomial(n, p), what is the
moment generating function of X ?
I
(Continuous case) If X has Exponential(λ), what is the
moment generating function of X ?
MGF and CDF
I
Let FX (x) and FY (y ) be two CDFs. If the moment
generating functions exist and MX (t) = MY (t) for all t in
some neighborhood of 0, then FX (u) = FY (u) for all u.
I
Suppose {Xn : n = 1, 2, · · · } is a sequence of random
variables, each with MGF MXn . Further,
lim MXn (t) = MX (t)
n→∞
for all t in a neighborhood of 0
and MX (t) is MGF of X . Then there exist a unique CDF
FX (x) whose moments are determined by MX (t) and for all
x where FX (x) is continuous, we have
lim FXn (x) = FX (x).
n→∞
Related documents