Download Point Estimators - STATISTICS -

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

History of statistics wikipedia , lookup

Statistics wikipedia , lookup

Transcript
Point Estimator
Methods of Point Estimations
Point Estimators
STATISTICS – Lecture no. 10
Jiřı́ Neubauer
Department of Econometrics FEM UO Brno
office 69a, tel. 973 442029
email:[email protected]
8. 12. 2009
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Introduction
Suppose that we manufacture lightbulbs and we want to state the
average lifetime on the box. Let us say that we have following five
observed lifetimes (in hours)
983
1063
1241
1040
1103
which have the average 1086. If it is all the information we have, it
seems to be reasonable to state 1086 as the average lifetime.
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Introduction
Let the random variable X be the lifetime of a lightbulb, and let
E (X ) = µ. Here µ is an unknown parameter. We decide to
repeat the experiment to measure a lifetime 5 times and will then
get an outcome on the five random variables X1 , . . . , X5 that are
i.i.d. (independent identically distributed). We now estimate µ by
5
X =
1X
Xi
5
i=1
which is the sample mean.
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Unbiased Estimator
Asymptotically Unbiased Estimator
Consistent Estimator
Efficiency of Estimators
Mean Square Error
Point Estimator
Definition
Let X1 , . . . , Xn be a random sample. The statistic (random
variable)
T = T (X1 , X2 , . . . , Xn ) = T (X),
which is a function of the random sample and is used to estimate
an unknown parameter θ, is called a point estimator of θ. We
write T (X) = θ̂.
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Unbiased Estimator
Asymptotically Unbiased Estimator
Consistent Estimator
Efficiency of Estimators
Mean Square Error
Unbiased Estimator
Definition
The estimator T (X) is said to be unbiased estimator the
parameter θ if
E [T (X)] = θ.
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Unbiased Estimator
Asymptotically Unbiased Estimator
Consistent Estimator
Efficiency of Estimators
Mean Square Error
Unbiased Estimator
Definition
The estimator T (X) is said to be unbiased estimator the
parameter θ if
E [T (X)] = θ.
The difference
B(θ, T ) = E [T (X)] − θ
is called a bias of the estimator T (X).
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Unbiased Estimator
Asymptotically Unbiased Estimator
Consistent Estimator
Efficiency of Estimators
Mean Square Error
Example
Let X1 , X2 , . . . , Xn be a random sample from a distribution with
the mean µ and the variance σ 2 .
The sample mean X is an unbiased estimator of µ, because
!
n
n
1X
1X
E (X ) = E
Xi =
E (Xi ) = µ.
n
n
i=1
Jiřı́ Neubauer
i=1
Point Estimators
Unbiased Estimator
Asymptotically Unbiased Estimator
Consistent Estimator
Efficiency of Estimators
Mean Square Error
Point Estimator
Methods of Point Estimations
Example
Let X1 , X2 , . . . , Xn be a random sample from a distribution with
the mean µ and the variance σ 2 .
The sample mean X is an unbiased estimator of µ, because
!
n
n
1X
1X
E (X ) = E
Xi =
E (Xi ) = µ.
n
n
i=1
i=1
The sample variance S 2 is an unbiased estimator of σ 2 ,
because
!
n
X
1
E (S 2 ) = E
(Xi − X )2 = · · · = σ 2 .
n−1
i=1
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Unbiased Estimator
Asymptotically Unbiased Estimator
Consistent Estimator
Efficiency of Estimators
Mean Square Error
Example
Let X1 , X2 , . . . , Xn be a random sample from a distribution with
the mean µ and the variance σ 2 .
The (moment) variance Sn2 is a biased estimator of σ 2 ,
because
!
n
X
1
n−1 2
E (Sn2 ) = E
(Xi − X )2 = · · · =
σ .
n
n
i=1
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Unbiased Estimator
Asymptotically Unbiased Estimator
Consistent Estimator
Efficiency of Estimators
Mean Square Error
Example
Let X1 , X2 , . . . , Xn be a random sample from a distribution with
the mean µ and the variance σ 2 .
The (moment) variance Sn2 is a biased estimator of σ 2 ,
because
!
n
X
1
n−1 2
E (Sn2 ) = E
(Xi − X )2 = · · · =
σ .
n
n
i=1
The bias of the estimator Sn2 is
B(σ 2 , Sn2 ) = E (Sn2 ) − σ 2 =
n−1 2
1
σ − σ2 = σ2.
n
n
The bias decreases for large n.
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Unbiased Estimator
Asymptotically Unbiased Estimator
Consistent Estimator
Efficiency of Estimators
Mean Square Error
Asymptotically Unbiased Estimator
Some estimators are biased but their bias decrease when n
increases.
Definition
If
lim E [T (X)] = θ,
n→∞
then the estimator T (X) is said to be asymptotically unbiased
estimator of the parameter θ.
It easy to see that
lim E [T (X) − θ] = 0.
n→∞
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Unbiased Estimator
Asymptotically Unbiased Estimator
Consistent Estimator
Efficiency of Estimators
Mean Square Error
Example
The (moment) variance is an asymptotically unbiased estimator of
σ 2 , because
lim E (Sn2 ) = lim
n→∞
n→∞
Jiřı́ Neubauer
n−1 2
σ = σ2.
n
Point Estimators
Point Estimator
Methods of Point Estimations
Unbiased Estimator
Asymptotically Unbiased Estimator
Consistent Estimator
Efficiency of Estimators
Mean Square Error
Consistent Estimator
Definition
The statistic T (X) is a consistent estimator of the parameter θ if
for every > 0
lim P(|T (X) − θ| < ) = 1.
n→∞
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Unbiased Estimator
Asymptotically Unbiased Estimator
Consistent Estimator
Efficiency of Estimators
Mean Square Error
Consistent Estimator
Definition
The statistic T (X) is a consistent estimator of the parameter θ if
for every > 0
lim P(|T (X) − θ| < ) = 1.
n→∞
If
lim B(θ, T ) = 0
n→∞
and
lim D[T (X)] = 0,
n→∞
then T (X) is the consistent estimator of θ.
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Unbiased Estimator
Asymptotically Unbiased Estimator
Consistent Estimator
Efficiency of Estimators
Mean Square Error
Example
Prove that the sample mean is a consistent estimator of the
expected value µ.
Jiřı́ Neubauer
Point Estimators
Unbiased Estimator
Asymptotically Unbiased Estimator
Consistent Estimator
Efficiency of Estimators
Mean Square Error
Point Estimator
Methods of Point Estimations
Example
Prove that the sample mean is a consistent estimator of the
expected value µ.
According to E (X ) = µ and D(X ) = σ 2 /n we obtain
B(µ, X ) = E (X ) − µ = 0
Jiřı́ Neubauer
a
σ2
= 0.
n→∞ n
lim D(X ) = lim
n→∞
Point Estimators
Point Estimator
Methods of Point Estimations
Unbiased Estimator
Asymptotically Unbiased Estimator
Consistent Estimator
Efficiency of Estimators
Mean Square Error
Efficiency of Estimators
If we have two unbiased estimators T1 (X) = θ̂ and T2 (X) = θ̃,
which should we choose? Intuitively, we should choose the one
that tends to be closer to θ, and since E (T1 ) = E (T2 ) = θ, it
makes sense to choose the estimator with the smaller variance.
Definition
Suppose that T1 (X) = θ̂ and T2 (X) = θ̃ are two unbiased
estimators of θ. If
D(T1 (X)) < D(T2 (X))
then T1 (X) = θ̂ is said to be more efficient than T2 (X) = θ̃.
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Unbiased Estimator
Asymptotically Unbiased Estimator
Consistent Estimator
Efficiency of Estimators
Mean Square Error
Example
We can find two unbiased estimators of a parameter λ of Poisson
distribution
E (X ) = λ and E (S 2 ) = λ.
It is possible to calculate that
D(X ) < D(S 2 ).
The estimator X is more efficient then the estimator S 2 .
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Unbiased Estimator
Asymptotically Unbiased Estimator
Consistent Estimator
Efficiency of Estimators
Mean Square Error
How to Compare Estimators?
Let us suppose we would like to compare unbiased and biased estimators
of the parameter θ. In this case might not be suitable to choose one of
the smallest variance.
The estimator T has the smallest
variance but has a large bias. Even
the estimator with the smallest bias
is not necessary the best one. The
estimator U has no bias but its variance is to large. The estimator V
seems to be the best.
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Unbiased Estimator
Asymptotically Unbiased Estimator
Consistent Estimator
Efficiency of Estimators
Mean Square Error
Mean Square Error
Definition
The mean square error of the estimator T of a parameter θ is
defined as
MSE (T ) = E (T − θ)2 = D(T ) + B 2 (θ, T )
(MSE of estimator = variance of estimator + bias2 ),
where T − θ is a sample error.
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Unbiased Estimator
Asymptotically Unbiased Estimator
Consistent Estimator
Efficiency of Estimators
Mean Square Error
Mean Square Error
The mean square error
indicates the ”average” sample error of estimates which can
be calculated for all possible random sample of the size n.
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Unbiased Estimator
Asymptotically Unbiased Estimator
Consistent Estimator
Efficiency of Estimators
Mean Square Error
Mean Square Error
The mean square error
indicates the ”average” sample error of estimates which can
be calculated for all possible random sample of the size n.
is a combination of 2 required properties (a small bias and
a small variance), that why it is an universal criterion.
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Unbiased Estimator
Asymptotically Unbiased Estimator
Consistent Estimator
Efficiency of Estimators
Mean Square Error
Mean Square Error
The mean square error
indicates the ”average” sample error of estimates which can
be calculated for all possible random sample of the size n.
is a combination of 2 required properties (a small bias and
a small variance), that why it is an universal criterion.
If T is an unbiased estimator then MSE (T ) = D(T ).
Another possibility how to measure an accuracy of estimators is
standard error
p
SE = D(T ).
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Unbiased Estimator
Asymptotically Unbiased Estimator
Consistent Estimator
Efficiency of Estimators
Mean Square Error
Example
The sample mean is an unbiased estimator of the expected value
µ, the standard error is equal to the standard deviation of the
sample mean
q
σ(X )
SE = D(X ) = σ(X ) = √ .
n
σ(X ) is unknown, we have to estimate it by the sample standard
deviation and we get the estimation
)
S
c = σ̂(X
√ =√ .
SE
n
n
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Unbiased Estimator
Asymptotically Unbiased Estimator
Consistent Estimator
Efficiency of Estimators
Mean Square Error
Example
Find the mean square error of S 2 and Sn2 . Let us start with the
statistic S 2 which is an unbiased estimator of σ 2 .
MSE (S 2 ) = D(S 2 ) = E (S 2 − σ 2 )2 = E (S 4 ) − 2σ 2 E (σ 2 ) + σ 4 =
2σ 4
.
= E (S 4 ) − σ 4 = n−1
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Unbiased Estimator
Asymptotically Unbiased Estimator
Consistent Estimator
Efficiency of Estimators
Mean Square Error
Example
Find the mean square error of S 2 and Sn2 . Let us start with the
statistic S 2 which is an unbiased estimator of σ 2 .
MSE (S 2 ) = D(S 2 ) = E (S 2 − σ 2 )2 = E (S 4 ) − 2σ 2 E (σ 2 ) + σ 4 =
2σ 4
.
= E (S 4 ) − σ 4 = n−1
The MSE of the estimator Sn2 is
4
4
MSE (Sn2 ) = E (Sn2 − σ 2 )2 = E (Sn4 ) − 2 n−1
n σ +σ =
2n−1 4
4
= E (Sn4 ) − 2−n
n σ = n2 σ ,
MSE (Sn2 ) < MSE (S 2 )
because
2n − 1
2
<
2
n
n−1
.
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Method of Moments
Method of Maximum Likelihood
Methods of Point Estimations
The definitions of unbiasness and other properties of estimators do
not provide any guidance about how good estimators can be
obtained. In this part, we discuss two methods for obtaining point
estimators:
the method of moments,
the method of maximum likelihood.
Maximum likelihood estimates are generally preferable to moment
estimators because they have better efficiency properties. However,
moment estimators are sometimes easier to compute. Both
methods can produce unbiased point estimators.
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Method of Moments
Method of Maximum Likelihood
Method of Moments
The general idea behind the method of moments is to equate
population moments, which are defined in terms of expected
values, to the corresponding sample moments. The population
moments will be functions of the unknown parameters. Then these
equations are solved to yield estimators of the unknown
parameters.
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Method of Moments
Method of Maximum Likelihood
Method of Moments
Let us assume the distribution with m ≥ 1 real parameters
θ1 , θ2 , . . . , θm and let X1 , X2 , . . . , Xn be a random sample from this
distribution. Let us suppose that exist moments
µ0r = E (Xir )
for r = 1, 2, . . . , m.
These moments depend on the parameters θ1 , θ2 , . . . , θm . Sample
moments are defined by the formula
n
Mr0 =
1X r
Xi ,
n
r = 1, 2 . . . .
i=1
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Method of Moments
Method of Maximum Likelihood
Method of Moments
Let X1 , . . . , Xn be a random sample from either a probability
function or probability density function with m unknown
parameters θ1 , . . . , θm . The moment estimators are found by
equating the first m population moments to the first m sample
moments and solving the resulting equations for the unknown
parameters
µ0r = Mr0 .
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Method of Moments
Method of Maximum Likelihood
Example
Estimation of the parameter λ – Poisson distribution.
Suppose that X1 , . . . , Xn is a random sample from the Poisson
distribution Po(λ), we get an equation
n
µ01 = M10
⇒
E (Xi ) =
1X
Xi ,
n
i=1
the estimator λ̂ of the parameter λ is
λ̂ = X .
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Method of Moments
Method of Maximum Likelihood
Example
Estimation of the parameters µ and σ 2 – normal distribution.
Suppose that X1 , . . . , Xn is a random sample from the normal distribution
N(µ, σ 2 ).
µ01
=
M10
⇒
n
1X
E (Xi ) =
Xi ,
n
i=1
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Method of Moments
Method of Maximum Likelihood
Example
Estimation of the parameters µ and σ 2 – normal distribution.
Suppose that X1 , . . . , Xn is a random sample from the normal distribution
N(µ, σ 2 ).
µ01
=
M10
⇒
n
1X
E (Xi ) =
Xi ,
n
i=1
µ02 = M20
⇒
E (Xi2 ) =
1
n
n
X
Xi2 ⇔ D(Xi ) + E (Xi )2 =
i=1
i=1
n
1X 2
σ 2 + µ2 =
Xi
n
i=1
Jiřı́ Neubauer
n
1X 2
Xi
n
Point Estimators
Point Estimator
Methods of Point Estimations
Method of Moments
Method of Maximum Likelihood
Example
Estimation of the parameters µ and σ 2 – normal distribution.
Suppose that X1 , . . . , Xn is a random sample from the normal distribution
N(µ, σ 2 ).
µ01
=
M10
⇒
n
1X
E (Xi ) =
Xi ,
n
i=1
µ02 = M20
⇒
E (Xi2 ) =
1
n
n
X
Xi2 ⇔ D(Xi ) + E (Xi )2 =
i=1
n
1X 2
Xi
n
i=1
n
1X 2
σ 2 + µ2 =
Xi
n
i=1
We obtain estimators
n
n
1X 2
1X
n−1 2
2
Xi − X =
(Xi − X )2 = Sn2 =
µ̂ = X , σ̂ 2 =
S
n
n
n
i=1
i=1
.
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Method of Moments
Method of Maximum Likelihood
Method of Maximum Likelihood
Let X1 , X2 , . . . , Xn be a random sample from either a probability
density function f (x, θ) or a probability function p(x, θ) with an
unknown parameter θ = (θ1 , θ2 , . . . , θm ). A random vector
X = (X1 , X2 , . . . , Xn ) has either a joint probability density function
or probability function
g (x, θ) = g (x1 , x2 , . . . , xn , θ) = f (x1 , θ)f (x2 , θ) · · · f (xn , θ)
or
g (x, θ) = g (x1 , x2 , . . . , xn , θ) = p(x1 , θ)p(x2 , θ) · · · p(xn , θ).
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Method of Moments
Method of Maximum Likelihood
Method of Maximum Likelihood
The density g (x, θ) is a function of x with a given value of θ. If
values x are given (observed data) than g (x, θ) is a function of
a variable θ. We denote it L(θ, x) and call it a likelihood
function.
If exists some θ̂ which fulfils
L(θ̂, x) ≥ L(θ, x),
then θ̂ is a maximum likelihood estimator of the parameter θ.
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Method of Moments
Method of Maximum Likelihood
Method of Maximum Likelihood
The density g (x, θ) is a function of x with a given value of θ. If
values x are given (observed data) than g (x, θ) is a function of
a variable θ. We denote it L(θ, x) and call it a likelihood
function.
If exists some θ̂ which fulfils
L(θ̂, x) ≥ L(θ, x),
then θ̂ is a maximum likelihood estimator of the parameter θ.
Sometimes is reasonable to use a logarithm of the likelihood
function L(θ, x) = ln L(θ, x). For the maximum likelihood
estimator we can write
L(θ̂, x) ≥ L(θ, x),
because the logarithm is an increasing function.
Jiřı́ Neubauer
Point Estimators
Point Estimator
Methods of Point Estimations
Method of Moments
Method of Maximum Likelihood
Method of Maximum Likelihood
The Maximum likelihood estimator of the vector
θ = (θ1 , θ2 , . . . , θm ) we obtain by solving a system of equations
∂L(θ, x)
= 0,
∂θi
Jiřı́ Neubauer
i = 1, 2, . . . , m.
Point Estimators
Method of Moments
Method of Maximum Likelihood
Point Estimator
Methods of Point Estimations
Example
Let X be a Bernoulli random variable. The probability function is
x
π (1 − π)1−x x = 0, 1,
p(x) =
0
otherwise.
The likelihood function is
L(π, x) = π x1 (1 − π)1−x1 π x2 (1 − π)1−x2 . . . π xn (1 − π)1−xn =
=π
Pn
i=1 xi
(1 − π)n−
Pn
i=1 xi
The logarithm of L(π, x) is
L(π, x) =
n
X
xi ln π +
i=1
n−
n
X
!
xi
i=1
Jiřı́ Neubauer
Point Estimators
ln(1 − π).
Method of Moments
Method of Maximum Likelihood
Point Estimator
Methods of Point Estimations
Example
We calculate the maximum of L(π, x)
Pn
P
n − ni=1 xi
dL(π, x)
i=1 xi
=
−
= 0,
dπ
π
1−π
and get the estimator
Pn
π̂ =
Jiřı́ Neubauer
i=1 xi
n
= x.
Point Estimators
Method of Moments
Method of Maximum Likelihood
Point Estimator
Methods of Point Estimations
Example
Find a maximum likelihood estimator of a parameter λ of Poisson
distribution Po(λ).
Pn
L(λ, x) = e
−nλ
L(λ, x) = ln L(λ, x) = −nλ +
λ i=1 xi
,
x1 !x2 ! · · · xn !
n
X
xi ln λ − ln(x1 !x2 ! · · · xn !)
i=1
n
X
dL(λ, x)
1
xi · = 0
= −n +
dλ
λ
i=1
λ̂ =
1
n
n
X
Jiřı́ Neubauer
xi = x.
i=1
Point Estimators