Download Descriptive Statistics

Document related concepts

History of statistics wikipedia , lookup

Statistics wikipedia , lookup

Transcript
Ch05
Statistical Estimation
CHAPTER CONTENTS
CHAPTER CONTENTS
5.1 Introduction
5.2 The Methods of Finding Point Estimators
5.3 Some Desirable Properties of Point Estimators
5.4 A Method of Finding the Confidence Interval: Pivotal Method
5.5 One Sample Confidence Intervals
5.6 A Confidence Interval for the Population Variance
5.7 Confidence Interval Concerning Two Population Parameters
5.8 Chapter Summary
5.9 Computer Examples
Projects for Chapter 5
220
221
245
261
269
284
289
298
299
303
5.1 Introduction
Unknown population parameters
To estimate:
point estimation
interval estimation
How much money do I have in my pocket?
 1000 $
 (700, 1200)
5.2The Methods of Finding Point Estimators
X1, . . ., Xn
independent and identically distributed (iid)
random variables (in statistical language, a
random sample)
f (x, 1, . . ., l)
Pdf or pmf of the population(?)
the unknown population parameters
(1, . . ., l)
Point estimation:
to determine statistics gi(X1, . . ., Xn), i = 1, . . ., l, which can
be used to estimate the value of each of the parameters
Estimator of i : i  1,, l 
ˆ  g  X ,, X 
i
i
1
n
Capital letters such as X and S2 to represent the estimators;
Lowercase letters such as x and s2 to represent the estimates.
Three of the more popular methods of estimation
the method of moments
the method of maximum
likelihood
Bayes’ method
This chapter
This chapter
Chapter 11
Unbiased
Bias
An estimator, ˆ, is unbiased if the mean of its sampling distribution is the parameter  .

B  E ˆ  
consistency
The estimator are said to satisfy the consistency property if the sample estimator has a high
probability of being close to the population value  for a large sample size.
efficiency
smaller variance
5.2.1 THE METHOD OF MOMENTS
: the kth population moment about the origin of a random variable X,
: the kth sample moment of the random variable X
5.2.2 THE METHOD OF MAXIMUM LIKELIHOOD
Even though the method of moments is intuitive and easy to apply, it usually does
not yield “good” estimators.
The method of maximum likelihood is intuitively appealing, because we attempt
to find the values of the true parameters that would have most likely produced
the data that we in fact observed.
For most cases of practical interest, the performance of MLEs is optimal for large
enough data.
This is one of the most versatile methods for fitting parametric statistical models
to data.
Maximum likelihood estimates give the parameter values for which the
observed sample is most likely to have been generated.
At times, the MLEs may be hard to calculate. It may be necessary to use numerical methods to
approximate values of the estimate.
5.3 Some Desirable Properties of Point Estimators
5.3.1 UNBIASED ESTIMATORS
The sample mean is always an unbiased estimator of the population mean.
Sample variance
Population variance:
Size of population = N
Elements of population: X1, X2, ,… , XN
N
  (1 / N ) ( X i   )
2
i 1
2
Unbiased estimators need not be unique.
If we have two unbiased estimators, there are infinitely many unbiased estimators.
It is better to have an estimator that has low bias as well as low variance.
For unbiased estimators,
5.3.2 SUFFICIENCY*
5.4 A Method of Finding the Confidence Interval: Pivotal Method
5.5 One Sample Confidence Intervals
5.6 A Confidence Interval for the Population Variance
5.7 Confidence Interval Concerning Two Population Parameters
5.8 Chapter Summary
5.9 Computer Examples (Optional)
Projects for Chapter 5