Download PSTAT 120B Probability and Statistics - Week 4

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
PSTAT 120B Probability and Statistics - Week 4
Fang-I Chu
University of California, Santa Barbara
October 25, 2012
Fang-I Chu
PSTAT 120B Probability and Statistics
couple notes about hw3
Average for hw#3: 72.35/100
About #2(8.19)
the formula: MSE(θ̂) = V (θ̂) usually gives easier computation.
Show your work on how you obtain Var(θ̂)
A complete answer:
2
Var(θ̂) = Var(nY(1) ) = n2 Var(Y(1) ) = n2 · θn2 = θ2 since
Y(1) ∼ exp( θn ).
About #4(b,c)(8.44)
for (b), make sure you know how to obtain fZ (z) via
transformation. (You can derive directly from cdf or using
transformation formula.)
for (c), after you solve for b, remember your goal is to find L,
so your answer should be L = Yb
In this question, solving for quadratic equation gives us two b,
which are b = 0.68 and b = 1.32. We discard b = 1.32
because the range of z is with restricted range 0 < z < 1.
Fang-I Chu
PSTAT 120B Probability and Statistics
couple notes about section
PLEASE put the section you enrolled on your homework when
you hand it in.
Slides will be available online at
http://www.pstat.ucsb.edu/graduate/FangI/pstat120B/index.htm by Saturday
noon.
Even though you could download slide online, section
attendance is still STRONGLY encouraged. Sections are for
your benefits. I am here to help you and answer your
questions!
You should have received two emails from me by now, if you
haven’t received any, please put down your name and email on
the attendance sheet.
Fang-I Chu
PSTAT 120B Probability and Statistics
Topics for review
Unbiased and Consistent Estimator
Hint for #1 (Exercise 9.3)
Hint for #2 (Exercise 9.15)
Hint for #3 (Exercise 9.19)
Sufficient Estimator
Exercise 9.41 (similar for #4.Exercise 9.40)
Hint for #5 (Exercise 9.43)
Fang-I Chu
PSTAT 120B Probability and Statistics
Unbiased and Consistent Estimator
Unbiasedness: Let θ̂ be a point estimator for a parameter θ.
Then θ̂ is an unbiased estimator if E (θ̂) = θ. If E (θ̂) 6= θ, θ̂ is
said to be biased.
Bias: The bias of a point estimator θ̂ is given by
B(θ̂) = E (θ̂) − θ.
Consistency: The estimator θ̂n is said to be a consistent
estimator of θ if, for any positive number .
lim P(|θ̂n − θ| ≤ ) = 1
n→∞
or equivalently,
lim P(|θ̂n − θ| > ) = 0
n→∞
Fang-I Chu
PSTAT 120B Probability and Statistics
Hint for #1(Exercise 9.3)
9.3
Let Y1 , Y2 , . . . , Yn denote a random sample from the uniform
distribution on the interval (θ, θ + 1). Let
θ̂1 = Ȳ −
1
n
and θ̂2 = Y(n) −
.
2
n+1
Fang-I Chu
PSTAT 120B Probability and Statistics
Hint for #1(Exercise 9.3)(a)
9.3(a)
Show that both θ̂1 and θ̂2 are unbiased estimators of θ.
Hint for(a):
1. Information:
n
Y ∼uniform(θ, θ + 1), θ̂1 = Ȳ − 12 and θ̂2 = Y(n) − n+1
.
2θ+1
E (Y ) = 2 (why?)
From section 6.7, pdf for Y(n) is gY(n) (y ) = n(y − θ)n−1 for
θ ≤y ≤θ+1
2. Goal: E (θ̂1 ) = θ and E (θ̂2 ) = θ
Fang-I Chu
PSTAT 120B Probability and Statistics
continue-Hint for #1(Exercise 9.3)(a)
9.3(a)
Show that both θ̂1 and θ̂2 are unbiased estimators of θ.
Hint for(a):
3. Bridge:
E (θ̂1 ) = E (Ȳ
− 1 ) = E (Ȳ ) − 12
Pn 2
Pn
Yi
E (Ȳ ) = E ( i=1
) = n1 E ( i=1 Yi ) =?
n
n
n
) = E (Y(n) ) − n+1
E (θ̂2 ) = E (Y(n) − n+1
R θ+1
E (Y(n) ) = θ ygY(n) (y )dy =?
4. Fine tune: you have all the pieces you need, make it work!
Fang-I Chu
PSTAT 120B Probability and Statistics
Hint for #1(Exercise 9.3)(b)
9.3(b)
Find the efficiency of θ̂1 relative to θ̂2 .
Hint for(b):
Known: Y ∼uniform(θ, θ + 1)
Facts:
Var(Y ) =
θ+1−θ
12
formula of eff(Var(θ̂1 ), Var(θ̂2 )) =
445)
Var(θ̂2 )
Var(θ̂1 )
(definition 9.1, page
Goal: find eff(Var(θ̂1 ), Var(θ̂2 ))
Fang-I Chu
PSTAT 120B Probability and Statistics
Hint for #1(Exercise 9.3)(b)
9.3(b)
Find the efficiency of θ̂1 relative to θ̂2 .
Hint for(b):
Way to approach:
Pn
Pn
Yi
Var(θ̂1 ) = Var(Ȳ ) = Var( i=1
) = n12 Var( i=1 Yi ) =?
n
n
Var(Y(n) ) = (n+2)(n+1)2 (You need to show COMPLETE work
that how you get this)
2 ) − E (Y )2 .
Note: Var(Y(n) ) = E (Y(n)
(n)
Fang-I Chu
PSTAT 120B Probability and Statistics
Hint for #2(Exercise 9.15)
9.15
Refer to Exercise 9.3. Show that both θ̂1 and θ̂2 are consistent
estimators for θ.
1. Information: From Exercise 9.3, we know that both θ̂1 and θ̂2
are unbiased estimators for θ.
2. Goal: θ̂1 and θ̂2 are consistent estimators for θ
3. Bridge:
Theorem 9.1 states, an unbiased estimator θ̂n for θ is
consistent estimator of θ if limn→∞ V (θ̂n ) = 0
Think: How do variances of θ̂1 and θ̂2 behave when n → ∞?
4. Fine tune: you could wrap it up!
Note: Look at example 9.2 at page 451.
Fang-I Chu
PSTAT 120B Probability and Statistics
Hint for #3(Exercise 9.19)
9.19
Let Y1 , Y2 , . . . , Yn denote a random sample from the probability
density function
θy θ−1
0<y <1
fY (y ) =
0
elsewhere
where θ > 0. Show that Ȳ is a consistent estimator of
Fang-I Chu
θ
θ+1 .
PSTAT 120B Probability and Statistics
Hint for #3(Exercise 9.19)
Proof outline for 9.19 :
1. Information: Recognize that Y ∼ Beta(θ, 1)
2. Goal: Show that Ȳ is a consistent estimator of
3. Bridge:
θ
θ
θ+1Pand Var(Y ) = (θ+2)(θ+1)2
n
Pn
Yi
1
E (Ȳ ) = E ( i=1
i=1 Yi ) =?
nPn ) = n E (
Pn
Yi
1
Var(Ȳ ) = Var( i=1
)
=
Var(
i=1 Yi )
n
n2
θ
θ+1 .
E (Y ) =
=?
4. Fine tune: You can wrap it up!
Think: Is Ȳ an unbiased estimator of
Theorem 9.1 here?
Fang-I Chu
θ
θ+1 ?
Can we apply
PSTAT 120B Probability and Statistics
Sufficiency
Let Y1 , Y2 , . . . , Yn denote a random sample from a probability
distribution with unknown parameter θ. Then the statistic
U = g (Y1 , Y2 , . . . , Yn ) is said to be sufficient for θ if the
conditional distribution of Y1 , Y2 , . . . , Yn , given U, does not
depend on θ.
Theorem 9.4. Let U be a statistic based on the random
sample Y1 , Y2 , . . . , Yn . Then U is a sufficient statistic for the
estimation of a parameter θ if and only if the likelihood
L(θ) = L(y1 , y2 , . . . , yn |θ) can be factored into two
nonnegative functions,
L(y1 , y2 , . . . , yn |θ) = g (u, θ) × h(y1 , y2 , . . . , yn )
where g (u, θ) is a function only of u and θ and
h(y1 , y2 , . . . , yn ) is not a function of θ.
Fang-I Chu
PSTAT 120B Probability and Statistics
Exercise 9.41 (similar to#4.Exercise 9.40)
9.41
Let Y1 , Y2 , . . . , Yn denote a random sample from a Weibull
distribution with P
known m and unknown α. (Refer to Exercise
6.26) Show that ni=1 Yim is sufficient for α.
Fang-I Chu
PSTAT 120B Probability and Statistics
Exercise 9.41
Proof:
1. Information: Y ∼ Weibull(α)
P
2. Goal: Show that ni=1 Yim is sufficient for α.
3. Bridge:
Likelihood function:
h P
m−1
Qn
n
L(α) = α−n mn
y
exp
− i=1
i
i=1
yim
α
i
Use Theorem 9.4 (factorization criterion)
4. Fine tune:
m−1
Qn
g (u, P
α) = α−n exp − αu and h(y) = mn
, where
i=1 yi
n
u = i=1 yim .
Pn
By factorization criterion, U = i=1 Yim is sufficient for α
Fang-I Chu
PSTAT 120B Probability and Statistics
Hint for #5 (Exercise 9.43)
9.43
Let Y1 , Y2 , . . . , Yn denote independent and identically distributed
random variables from a power family distribution with parameters
α and θ. Then, by the result in Exercise 6.17, if α, θ > 0,
(
αy α−1
0≤y ≤θ
θα
f (y |α, θ) =
0
elsewhere
If θ is known, show that
Qn
i=1 Yi
Fang-I Chu
is sufficient for α.
PSTAT 120B Probability and Statistics
Exercise 9.43
Proof:
1. Information: given Y has pdf
(
αy α−1
θα
f (y |α, θ) =
2. Goal: Show that
3. Bridge:
Qn
i=1 Yi
0
0≤y ≤θ
elsewhere
is sufficient for α.
α−1
Qn
Likelihood function: L(α) = αn θ−nα
i=1 yi
Use Theorem 9.4 (factorization criterion)
4. Fine tune: you can wrap it up!
Fang-I Chu
PSTAT 120B Probability and Statistics
Remark
Make sure you know how to obtain likelihood function. Review
the definition of likelihood function from Wednesday lecture.
For most sufficiency problems, the first step is usually to write
out likelihood function and then apply factorization craterion.
It is legal to have h(y) = 1.
Fang-I Chu
PSTAT 120B Probability and Statistics