Download Fall 2015

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Department of Economics
UCLA
Fall 2015
Comprehensive Examination
Quantitative Methods
This exam consists of three parts. You are required to answer all the questions in all the parts.
Each part is worth 100 points, with relative weights given by the points by each question.
Allocate your time wisely. Good luck!
Part I - 203A
Question 1 (20 points)
Suppose that A1, A2, B1, and B2 are four mutually independent random variables, distributed
with, respectively, marginal cumulative distribution functions, FA1 ; FA2 ; FB1 ; and FB2 . Assume
that FA1 ; FA2 ; FB1 ; and FB2 are strictly increasing and di¤erentiable. Let
A = minfA1 ; A2 g
and
B = minfB1 ; B2 g
and
C = maxfA; Bg
Answer the following questions in terms of FA1 ; FA2 ; FB1 ; and FB2 :
(a; 10 points) Derive the probability densities of (A; B) and of C: Are A and B independent?
Justify your answers.
(b; 10 points) Let E = (A1 )2 + (B1 )2 : Derive the density of E:
Question 2 (80 points):
Consider the following model
Y1 =
+
Y2 + "1
Y2 = +
X + "2
where ; ; ; and are parameters, the probability density of "2 conditional on X = x is, for
a parameter 2 > 0;
1
f"2 jX=x ("2 ) = p
2
1
2
e
2
2
"2
x
2
and the probability density of "1 is , for a parameter
1
f"1 ("1 ) = p
2
2
1
e
1
2
1 < "2 < 1
2
"1
1
1
>0
2
1 < "1 < 1
The random variable "1 is assumed to be distributed independently of X; and the (marginal)
density of X is given by
8
9
1
>
>
<
x
<
1
2
< ( 2 1)
=
fX (x) =
>
>
: 0
otherwise ;
for parameters
1
and
2:
The answers to (a)-(c) should be in terms of parameters.
(a; 10 points) What is the density of Y1 conditional on X = x for x in ( 1 ;
expectation of Y1 ?
(b; 10 points) What is the probability that (Y2
conditional on "2 ?
2 )?
What is the
0)? What is the probability that (X
0)
(c; 10 points) What is the joint cumulative distribution of (Y1 ; Y2 )? What is the moment
generating function of Y2 ?
Suppose now that the density of "2 conditional on X = x; for x in ( 1 ;
continuous function, f"2 jX=x ; of "2 and x, and as above,
Y2 = +
2) ;
is an unknown
X + "2
(d; 15 points) What can and what cannot be identi…ed about the function f"2 jX=x from the
distribution of (Y2 ; X) under the following cases?
(i) No additional information.
(ii) = 0 and = 1:
(iii) For all for x in ( 1 ; 2 ) ; the expectation of "2 given X = x is 0.
(iv) For all for x in ( 1 ; 2 ) :
Z 0
f"2 jX=x (t) dt = 0:5
1
Justify your answers.
(e; 15 points) What can and what cannot be identi…ed about the function f"2 jX=x from the
probability that (Y2 0) conditional on X = x for all x in ( 1 ; 2 ) under the following cases?
(i) No additional information.
(ii) = 0 and = 1:
(iii) For all for x in ( 1 ; 2 ) ; the expectation of "2 given X = x is 0.
(iv) " and X are distributed independently:
Justify your answers.
Suppose that you can observe an i.i.d. sample fX1 ; X2 ; :::g of size N from the distribution
of X:
p
(f; 20 points) Assume that 1 > 0: Let =
( 1 + 2 ): Provide a consistent estimator
for and an approximate con…dence interval, in terms of the observations.
Part II - 203B
Question 1
Suppose that
y i = x i + "i
1
1
xi = zi1 + zi2 + vi ;
2
3
where we assume that (i) (zi1 ; zi2 )0 is independent of ("i ; vi )0 ; (ii) zi1 and zi2 are independent of
2
2
] = E ["2i ] = E [vi2 ] = 1;
] = E [zi2
each other; (iii) E [zi1 ] = E [zi2 ] = E ["i ] = E [vi ] = 0; (iv) E [zi1
0
and (v) (xi ; zi1 ; zi2 ; "i ; vi ) i = 1; 2; : : : are i.i.d.
Question 1-1 (10 pts.)
No derivation is required for this question; your derivation will not be read anyway.
Let
Pn
Pn
b1 = Pni=1 zi1 yi ; b2 = Pni=1 zi2 yi
i=1 zi1 xi
i=1 zi2 xi
It can be shown that
2 p
3
n b1
4 p
5
b
n 2
is asymptotically normal with mean zero. Provide numerical characterization of the asymptotic
variance matrix. (You do not have to establish asymptotic normality. Just state the asymptotic
variance matrix. Your answer should take the form of numbers; an abstract formula will not
be accepted as an answer.)
Question 1-2 (10 pts.)
No derivation is required for this question; your derivation will not be read anyway.
Let
P
P
P
0
0 1
(
x
z
)
(
z
z
)
(
zi yi )
i
i
i
i
i
i
b= P
P
Pi
0
0 1
( i xi zi ) ( i zi zi ) ( i zi xi )
P
P
1 1 P
1
1
0
0
i xi zi
i zi zi
i zi "i
n
n
n
= +
P
P
P
1
1
1
1
0
0
i xi zi
i zi zi
i zi xi
n
n
n
p
denote 2SLS. It can be shown that n b
is asymptotically normal with mean zero.
Provide numerical characterization of the asymptotic variance. (You do not have to establish
asymptotic normality. Just state the asymptotic variance. Your answer should take the form
of a number; an abstract formula will not be accepted as an answer.)
Question 2 (10 pts.)
No derivation is required for this question; your derivation will not be read anyway.
Consider the following two-equation model:
where we assume that
We know that
y1 =
1 y2
+
11 x1
+
31 x3
+ "1
y2 =
2 y1
+
12 x1
+
22 x2
+ "2
3 2 3
3
2
0
x1 "2
x1 "1
5
4
5
4
4
E x2 "1 = E x2 "2 = 0 5
0
x3 "2
x3 "1
2
y1 = x1 + 2x2 + 3x3 + u1
y2 = 4x1 + 5x2 + 6x3 + u2
where
3
3 2 3
2
x1 u 1
x1 u 2
0
E 4 x2 u 1 5 = E 4 x2 u 2 5 = 4 0 5
x3 u 1
x3 u 2
0
2
What are the numerical values of s and s?
Question 3 (20 pts.)
No derivation is required for this question; your derivation will not be read anyway.
Suppose that
y i = + x i + "i
with (yi ; xi )0 i = 1; : : : ; n i.i.d., and E ["i ] = E [xi "i ] =
with n = 3:
2
3 2
y 1 x1
2
4 y 2 x2 5 = 4 0
y 3 x3
4
0. You are given the following data set
3
0
1 5
2
The OLS estimates from a regression of y on a constant and x using the given data is
2
31 1
2 3
" # 0
1 0
2
b
1 1 1 4
1 1 1 4 5
1
@
5
A
1 1
0 =
b =
0 1 2
0 1 2
1
1 2
4
and the residual vector is
2
3
2
e=4 0 5
4
2
3
1 0
4 1 1 5
1 2
1
1
2
3
1
=4 2 5
1
Assuming heteroskedasticity and using White’s heteroscedasticity corrected asymptotic variance
estimator, provide 95% asymptotic con…dence interval for . If your answer involves a square
root, try to simplify as much as you can.
Question 4 (10 pts.)
No derivation is required for this question; your derivation will not be read anyway.
Suppose that
y i = i x i + "i
i = 1; : : : ; n
and that (i) ( i ; xi ; "i ) i = 1; 2; : : : are i.i.d.; (ii) xi , i , and "i are independent of each other;
(iii) xi N (0; 1), i N ( ; 1), "i N (0; 1). Note that = E [ i ]. Let
Pn
xi y i
b = Pi=1
n
2
i=1 xi
denote the coe¢ cient of xi when yi is regressed on xi . What is the asymptotic distribution of
p b
n
? Make sure that the asymptotic variance is a concrete number, not an abstract
formula.
Question 5 (20 pts.)
In this question, your derivation as well as your answer will be read and evaluated.
Suppose that we have
yi =
+ x i + "i
i = 1; : : : ; 10
yi =
+ xi + "i
i = 11; : : : ; 20
We assume that x1 ; : : : ; x20 are nonstochastic, and that "i are i.i.d. such that "i
assume that
10
X
xi = 0;
i=1
10
X
i=1
x2i = 20;
20
X
N (0; 1). We
xi = 0
i=11
20
X
x2i = 40
i=11
Let b; b and b; b denote the OLS coe¢ cients when y is regressed on a constant and x for
i = 1; : : : ; 10 and i = 11; : : : ; 20. What is the distribution of b
b
(
)? (Your will get
at most 50% of the credit if you do not establish the joint distribution of b; b rigorously.)
Question 6 (20 pts.)
In this question, your derivation as well as your answer will be read and evaluated.
Suppose that
y i = i x i + "i
i = 1; : : : ; n
and that (i) ( i ; xi ; "i ) i = 1; 2; : : : are i.i.d.; (ii) xi is independent of "i ; (iii) xi
N (1; 1);
b
and (iv) i j xi
N (xi ; 1) and "i
N (0; 1) are independent of each other. Let denote the
coe¢ cient of xi when yi is regressed on xi . What is the numerical value of plim b E [ i ] ?
You may want to recall that the moment generating function of N ( ;
2
) is exp
t+
2 t2
2
.
Part III - 203C
1. A random variable X is called exponential( ) if it has the probability density function
f (x) =
exp( x); if x 0
:
0;
otherwise
Suppose that we have two independent random samples: fXi gni=1 from exponential( ) and
fYj gm
j=1 from exponential( ).
(a) (10 points) Find the likelihood ratio test of H0 :
=
v.s. H1 :
6= .
(b) (10 points) Show that the test in part (a) can be based on the statistic
Pn
Xi
i=1 P
:
Tn;m = Pn
m
j=1 Yj
i=1 Xi +
(c) (5 points) Find the …nite sample distribution of Tn;m when H0 is true. Moreover, …nd
the critical value of the level- test in part (a) based on Tn;m .
(d) (10 points) Find the asymptotic properties of the level- test in (c) under H0 in
the following three separate scenarios: (i) n=m ! 0; (ii) n=m ! 1; and (iii)
n=m ! c 2 (0; 1).
(e) (10 points) Find the asymptotic properties of the level- test in (c) under H1 in
the following three separate scenarios: (i) n=m ! 0; (ii) n=m ! 1; and (iii)
n=m ! c 2 (0; 1).
2. Suppose fXt g is generated from the following model
Xt = X t
1
+ ut + ut 1 ,
where ut is i.i.d. (0; u2 ) with …nite 4-th moment and j j < 1. We have T observations
fXt gTt=1 from this process.
(a) (10 points) Find the spectral density of the process fXt g.
(b) (10 points) Show that E [(Xt
Xt 1 )Xt 2 ] = 0 for any j j < 1 and for any .
Construct the method of moment (MM) estimator of and …nd its asymptotic
distribution.
(c) (10 points) Suppose that one is interested in testing H0 :
= 0 versus H1 :
6=
0. Construct a test using the MM estimator in part (b) and study its asymptotic
properties under both H0 and H1 .
(d) (10 points) Suppose that we know that 2 (0; 1) and > 0. Prove or disprove the
following statement: and are identi…ed by the following moment conditions
E [(Xt
Xt 1 )Xt 2 ] = 0
:
E [(Xt ( + )Xt 1 + Xt 2 )Xt 3 ] = 0
(e) (15 points) Consider the LS estimator of :
bn =
PT
t=2
P
T
Xt Xt
2
t=1 Xt
1
:
From bn , we can construct the …tted residual u
bt = Xt
then construct the following statistic
PT
bt u
bt 1
t=3 u
bn = P
:
T
b2t
t=2 u
bn Xt
Under the conditions in (d), …nd the probability limit of bn .
1
for t = 2; : : : ; T , and
Some Useful Theorems and Lemmas
P
Theorem 1 If X1 ; X2 ; : : : ; Xk are i.i.d. from exponential( ), then
Xi is a gamma(k; )
random variable. Moreover, if X is a gamma(k1 ; ) random variable, Y is a gamma(k2 ; )
random variable, and X and Y are independent, then X=(X + Y ) is a beta(k1 ; k2 ) random
variable whose pdf only depends on k1 and k2 .
Theorem 2 (Martingale Convergence Theorem) Let f(Xt ; Ft )gt2Z+ be a martingale in
L2 . If supt E jXt j2 < 1, then Xn ! X1 almost surely, where X1 is some element in L2 .
Theorem 3 (Martingale CLT) Let fXt;n ; Ft;n g be a martingale di¤erence array such that
E[jXt;n j2+ ] < < 1 for some > 0 and for all t and n. If 2n > 1 > 0 for all n su¢ ciently
P
1
2
2
2
large and n1 nt=1 Xt;n
n !p 0, then n X n = n !d N (0; 1).
Theorem 4 (LLN of Linear Processes) Suppose that Zt is i.i.d. with mean zero and E[jZ0 j] <
P1
P
'
Z
,
where
'
is
a
sequence
of
real
numbers
with
1. Let Xt = 1
k
t
k
k
k=0 k j'k j < 1. Then
k=0
Pn
1
n
t=1 Xt !a:s: 0.
Theorem 5 (CLT of Linear Processes) Suppose that Zt is i.i.d. with mean zero and E[Z02 ] =
P1 2 2
P1
2
Z < 1. Let Xt =
k=0 k 'k < 1.
k=0 'k Zt k , where 'k is a sequence of real numbers with
1 Pn
2
2
Then n 2 t=1 Xt !d N [0; '(1) Z ].
Theorem 6 (LLN of Sample Variance) Suppose that Zt is i.i.d. with mean zero and E[Z02 ] =
P1
P1
2
2
Z < 1. Let Xt =
k=0 k'k < 1.
k=0 'k Zt k , where 'k is a sequence of real numbers with
Then
n
1X
Xt Xt h !p X (h) = E [Xt Xt h ] :
(1)
n t=1
P
Theorem 7 (Donsker) Let fut g be a sequence of random variables generated by ut = 1
k=0 'k "t
2
'(L)"t , where f"t g iid (0; " ) with …nite fourth moment and f'k g is a sequence of constants
P
1 P[n ]
2
with 1
= " '(1).
k=0 k j'k j < 1. Then Bu;n ( ) = n
t=1 ut !d B( ), where
k
=