Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
30/10/2014 OLS • Let X be a (N x K)matrix of k explanatory variables, including a column of 1 for the constant term, over N observations. • Let y be a vector of N observations on the dependent variable • Let B be a vector of paramters • Let e be a vector of N residual terms • y = XB + e OLS & GLS Bo Sjö 2014 1 Min sum of squares • • • • • • 2 Properties of OLS Min (ee’) => S(b) objective function. S(b) = (y – XB)’(y – XB) [S(b)/δb] = -2(X’y – X’XB) Solving for B gives b (the estimate) b = [X’X]-1 [X’y] = 0 b = B + [X’X]-1 [X’e] The Gauss-Markov conditions: yt = xt’βi + εt or matrix form y = x’ β + ε εt is a random variable (a process) E{εt} = 0 Correct specification (+No err in variable) E{εt εt} = σ2 Homoscedasticity E{εt εt±/-k} = 0 for all k ≠ t No autocorrelation E{εt | X} = 0 Weak exogeneity Var {εt | X} = 0 We can add linearity, Normality in εt for inference If these conditions are fulfilled OLS is Best Linear unbiased estimator BLUE, and the estimated coefficients are ‘good estimates ‘ of the true parameters of interest. Estimates will asymptotically have a normal distribution. (CLT) 3 GLS and FGLS (or EGLS) 4 Estimating the OLS In you first course you learn how to use OLS and understand the coefficients in a mutivariate linear regression model. You focus on the problems of heteroscedasticity and autocorrelation. These problems can be analysed and ’solved’ with Generalized Least squares GLS. If you know and can estimate the heteroscedastity and autocorrelation correctly, you can pre-wash you data to resore the desired residual properties. Since GLS assumes that we know the covariance matrix, it must be replaced by a estimates, which leads to Feasible Generalized Least Squares (FGLS) or Estimated Generalized Least Squares (EGLS). 5 • In the basic textbook, the estimated parameters will be given by B= (X’X)-1 (X’y) • This requiers that the expected value (b) is unbiased. • Take expectations, substitute y with xB + e E(b) = B + E{[X’X]-1 [X’e]} • Since E(B) = B, it is constant. 6 1 30/10/2014 • What happend with the last term? E(b) = B + E{[X’X]-1 [X’e]} If the Gauss-Markov holds etc. It will have an expected value of zero. Taking plim show 1) That E[X’X]-1 can be viewed as a constant. 2) E[X’e] -p > 0, typpically because E(X)E(e)=E(X) × 0 = 0, if independent • The last term is always interesting E{[X’X]-1 [X’e]} It tells about bias – if non-zero It gives the variance (and efficiency) as We look at Var(b-B) Also consistency, drive T to infinity and analyse what happens with the estimates. 7 8 Besides • • • • • GLS Estimation Heteroscedasticity and autocorrelation Specification System equation - exogeneity Error in variable problems (measurement)? What is y? • • • • You want E{ε’ ε} = σ2I Where I is the identity matrix But you get E{ε’ ε} = σ2 Ω To get what you want find the inverse of Ω, • • • • Finding the inverse, means finding P such that P’P = Ω-1 Use P to construct new variables such that Py = Px’ β + Pε y* = x*‘ β + ε* where V{ε* } = σ2I – such that Ω Ω-1 =I, – Continuous, truncated, only positive values, ordered variables, probability, time measure variable (duration)? 9 10 FGLS & GMM • Since we cannot know the P matrix it must be estimated -> Feasable GLS. • Here you operate on the variables. • But, suppose the problem is misspecification then FGLS leads totally wrong. In time series use dynamic specification (lags) instead. • But you can also operate in the σ2 Ω expression directly. This is GMM. 11 2