Survey
* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project
4/9/2009 QM 6203: Econometrics I Mohd‐Pisal Zainal, Ph.D. Department of Banking INCEIF Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation 1 4/9/2009 7‐1. The three‐Variable Model: Notation and Assumptions • Yi = ß1+ ß2X2i + ß3X3i + u i • ß2 , ß3 are partial regression coefficients • With the following assumptions: – – – – – – (7.1.1) Zero mean value of U i:: E(u i|X2i,X3i) = 0. ∀i No serial correlation: Cov(ui,uj) = 0, ∀i # j Homoscedasticity: Var(u i) = σ2 Cov(ui,X2i) = Cov(ui,X3i) = 0 No specification bias or model correct specified No exact collinearity between X variables (7.1.2) (7.1.3) (7.1.4) (7.1.5) (7.1.6) (7.1.7) • (no multicollinearity in the cases of more explanatory vars. If there is linear relationship exits, X vars. Are said to be linearly dependent) – Model is linear in parameters 7‐2. Interpretation of Multiple Regression • E(Yi| X2i ,X X3i) = ß = ß1+ ß + ß2X2i + ß + ß3X3i (7.2.1) (7 2 1) • (7.2.1) gives conditional mean or expected value of Y conditional upon the ggiven or fixed value of the X2 and X3 2 4/9/2009 7‐3. The meaning of partial regression coefficients • • • Yi= ß1+ ß2X2i + ß3X3 +….+ ßsXs+ ui ßk measures the change in the mean value of Y per unit change in Xk, holding the rest explanatory variables constant. It gives the “direct” effect of unit change in Xk on the E(Yi), net of Xj (j # k) How to control the “true” effect of a unit change in Xk on Y? (read pages 195‐197) 7‐4. OLS and ML estimation of the partial regression coefficients • This section (pages 197‐201) provides: s sec o (pages 9 0 ) p o des 1. The OLS estimators in the case of three‐variable regression Yi= ß1+ ß2X2i + ß3X3+ ui 2. Variances and standard errors of OLS estimators 3. 8 properties of OLS estimators (pp 199‐201) 4. Understanding on ML estimators 3 4/9/2009 7‐5. The multiple coefficient of determination R2 and the multiple coefficient of correlation R • This section provides: p 1. Definition of R2 in the context of multiple regression like r2 in the case of two‐variable regression 2. R = ±√R2 is the coefficient of multiple regression, it measures the degree of association between Y and all the explanatory variables jointly 3. Variance of a partial regression coefficient 2/ ∑x V (ß^k) = σ Var(ß^ ) / ∑ 2k (1/(1‐R (1/(1 R2k)) )) (7.5.6) (7 5 6) Where ß^k is the partial regression coefficient of regressor Xk and R2k is the R2 in the regression of Xk on the rest regressors 7‐6. Example 7.1: The expectations‐ augmented Philips Curve for the US (1970‐1982) • This section provides an illustration for the ideas introduced in the chapter • Regression Model (7.6.1) • Data set is in Table 7.1 Data set is in Table 7 1 4 4/9/2009 7‐7. Simple regression in the context of multiple regression: Introduction to specification bias • This section provides an understanding on “ Simple regression in the context of multiple regression”. It will cause the specification bias which will be discussed in Chapter 13 p 7‐8. R2 and the Adjusted‐R2 • • • R2 is a non‐decreasing function of the number of explanatory variables. An additional X variable will not decrease R2 R2= ESS/TSS = 1‐ RSS/TSS = 1‐∑u^2I / ∑y^2i (7.8.1) This will make the wrong direction by adding more irrelevant variables into the regression and give an idea for an adjusted‐R2 (R bar) by taking account of degree of freedom (7.8.2) R2bar= 1‐ [ ∑u^2I /(n‐k)] / [∑y^2i /(n‐1) ] , or R2bar= 1‐ σ^2 / S2Y (S2Y is sample variance of Y) K= number of parameters including intercept term – – By substituting (7.8.1) into (7.8.2) we get By substituting (7.8.1) into (7.8.2) we get R2bar = 1‐ (1‐R2) (n‐1)/(n‐ k) (7.8.4) For k > 1, R2bar < R2 thus when number of X variables increases R2bar increases less than R2 and R2bar can be negative 5 4/9/2009 7‐8. R2 and the Adjusted‐R2 • • • R2 is a non‐decreasing function of the number of explanatory variables. An additional X variable will not decrease R2 R2= ESS/TSS = 1‐ RSS/TSS = 1‐∑u^2I / ∑y^2i (7.8.1) This will make the wrong direction by adding more irrelevant variables into the regression and give an idea for an adjusted‐R2 (R bar) by taking account of degree of freedom (7.8.2) R2bar= 1‐ [ ∑u^2I /(n‐k)] / [∑y^2i /(n‐1) ] , or R2bar= 1‐ σ^2 / S2Y (S2Y is sample variance of Y) K= number of parameters including intercept term – – By substituting (7.8.1) into (7.8.2) we get R2bar = 1‐ (1‐R2) (n‐1)/(n‐ k) (7.8.4) For k > 1, R2bar < R2 thus when number of X variables increases R2bar increases less than R2 and R2bar can be negative 7‐8. R2 and the Adjusted‐R2 • • • • Comparing Two R2 Values: TTo compare, the size n and the dependent variable must be h i d h d d i bl b the same Example 7‐2: Coffee Demand Function Revisited (page 210) The “game” of maximizing adjusted‐R2: Choosing the model that gives the highest R2bar may be dangerous, for in regression our objective is not for that but for obtaining the dependable estimates of the true population regression coefficients and draw statistical inferences about them Should be more concerned about the logical or theoretical relevance of the explanatory variables to the dependent variable and their statistical significance 6 4/9/2009 7‐9. Partial Correlation Coefficients This section provides: • This section provides: 1. Explanation of simple and partial correlation coefficients 2. Interpretation of simple and partial correlation coefficients (pages 211‐214) 7‐10. Example 7.3: The Cobb‐Douglas Production function More on functional form • Yi = β1Xβ22i Xβ33ieUi (7.10.1) By log‐transform of this model: • lnYi = lnβ1 + β2ln X2i + β3ln X3i + Ui = β0 + β2ln X2i + β3ln X3i + Ui (7.10.2) Data set is in Table 7.3 Report of results is in page 216 7 4/9/2009 7‐11 Polynomial Regression Models • Yi = β0 + β1 Xi + β2 X2i +…+ βk Xki + Ui (7.11.3) • • • Example 7.4: Estimating the Total Cost Function Data set is in Table 7.4 Empirical results is in page 221 ‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐‐ • 7‐12. Summary and Conclusions (page 221) 8