MTH-416, REGRESSION ANALYSIS Lecture Notes - Lecture 27: Principal Component Regression, Null Hypothesis, Observational Error

7 views4 pages

Document Summary

The random error component is represented as * just to distinguish with . The reduced coefficients contain the coefficients associated with retained. Using ols on the model with retained principal components, the olse of k r is. Now it is transformed back to original explanatory variables as follows: V k r pc k r is the principal component regression estimator of. This method improves the efficiency as well as multicollinearity: ridge regression. The olse is the best linear unbiased estimator of regression coefficient in the sense that it has minimum variance in the class of linear and unbiased estimators. However, if the condition of unbiased can be relaxed, then it is possible to find a biased estimator of regression coefficient say that has smaller variance them the unbiased olse b . The mean squared error (mse) of is. Var by introducing small bias is . The ridge regression estimator is obtained by solving the normal equations of least squares estimation.

Get access

Grade+
$40 USD/m
Billed monthly
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
10 Verified Answers
Class+
$30 USD/m
Billed monthly
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
7 Verified Answers

Related Questions