MTH-416, REGRESSION ANALYSIS Lecture Notes - Lecture 7: Simple Linear Regression, Symmetric Matrix, Bias Of An Estimator

13 views6 pages

Document Summary

The usual linear regression model assumes that all the random error components are identically and independently distributed with constant variance. When this assumption is violated, then ordinary least squares estimator of the regression coefficient loses its property of minimum variance in the class of linear and unbiased estimators. In such cases, the covariance matrix of random error components does not remain in the form of an identity matrix but can be considered as any positive definite matrix. Under such assumption, the olse does not remain efficient as in the case of an identity covariance matrix. The generalized or weighted least squares method is used in such situations to estimate the parameters of the model. In this method, the deviation between the observed and expected values of iy is multiplied by a weight i i is chosen to be inversely proportional to the variance of where. For a simple linear regression model, the weighted least squares function is iy .

Get access

Grade+
$40 USD/m
Billed monthly
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
10 Verified Answers
Class+
$30 USD/m
Billed monthly
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
7 Verified Answers

Related Documents