Statistical Sciences 2141A/B Lecture Notes - Lecture 30: Squared Deviations From The Mean, Likelihood Function, Independent And Identically Distributed Random Variables

15 views4 pages

Document Summary

The regression line is fitted to the data points (x1, y1), (x2, y2), , (xn, yn) by finding the line that is closest to the data points. This is done by looking at the vertical distances (deviations) between the line and the data points yi ( 0 + 1 xi) and minimizing the sum of the squares of the vertical deviations. This is referred to as the least squares fit. Since the values y1, y2, , yn are observations of iid rv"s where minimizing q is equivalent to maximizing the likelihood function of the yi s. Therefore, the point estimates of 0 and 1 are maximum likelihood estimates. The parameter estimates are found by taking partial derivatives of q with respect to 0 and 1 and setting the equations equal to 0. Therefore, the parameter estimates are the solutions to the normal equations. The normal equations can be solved to give.

Get access

Grade+
$40 USD/m
Billed monthly
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
10 Verified Answers
Class+
$30 USD/m
Billed monthly
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
7 Verified Answers

Related Documents