STAT 2040 Lecture Notes - Lecture 14: Confidence Interval, Statistical Model, Variance
Document Summary
In simple linear regression there is a single explanatory variable x. I(cid:374) (cid:373)ultiple li(cid:374)ea(cid:396) (cid:396)eg(cid:396)essio(cid:374) the(cid:396)e is (cid:373)o(cid:396)e tha(cid:374) o(cid:374)e e(cid:454)pla(cid:374)ato(cid:396)(cid:455) (cid:448)a(cid:396)ia(cid:271)le (cid:894)x(cid:1005), x(cid:1006), (cid:895) In the simple linear regression model, we assume linear relationship between y and x: uylx = B0 + b1 x: uylx = e(y l x) represents the true mean of y for a given value of x, the observed values of y will vary about the line. In order to carry out valid statistical procedures in regression, we need to make a few assumptions. First, the observations are assumed to be independent. In addition, e is assumed to be a random variable that: has a mean of 0, has the same variance at every value of x. X and y: h0: b1 = 0 ---- against one of the three alternative hypotheses: h0: b1 not equal 0, h0: b1 < 0, h0: b1 > 0.