STAT 4444 Lecture Notes - Lecture 17: Simple Linear Regression, Likelihood Function, Posterior Probability

32 views3 pages

Document Summary

Lecture 17: linear regression with the bayesian approach. Class business: homework iv is due on april 17 at 11:59 p. m. V ar[yi] = 2, so the data function can be written as yi|xi, 0, 1, 2 follows. N ( 0 + 1(xi x), 2), i = 1, 2, , n. The likelihood function for 0, 1, 2|yi, xi follows. The parameter of greatest interest in bayesian simple linear regression is usually the slope 1. Thus, we need to nd a joint posterior density of all three regression parameters before integrating out 0 and 2 to get the posterior density of 1. The likelihood of y1, y2, , yn is the product of each likelihood p(yi|xi, 0, 1, 2). p(y| 0, 1, 2) = n. 1 exp(cid:18) (y1 0 1(x1 x))2. 1 exp(cid:18) (yn 0 1(xn x))2 (cid:19) 2 exp(cid:18) pn i=1(yi 0 1(x i x))2.

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers

Related textbook solutions

Related Documents