COMPSCI 189 Study Guide - Final Guide: Anatoly Efros, Decision Tree Learning, Logistic Regression

47 views18 pages
8 Jan 2019
School
Professor

Document Summary

If you are not sure of your answer you may wish to provide a brief explanation. Do not attach any extra sheets: the total number of points is 150. No partial credit: the set of all correct answers must be checked. First and last name of student to your left. First and last name of student to your right. Squares. (cid:13) can reduce the l1-penalized residual sum of. In this problem, we want to investigate whether given enough training examples, the bayes decision rule gives more accurate results that nearest neighbors. A life insurance company needs to estimate whether a client is at risk of dying in the year to come, based on his age and blood pressure. We call x = [a, b] (a=age, b=blood pressure) the two dimensional input vector and y the outcome (y = 1 if the client dies and y = 1 otherwise).