ITM 760 Lecture Notes - Lecture 8: Feature Vector, Binary Classification, Discriminant

15 views7 pages

Document Summary

Itm760: lecture #8 large scale machine learning ii. 1. 0 601-1. 5 70000 - 60=105000 -b o t age b. Input: vectors x and labels y can be. H } - i: vectors x are real valued. Perceptron [rosenblatt 58]: (artificial neurons) ffx) takes the feature values } calculates the weighted sum - which is the multiplications of the weights of each feature } the value of that feature. (very) loose motivation: neuron. If the f(x) is: positive: predict +1, negative: predict -1. * 5 dimensions so 5 weights: binary or real-valued feature vector x of word occurrences, d features (words + other things, d~100,000) Class y (cid:143) y: y: spam (+1), not spam (-1) class -_ t1 ( spam) hor -i ( notspam) Training a perceptron with zero threshold (estimating w): Start with w0 = 0: this implies is equal to zero. If y" is correct (i. e. , yt = y") no change: w(t+1) = w(t)

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers

Related Documents