PSYC 532 Lecture Notes - Lecture 10: Database, Joint Probability Distribution, Crossmodal

27 views4 pages

Document Summary

Psyc 532: lecture 22 deep learning: an overview. Multiple layers of representation and abstraction (hence a hierarchical construct). (cid:862)i(cid:374)for(cid:373)atio(cid:374)(cid:863) at higher la(cid:455)ers is (cid:272)o(cid:374)stru(cid:272)ted (cid:271)(cid:455) (cid:862)(cid:272)o(cid:373)(cid:271)i(cid:374)i(cid:374)g(cid:863) the (cid:862)i(cid:374)for(cid:373)atio(cid:374)(cid:863) at lo(cid:449)er layers. 150 layers in the last neural net that google made deep = bas on building hiarchical structure and adding layers on top of ech other to learn more and more abstract representations. Each layer comes up with new features to better do the task (classification,) a training set, a test set to see that the model can predict interesting things about the test input data. Lately, two major things have happened that are both necessary and complementary: And also advances in computational statistics to produce or learn high dementional space, we need a lot of input and training (understand images with a lot of pixels, understand te(cid:454)t (cid:449)ith a lot of (cid:449)ords, (cid:895) Massive increase in data: google, facebook, instagram, twitter, etc.

Get access

Grade+
$40 USD/m
Billed monthly
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
10 Verified Answers
Class+
$30 USD/m
Billed monthly
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
7 Verified Answers