COGS 100 Lecture Notes - Lecture 17: Dementia, Parallel Computing, Recurrent Neural Network

46 views4 pages

Document Summary

Weights (connections) are the things that change overtime. Learn and adjust to produce input you want it to produce. Has no ready and semantically interpretable representations: model of: conceptual hierarchy, concept learning, semantic memory. What are they like, what are they related to. E. g. first differentiate plants and animals, then fish and birds, then robin and canary (general to more specific) Actions are in the nodes, the weights do not actually do very much: learning is in the weights, representation is in the weights, building a model of a particular chunk, sense-think-act. Mainly a thinking model: parallel processing (not serial) Literal: processing thinking, layer to layer, at one time: semantically interpretable representations. Earlier in learning one pattern, later in learning diff pattern. Shrdlu"s representations are stable always represent the same things: language-like rules. E. g. for symbolic system how to move an object, making sub-goals, the status. Need to ask the model, cannot just say it is.

Get access

Grade+
$40 USD/m
Billed monthly
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
10 Verified Answers
Class+
$30 USD/m
Billed monthly
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
7 Verified Answers

Related Documents