PSYC2013 Lecture Notes - Lecture 5: Connectionism, Backpropagation, Supervised Learning

16 views8 pages

Document Summary

Computational models: computer programs inspired by neural metaphor. Set of interconnected processing nodes (~neurones) that communicate" by sending activation or inhibition. A learning rule for adjusting connections throughout the network. Ex: back propagation learning rules : mechanism allowing a network to learn to associate a given input pattern with a given output pattern by comparing actual responses against correct ones. During the early stages of learning, the output units often produce an incorrect pattern or response following presentation of the input pattern. Backprop compares the imperfect pattern with the known required response, noting the errors. It then back-propagates activation through the network so the weights between the units are adjusted to produce the required pattern. This process is repeated until the network produces the required pattern the model learns the appropriate behavior without being explicitly programmed to do so. Supervised learning: adjust connection weights to reduce error.

Get access

Grade+
$40 USD/m
Billed monthly
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
10 Verified Answers
Class+
$30 USD/m
Billed monthly
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
7 Verified Answers

Related Documents