PSYC 532 Lecture Notes - Lecture 5: Network Topology, Composite Number, Prime Number

26 views3 pages

Document Summary

Psyc 532: lecture 13 constructive learning lecture. 1- back propagation is too slow: needs 100 or 1000s of epochs (full sets of examples) to learn a- step size problem: A better algorithm would take large and decisive steps, not constantly oscillate with relatively small steps. As curvature gets large (near summit of parabola) take smaller steps. Inspired by taylor series, where the more derivatives you know, the closer you are from a function. Each hidden unit tries to become a feature detector to contribute to solution, but all units are changing. Herd effect: units must decide which subtask to solve. If a generates larger or more coherent error signal, then units converge on a & ignore b. Once a is solved, they move toward b, so a reappears. Units converge on solving a"s error signal (reducing as weight to reduce its error). But this can make b become the current biggest source of error.

Get access

Grade+
$40 USD/m
Billed monthly
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
10 Verified Answers
Class+
$30 USD/m
Billed monthly
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
7 Verified Answers

Related Documents