CS486 Lecture 13: Back Propagation

34 views3 pages

Document Summary

Learning neural networks 11. 26. 18: a perceptron has a non-linear activation function f( wi xi) of input values x0, x1, , xn to an output value y, each input has a weight to model their strength. The step function f(x) = 1 if x > 0 or 0 if x <= 0 but is non-differentiable. There is one input layer, at least one hidden layer, and one output layer. The outputs of a unit cannot influence its inputs (no loops: each unit uses the sigmoid threshold function. Xor (true if both inputs are different) is the simplest function that requires a hidden layer. Use the activation function as the step function. H1 = f(x1 + x2 0. 5) computes x1 or x2. H2 = f( x1 x2 + 1. 5) computes not x1 and x2. If the stopping criteria is not met, go to step 2 and repeat with another training example.

Get access

Grade+
$40 USD/m
Billed monthly
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
10 Verified Answers
Class+
$30 USD/m
Billed monthly
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
7 Verified Answers

Related Documents