COGS 2160 Lecture Notes - Lecture 11: Artificial Neural Network, Hebbian Theory, Unsupervised Learning
Document Summary
Neurons that are out of synch fail to link. : si(cid:373)ple for(cid:373)al e(cid:454)pressio(cid:374): (cid:449)12 = (cid:632) (cid:454) a1 x a2 a1 = activation level of first node a2 = activation level of second node. It also features in more complex learning algorithms, e. g. competitive learning. It will tell you the output activation which is fed through and transmits the activity level to the units in the next layer. If delta is positive, then the intended output was bigger than the actual network: the network has undershot. So the weights need to be increased and the threshold needs to be decreased. It is not capable of computing all boolean functions: there are some that it cannot compute. Xor is not linearly separable: the network must output 1 when i2 = 1, so we have 1 x w2 > t. The basic problem: multilayer networks can be constructed to compute any turing-computable function, but cannot be trained using the perceptron convergence rule.