Backprop
ThoughtStorms Wiki
In multi-layer NeuralNetworks, (aka multi-layer perceptron, there are "hidden layers" (ie. those between the input and output layers)
https://en.wikipedia.org/wiki/Backpropagation
In order to correctly figure out the responsibility that all the hidden weights contribute to final performance you need to "back-propagate" the error from the last layer through the preceding layers. That way you know how to update the weights in the hidden layers.
GoeffreyHinton one of the main researchers who developed this.
Backlinks (3 items)