How Backpropagation Works? (Cont.)
Step 3. Calculating the Updated Weight Value
Now, let’s put all the values together:
Let’s calculate the updated value of W5 (optionally multiplied by some learning rate, η, which we’ll set to 0.5):
- Similarly, we can calculate the other weight values as well.
- After that we will again propagate forward and calculate the output. Again, we will calculate the error.
- If the error is minimum we will stop right there, else we will again propagate backwards and update the weight values.
- This process will keep on repeating until error becomes minimum.
Conclusion
The pseudocode for the backpropagation is as follows:
Backpropagation Algorithm
|
initialize network weights (often small random values)
do {
foreach training example named ex {
// forward pass
prediction = neural-net-output( network, ex )
actual = teacher-output( ex )
compute error( prediction - actual ) at the output units
// backward pass
for all weights from hidden layer to output layer
compute { displaystyle Delta w_{h} }
// backward pass continued
for all weights from input layer to hidden layer
compute { displaystyle Delta w_{i} }
// input layer not modified by error estimate
update network weights
}
}
until all examples classified correctly or another stopping criterion satisfied
return the network
|
Q: How do you make a skunk stop smelling?
A: Pinch its nose closed.
|