1. What is the fundamental equation that guides changes to a weight wij in a BP network. Describe its components.
Backpropagation:
It is an algorithm superintend learning of artificial neural network using gradient descent. It calculates the gradient of the error function with respect to the neural network's weights.
Five steps in the back propagation learning algorithm:
1. Initialize weights with random values and set other parameters.
2. Read the input vector and desired output.
3. Compute the actual output via the calculations, working forward through the layers.
4. Compute the error.
Get Answers For Free
Most questions answered within 1 hours.