Backpropagation Algorithm
Take a look at the mathematics of the backpropagation algorithm.
We'll cover the following
Neural Networks (NN) are non-linear classifiers that can be formulated as a series of matrix multiplications. Just like linear classifiers, they can be trained using the same principles we followed before, namely the gradient descent algorithm. The difficulty arises in computing the gradients.
But first things first.
Let’s start with a straightforward example of a two-layered NN, with each layer containing just one neuron.
Notations
- The superscript defines the layer that we are in.
- denotes the activation of layer L.
- is a scalar weight of the layer L.
- is the bias term of layer L.
- is the cost function, is our target class, and is the activation function.
Forward pass
Our lovely model would look something like this in a simple sketch:
Create a free account to view this lesson.
Continue your learning journey with a 14-day free trial.
By signing up, you agree to Educative's Terms of Service and Privacy Policy