Introduction to Neural Networks For Self Driving Cars (Foundational Concepts -- Part 2)
So now let's study gradient descent. So we're standing somewhere in Mount ABC and we need to go down. So now the inputs of the functions are W1 and W2 and the error function is given by E. Then the gradient of E is given by the vector sum of the partial derivatives of E with respect to W1 and W2. This gradient actually tells us the direction we want to move if we want to increase the error function the most. Thus, if we take the negative of the gradient, this will tell us how to decrease the error function the most.
Sep-8-2022, 17:00:10 GMT