Understanding Gradients and Optimization Techniques , Loss Functions in Neural Networks
What is a Gradient?
A gradient measures how much a small change in a parameter (like a weight) affects the model's output. In neural networks, gradients guide how to adjust weights to minimize the error (loss).
Intuition: Imagine standing on a hill—g...
mlbook.hashnode.dev4 min read