The Calculus That Powers Neural Networks
Here is my guide to the calculus concepts that actually matter for ML
1. Derivatives and Gradients
At its heart, training a model is an optimization problem. We define a Loss Function (L) that tells us how "bad" our model is. We want to minimize L.
T...
saachis-blog.hashnode.dev6 min read