Sanika Nandpuresanikanandpure.hashnode.dev·May 19, 2024optimizers (the Adam algorithm)The Adam (Adaptive Moment estimation) algorithm optimizes gradient descent. If it notices that gradient descent is taking small steps in the same direction, it increases the value of alpha so that it can reach the minimum faster. On the other extreme...Discussneural network optimizers
Kaan Berke UGURLARkaanberke.hashnode.dev·Sep 17, 2023This New Optimizer Called Lion Could Replace Adam as the Go-To for Training Neural NetsGoogle Brain researchers have discovered a new optimization algorithm for training deep neural networks called Lion that outperforms the popular Adam optimizer on a variety of computer vision and natural language processing tasks. In a paper publishe...Discuss·214 readsOptimizer