Neural networks are hard to train. The more they go deep, the more they are likely to suffer from unstable gradients. Gradients can either explode or vanish, and neither of those is a good thing for the training of our network. The vanishing gradient...
jeande.hashnode.dev2 min read
No responses yet.