Introduction to Transformers and Self Attention
Why did we switch to Transformers in the first place? RNN and LSTM were developed even before the 2000s. They managed to translate texts and were used in Time series data prediction to predict outcomes based on past events, LSTMs even helped increase...
kunal221.hashnode.dev14 min read
Prateek Shokeen
well researched content, loved reading it all :)