Attention is all you need: How Transformer Architecture in NLP started.
Original Paper: Attention is all you need.
This was THE paper that introduced Transformer Architecture to NLP. This transformative concept led to the rise of LLMs and solved the problem of contextualized word embeddings!
Ok, some context first:
Let's...
suryamaddula.com8 min read