bharathgaddam.hashnode.devLECTURE-2 : Implementing Word2Vec with Negative Sampling from ScratchNatural Language Processing (NLP) has taken tremendous strides over the last decade, and at the heart of many modern NLP techniques lies the idea of word embeddings—vector representations of words that capture their semantic and syntactic meanings. O...Apr 14, 2025·4 min read
bharathgaddam.hashnode.devLECTURE -1 : Understand Word2Vec ( SkipGram)📌 Introduction Understanding language with neural networks has become the backbone of many NLP applications. One of the first major breakthroughs was Word2Vec, developed by Tomas Mikolov in 2013. It brought the concept of word embeddings — dense vec...Apr 9, 2025·5 min read
bharathgaddam.hashnode.dev✨ Transfer Learning with BERT for Text Classification🔍 Introduction In this blog, we’ll walk through a full pipeline for applying transfer learning using BERT (Bidirectional Encoder Representations from Transformers) to a text classification task. We'll use the Hugging Face Transformers library along ...Apr 7, 2025·4 min read
bharathgaddam.hashnode.devPART -3 : Neural Machine Translation with AttentionIn recent years, Neural Machine Translation (NMT) has transformed how machines translate languages, moving from traditional statistical models to deep learning-based approaches. This research, based on the seminal work by Bahdanau et al. (2014), expl...Jan 20, 2025·4 min read
bharathgaddam.hashnode.devPART - 2 : Universal Language Model Fine-tuning for Text ClassificationWelcome to the series, celebrating the foundational works that have shaped modern Natural Language Processing (NLP). Today we will discuss “Universal Language Model Fine-tuning for Text Classification“. Paper that has introduced an effective transfer...Jan 3, 2025·4 min read