TANBIRforMachine Learning Club, NIT Silcharml-club-nits.hashnode.dev·Oct 24, 2024Cracking the Code of Language: The Art of Word Embeddings.🌟INTRODUCTION What are these Word Embeddings? In a broad sense, these are just representations of words using numbers or vectors. Well, there’s more to it and In this blog, I will be going through the whole story of WORD EMBEDDINGS. "Okay, but why '...29 readsword embedding
TANBIRword-embedding.hashnode.dev·Oct 17, 2024Cracking the Code of Language: The Art of Word Embeddings.🌟INTRODUCTION What are these Word Embeddings? In a broad sense, these are just representations of words using numbers or vectors. Well, there’s more to it and In this blog, I will be going through the whole story of WORD EMBEDDINGS. "Okay, but why '...10 likes·48 readsMachine Learning
Muneer Ahmedthisismuneer.hashnode.dev·Oct 1, 2024You are the average of the five others around you.At its core, modern Natural Language Processing is built on a fascinating idea: "A word is characterized by the company it keeps." When I first learned about vector embeddings, this concept instantly reminded me of the saying "You are the average of ...1 like·41 readsvector embeddings
pankaj chauhanpankajchauhanblogs.hashnode.dev·Aug 30, 2024From Bag of Words to Self-Attention: The Evolution of Understanding Text in Machine LearningIntroduction: The Journey of Text Understanding In the realm of natural language processing (NLP), one of the most intriguing challenges has always been how machines can effectively understand and interpret human language. This journey has seen the e...nlp
Junyu Chenforpgvecto.rsblog.pgvecto.rs·Jul 5, 2024Unleash the power of sparse vectorIn the past, hybrid search combined two search methods: traditional keyword-based search and vector-based similarity search. However, sparse vector can act as a substitute for keyword search, unlocking the full potential of the data pipeline with pur...2 likes·247 readssparsevectors
Saurabh Naiksaurabhz.hashnode.dev·Apr 18, 2024Cracking the Code: Understanding Encoder-Decoder Architecture and Distance Measures for Word Embeddings with PythonIntroduction: Word embeddings, a cornerstone of natural language processing (NLP), provide a means to represent words in a machine-understandable format by capturing their semantic meaning. In this technical blog, we delve into the encoder-decoder ar...30 readsGenerative AIdistances
Fotie M. Constantblog.fotiecodes.com·Apr 11, 2024Explaining Embeddings in Machine Learning like I'm 10Words are very powerful! To help people understand us, we use them to communicate our ideas and thoughts. The funny thing is, although humans understand words, AI models don't actually understand them. All they understand are numbers! Thus, we need t...10 likes·46 readsMachine Learning
Nitin Agarwalcognibits.hashnode.dev·Jan 24, 2024Understanding Word Embeddings: Word2Vec and GloVeWord embeddings are a crucial component of natural language processing (NLP) and machine learning applications. They represent words as dense vectors in a continuous vector space, capturing semantic relationships between words. In this article, we'll...nlp
Atrij Paulatrijpaul.hashnode.dev·Nov 16, 2023Exquisite Exposition of Word EmbeddingsWhat is Word Embedding ? In Natural Language Processing , word embedding is a term used for the representation of words for text analysis , typically in the form of real valued vectors that encodes the meaning of the word such that the words that are...54 likes·405 readsword2vec
picipici.hashnode.dev·Oct 31, 2023Word embeddingIntroduction Word embeddings have become a foundational technology in Natural Language Processing (NLP), providing a way to represent words and documents in a numerical format. In this blog post, we'll explore the use of word embeddings in the contex...Natural Language Processingnlp