Apr 7 · 7 min read · Mem0 embedding is a technique that transforms data into dense vector representations, enabling AI agents to store and retrieve information semantically. This method allows for rapid, similarity-based searches of past experiences or knowledge, crucial...
Join discussionFeb 12 · 4 min read · 🔹 Why Do We Need Relationships in MongoDB? In real applications, data is connected. Real life example (Railway system 🚆): One Train has many Coaches One User can have many Bookings One Booking belongs to one Train So the question is:👉 How do ...
Join discussionFeb 10 · 11 min read · You're building a RAG system and need to pick an embedding model. The options are overwhelming: OpenAI, Voyage, Google, Cohere, or self-hosted open-source. Prices range from free to $0.13 per million tokens. Dimensions range from 256 to 3072. How do ...
Join discussion
Feb 8 · 14 min read · If you've ever wondered how AI "understands" that "king" is closer to "queen" than to "pizza," you're about to find out. And no, it's not magic, it's math. Specifically, it's embeddings and vector similarity. This is the foundation that powers semant...
Join discussion
Jan 24 · 4 min read · Have you ever wondered why a language model can tell that “king” and “queen” are related, or that “apple” and “banana” belong to the same category? At first glance, these connections seem obvious — but under the hood, language models don’t “know” any...
Join discussionDec 30, 2025 · 4 min read · Before a machine can reason, retrieve knowledge, or generate human-like responses, it must first answer a deceptively simple question: how should text be represented as numbers? Natural language, rich with context and nuance, cannot be processed dire...
Join discussion
Dec 17, 2025 · 7 min read · Brewing GenAI — One Cup of Chai at a Time ☕ This is the first blog in a series I’ll be posting over the coming days, covering concepts from my Generative AI (GenAI) course—basically a learning diary for me. Just like how we sip chai slowly and enjoy ...
Join discussion