retzam.hashnode.devBERT in its ElementsHello 馃懡, I know, still on BERT, right? 馃榾. Well, it鈥檚 a lot to learn, and it鈥檚 worth it, no shortcuts 馃. We learned about General Language Pre-training and a little about the architecture of BERT. W11h ago路8 min read
retzam.hashnode.devSo I have been using OpenClaw for a week nowHello 馃懡 Today we鈥檒l be taking another spontaneous detour from our AI series for a Mid-Series Special 馃. This time it鈥檒l be about the highly popular OpenClaw project. If you haven鈥檛 heard of OpenClaw, it鈥檚 an open-source tool/project that provides a...Feb 16路8 min read
retzam.hashnode.devBERT and GLPHello 馃槉, We learned the core philosophy behind BERT in the introductory chapter of our previous series chapter. In this chapter, we鈥檒l learn more about BERT and another pivotal process in Machine Learning known as General Language Pre-training, GLP....Feb 9路6 min read
retzam.hashnode.devIntroduction to BERT (Bidirectional Encoder Representations from Transformers)Aloha 馃懡, We learned a little about Large Language Models in our last series chapter, which brought us to the 3 types of LLMs (BERT, GPT, LLaMA). We鈥檒l start with BERT. Let鈥檚 get right into it 馃 It all started with Google Engineers yet again. Jacob ...Jan 26路4 min read
retzam.hashnode.devIntroduction To Large Language Models (LLMs)Hello 馃, We鈥檝e come a long way in our AI series. Let me start with a brief recap. We started from scratch with Supervised and Unsupervised Learning models, where we learned and built models like K-Nearest Neighbors, Naive Bayes, K-Means Clustering, ...Jan 19路4 min read