Sam Schneiderblog.samschneider.me·Aug 16, 2024A Very Simple Example of Fine-Tuning an LLMNormally when you fine-tune an LLM you end up making Jeff Bezos just a little bit richer due to the enormous compute power required even for the simplest of fine-tuning. I tried every free avenue I could think of to demonstrate fine-tuning using Mist...Discussfinetuning
Sam Schneiderblog.samschneider.me·Jul 30, 2024Type Code, Get AIMany people are extremely conversant in AI, but still haven't typed that first command to interact with a real AI model. Let's change that in a single blog post. The first technology you need is called a Jupytr Notebook. There are lots of online host...Discuss·1 like·28 readsJupyter Notebook
Arnab Deyarnabde.hashnode.dev·Jul 17, 2024Analyzing the Architectural Components in Large Language Model (LLMs)We’ll explore the architecture of the GPT-2 model and understand how it generates text. Deep Diving into the intricacies of LLM and their pivotal role in the realm of GenAI. I’ll be doing this in Google Colab with Python and a couple of packages call...Discuss·1 like·41 readsLLM's
DataChefforDataChef's Blogblog.datachef.co·Jul 29, 2021Paraphrasing and Style Transfer with GPT-2Since their introduction in 2017, transformers have gotten more popular by day. They were initially proposed in the paper Attention Is All You Need. In Natural Language Processing, a transformer is a structure that helps perform the seq2seq tasks (te...Discuss·139 readsGPT-2