Brian KingProsolodev.app·Apr 24, 2024Installing Twinny for VS Code.TL;DR. Installing Twinny for VS Code involves creating two Miniconda environments to run different variations of Code Llama using LiteLLM, which provides accessible IP addresses and ports for local LLMs. Twinny, an AI code completion tool, is integra...DiscussThe AI Series#Twinny
Brian KingProsolodev.app·Mar 12, 2024Installing AutoGen Studio.TL;DR. This post is a guide through installing and setting up Microsoft AutoGen Studio, a tool for orchestrating collaborative AI agents. It covers prerequisites like setting up Ubuntu with Miniconda, Ollama and Llama2, and LiteLLM. The process inclu...Discuss·92 readsThe AI SeriesAutoGen Studio
Jensen Lowjensenlwt.io·Nov 24, 2023Extending Ollama 🚀: Leverage the Power of Ollama and LiteLLM for OpenAI CompatibilityWhat is Ollama Lacking? Before we deep dive into extending Ollama, let me just state that I absolutely love using Ollama. The ease of installing it on my MacOS, and downloading the latest LLMs to running models like Yi and Mistral cannot be over-emph...Discuss·1 like·1.1K readsLLM Toolslitellm