Setting Up a Local LLM Chat Server with Ollama and OpenWebUI on Alpine Linux
Running your own local Language Learning Model (LLM) server can be an excellent way to repurpose older hardware while maintaining control over your AI interactions. This guide will walk you through setting up Ollama and OpenWebUI on Alpine Linux, a l...
blog.anirudhabhurke.dev4 min read