Using Ollama with Quadlets
There are numerous posts around the internet about getting Ollama running to do local AI, but not many seem to know about the convenience of Quadlets! Quadlets let you run Podman containers as systemd services. With my AMD GPU, I’m using the :rocm ve...
the-learning-curve.hashnode.dev3 min read
Lamri Abdellah Ramdane
Developer passionate about clean code, open source, and exploring new tech.
This is a great tutorial on using Ollama with Quadlets. Local LLMs are a game-changer! For a full local dev environment that makes it easy to manage tools like this, I'd recommend checking out ServBay.