Run Ollama on a Pi5
This Sunday morning, I decided to verify if Ollama could run on a Pi5.
My Pi5 has 8Gb of RAM and uses a SanDisk 256 Go Extreme PRO microSDXC (until 200 Mo/s). I plan to do more experiments with an NVMe Base extension board + SSD in the future, but th...
k33g.hashnode.dev5 min read
Works like a charm on a Fedora + podman compose !! I had to tweak the ollama-service a tiny bit with Philippe Charrière's help:
services: # https://cheshirecat.ai/local-models-with-ollama/ ollama-service: container_name: ollama_pi_local user: root image: ollama/ollama:0.1.27 volumes: - ollama-data:/root/.ollama ports: - 11434:11434 volumes: ollama-data: