Hayati Prefilled Pods deliver smooth flavours with ease—ready to use, TPD compliant, affordable, and perfect for hassle-free vaping. prefilledpodvapes.co.uk/collections/hayati-vapes
Thanks Joseph, the dev space is changing drastically ringt now and it is difficult to keep up with the new trends. Your blogs hit the right spot so far for me as I am also looking into working with local AI.
Great practical walkthrough! I've been running Ollama on a Mac Mini with 64GB unified memory for months now, and the local inference experience has improved dramatically.
One thing I found running multiple models for different tasks —
qwen3:30bhandles complex reasoning surprisingly well for its size, whilegemma3:27bis solid for summarization. The key was keeping Ollama'sOLLAMA_MAX_LOADED_MODELStuned to avoid swapping between models killing your memory.Curious about the CORS setup in production — did you end up using a reverse proxy, or does Ollama's built-in
OLLAMA_ORIGINSenv var cover most use cases? In my setup I had to whitelist specific origins when multiple local apps were hitting the same Ollama instance simultaneously.