Anubhav Singhxprilion.comยทJun 27, 2024Ollama Models on Cloud RunThis blog is a read-along for the repository xprilion/ollama-cloud-run which shows how to deploy various models using the Ollama API on Cloud Run, to run inference using CPU only on a serverless platform - incurring bills only when you use them. Olla...GoogleAdd a thoughtful commentNo comments yetBe the first to start the conversation.