Philippe Charrièrek33g.hashnode.dev·Apr 4, 2024Create a Java GenAI StackIn this series, "AI Experiments with Ollama on a Pi5," I explained that you could run LLM locally, even on a small machine. This is possible using "small" LLMs like DeepSeek Coder, TinyLlama, TinyDolphin, etc. I ran all my examples with Docker Compos...Discuss·10 likes·362 readsDocker Compose, Ollama and LangChain4Jvert-x