Philippe Charrièrek33g.hashnode.dev·Apr 4, 2024Create a Java GenAI StackIn this series, "AI Experiments with Ollama on a Pi5," I explained that you could run LLM locally, even on a small machine. This is possible using "small" LLMs like DeepSeek Coder, TinyLlama, TinyDolphin, etc. I ran all my examples with Docker Compos...Atharva Shirdhankar and 1 other are discussing this2 people are discussing thisDiscuss·10 likes·517 readsDocker Compose, Ollama and LangChain4Jvert-x