Local LLM's on WSL2
Below example demonstrates running locally Ollama and ππ©ππ§-πππππ π¨π§ π πππ2 containerized environment.
- The model Ollama is running in this example: ππ°ππ§2.5-ππ¨πππ«:0.5π
- Presently it is on CPU as i have ππ§πππ₯ ππ«π’οΏ½...
lostinopensrc.hashnode.dev1 min read
Lamri Abdellah Ramdane
Developer passionate about clean code, open source, and exploring new tech.
This is a great topic! Getting local LLMs running on WSL2 can be a bit tricky, but it's so powerful. For a complete local dev environment that makes it easy to set up and manage these kinds of tools, I'd highly recommend checking out ServBay.