Local LLM's on WSL2
Below example demonstrates running locally Ollama and ๐๐ฉ๐๐ง-๐๐๐๐๐ ๐จ๐ง ๐ ๐๐๐2 containerized environment.
- The model Ollama is running in this example: ๐๐ฐ๐๐ง2.5-๐๐จ๐๐๐ซ:0.5๐
- Presently it is on CPU as i have ๐๐ง๐ญ๐๐ฅ ๐๐ซ๐ข๏ฟฝ...
lostinopensrc.hashnode.dev1 min read
Lamri Abdellah Ramdane
Developer passionate about clean code, open source, and exploring new tech.
This is a great topic! Getting local LLMs running on WSL2 can be a bit tricky, but it's so powerful. For a complete local dev environment that makes it easy to set up and manage these kinds of tools, I'd highly recommend checking out ServBay.