Docker Can Run LLMs Locally—Wait, What!?
Using Docker to run Large Language Models (LLMs) locally? Yes, you heard that right. Docker is now much more than just running a container image. With Docker Model Runner, you can run and interact with LLMs locally.
It’s a no-brainer that we’ve seen ...
pradumnasaraf.hashnode.dev6 min read