2025-03-29 10:29am
The next version of docker (v4.40) will add native llm capability to the docker cli. Docker Model Runner is not yet publicly released, but adds commands like docker model run
that will run LLM models outside of containers. Initial reports look promising and may be nice replacement for running llama.cpp, koboldcpp or ollama locally.