Your own LLM platform is now one 'pip install' away
Kalavai worker and seed nodes run now on docker, and so can your LLMs
Kalavai is a tool that turns your devices into a scalable LLM platform for fast, robust prototyping and seamless transition to production, where LLMs really deliver value. Ideal for pros and hobbyists alike; put those RTXs, QUADROS and GTXs to work as one, large supercomputer.
Kalavai is now more accessible than ever. We have published it as a python package, and along with it we have withered down the requirements to run it to almost zero. All you need to get started is python 3.4+ and docker.
It really could not get easier
pip install kalavai-client
With that simple command you get:
An LLM platform to easily deploy Large Language Models like
DeepSeek
.Support for multiple model engines (
vLLM
,llama.cpp
,Petals
and more)Ultimate DIY: easily add your devices to get more computing power. Kalavai handles the distribution of workload.
Why should you care?
You want to run the latest models, not litter your computer with python libraries and OS plugins. With docker, now kalavai is less intrusive to your computer (no third-party installation required).
Workers run on docker, which are isolated from your system for safe experimentation. Docker also allows us to run in more systems; if you can run docker, you can run kalavai! Yes, that includes Linux, Windows and MacOS*.
What are you waiting for? Give it a shot. Get started with minimal effort with our public LLM pool, or dive in full DYI by hosting your own.
If you are visually-oriented, here’s a short series of videos to get you kalavai-ready:
* Support for Windows and MacOS workers is experimental: kalavai workers run on docker containers that require access to the host network interfaces, thus systems that do not support containers natively (Windows and MacOS) may have difficulties finding each other.