Self-Host Kyomi
Run the full platform on your own infrastructure. Your data, your LLM key, your rules.
Full ControlNothing leaves your network. Data, queries, and AI conversations stay on your infrastructure.
Bring Your Own LLMUse your Anthropic, OpenAI, Gemini, or any OpenAI-compatible API key. Pay your provider directly.
Open SourceAGPL-3.0 licensed. Audit the code, contribute, or fork it. View on GitHub
Get Started in 30 Seconds
Choose your preferred installation method.
Option 1: Docker (recommended)
One command installs Kyomi with PostgreSQL. Requires Docker with Compose.
$ curl -fsSL https://get.kyomi.ai | sh
The installer will prompt for your LLM API key and access URL, generate security keys, and start everything.
Option 2: Standalone Binary
A single self-contained binary with the frontend, AI model, and server built in. Uses SQLite — no external database needed.
# Download for your platform
$ curl -L https://github.com/kyomi-ai/kyomi/releases/latest/download/kyomi-linux-amd64.tar.gz | tar xz
# Set your LLM API key and run
$ export LLM_PROVIDER=anthropic
$ export LLM_API_KEY=sk-ant-...
$ ./kyomi
Open http://localhost:3000 in your browser. Data is stored in ./data/ by default.
Downloads
Pre-built binaries for every major platform. All releases available on GitHub Releases.
| Platform | Architecture | Download |
|---|
| Linux | x86_64 (amd64) | tar.gz |
| Linux | ARM64 (aarch64) | tar.gz |
| macOS | Apple Silicon (arm64) | tar.gz |
| macOS | Intel (amd64) | tar.gz |
| Docker | Multi-arch (amd64 + arm64) | ghcr.io/kyomi-ai/kyomi |
System Requirements
Standalone Binary
- 2 GB RAM minimum
- 1 GB disk space
- An LLM API key
- Linux (glibc) or macOS
- No external database needed (uses SQLite)
Docker Compose
- 4 GB RAM minimum
- Docker with Compose plugin
- An LLM API key
- Uses PostgreSQL (included in compose)
- Better for production and multi-user
Self-Hosted vs Cloud
| Feature | Self-Hosted | Cloud |
|---|
| AI Chat & Analysis | Your LLM key, unlimited | Included (per-tier budget) |
| Dashboards | Unlimited | Per plan |
| Datasources | All 9 supported | All 9 supported |
| SQL Editor | Full | Full |
| Forecasting | Built-in | Built-in |
| Kyomi Watch | Included | Pro & Team |
| MCP Support | Included | All plans |
| Users | Unlimited | Per plan |
| Data Residency | Your infrastructure | Kyomi Cloud (AU) |
| Updates | Manual (upgrade script) | Automatic |
| Support | Community (GitHub) | Email / Priority |
| Cost | Free (+ your LLM costs) | From $0/month |
Supported LLM Providers
Self-hosted Kyomi works with any of these AI providers. You bring your own API key.
AnthropicClaude 4, Claude 3.5 Sonnet
OpenAIGPT-4o, GPT-4 Turbo
Google GeminiGemini 2.5 Pro, Flash
OpenAI-CompatibleOllama, vLLM, LiteLLM, any compatible API
Try It Now
One command to install. Five minutes to your first insight.