Skip to content

Self-Host Kyomi

Run the full platform on your own infrastructure. Your data, your LLM key, your rules.

Full Control

Nothing leaves your network. Data, queries, and AI conversations stay on your infrastructure.

Bring Your Own LLM

Use your Anthropic, OpenAI, Gemini, or any OpenAI-compatible API key. Pay your provider directly.

Open Source

AGPL-3.0 licensed. Audit the code, contribute, or fork it. View on GitHub

Get Started in 30 Seconds

Choose your preferred installation method.

Option 1: Docker (recommended)

One command installs Kyomi with PostgreSQL. Requires Docker with Compose.

$ curl -fsSL https://get.kyomi.ai | sh

The installer will prompt for your LLM API key and access URL, generate security keys, and start everything.

Option 2: Standalone Binary

A single self-contained binary with the frontend, AI model, and server built in. Uses SQLite — no external database needed.

# Download for your platform
$ curl -L https://github.com/kyomi-ai/kyomi/releases/latest/download/kyomi-linux-amd64.tar.gz | tar xz

# Set your LLM API key and run
$ export LLM_PROVIDER=anthropic
$ export LLM_API_KEY=sk-ant-...
$ ./kyomi

Open http://localhost:3000 in your browser. Data is stored in ./data/ by default.

Downloads

Pre-built binaries for every major platform. All releases available on GitHub Releases.

PlatformArchitectureDownload
Linuxx86_64 (amd64)tar.gz
LinuxARM64 (aarch64)tar.gz
macOSApple Silicon (arm64)tar.gz
macOSIntel (amd64)tar.gz
DockerMulti-arch (amd64 + arm64)ghcr.io/kyomi-ai/kyomi

System Requirements

Standalone Binary

  • 2 GB RAM minimum
  • 1 GB disk space
  • An LLM API key
  • Linux (glibc) or macOS
  • No external database needed (uses SQLite)

Docker Compose

  • 4 GB RAM minimum
  • Docker with Compose plugin
  • An LLM API key
  • Uses PostgreSQL (included in compose)
  • Better for production and multi-user

Self-Hosted vs Cloud

FeatureSelf-HostedCloud
AI Chat & AnalysisYour LLM key, unlimitedIncluded (per-tier budget)
DashboardsUnlimitedPer plan
DatasourcesAll 9 supportedAll 9 supported
SQL EditorFullFull
ForecastingBuilt-inBuilt-in
Kyomi WatchIncludedPro & Team
MCP SupportIncludedAll plans
UsersUnlimitedPer plan
Data ResidencyYour infrastructureKyomi Cloud (AU)
UpdatesManual (upgrade script)Automatic
SupportCommunity (GitHub)Email / Priority
CostFree (+ your LLM costs)From $0/month

Supported LLM Providers

Self-hosted Kyomi works with any of these AI providers. You bring your own API key.

Anthropic

Claude 4, Claude 3.5 Sonnet

OpenAI

GPT-4o, GPT-4 Turbo

Google Gemini

Gemini 2.5 Pro, Flash

OpenAI-Compatible

Ollama, vLLM, LiteLLM, any compatible API

Try It Now

One command to install. Five minutes to your first insight.