Skip to content

Self-host Kyomi.

Your infrastructure. Your LLM key. Your data.

Coming soon under AGPL-3.0. The public release lands alongside Cloud GA.


What self-hosted Kyomi will be

When the public repo opens, you'll be able to run the full Kyomi platform on your own infrastructure — same product, same code, same features as Cloud. Nothing leaves your network. You pay your LLM provider directly; you pay us nothing.

  • AGPL-3.0 licensed — audit every line, fork it, contribute back
  • Bring your own LLM — Anthropic, OpenAI, Google Gemini, or any OpenAI-compatible endpoint (Ollama, vLLM, LiteLLM)
  • All 9 datasources — BigQuery, Snowflake, Postgres, MySQL, ClickHouse, Redshift, Databricks, SQL Server, Azure Synapse
  • Same product — AI chat, dashboards, Kyomi Watch, MCP, PDF export, website analytics
  • Deploy how you want — standalone binary (SQLite) or Docker Compose (Postgres). Your infrastructure, your call.

Available today: Kyomi Connect

If you need your database credentials to stay on-prem right now, Kyomi Connect is open and ready. It's the on-prem data gateway — your warehouse credentials never leave your network, only query results travel to Kyomi. Apache 2.0 licensed, separate from the main Kyomi app.

Kyomi Connect on GitHub →


Until the public release lands

Three ways to stay close:

  • Watch the org. Follow github.com/kyomi-ai — when the main repo opens, you'll see it there first.
  • Run Connect today. If on-prem credential control is what you need now, Kyomi Connect solves that alone.
  • Use Cloud with a 30-day free trial. Same product, hosted. You can always migrate later.

Try Cloud while you wait.

30-day free trial. $5 per user, per month after. No credit card to start.

Start free trial