Data Sovereignty

Zero external calls — fully air-gapped

Set ALLOW_OLLAMA_IN_PRODUCTION=true and Kiedo replaces every cloud API with a locally-running model:

  • Embeddings: Ollama mxbai-embed-large replaces Voyage AI
  • LLM inference: phi4-mini, llama3.2:3b/1b, qwen3:4b, or ministral-3:3b
  • Vector store: PostgreSQL 16 + pgvector stays on your server
  • No Stripe required: Use manual or simulated billing on isolated networks
Discuss deployment →
.env.local
# Enable fully air-gapped mode
ALLOW_OLLAMA_IN_PRODUCTION=true
OLLAMA_BASE_URL=http://ollama:11434
EMBEDDING_MODEL=mxbai-embed-large
LLM_MODEL=phi4-mini

# Vector store (stays on-prem)
POSTGRES_URL=postgresql://kiedo:password@db:5432/kiedo
REDIS_URL=redis://redis:6379/0

# Billing (simulated for air-gapped)
BILLING_PROVIDER=simulated
Docker Compose Stack

Seven services, one command

The entire Kiedo stack is Docker Composed — clone, configure your .env, and run docker compose up -d. All services are health-checked and networked automatically.

nginx — Reverse Proxy
TLS termination, static file serving, upstream routing to FastAPI
FastAPI — API Server
Python 3, SQLAlchemy async, arq task workers — REST + WebSocket endpoints
PostgreSQL 16 + pgvector
Primary data store and vector index (HNSW, cosine distance)
Redis 7
Cache, pub/sub for WebSocket fanout, rate limiting, arq task queue
React Admin Panel
React 18 / Vite / Tailwind — served as a static bundle via nginx
Grafana
Dashboards provisioned on deploy — available at port 3001
Ollama (optional)
On-prem LLM and embedding server — activated by ALLOW_OLLAMA_IN_PRODUCTION=true
Copy
# Clone and configure
git clone https://github.com/kiedo/kiedo.git && cd kiedo
cp .env.example .env.local
# Edit .env.local with your settings…

# Deploy all 7 services
docker compose up -d

# Check service health
curl http://localhost/health
# → {"db":"ok","redis":"ok"}

# Admin panel available at:
# http://localhost/admin
# Grafana dashboards at:
# http://localhost:3001
Compliance & Residency

Built for regulated industries

Healthcare, fintech, and legal teams choose self-hosted Kiedo because it removes all third-party data exposure from the AI layer.

GDPR / Data Residency

All conversation data, embeddings, and customer records stay within your chosen jurisdiction. No EU data crosses to US-based LLM APIs.

HIPAA / PHI Isolation

Healthcare teams can process patient-related queries without any PHI leaving their private cloud. Argon2 encryption + encrypted secrets at rest throughout.

No Vendor Lock-In

Swap the LLM provider, embedding model, or billing provider at any time via environment variables. The business logic doesn't change — only the adapter does.

Ready to deploy on your hardware?

Our team will scope your infrastructure requirements and provide a deployment guide tailored to your environment and compliance needs.

Talk to sales Security overview →