Your data. Your infrastructure. Your rules.
Deploy the complete Kiedo stack on any Linux server with Docker Compose. Enable fully air-gapped mode for zero external API calls — your knowledge base never leaves your network.
Zero external calls — fully air-gapped
Set ALLOW_OLLAMA_IN_PRODUCTION=true and Kiedo replaces every cloud API with a locally-running model:
-
✓
Embeddings: Ollama
mxbai-embed-largereplaces Voyage AI - ✓ LLM inference: phi4-mini, llama3.2:3b/1b, qwen3:4b, or ministral-3:3b
- ✓ Vector store: PostgreSQL 16 + pgvector stays on your server
- ✓ No Stripe required: Use manual or simulated billing on isolated networks
# Enable fully air-gapped mode ALLOW_OLLAMA_IN_PRODUCTION=true OLLAMA_BASE_URL=http://ollama:11434 EMBEDDING_MODEL=mxbai-embed-large LLM_MODEL=phi4-mini # Vector store (stays on-prem) POSTGRES_URL=postgresql://kiedo:password@db:5432/kiedo REDIS_URL=redis://redis:6379/0 # Billing (simulated for air-gapped) BILLING_PROVIDER=simulated
Seven services, one command
The entire Kiedo stack is Docker Composed — clone, configure your .env, and run docker compose up -d. All services are health-checked and networked automatically.
ALLOW_OLLAMA_IN_PRODUCTION=true# Clone and configure
git clone https://github.com/kiedo/kiedo.git && cd kiedo
cp .env.example .env.local
# Edit .env.local with your settings…
# Deploy all 7 services
docker compose up -d
# Check service health
curl http://localhost/health
# → {"db":"ok","redis":"ok"}
# Admin panel available at:
# http://localhost/admin
# Grafana dashboards at:
# http://localhost:3001
Built for regulated industries
Healthcare, fintech, and legal teams choose self-hosted Kiedo because it removes all third-party data exposure from the AI layer.
GDPR / Data Residency
All conversation data, embeddings, and customer records stay within your chosen jurisdiction. No EU data crosses to US-based LLM APIs.
HIPAA / PHI Isolation
Healthcare teams can process patient-related queries without any PHI leaving their private cloud. Argon2 encryption + encrypted secrets at rest throughout.
No Vendor Lock-In
Swap the LLM provider, embedding model, or billing provider at any time via environment variables. The business logic doesn't change — only the adapter does.