Get started
Set up global config
Configure your LLM provider and defaults (optional but recommended):This prompts for:
- LLM provider —
openai,anthropic,gemini, orollama - API key — your provider API key
- Default namespace — Kubernetes namespace (e.g.,
default) - Container registry — e.g.,
ghcr.io/my-org - Organization name — for Kubernetes labels
~/.config/dorgu/config.yaml.Initialize your app
Navigate to your application directory (must contain a This creates a
Dockerfile or docker-compose.yml):.dorgu.yaml file with app name, team, repository (auto-detected from git), and other metadata.Preview with dry run
Print generated manifests to stdout without writing files:Use LLM-enhanced analysis
For deeper analysis (framework-specific recommendations, resource sizing, security practices):dorgu init --global or the OPENAI_API_KEY environment variable. See LLM Providers for all supported providers.
Output walkthrough
| File | Description |
|---|---|
k8s/deployment.yaml | Kubernetes Deployment with resource limits, health probes, security context |
k8s/service.yaml | ClusterIP Service mapping detected ports |
k8s/ingress.yaml | Ingress with TLS via cert-manager (if configured) |
k8s/hpa.yaml | HorizontalPodAutoscaler with CPU-based scaling |
k8s/argocd/application.yaml | ArgoCD Application pointing to your repo |
.github/workflows/deploy.yaml | GitHub Actions CI/CD pipeline (build, push, deploy) |
PERSONA.md | Human-readable persona document describing your app |
Next steps
Command Reference
See all flags and options for
dorgu generate.Configuration
Customize manifests with layered configuration.
Cluster Setup
Bootstrap a production-ready Kubernetes stack.
Working with Personas
Apply ApplicationPersona CRDs to your cluster.