Skip to main content

Configuration Overview

Dorgu uses a layered configuration system that lets you set defaults globally, override them per workspace, and fine-tune per application. Higher-priority sources override lower ones on a per-key basis.

Priority Order

Configuration is resolved from six sources, highest priority first:
PrioritySourceDescription
1 (highest)CLI flags--namespace, --llm-provider, etc.
2App .dorgu.yamlConfig file in the application directory being analyzed
3Workspace .dorgu.yamlConfig file in the current working directory
4Global config~/.config/dorgu/config.yaml
5Environment variablesOPENAI_API_KEY, KUBECONFIG, etc.
6 (lowest)Built-in defaultsSensible defaults shipped with Dorgu

Config File Locations

FileLocationCreated by
App config<app-dir>/.dorgu.yamldorgu init
Workspace config<cwd>/.dorgu.yamldorgu init
Global config~/.config/dorgu/config.yamldorgu init --global

How Merging Works

Dorgu reads all available configuration sources and merges them together. Higher-priority sources override lower ones per-key — not as a wholesale replacement. This means you can set organization-wide defaults in the global config and only override specific values (like resource limits or LLM provider) at the app or workspace level. For example, if your global config sets llm.provider: openai and your app .dorgu.yaml sets llm.provider: anthropic, the app-level value wins. But all other global settings still apply.
CLI flags always take the highest priority. Use them for one-off overrides without modifying any config file.

Next Steps

App Configuration

.dorgu.yaml reference for workspace and application config

Global Configuration

~/.config/dorgu/config.yaml reference

LLM Providers

Configure OpenAI, Anthropic, Gemini, or Ollama

Environment Variables

All environment variables recognized by Dorgu