LLM Providers
Dorgu can use large language models to enhance its application analysis. LLM integration provides deeper code understanding, framework-specific recommendations, smarter resource sizing, and security best practices tailored to your stack.LLM enhancement is optional. Dorgu’s static analysis and manifest generation work without any LLM provider configured. The LLM layer adds refinements on top of the base analysis.
Supported Providers
| Provider | Models | Env Variable | Default Model |
|---|---|---|---|
| OpenAI | gpt-4, gpt-3.5-turbo | OPENAI_API_KEY | gpt-4 |
| Anthropic | claude-3-sonnet | ANTHROPIC_API_KEY | claude-3-sonnet |
| Gemini | gemini-1.5-pro | GEMINI_API_KEY or GOOGLE_API_KEY | gemini-1.5-pro |
| Ollama | any locally hosted model | OLLAMA_HOST (endpoint URL) | configurable |
What LLM Enhances
When an LLM provider is configured, Dorgu uses it to improve several areas of analysis:- Deeper code analysis — understands business logic, not just file structure
- Framework-specific recommendations — tailored settings for Express, Spring Boot, Django, etc.
- Resource sizing — more accurate CPU/memory recommendations based on application patterns
- Security best practices — context-aware security recommendations beyond generic defaults
OpenAI
Setup
Usage
Model Override
Anthropic
Setup
Usage
Gemini
Setup
Usage
Ollama
Ollama lets you run LLMs locally without an API key. You need a running Ollama instance with at least one model pulled.Prerequisites
Setup
Usage
Ollama does not require an API key. The
OLLAMA_HOST variable is only needed if your Ollama instance runs on a non-default address.API Key Resolution Order
When Dorgu needs an API key, it checks the following sources in order:- Environment variable —
OPENAI_API_KEY,ANTHROPIC_API_KEY,GEMINI_API_KEY, orGOOGLE_API_KEY - Global config —
llm.api_keyin~/.config/dorgu/config.yaml - Interactive prompt — if running in a terminal and no key is found, Dorgu prompts you to enter one