Ollama Provider
Run analysis and queries using local Ollama models. No API key required.
Configuration
{
"providers": {
"ollama": {
"type": "ollama",
"model": "llama3.1",
"baseUrl": "http://localhost:11434/v1"
}
}
}Options
| Field | Default | Description |
|---|---|---|
model | — | Ollama model name |
baseUrl | http://localhost:11434/v1 | Ollama API URL |
Setup
- Install Ollama from ollama.ai
- Pull a model:
ollama pull llama3.1 - Ollama runs automatically on port 11434