Ollama Provider

Run analysis and queries using local Ollama models. No API key required.

Configuration

{
  "providers": {
    "ollama": {
      "type": "ollama",
      "model": "llama3.1",
      "baseUrl": "http://localhost:11434/v1"
    }
  }
}

Options

FieldDefaultDescription
modelOllama model name
baseUrlhttp://localhost:11434/v1Ollama API URL

Setup

  1. Install Ollama from ollama.ai
  2. Pull a model: ollama pull llama3.1
  3. Ollama runs automatically on port 11434