OpenAI-Compatible Provider

Connect to any API that implements the OpenAI Chat Completions format.

Configuration

{
  "providers": {
    "local-llm": {
      "type": "openai-compatible",
      "model": "my-model",
      "baseUrl": "http://localhost:8000/v1",
      "apiKeyEnv": "LOCAL_LLM_KEY"
    }
  }
}

Options

FieldDefaultDescription
modelModel identifier
baseUrlhttp://localhost:8000/v1API base URL
apiKeyEnvOptional API key env var
headersAdditional HTTP headers

Compatible Services

This provider works with any service that implements the OpenAI Chat Completions API, including LM Studio, vLLM, text-generation-webui, and more.