Together Provider
Together is exposed as a named preset over SwarmVault's OpenAI-compatible adapter. Together hosts a wide range of open-weight models with an OpenAI-compatible API.
Configuration
{
"providers": {
"primary": {
"type": "together",
"model": "meta-llama/Llama-3.3-70B-Instruct-Turbo"
}
}
}Options
| Field | Default | Description |
|---|---|---|
model | -- | Model ID in org/model format (e.g., meta-llama/Llama-3.3-70B-Instruct-Turbo) |
apiKeyEnv | "TOGETHER_API_KEY" | Environment variable for API key |
baseUrl | https://api.together.xyz/v1 | API base URL |
apiStyle | "chat" | API style (chat for Chat Completions) |
capabilities | -- | Override auto-detected capabilities |
Environment Variable
export TOGETHER_API_KEY=...Notes
- Model naming: Together uses
org/modelformat for model IDs (e.g.,meta-llama/Llama-3.3-70B-Instruct-Turbo). Check the Together docs for the full model catalog. - Turbo variants: Together offers optimized "Turbo" variants of popular models that provide faster inference at the same quality level.
- Embedding models: Together also hosts embedding models. If you configure an embedding-capable model, it can serve as your
embeddingProviderfor semantic graph queries.