SwarmVault
SwarmVault is a local-first LLM knowledge compiler.
It ingests raw research inputs, stores them immutably, and compiles them into a persistent markdown wiki, a structured graph, and a local search index. Instead of losing work inside chat history, the vault becomes the durable artifact.
What You Get
- Local-first artifacts - keep sources, wiki pages, graph data, and search state on disk
- **Schema-guided compilation** - teach each vault its own naming, categorization, and grounding rules with
swarmvault.schema.md - **Markdown + graph outputs** - generate readable wiki pages and structured
graph.json - Pluggable providers - use OpenAI, Anthropic, Gemini, Ollama, generic OpenAI-compatible APIs, custom adapters, or the built-in heuristic
- **Automation loop** - import from
inbox/, watch for changes, and keep run logs instate/jobs.ndjson - **MCP access** - expose the vault to compatible agents with
swarmvault mcp - Agent-ready - install rules for Codex, Claude, and Cursor
Install
SwarmVault requires Node >=24.
npm install -g @swarmvaultai/cliCore Workflow
- Ingest a file path or URL into immutable source storage
- **Shape the vault** with
swarmvault.schema.md - Compile source manifests into wiki pages, graph data, and local search
- **Query** the compiled vault and save useful answers back into
wiki/outputs/ - **Automate** capture workflows with
inbox import,watch, and MCP
Quick Start
swarmvault init
swarmvault ingest ./my-document.pdf
swarmvault compile
swarmvault query "What are the key concepts?" --save
swarmvault graph serveWhy SwarmVault
Most "chat with your docs" tools answer a question and discard the work. SwarmVault keeps the work. Source manifests, schema rules, markdown pages, query outputs, graph edges, and freshness state all remain as inspectable files you can diff, search, and reuse.