Compilation Pipeline
The swarmvault compile command runs a multi-step pipeline that transforms raw sources into structured knowledge.
Steps
1. Load Sources
Reads all source manifests from state/manifests/ and existing analyses from state/analyses/.
2. Analyze Sources
For each new or changed source, the configured compileProvider extracts:
- Concepts (max 12) — Key ideas and topics with descriptions
- Entities (max 12) — Named things (people, tools, organizations) with descriptions
- Claims (max 8) — Factual assertions with confidence scores and polarity (positive/negative/neutral)
- Questions (max 6) — Questions the source raises or answers
3. Build Knowledge Graph
Merges all analyses into a unified graph:
- Nodes — Sources, concepts, and entities with freshness tracking
- Edges — Relationships between nodes with claim status (extracted, inferred, conflicted, stale)
4. Generate Wiki Pages
Creates Markdown pages from the graph:
wiki/index.md— Home page with overviewwiki/sources/— One page per ingested sourcewiki/concepts/— One page per extracted conceptwiki/entities/— One page per extracted entity
5. Build Search Index
Rebuilds the SQLite FTS index over all wiki pages for fast full-text search.
Incremental Compilation
Analyses include content signatures. If a source hasn't changed, its existing analysis is reused, saving LLM API calls.