Past two parallel sessions, the human stops being a strategic director and starts being a clipboard — manually ferrying decisions, specs, and context between terminal windows. Token costs spiral. Sessions drift. Mistakes compound.
Two parallel sessions are manageable. Three, and you're spending more time coordinating than directing. Four, and you're a full-time message bus.
Decisions made in session A need to reach session B. You copy-paste. You paraphrase. You forget. The information degrades every hop.
Each session cold-starts from its context window. No session knows what the others decided an hour ago. Contradictions ship. You find them in production.
Each session cold-starts from its context window and re-derives what others already worked out. The same approach gets reasoned into existence three times across three sessions. Multiply by every resume, every parallel branch, every week.
The other tools wrap one AI session in a visual workspace.
Vinculum sits underneath whatever clients you already run — every session reads what every other session decided, without you copy-pasting between windows.
One human. One chat. N workers. Zero clipboards.
You don't run Vinculum to do your AI work in. You run it so the sessions you already use share a coherent picture of what you're building and what's already been decided.
The role model wasn't designed upfront — it appeared in the first day of using the system. The names stuck because they described what was already happening.
Strategic intent, written in plain language to your chat session. “Spawn a couple workers — I want a new onboarding flow. Here's the brief.”
You never open a terminal. You never manage a git branch. You never relay context. You describe what you want and supervise the dashboard while it happens.
Tactical translation and worker direction, never implementation. Your chat session (claude.ai web, Windows app, CLI Claude in a terminal) writes directives to Vinculum, targeting the right workers on the right branches. It coordinates; it doesn't code.
The colonel's unique value is the gap between what you said and what the workers need to hear. It holds that gap without collapsing it.
Decomposers, auditors, committers. Lieutenants take a colonel-sized brief, break it into atomic grunt directives, dispatch the grunts, batch-review their work, and commit. Always Sonnet or Opus — the work needs judgment Haiku can't reliably give.
Real-military analogy: an LT leads a platoon.
Focused execution in one domain, writing back everything they decide. Two model tiers under one substrate role: Sergeants (Sonnet) handle judgment-heavy work and can peer-coordinate when an LT directs them. Privates (Haiku) handle mechanical execution.
Both write implementations, questions, and blockers back to the substrate so the colonel and the LT stay current without polling.
spawn_grunt and a worker process materializes on your host — stdin pipe open, focus declared, ready to work. No terminal ceremony. No manual session management.spawn_grunt(role="grunt", model="claude-sonnet-4-6", session_label="builder-auth-flow")The primitives for running parallel AI agents exist now. The substrate they share — the persistent memory underneath — is the empty lane.
Multi-agent coordination becomes a first-class API primitive. AI sessions can delegate work to other AI sessions. The orchestration substrate exists. The memory substrate doesn't.
Process and lifecycle management for long-running agents. Spinning up a worker on a remote host is a documented, supported operation. The spawn primitive is real. The memory those spawns share is still ephemeral.
Every competitor answered "how do I run AI sessions in a better wrapper?" Nobody answered "how do those sessions remember anything across the team's work?"
Intelligence features route through MCP sampling — your existing Claude subscription, no extra charges. Self-hosted stays free forever under AGPL v3.
The dashboard is the workspace, not a picture of it. Nodes are entries, branches are threads, edges are relations. Click a node to see its context. Filter by role, author, or time. Live via SSE — new entries appear as you work.
Vinculum scales from “I want my AI to remember what we were doing” to “I'm running a six-session parallel build and need a real coordination layer.”
You use Claude in chat. You want it to remember what you decided last session without you recapping every time. Vinculum is the memory layer. One session, one project, always caught up.
You already pay for Claude Max. You run two, four, six sessions in parallel. You're tired of being the message bus. You want to describe what needs doing and supervise a dashboard while workers execute.
Self-hosted is free and stays free. The hosted tier adds auth, background intelligence, and managed ops. Full pricing breakdown →
Self-host or 1-project hosted. AGPL v3, free forever (commercial license for enterprises).
Unlimited projects. Unlimited retention. Live intelligence.
Unlimited projects. Background intelligence on your Anthropic key. No ops.
Up to 10 members. Shared projects. Audit log.
Enterprise — SSO · on-prem · custom RBAC · SLA · Contact us →
uvx vinculum-mcpand you're up in about 60 seconds. The hosted service at vinculum.run runs the same code with managed auth, multi-tenant Postgres, and background intelligence on our infrastructure. Use it if you don't want to run a box; skip it if you do.Self-hosted is free and always will be. The hosted tier starts at $5/mo and handles all the ops. Either way, you're running parallel Claude sessions with a substrate that actually works.
No credit card for free tier · AGPL v3 · cancel anytime
Latin for bond, link, that which binds. In mathematics, the bar over a repeating decimal — the mark that says these digits recur, indefinitely, as a unit. That is the architectural claim: a substrate that accumulates a project's intent across its entire life, holding parallel sessions together not by synchronizing them, but by giving them a single authoritative surface to read from and write to. The sessions are ephemeral. The graph is not.
— the substrate that persists —