Skip to Content

AI & MCP

OSS Release: Granary

Daniel Kovács

Daniel Kovács

February 10, 2026 - 5 min read

AI & MCP

Today we’re releasing Granary  — an open source CLI tool that gives AI agents the shared memory and coordination layer they’ve been missing. If you’ve tried running multiple agents on a real codebase, you’ve probably hit the wall: agents lose context between sessions, duplicate each other’s work, or silently conflict. Granary fixes that.

Check it out on GitHub → 

Why we built Granary

At Speakeasy, we lean heavily on AI agents for everything from SDK generation to documentation. As we scaled up agentic workflows — running multiple agents in parallel across large codebases — a pattern kept emerging: agents don’t coordinate well on their own.

An agent finishes a task, but the next agent picking up work has no idea what was done. Two agents claim the same file and produce conflicting changes. A long-running migration stalls because there’s no way to checkpoint progress and resume. Context gets lost between sessions, and work gets repeated.

We needed infrastructure — not another prompt trick — to make multi-agent work reliable. Something that could track what each agent is doing, what context it has, and what’s left to do. That’s Granary.

Core capabilities

Granary is built around four phases of agentic work: plan, execute, coordinate, and hand off.

  • Session-centric context: Every agent run gets explicit context tracking, so nothing is lost between sessions
  • Peak context window efficiency: Granary structures and compresses context so models always operate within their optimal window — no wasted tokens, no degraded reasoning
  • Lossless planning: Agents can clear their context freely without losing critical information. Plans, decisions, and progress are persisted in Granary and always make it to the next agent
  • Concurrency safety: Task claiming with leases prevents multiple agents from colliding on the same work
  • LLM-native composite commands: Commands like plan, work, and initiate bundle multiple operations into single calls — fewer tool invocations, more targeted context delivered exactly when agents need it
  • Event-driven automation: Launch agents, capture feedback loops, run agentic tools — or anything else — in response to Granary state changes like task completion or context updates

Getting started

Granary installs in seconds and initializes in your project directory:

# Install on macOS/Linux curl -sSfL https://raw.githubusercontent.com/speakeasy-api/granary/main/scripts/install.sh | sh # Initialize a workspace granary init # Kick off the agent in the background, watch progress in the foreground claude -p "use granary to plan: Migrate endpoints to v2" & granary summary --watch

How it works in practice

Here are some workflows where Granary makes a real difference:

Multi-agent code review: Spin up specialized agents for security, performance, and style review in parallel. Each agent claims its review tasks through Granary, works independently, and hands off findings without stepping on each other.

Codebase migrations: Break a large refactor into hundreds of granular tasks. Agents pick up work with granary work, checkpoint after each file, and progress is tracked across the entire migration. If an agent session ends, the next one picks up exactly where it left off.

Feature development pipelines: Orchestrate a full cycle from planning through implementation, testing, and review. Agents checkpoint between stages, and handoff context packages ensure smooth transitions between phases.

Bug investigation: Structure a research-reproduce-fix-verify workflow with granary session and granary focus. The investigation context stays organized even as the agent explores different hypotheses.

End-to-end issue automation: Connect external systems like Linear to Granary and let issues flow straight into agentic workflows. A new ticket triggers project creation, Granary runners expand the brief into rich context using capable LLMs, and agents pick up the work — implementing changes, reviewing each other’s output, and shipping PRs without human interaction. The entire lifecycle from issue to merged code runs autonomously.

Architecture

Granary is deliberately simple in its design:

  • Local-first: Everything is stored in SQLite on your machine. No network calls, no cloud dependency, no data leaving your environment.
  • Single binary: Written in Rust, Granary compiles to a single binary with no runtime dependencies.
  • Centralized storage, separate workspaces: The database lives in $HOME by default, keeping each workspace’s data separate while avoiding scattered config directories. For fully portable setups, workspaces can be stored locally and moved between machines with the migrate command.

All output is designed for both humans and machines. Run any command with --json for structured output, --format prompt for LLM-optimized context, or plain text for readable tables.

Workers and event automation

Granary includes an event-driven worker system for automating repetitive coordination. Workers subscribe to Granary events — like a task becoming unblocked — and can automatically spawn agent sessions in response.

This means you can set up pipelines where completing one task automatically triggers the next agent to start working, without manual intervention.

Open source and community-driven

Granary is fully open source under the MIT license. We built it because we needed it, and we think anyone running agentic workflows at scale needs something like it too.

The project is written in Rust with a clean architecture designed for contributors. Whether it’s new workflow patterns, integrations with other agent frameworks, or bug fixes — we’d love to see what the community builds on top of it.

Try it out

Granary is available now on GitHub  and at granary.dev . Install it, run granary init, and see how much smoother your agentic workflows get when agents can actually share context and coordinate.

We’re excited to see how teams use Granary to push the boundaries of what’s possible with multi-agent development workflows.

Last updated on

Build with
confidence.

Ship what's next.