namzu.ai
All posts
·3 min read·Bahadır Arda

Introducing Namzu: an open-source agent kernel

Namzu is an open-source TypeScript agent kernel and SDK for building AI agents. Here is what it is, what it is not, and why we are giving it away.

announcementagent-kernel

Namzu is an open-source TypeScript agent kernel and SDK for building AI agents. It owns the runtime layer beneath agent frameworks: process lifecycle, scheduling, memory, IPC, sandboxing, and checkpoint/resume. Today it ships as @namzu/sdk on npm, with provider packages for Anthropic, OpenAI, OpenRouter, Bedrock, Ollama, LM Studio, and a generic HTTP backend. The kernel is released under FSL-1.1-MIT.

This post is the short version of what Namzu is, what it isn't, and the constraints we built it under.

What an agent kernel does

A kernel owns the runtime. For an AI agent, that means six things:

  • Lifecycle. An agent is a process. It is created, paused, resumed, killed, and restored. The kernel owns those transitions.
  • Scheduling. When you have more than one agent, something has to decide who runs when. The kernel runs the scheduler.
  • Memory. State across turns, working memory, and durable conversation history are different problems. The kernel separates them.
  • IPC. Agents talk to each other (and to the host) through a defined interface. The kernel owns the transport.
  • Sandboxing. When an agent runs untrusted tool code, the kernel enforces the boundary at the OS level, not at the language level.
  • Checkpoint and resume. A long agent run should survive restarts. The kernel knows how to snapshot the live state and bring it back.

Frameworks (LangGraph, CrewAI, Mastra) sit above the kernel and decide what an agent is for. The kernel decides how it runs.

What Namzu is not

It is not a hosted service. There is no cloud component, no control plane, no telemetry. You install it from npm and run it on your own infrastructure.

It is not a UI or an SDK for a UI. Streaming model output into a React chat UI is the Vercel AI SDK's job, and it does it well.

It is not opinionated about composition. If you want a crew-of-agents abstraction, CrewAI ships one. If you want an explicit state graph, LangGraph is purpose-built. Namzu is the kernel underneath whichever of those you pick.

It is not tied to one model vendor. The core exposes a narrow LLMProvider interface; the providers live in sibling packages so the core never imports a vendor SDK. No model is a citizen of the kernel.

Why TypeScript today

TypeScript is the language with the largest pool of developers who actually ship server software for AI workloads in 2026. Starting there made it possible to ship something useful on day one.

The kernel itself is a specification, not a language. The TypeScript implementation is the first; Rust, Go, and Python kernels sharing the same spec are on the public roadmap. The portability claim is structural, not aspirational: the spec lives in ADRs, the public surface stays small, and provider packages are the only place vendor SDKs are allowed to leak in.

Why open source

The argument is in the manifesto, but in one sentence: a runtime you cannot fork is not your runtime. We can disappear, change course, raise prices, or be acquired. None of that should be able to take an agent away from the team that built it.

FSL-1.1-MIT was chosen deliberately. It is a Functional Source License that converts to MIT after two years. The first two years protect against direct competitive resale of the core; after that, every release becomes pure MIT. The license is in the repo.

What ships today

The current public surface is in the changelog, but the headline is:

  • @namzu/sdk — the kernel, scheduler, conversation store, public API.
  • Provider packages for Anthropic, OpenAI, OpenRouter, Bedrock, Ollama, LM Studio, and HTTP.
  • @namzu/computer-use — sandboxed desktop control, in preview.

The session-hierarchy work (projects, sessions, handoffs) is in flight and lands behind the public API as it stabilises. ADRs land in the kernel repo before the code does, so the direction is visible before the patch.

Where to start

The kernel exists. The shape is settled enough to build on. The roadmap is public, the license is permissive, and the runtime is yours.