Namzu vs Vercel AI SDK. agent kernel vs model SDK.
The two tools sit at different layers of the agent stack. The short version: Namzu is an agent kernel — runtime, scheduling, sandboxing. Vercel AI SDK is a frontend-first model sdk. They compose more often than they compete.
Namzu
Namzu is an open-source TypeScript agent kernel and SDK. It runs server-side (Node.js today) and owns the runtime layer beneath agent frameworks. Its concerns are processes, scheduling, memory, sandboxing, and persistence — not UI streaming.
Vercel AI SDK
The Vercel AI SDK is a model-access SDK with first-class support for streaming model output into React, Vue, and Svelte UIs. It abstracts over LLM providers, exposes a tool-calling primitive, and ships React hooks for chat UIs. It is not a runtime layer.
Choose Namzu if…
- →You need the runtime concerns: process isolation, scheduling, memory, IPC, checkpoint/resume.
- →Your agents are long-running, multi-tenant, or run untrusted tool code.
- →You want vendor-neutral provider packages without React/Vue/Svelte coupling.
Choose Vercel AI SDK if…
- →Your primary need is streaming model output into a frontend chat UI.
- →You want React hooks and adapters for chat-style applications.
- →You are building a single-tenant assistant where one agent === one HTTP request.
The two are orthogonal. A backend Namzu agent can stream tokens through the Vercel AI SDK on the way to a React chat UI. The Vercel AI SDK is a great answer to "how do I stream into a UI"; it is not an answer to "how do I run a long-lived, isolated, schedulable agent process". Use both when the answer to both questions matters.
Try the kernel underneath your stack.
Namzu installs from npm and runs on Node.js today. Pair it with whichever agent layer you already use.