§int
Integration · @namzu/anthropic
Namzu × Anthropic. Frontier model API behind Claude.
The @namzu/anthropic package wires Anthropic's Messages API into Namzu's LLMProvider interface. Tool calling, streaming, structured outputs, and stop sequences are mapped one-to-one with no abstraction tax.
01
01 · Install
$ pnpm add @namzu/sdk @namzu/anthropic02
02 · Why pair the kernel with Anthropic
Claude's tool-use API is the closest first-party fit to a kernel-style runtime — explicit tool blocks, deterministic stop reasons, and a stable schema across model generations. Namzu surfaces those primitives as kernel signals rather than burying them in a framework abstraction.
- →Tool calling and tool-result handling without DSL translation
- →Streaming responses surfaced as kernel events for IPC and persistence
- →Per-agent model + temperature config via the kernel's provider scope
- →Vision and document inputs passed through unchanged
03
03 · Example
import { createKernel } from '@namzu/sdk'
import { anthropic } from '@namzu/anthropic'
const kernel = createKernel({
provider: anthropic({
apiKey: process.env.ANTHROPIC_API_KEY!,
model: 'claude-sonnet-4-6',
}),
})
const agent = kernel.spawn({
systemPrompt: 'You are a careful research assistant.',
tools: [searchWeb, summariseDocument],
})
await agent.run('Find the latest paper on agent kernels and summarise it.')Ship Anthropic agents on a runtime you can own.
Anthropic owns the model. Namzu owns the runtime. The provider package is a thin adapter, not a wrapper, so you keep everything Anthropic ships and add the kernel concerns you do not get from the model API alone.