§int
Integration · @namzu/openrouter
Namzu × OpenRouter. Multi-model unified gateway.
@namzu/openrouter routes through OpenRouter, the unified gateway over Anthropic, OpenAI, Google, Mistral, Meta, and dozens of open-weight models. Switching models is a config change, not a code change.
01
01 · Install
$ pnpm add @namzu/sdk @namzu/openrouter02
02 · Why pair the kernel with OpenRouter
The kernel's value proposition is that no model is a citizen of the core. OpenRouter is the operational embodiment of that — agent code stays identical while a config flip swaps Claude for GPT-5 for Llama. Useful for model evals, fallback chains, and price-sensitive workloads.
- →One config field switches between hundreds of models
- →Provider-side fallback when a model is rate-limited
- →Per-request model override for A/B testing
- →Price and token-level usage exposed as kernel events
03
03 · Example
import { createKernel } from '@namzu/sdk'
import { openrouter } from '@namzu/openrouter'
// Swap models in config — agent code stays identical.
const kernel = createKernel({
provider: openrouter({
apiKey: process.env.OPENROUTER_API_KEY!,
model: 'anthropic/claude-sonnet-4.6', // or 'openai/gpt-5.1', etc.
}),
})Ship OpenRouter agents on a runtime you can own.
OpenRouter owns the model. Namzu owns the runtime. The provider package is a thin adapter, not a wrapper, so you keep everything OpenRouter ships and add the kernel concerns you do not get from the model API alone.