OpenRouter
OpenRouter exposes an OpenAI-compatible HTTP API. Use the official openai package with a custom baseURL and your OpenRouter key, then pass the client to noryen.wrap() like any other OpenAI client.
Example
openrouter.ts
import OpenAI from "openai";import { noryen } from "@noryen/sdk";noryen.init({ apiKey: process.env.NORYEN_API_KEY! });const openrouter = new OpenAI({baseURL: "https://openrouter.ai/api/v1",apiKey: process.env.OPENROUTER_API_KEY!,});const client = noryen.wrap(openrouter);const response = await client.chat.completions.create({model: "meta-llama/llama-3-8b-instruct",messages: [{ role: "user", content: "Hello Noryen!" }],});await noryen.flush();
Model names follow OpenRouter's identifiers (e.g. meta-llama/…). Noryen records the model string returned by the wrapper for filtering in the UI.
More on wrapping: Wrappers. In serverless deployments, keep flush() at the end of the handler.