The OpenRouter provider for the Vercel AI SDK gives access to over 300 large language model on the OpenRouter chat and completion APIs.
# For pnpm
pnpm add @openrouter/ai-sdk-provider
# For npm
npm install @openrouter/ai-sdk-provider
# For yarn
yarn add @openrouter/ai-sdk-provider
You can import the default provider instance openrouter
from @openrouter/ai-sdk-provider
:
import { openrouter } from '@openrouter/ai-sdk-provider';
import { openrouter } from '@openrouter/ai-sdk-provider';
import { generateText } from 'ai';
const { text } = await generateText({
model: openrouter('openai/gpt-4o'),
prompt: 'Write a vegetarian lasagna recipe for 4 people.',
});
This list is not a definitive list of models supported by OpenRouter, as it constantly changes as we add new models (and deprecate old ones) to our system.
You can find the latest list of models supported by OpenRouter here.
You can find the latest list of tool-supported models supported by OpenRouter here. (Note: This list may contain models that are not compatible with the AI SDK.)
There are 3 ways to pass extra body to OpenRouter:
-
Via the
providerOptions.openrouter
property:import { createOpenRouter } from '@openrouter/ai-sdk-provider'; import { streamText } from 'ai'; const openrouter = createOpenRouter({ apiKey: 'your-api-key' }); const model = openrouter('anthropic/claude-3.7-sonnet:thinking'); await streamText({ model, messages: [{ role: 'user', content: 'Hello' }], providerOptions: { openrouter: { reasoning: { max_tokens: 10, }, }, }, });
-
Via the
extraBody
property in the model settings:import { createOpenRouter } from '@openrouter/ai-sdk-provider'; import { streamText } from 'ai'; const openrouter = createOpenRouter({ apiKey: 'your-api-key' }); const model = openrouter('anthropic/claude-3.7-sonnet:thinking', { extraBody: { reasoning: { max_tokens: 10, }, }, }); await streamText({ model, messages: [{ role: 'user', content: 'Hello' }], });
-
Via the
extraBody
property in the model factory.import { createOpenRouter } from '@openrouter/ai-sdk-provider'; import { streamText } from 'ai'; const openrouter = createOpenRouter({ apiKey: 'your-api-key', extraBody: { reasoning: { max_tokens: 10, }, }, }); const model = openrouter('anthropic/claude-3.7-sonnet:thinking'); await streamText({ model, messages: [{ role: 'user', content: 'Hello' }], });
You can include Anthropic-specific options directly in your messages when using functions like streamText
. The OpenRouter provider will automatically convert these messages to the correct format internally.
import { createOpenRouter } from '@openrouter/ai-sdk-provider';
import { streamText } from 'ai';
const openrouter = createOpenRouter({ apiKey: 'your-api-key' });
const model = openrouter('anthropic/<supported-caching-model>');
await streamText({
model,
messages: [
{
role: 'system',
content: 'You are a helpful assistant.',
// Add provider options at the message level
providerMetadata: {
openrouter: {
// cache_control also works
// cache_control: { type: 'ephemeral' }
cacheControl: { type: 'ephemeral' },
},
},
},
{
role: 'user',
content: 'Hello, how are you?',
},
],
});