Skip to main content
Browse docs
By Audience
Getting Started
Configuration
Use Cases
IDE Integration
Third-Party Integrations
Engineering Cache
Console
API Reference
Gateway
Workflow Guides
Templates
Providers and SDKs
Industry Guides
Advanced Guides
Browse by Role
Deployment Guides
In-Depth Guides
Tutorials
FAQ

Vercel AI SDK

The Vercel AI SDK (ai package) is the standard way to add streaming AI to Next.js and Node.js applications. When you set baseURL to the Keeptrusts gateway, every request made by streamText, generateText, or streamObject passes through Keeptrusts's real-time policy enforcement layer — with zero changes to your application logic.

Use this page when

  • You need the exact command, config, API, or integration details for Vercel AI SDK.
  • You are wiring automation or AI retrieval and need canonical names, examples, and constraints.
  • If you want a guided rollout instead of a reference page, use the linked workflow pages in Next steps.

Primary audience

  • Primary: AI Agents, Technical Engineers
  • Secondary: Technical Leaders

Prerequisites

  • Keeptrusts CLI installed and a policy-config.yaml created
  • ai and @ai-sdk/openai installed in your project
  • The upstream provider API key available as an environment variable
npm install ai @ai-sdk/openai

Configuration

pack:
name: "vercel-ai-sdk-gateway"
version: "0.1.0"
enabled: true

policies:
chain:
- prompt-injection
- pii-detector
- audit-logger

providers:
targets:
- id: "openai-via-gateway"
provider: "openai"
model: "gpt-4o"
base_url: "https://api.openai.com/v1"
secret_key_ref:
env: "OPENAI_API_KEY"

Start the gateway:

export OPENAI_API_KEY="sk-..."
kt gateway run --listen 127.0.0.1:41002 --policy-config policy-config.yaml

Provider Fields

FieldTypeDefaultDescription
idstringUnique target identifier
providerstringUpstream provider (openai, anthropic, etc.)
modelstringDefault model passed to the upstream
base_urlstringprovider defaultUpstream API base URL
secret_key_refobjectprovider defaultObject reference to the env var holding the upstream API key

Supported Models

Any model supported by the underlying upstream provider is accessible through the gateway. When using @ai-sdk/openai pointed at Keeptrusts, pass any valid OpenAI model name:

ModelNotes
gpt-4oLatest GPT-4o flagship
gpt-4o-miniCost-efficient, fast
gpt-4-turboPrevious generation flagship
o1, o1-miniReasoning models
Any Anthropic modelUse @ai-sdk/anthropic variant (see Advanced Configuration)

Client Examples

The Keeptrusts gateway exposes an OpenAI-compatible /v1 surface. Configure @ai-sdk/openai to point at http://localhost:41002/v1:

import { createOpenAI } from '@ai-sdk/openai';
import { streamText } from 'ai';

const openai = createOpenAI({
baseURL: 'http://localhost:41002/v1',
apiKey: 'any', // The gateway handles auth to the upstream
});

const { textStream } = await streamText({
model: openai('gpt-4o'),
prompt: 'Summarize the key benefits of a zero-trust security model.',
});

for await (const chunk of textStream) {
process.stdout.write(chunk);
}

Streaming

The Vercel AI SDK's streamText and streamObject functions work natively with the Keeptrusts gateway. The gateway forwards server-sent events (SSE) to the client without buffering:

import { createOpenAI } from '@ai-sdk/openai';
import { streamText } from 'ai';

const openai = createOpenAI({
baseURL: 'http://localhost:41002/v1',
apiKey: 'any',
});

// Stream tokens in a Next.js Route Handler
export async function POST(req: Request) {
const { prompt } = await req.json();

const result = await streamText({
model: openai('gpt-4o'),
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: prompt },
],
});

return result.toDataStreamResponse();
}

Advanced Configuration

Using @ai-sdk/anthropic

Point the Anthropic provider at the gateway's /v1 surface. Keeptrusts performs automatic format translation between OpenAI and Anthropic wire formats:

import { createAnthropic } from '@ai-sdk/anthropic';
import { generateText } from 'ai';

const anthropic = createAnthropic({
baseURL: 'http://localhost:41002/v1',
apiKey: 'any',
});

const { text } = await generateText({
model: anthropic('claude-3-5-sonnet-20241022'),
prompt: 'Explain chain-of-thought prompting.',
});

Update policy-config.yaml to route to the Anthropic upstream:

pack:
name: vercel-providers-2
version: 1.0.0
enabled: true
providers:
targets:
- id: anthropic-via-gateway
provider: anthropic
model: claude-3-5-sonnet-20241022
base_url: https://api.anthropic.com/v1
secret_key_ref:
env: ANTHROPIC_API_KEY
provider_type: anthropic
format: anthropic
policies:
chain:
- audit-logger
policy:
audit-logger:
immutable: true
retention_days: 365
log_all_access: true

Structured Output with streamObject

import { createOpenAI } from '@ai-sdk/openai';
import { streamObject } from 'ai';
import { z } from 'zod';

const openai = createOpenAI({
baseURL: 'http://localhost:41002/v1',
apiKey: 'any',
});

const { partialObjectStream } = await streamObject({
model: openai('gpt-4o'),
schema: z.object({
title: z.string(),
summary: z.string(),
tags: z.array(z.string()),
}),
prompt: 'Describe the Keeptrusts gateway in structured form.',
});

for await (const partial of partialObjectStream) {
console.log(partial);
}

Best Practices

  • Set apiKey: 'any' — the gateway manages upstream auth; no real API key should be sent from the browser
  • Use baseURL not baseUrl@ai-sdk/openai uses camelCase baseURL
  • Keep the gateway local in development — run kt gateway run alongside your Next.js dev server on port 41002; configure KEEPTRUSTS_GATEWAY_URL for staging and production
  • Apply PII redaction — add the pii-detector policy to prevent sensitive data from leaving your network before it reaches the upstream provider
  • Audit all requests — include audit-logger in the policy chain to capture every prompt and response for compliance review
  • Pin the gateway port — standardize on 41002 across environments to avoid configuration drift

For AI systems

  • Canonical terms: Keeptrusts gateway, Vercel AI SDK, @ai-sdk/openai, Next.js, Edge Functions, createOpenAI, baseURL, provider target.
  • Integration pattern: Use createOpenAI({ baseURL: "http://localhost:41002/v1" }) in the Vercel AI SDK to route through Keeptrusts.
  • Key behavior: The Vercel AI SDK's OpenAI provider adapter points at Keeptrusts instead of OpenAI directly — all AI SDK features (streaming, tool calling) work unchanged.
  • Best next pages: Node.js SDK integration, OpenAI integration, Quickstart.

For engineers

  • Prerequisites: Next.js project with ai and @ai-sdk/openai packages, Keeptrusts gateway running on a known port (default 41002).
  • Set baseURL in createOpenAI() to your Keeptrusts gateway address (e.g., http://localhost:41002/v1).
  • Include audit-logger in the policy chain to capture every prompt and response for compliance review.
  • Pin the gateway port (standardize on 41002) across environments to avoid configuration drift.
  • Validate: deploy your Next.js app and check the Keeptrusts console Events dashboard for request records.
  • All Vercel AI SDK features (streaming, tool calling, structured output) work unchanged through the gateway.

For leaders

  • Zero framework code changes — existing Vercel AI SDK applications adopt Keeptrusts governance by changing only the baseURL configuration.
  • All AI requests from your Next.js application are audit-logged, providing compliance evidence for AI-powered features.
  • Keeptrusts policies (PII redaction, prompt-injection) apply to all traffic regardless of which model the AI SDK routes to.
  • Standardizing the gateway port across environments prevents configuration drift between development, staging, and production.

Next steps