Vercel AI SDK Integration
Enforce Control Zero policies in applications built with the Vercel AI SDK.
Overview
The Vercel AI SDK provides a unified interface for building AI-powered applications in TypeScript/JavaScript. Control Zero integrates through middleware that intercepts model calls and tool invocations before they execute.
Installation
npm install @controlzero/sdk ai @ai-sdk/openai
Setup
import { ControlZero } from '@controlzero/sdk';
import { generateText, streamText, tool } from 'ai';
import { openai } from '@ai-sdk/openai';
const cz = new ControlZero({
apiKey: process.env.CONTROLZERO_API_KEY,
projectId: process.env.CONTROLZERO_PROJECT_ID,
});
await cz.initialize();
Basic Usage
Enforce Before Generation
async function generate(prompt: string, model: string = 'gpt-4') {
// Enforce policy before the AI call
await cz.enforce({
action: 'llm.generate',
resource: `model/${model}`,
context: { agent_id: 'web-assistant' },
});
const result = await generateText({
model: openai(model),
prompt,
});
return result.text;
}
Enforce Tool Calls
import { z } from 'zod';
async function runAgent(prompt: string) {
// Enforce the LLM call
await cz.enforce({
action: 'llm.generate',
resource: 'model/gpt-4',
context: { agent_id: 'web-assistant' },
});
const result = await generateText({
model: openai('gpt-4'),
prompt,
tools: {
searchWeb: tool({
description: 'Search the web for information',
parameters: z.object({ query: z.string() }),
execute: async ({ query }) => {
// Enforce tool call policy
await cz.enforce({
action: 'tool.call',
resource: 'tool/search_web',
context: { agent_id: 'web-assistant' },
});
return `Results for: ${query}`;
},
}),
readDatabase: tool({
description: 'Query the database',
parameters: z.object({ sql: z.string() }),
execute: async ({ sql }) => {
// Enforce tool call policy
await cz.enforce({
action: 'tool.call',
resource: 'tool/read_database',
context: { agent_id: 'web-assistant' },
});
return `DB results for: ${sql}`;
},
}),
},
});
return result.text;
}
Streaming with Enforcement
async function streamResponse(prompt: string) {
await cz.enforce({
action: 'llm.generate',
resource: 'model/gpt-4',
context: { agent_id: 'web-assistant' },
});
const result = streamText({
model: openai('gpt-4'),
prompt,
});
return result.toDataStreamResponse();
}
Next.js Route Handler Example
// app/api/chat/route.ts
import { ControlZero } from '@controlzero/sdk';
import { streamText } from 'ai';
import { openai } from '@ai-sdk/openai';
const cz = new ControlZero();
export async function POST(req: Request) {
await cz.initialize();
const { messages } = await req.json();
// Enforce before streaming
await cz.enforce({
action: 'llm.generate',
resource: 'model/gpt-4',
context: { agent_id: 'chat-assistant' },
});
const result = streamText({
model: openai('gpt-4'),
messages,
});
return result.toDataStreamResponse();
}
Example Policy
{
"name": "vercel-ai-policy",
"rules": [
{
"effect": "allow",
"action": "llm.generate",
"resource": "model/gpt-4"
},
{
"effect": "allow",
"action": "tool.call",
"resource": "tool/search_web"
},
{
"effect": "deny",
"action": "tool.call",
"resource": "tool/read_database"
}
]
}
Next Steps
- See the Node.js SDK for the full API reference.
- Explore the Customer Support Guide for a complete chat application example.