A custom provider that enables Workers AI's models for the Vercel AI SDK.
npm install workers-ai-provider
First, setup an AI binding in wrangler.toml
in your Workers project:
# ...
[ai]
binding = "AI"
# ...
Then in your Worker, import the factory function and create a new AI provider:
// index.ts
import { createWorkersAI } from "workers-ai-provider";
import { streamText } from "ai";
type Env = {
AI: Ai;
};
export default {
async fetch(req: Request, env: Env) {
const workersai = createWorkersAI({ binding: env.AI });
// Use the AI provider to interact with the Vercel AI SDK
// Here, we generate a chat stream based on a prompt
const text = await streamText({
model: workersai("@cf/meta/llama-2-7b-chat-int8"),
messages: [
{
role: "user",
content: "Write an essay about hello world",
},
],
});
return text.toTextStreamResponse({
headers: {
// add these headers to ensure that the
// response is chunked and streamed
"Content-Type": "text/x-unknown",
"content-encoding": "identity",
"transfer-encoding": "chunked",
},
});
},
};
For more info, refer to the documentation of the Vercel AI SDK.
Based on work by Dhravya Shah and the Workers AI team at Cloudflare.