Skip to content

Vercel AI SDK

Vercel AI SDK is a TypeScript toolkit for building AI applications. SideSeat captures traces from generateText, streamText, and other AI SDK calls via OpenTelemetry.

  • SideSeat running locally (sideseat)
  • Node.js 18+
  • AWS credentials configured for Amazon Bedrock, or another supported provider
  1. Start SideSeat

    Terminal window
    npx sideseat
  2. Install dependencies

    Terminal window
    npm install @sideseat/sdk ai @ai-sdk/amazon-bedrock
  3. Add telemetry

    import { init, Frameworks } from '@sideseat/sdk';
    import { generateText } from 'ai';
    import { bedrock } from '@ai-sdk/amazon-bedrock';
    init({ framework: Frameworks.VercelAI });
    const { text } = await generateText({
    model: bedrock('us.anthropic.claude-sonnet-4-5-20250929-v1:0'),
    prompt: 'What is the capital of France?',
    experimental_telemetry: { isEnabled: true },
    });
    console.log(text);
  4. View runs

    Open http://localhost:5388 to see your runs.

  1. Start SideSeat

    Terminal window
    npx sideseat
  2. Set the endpoint

    Terminal window
    export OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:5388/otel/default
  3. Install dependencies

    Terminal window
    npm install @opentelemetry/sdk-node @opentelemetry/exporter-trace-otlp-http ai @ai-sdk/amazon-bedrock
  4. Add telemetry

    import { NodeSDK } from '@opentelemetry/sdk-node';
    import { OTLPTraceExporter } from '@opentelemetry/exporter-trace-otlp-http';
    import { generateText } from 'ai';
    import { bedrock } from '@ai-sdk/amazon-bedrock';
    const sdk = new NodeSDK({ traceExporter: new OTLPTraceExporter() });
    sdk.start();
    const { text } = await generateText({
    model: bedrock('us.anthropic.claude-sonnet-4-5-20250929-v1:0'),
    prompt: 'What is the capital of France?',
    experimental_telemetry: { isEnabled: true },
    });
    console.log(text);
  5. View runs

    Open http://localhost:5388 to see your runs.

SideSeat shows each generateText/streamText call with the model name, token counts, prompt messages, and the full response. Tool calls appear as child spans.

import { init, Frameworks } from '@sideseat/sdk';
import { streamText } from 'ai';
import { bedrock } from '@ai-sdk/amazon-bedrock';
init({ framework: Frameworks.VercelAI });
const result = streamText({
model: bedrock('us.anthropic.claude-sonnet-4-5-20250929-v1:0'),
prompt: 'Tell me a story',
experimental_telemetry: { isEnabled: true },
});
for await (const chunk of result.textStream) {
process.stdout.write(chunk);
}
import { init, Frameworks } from '@sideseat/sdk';
import { generateText, tool } from 'ai';
import { bedrock } from '@ai-sdk/amazon-bedrock';
import { z } from 'zod';
init({ framework: Frameworks.VercelAI });
const { text } = await generateText({
model: bedrock('us.anthropic.claude-sonnet-4-5-20250929-v1:0'),
prompt: 'What is the weather in Paris?',
tools: {
getWeather: tool({
description: 'Get weather for a location',
parameters: z.object({ location: z.string() }),
execute: async ({ location }) => `Sunny in ${location}`,
}),
},
experimental_telemetry: { isEnabled: true },
});