Skip to content

Vercel AI SDK

Vercel AI SDK is a TypeScript toolkit for building AI applications. SideSeat captures traces from generateText, streamText, and other AI SDK calls via OpenTelemetry.

  1. Install dependencies

    Terminal window
    npm install @sideseat/sdk ai @ai-sdk/openai
  2. Add telemetry

    import { init } from '@sideseat/sdk';
    init();
    import { generateText } from 'ai';
    import { openai } from '@ai-sdk/openai';
    const { text } = await generateText({
    model: openai('gpt-5-mini'),
    prompt: 'What is the capital of France?',
    experimental_telemetry: { isEnabled: true },
    });
    console.log(text);
  3. View runs

    Open http://localhost:5388 to see your runs.

Use the OpenTelemetry SDK directly:

import { NodeSDK } from '@opentelemetry/sdk-node';
import { OTLPTraceExporter } from '@opentelemetry/exporter-trace-otlp-http';
const sdk = new NodeSDK({
traceExporter: new OTLPTraceExporter({
url: 'http://localhost:5388/otel/default/v1/traces',
}),
});
sdk.start();

Or set the environment variable:

Terminal window
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:5388/otel/default/v1/traces

SideSeat shows each generateText/streamText call with the model name, token counts, prompt messages, and the full response. Tool calls appear as child spans.

import { streamText } from 'ai';
import { openai } from '@ai-sdk/openai';
const result = streamText({
model: openai('gpt-5-mini'),
prompt: 'Tell me a story',
experimental_telemetry: { isEnabled: true },
});
for await (const chunk of result.textStream) {
process.stdout.write(chunk);
}
import { generateText, tool } from 'ai';
import { openai } from '@ai-sdk/openai';
import { z } from 'zod';
const { text } = await generateText({
model: openai('gpt-5-mini'),
prompt: 'What is the weather in Paris?',
tools: {
getWeather: tool({
description: 'Get weather for a location',
parameters: z.object({ location: z.string() }),
execute: async ({ location }) => `Sunny in ${location}`,
}),
},
experimental_telemetry: { isEnabled: true },
});