First Run
Get from zero to your first run in the local AI development workbench.
Prerequisites
Section titled “Prerequisites”- Node.js 18+ (to run SideSeat)
- Python 3.9+ or Node.js 18+ (for your AI app)
- Model credentials for your provider (if required)
Start SideSeat
Section titled “Start SideSeat”-
Run SideSeat locally
Terminal window npx sideseatYou’ll see output like:
SideSeat v1.xLocal: http://127.0.0.1:5388OTLP: http://127.0.0.1:5388/otel/default/v1/traces -
Open the workbench
Navigate to http://localhost:5388 in your browser.
Install the SDK
Section titled “Install the SDK”pip install sideseat# oruv add sideseatnpm install @sideseat/sdkInstrument Your App
Section titled “Instrument Your App”Add two lines at the top of your entry point:
from sideseat import SideSeat, FrameworksSideSeat(framework=Frameworks.Strands)
# Your existing codefrom strands import Agent
agent = Agent()response = agent("What is the capital of France?")print(response)import { init } from '@sideseat/sdk';init();
// Your existing codeimport { generateText } from 'ai';import { openai } from '@ai-sdk/openai';
const { text } = await generateText({ model: openai('gpt-5-mini'), prompt: 'What is the capital of France?', experimental_telemetry: { isEnabled: true },});console.log(text);The SDK auto-detects your framework and configures telemetry accordingly.
What You’ll See
Section titled “What You’ll See”Run your app. A new run appears in the workbench showing a timeline of each LLM call, tool execution, and response. Token counts and costs are calculated automatically.
You should see:
- A live run timeline with each step
- Prompt and response messages grouped by step
- Token, latency, and cost summaries
Why Local?
Section titled “Why Local?”SideSeat runs locally by default. Your data stays on your machine.
| Benefit | What It Means |
|---|---|
| No signup | Run npx sideseat and start debugging immediately |
| No data egress | Traces stay on your machine — no cloud uploads |
| No latency | Real-time streaming without network roundtrips |
| No vendor lock-in | Standard OpenTelemetry traces work with any backend |
Next Steps
Section titled “Next Steps”- Core Concepts — understand runs, steps, and messages
- Integrations — connect your framework or provider
- Troubleshooting — fix common issues