Skip to content

First Run

Get from zero to your first run in the local AI development workbench.

  • Node.js 18+ (to run SideSeat)
  • Python 3.9+ or Node.js 18+ (for your AI app)
  • Model credentials for your provider (if required)
  1. Run SideSeat locally

    Terminal window
    npx sideseat

    You’ll see output like:

    SideSeat v1.x
    Local: http://127.0.0.1:5388
    OTLP: http://127.0.0.1:5388/otel/default/v1/traces
  2. Open the workbench

    Navigate to http://localhost:5388 in your browser.

Terminal window
pip install sideseat
# or
uv add sideseat

Add two lines at the top of your entry point:

from sideseat import SideSeat, Frameworks
SideSeat(framework=Frameworks.Strands)
# Your existing code
from strands import Agent
agent = Agent()
response = agent("What is the capital of France?")
print(response)

The SDK auto-detects your framework and configures telemetry accordingly.

Run your app. A new run appears in the workbench showing a timeline of each LLM call, tool execution, and response. Token counts and costs are calculated automatically.

You should see:

  • A live run timeline with each step
  • Prompt and response messages grouped by step
  • Token, latency, and cost summaries

SideSeat runs locally by default. Your data stays on your machine.

BenefitWhat It Means
No signupRun npx sideseat and start debugging immediately
No data egressTraces stay on your machine — no cloud uploads
No latencyReal-time streaming without network roundtrips
No vendor lock-inStandard OpenTelemetry traces work with any backend