Skip to content

Troubleshooting

Quick fixes for common issues with the local AI development workbench.

  1. Confirm SideSeat is running — you should see Local: http://127.0.0.1:5388 in the terminal
  2. Check the OTLP endpoint — default is http://localhost:5388/otel/default/v1/traces
  3. Flush before exit — call client.shutdown() (Python) or await shutdown() (JavaScript) before the script exits

If port 5388 is occupied:

Terminal window
sideseat --port 5390

Update your SDK/OTLP endpoint to match.

If the UI asks for a token or refuses to load:

  • Open the tokenized URL printed at startup when auth is enabled
  • Or disable auth locally: sideseat --no-auth
  • Ensure the model ID is captured in gen_ai.request.model
  • Use a recognized provider (OpenAI, Anthropic, Amazon Bedrock, Vertex, etc.)
  • Confirm your framework emits tool spans via OpenTelemetry
  • Use the SideSeat SDK for best results with tool I/O normalization