Azure OpenAI
SideSeat instruments the Azure OpenAI client to capture model information, token usage, messages, and costs from the Chat Completions API. Azure OpenAI uses the same OpenAI Python SDK with an Azure-specific client.
Prerequisites
Section titled “Prerequisites”- SideSeat running locally (
sideseat) - Python SDK installed with the OpenAI extra (
pip install "sideseat[openai]") - Azure OpenAI credentials configured (API key or Managed Identity)
Chat Completions API
Section titled “Chat Completions API”SideSeat instruments the OpenAI SDK automatically. Initialize SideSeat with the OpenAI framework, then use the AzureOpenAI client as usual:
from sideseat import SideSeat, Frameworksfrom openai import AzureOpenAI
SideSeat(framework=Frameworks.OpenAI)
azure = AzureOpenAI( api_key="your-api-key", api_version="2024-02-01", azure_endpoint="https://your-resource.openai.azure.com",)
response = azure.chat.completions.create( model="gpt-5-mini", # Your deployment name messages=[ {"role": "system", "content": "Answer in one sentence."}, {"role": "user", "content": "What is the speed of light?"}, ],)
print(response.choices[0].message.content)Traces
Section titled “Traces”By default, each API call produces its own independent trace. Use client.trace() to group related calls under a single root span:
client = SideSeat(framework=Frameworks.OpenAI)azure = AzureOpenAI( api_key="your-api-key", api_version="2024-02-01", azure_endpoint="https://your-resource.openai.azure.com",)
with client.trace("geography-chat"): messages = [ {"role": "system", "content": "You are a geography assistant. Answer in 1-2 sentences."}, ]
# Turn 1 messages.append({"role": "user", "content": "What is the capital of France?"}) response = azure.chat.completions.create(model="gpt-5-mini", messages=messages) messages.append({"role": "assistant", "content": response.choices[0].message.content})
# Turn 2 messages.append({"role": "user", "content": "What about Germany?"}) response = azure.chat.completions.create(model="gpt-5-mini", messages=messages) messages.append({"role": "assistant", "content": response.choices[0].message.content})
# Turn 3 messages.append({"role": "user", "content": "Which city has a larger population?"}) response = azure.chat.completions.create(model="gpt-5-mini", messages=messages)This produces the following span hierarchy:
geography-chat (root span) ├── ChatCompletion (turn 1) ├── ChatCompletion (turn 2) └── ChatCompletion (turn 3)All three calls appear as child spans in the SideSeat UI, with the full multi-turn conversation visible in the trace detail view.
Sessions
Section titled “Sessions”Pass session_id and user_id to client.trace() to group independent traces into a session. The SideSeat sessions view groups all traces that share the same session_id.
Each client.trace() produces its own trace with its own trace ID, but they are linked by the shared session:
from sideseat import SideSeat, Frameworksfrom openai import AzureOpenAI
client = SideSeat(framework=Frameworks.OpenAI)azure = AzureOpenAI( api_key="your-api-key", api_version="2024-02-01", azure_endpoint="https://your-resource.openai.azure.com",)
session_id = "sess-abc"user_id = "user-123"
# Trace 1: Trip planningwith client.trace("trip-planning", session_id=session_id, user_id=user_id): messages = [ {"role": "system", "content": "You are a travel advisor. Be concise."}, ] messages.append({"role": "user", "content": "Plan a 5-day trip to Japan."}) response = azure.chat.completions.create(model="gpt-5-mini", messages=messages) messages.append({"role": "assistant", "content": response.choices[0].message.content})
messages.append({"role": "user", "content": "Tell me more about Kyoto."}) response = azure.chat.completions.create(model="gpt-5-mini", messages=messages)
# Trace 2: Food recommendations (fresh conversation, same session)with client.trace("food-recommendations", session_id=session_id, user_id=user_id): messages = [ {"role": "system", "content": "You are a food expert. Be concise."}, ] messages.append({"role": "user", "content": "What are the must-try dishes in Tokyo?"}) response = azure.chat.completions.create(model="gpt-5-mini", messages=messages) messages.append({"role": "assistant", "content": response.choices[0].message.content})
messages.append({"role": "user", "content": "What about street food in Osaka?"}) response = azure.chat.completions.create(model="gpt-5-mini", messages=messages)This produces two independent traces, each with their own span hierarchy:
Trace 1: trip-planning (session_id=sess-abc, user_id=user-123) ├── ChatCompletion (turn 1) └── ChatCompletion (turn 2)
Trace 2: food-recommendations (session_id=sess-abc, user_id=user-123) ├── ChatCompletion (turn 1) └── ChatCompletion (turn 2)Each trace starts a fresh conversation with its own message history. The SideSeat sessions view groups them by session_id.
Streaming
Section titled “Streaming”Streaming responses are fully captured, including token counts aggregated from stream chunks:
from sideseat import SideSeat, Frameworksfrom openai import AzureOpenAI
client = SideSeat(framework=Frameworks.OpenAI)azure = AzureOpenAI( api_key="your-api-key", api_version="2024-02-01", azure_endpoint="https://your-resource.openai.azure.com",)
stream = azure.chat.completions.create( model="gpt-5-mini", messages=[ {"role": "system", "content": "Answer in one sentence."}, {"role": "user", "content": "What is the boiling point of water?"}, ], stream=True,)
for chunk in stream: delta = chunk.choices[0].delta if delta.content: print(delta.content, end="", flush=True)Tool Use
Section titled “Tool Use”Tool definitions, tool call requests, and tool results are all captured:
from sideseat import SideSeat, Frameworksfrom openai import AzureOpenAI
client = SideSeat(framework=Frameworks.OpenAI)azure = AzureOpenAI( api_key="your-api-key", api_version="2024-02-01", azure_endpoint="https://your-resource.openai.azure.com",)
tools = [{ "type": "function", "function": { "name": "get_weather", "description": "Get the current weather for a location.", "parameters": { "type": "object", "properties": { "location": {"type": "string", "description": "City name"} }, "required": ["location"], }, },}]
# Step 1: model requests a tool callmessages = [ {"role": "system", "content": "Use tools when available."}, {"role": "user", "content": "What's the weather in Paris?"},]response = azure.chat.completions.create( model="gpt-5-mini", messages=messages, tools=tools,)assistant_msg = response.choices[0].messagemessages.append(assistant_msg)
# Step 2: return the tool resulttool_call = assistant_msg.tool_calls[0]messages.append({ "role": "tool", "tool_call_id": tool_call.id, "content": "Sunny, 22C",})
# Step 3: model produces the final answerresponse = azure.chat.completions.create( model="gpt-5-mini", messages=messages, tools=tools,)SideSeat captures all three steps as separate spans, each with full message details.
Authentication
Section titled “Authentication”Azure OpenAI supports API keys and Managed Identity.
API key (via environment variables):
export AZURE_OPENAI_API_KEY=your-api-keyexport AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.comexport OPENAI_API_VERSION=2024-02-01from openai import AzureOpenAI
# Uses environment variablesazure = AzureOpenAI()Managed Identity (for Azure-hosted applications):
from azure.identity import DefaultAzureCredentialfrom openai import AzureOpenAI
credential = DefaultAzureCredential()token = credential.get_token("https://cognitiveservices.azure.com/.default")
azure = AzureOpenAI( azure_ad_token=token.token, api_version="2024-02-01", azure_endpoint="https://your-resource.openai.azure.com",)Next Steps
Section titled “Next Steps”- OpenAI — standard OpenAI usage
- Python SDK — SDK configuration and API reference
- Overview — get started with SideSeat