Skip to main content

Documentation Index

Fetch the complete documentation index at: https://whyops.com/docs/llms.txt

Use this file to discover all available pages before exploring further.

The whyops package gives Python services one integration surface for agent registration, OpenAI or Anthropic proxying, and direct runtime event emission. This page is the basic setup path.

Proxy Helpers

Read this next if you want the exact API-key flow and what sdk.openai() or sdk.anthropic() changes.

Runtime Events

Add sync or async trace events for tools, thinking blocks, embeddings, and errors.

Advanced Patterns

Hybrid tracing, self-hosted overrides, prompt caching, and common mistakes.

Before you start

You needWhy
WHYOPS_API_KEYAuthenticates agent init, manual events, and proxied model traffic to WhyOps
A stable agent_nameKeeps proxy traffic and runtime events attached to the same agent identity
systemPrompt and tool metadataLets WhyOps version the agent and show the correct configuration in the UI
Your provider key in the WhyOps dashboardLets WhyOps authenticate upstream when it forwards OpenAI or Anthropic traffic
A stable session or trace IDBest for explicit continuity between proxied model calls and later tool or runtime events
In Python, the cleanest flow is: create the WhyOps client, call init_agent_sync() or await init_agent() during startup, then patch OpenAI or Anthropic, and only after that add manual runtime events.

1. Install the package

pip install whyops

2. Create the WhyOps client once

from whyops import WhyOps

sdk = WhyOps(
    api_key="YOUR_WHYOPS_API_KEY",
    agent_name="customer-support-agent",
    agent_metadata={
        "systemPrompt": "You are a precise customer support assistant.",
        "description": "Handles support, billing, and order status flows.",
        "tools": [
            {
                "name": "search_orders",
                "description": "Look up order state by ID",
                "inputSchema": "{\"type\":\"object\",\"properties\":{\"orderId\":{\"type\":\"string\"}},\"required\":[\"orderId\"]}",
                "outputSchema": "{\"type\":\"object\",\"properties\":{\"status\":{\"type\":\"string\"}}}",
            }
        ],
    },
)
If you include inputSchema or outputSchema, pass JSON strings. If the agent has no tools, set tools: [] explicitly so the registered definition stays clear.

3. Initialize the agent during startup

sdk.init_agent_sync()
You do not have to call this manually because trace methods auto-initialize, but early registration is the better default because configuration problems fail before live traffic.

4. Make your first proxied model call

from openai import OpenAI

trace_id = "session-123"
client = sdk.openai(OpenAI(api_key="YOUR_WHYOPS_API_KEY"))
client.default_headers = {
    **(client.default_headers or {}),
    "X-Trace-ID": trace_id,
    "X-Thread-ID": trace_id,
}

completion = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Where is order 123?"}],
)
The helper mutates the provider client in place. Go to Python SDK Proxy Helpers for the exact key flow, header behavior, and sync versus async details.
The proxy can generate a trace automatically, but the backend checks X-Trace-ID and X-Thread-ID first. If your app later emits tool or runtime events, reusing the same explicit trace ID is the cleaner and more reliable setup.

5. Add runtime traces only where you need more visibility

Start with proxy-only instrumentation first. Add trace() events when you need:
  • tool execution latency and outputs
  • retries inside your framework
  • runtime failures after the model returns
  • prompt caching-aware usage on manual llm_response() calls
  • exposed thinking blocks or orchestration milestones

Next pages

Proxy Helpers

Understand which API key goes where and what the helper changes on the provider client.

Runtime Events

Add manual event coverage for tools, thinking blocks, embeddings, and errors.

Advanced Patterns

Finish with hybrid tracing, self-hosting, prompt caching, and event IDs.