Skip to main content

Documentation Index

Fetch the complete documentation index at: https://whyops.com/docs/llms.txt

Use this file to discover all available pages before exploring further.

Use this page after the Python quickstart, proxy helpers, and runtime events are already clear.

Quickstart

Install the package, initialize the agent, and make the first proxied model call.

Proxy Helpers

Review the key flow and patched-client behavior.

Runtime Events

Review the event methods before composing hybrid flows.

Hybrid tracing

trace_id = "checkout-8841"
client = sdk.openai(OpenAI(api_key="YOUR_WHYOPS_API_KEY"))
client.default_headers = {
    **(client.default_headers or {}),
    "X-Trace-ID": trace_id,
    "X-Thread-ID": trace_id,
}
trace = sdk.trace(trace_id)

span_id = trace.tool_call_request_sync(
    "charge_card",
    [{"name": "charge_card", "arguments": {"amount": 4999, "currency": "usd"}}],
)

result = charge_card()

trace.tool_call_response_sync(
    "charge_card",
    span_id,
    [{"name": "charge_card", "arguments": {"amount": 4999, "currency": "usd"}}],
    result,
)

Prompt caching-aware usage

await trace.llm_response(
    "anthropic/claude-sonnet-4-5",
    "anthropic",
    "Done.",
    usage={
        "promptTokens": 1200,
        "completionTokens": 240,
        "totalTokens": 9940,
        "cacheReadTokens": 8200,
        "cacheCreationTokens": 300,
    },
    latency_ms=860,
)
Use cacheReadTokens for tokens served from cache and cacheCreationTokens for tokens written into cache when your runtime exposes those values.

Self-hosted overrides

sdk = WhyOps(
    api_key="YOUR_WHYOPS_API_KEY",
    agent_name="customer-support-agent",
    agent_metadata={"systemPrompt": "You are helpful.", "tools": []},
    proxy_base_url="https://proxy.internal.example.com",
    analyse_base_url="https://analyse.internal.example.com/api",
)

Event IDs you should understand

FieldWhen to set it
span_idUse the same value across tool_call_request() and tool_call_response()
step_idSet it only if your framework already has stable ordered steps
parent_step_idUse it when you want explicit tree structure instead of backend inference
idempotency_keySet it when retries or queue replay could submit the same event twice

Common mistakes

  • Calling trace() methods without stable trace_id values across the same workflow.
  • Mixing manual llm_response() events with proxied model responses for the same model turn when you only wanted one representation.
  • Forgetting to reuse span_id across tool request and response events.
  • Overriding URLs request-by-request instead of setting proxy_base_url and analyse_base_url once on the WhyOps client.