Skip to main content

Documentation Index

Fetch the complete documentation index at: https://whyops.com/docs/llms.txt

Use this file to discover all available pages before exploring further.

WhyOps acts as a drop-in API proxy for LLM providers like OpenAI and Anthropic. You do not need to rewrite your agent logic to start capturing traces.

1. Get your WhyOps API Key

  1. Log in to the WhyOps Dashboard.
  2. Create a new Project and Environment.
  3. Navigate to API Keys and generate a new key.
  4. Add your provider API keys (OpenAI, Anthropic) in the Providers section of the dashboard.

2. Register your agent

Before sending traffic, register the agent configuration you want WhyOps to track.
curl -X POST https://proxy.whyops.com/v1/agents/init \
  -H "Authorization: Bearer $WHYOPS_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "agentName": "my-customer-support-agent",
    "metadata": {
      "systemPrompt": "You are a helpful support assistant.",
      "description": "Production support agent",
      "tools": [
        {
          "name": "search_orders",
          "inputSchema": "{\"type\":\"object\",\"properties\":{\"orderId\":{\"type\":\"string\"}},\"required\":[\"orderId\"]}",
          "outputSchema": "{\"type\":\"object\",\"properties\":{\"status\":{\"type\":\"string\"}}}",
          "description": "Look up order status"
        }
      ]
    }
  }'
WhyOps creates or reuses a versioned agent definition from this payload.

3. Choose your integration path

TypeScript / JavaScript SDK

Use @whyops/sdk when you want a single package for agent init, proxy patching, and manual runtime events.

Python SDK

Use whyops for sync and async tracing, plus OpenAI and Anthropic proxy helpers.

Go SDK

Use the Go module for manual events and a proxy-aware http.Client transport.
If you want the fastest path with the published SDK packages, start at SDK Packages and then move into the TypeScript SDK, Python SDK, or Go SDK section in the sidebar.

4. Update your Base URL

Configure your existing OpenAI or Anthropic SDKs to point to the WhyOps proxy.
import OpenAI from 'openai';

const openai = new OpenAI({
  apiKey: process.env.WHYOPS_API_KEY,
  baseURL: 'https://proxy.whyops.com/v1',
  defaultHeaders: {
    'X-Agent-Name': 'my-customer-support-agent'
  }
});

// The proxy automatically routes the request to OpenAI 
// using the provider credentials stored in your WhyOps dashboard.
const completion = await openai.chat.completions.create({
  model: 'gpt-4o',
  messages: [{ role: 'user', content: 'Hello!' }],
});
You must provide the X-Agent-Name header on every request to identify which agent is making the call.
If your framework exposes a natural request context, you can also send X-Trace-ID or X-Thread-ID. If you do not, WhyOps automatically uses invisible signatures to keep the trace stitched together.

5. View your Traces

Once your agent makes a request, WhyOps automatically injects Invisible Signatures to tie follow-up tool calls and context windows into a single trace. Navigate to the Traces tab in the WhyOps Dashboard to view the real-time execution graphs.

6. Add manual runtime events when needed

The proxy captures LLM traffic automatically, but you can also send manual tool/runtime events to whyops-analyse for richer graphs and better static analysis. Use this when you want to log:
  • custom tool execution
  • framework retries
  • runtime errors
  • internal reasoning or orchestration milestones
See Manual Events.

Next Steps