Skip to main content

Documentation Index

Fetch the complete documentation index at: https://whyops.com/docs/llms.txt

Use this file to discover all available pages before exploring further.

@whyops/sdk is the main WhyOps package for Node.js, Bun, and modern server-side TypeScript apps. This page is the basic setup path: install the package, register the agent, then route OpenAI or Anthropic traffic through WhyOps.
Start with this page to create the base WhyOps client, then use the companion wrapper on the Vercel AI SDK integration page.

Proxy Helpers

Read this next if you want the exact API-key flow and what whyops.openai() or whyops.anthropic() mutates.

Runtime Events

Add tool spans, cache-aware token usage, thinking events, and runtime failures.

Advanced Patterns

Hybrid tracing, self-hosted overrides, prompt caching, and common mistakes.

Vercel AI SDK

Wrap generateText(), streamText(), and embedding helpers when your app uses ai.

Before you start

You needWhy
WHYOPS_API_KEYAuthenticates agent init, manual events, and proxied model traffic to WhyOps
A stable agentNameKeeps proxy traffic and runtime events attached to the same agent identity
systemPrompt and tool metadataLets WhyOps version the agent and show the right configuration in the UI
Your provider key in the WhyOps dashboardLets WhyOps authenticate upstream when it forwards OpenAI or Anthropic traffic
A stable session or trace IDBest for explicit continuity between proxied model calls and later tool or runtime events
In TypeScript, the cleanest flow is: create the WhyOps client, call await whyops.initAgent() during startup, then patch OpenAI or Anthropic, and only after that add manual runtime events.

1. Install the package

npm install @whyops/sdk

2. Create the WhyOps client once

import { WhyOps } from '@whyops/sdk';

export const whyops = new WhyOps({
  apiKey: process.env.WHYOPS_API_KEY!,
  agentName: 'customer-support-agent',
  agentMetadata: {
    systemPrompt: 'You are a precise customer support assistant.',
    description: 'Handles support, billing, and order status flows.',
    tools: [
      {
        name: 'search_orders',
        description: 'Look up order state by ID',
        inputSchema: JSON.stringify({
          type: 'object',
          properties: { orderId: { type: 'string' } },
          required: ['orderId'],
        }),
        outputSchema: JSON.stringify({
          type: 'object',
          properties: { status: { type: 'string' } },
        }),
      },
    ],
  },
});
If you include inputSchema or outputSchema, pass JSON strings with JSON.stringify(...). If the agent has no tools, set tools: [] explicitly so the registered definition stays clear.

3. Initialize the agent during startup

await whyops.initAgent();
You do not have to call this manually because trace() auto-initializes on first event, but calling it during boot is the better default because agent registration problems fail early.

4. Make your first proxied model call

import OpenAI from 'openai';
import { whyops } from './whyops';

const traceId = 'session-123';
const openai = whyops.openai(
  new OpenAI({ apiKey: process.env.WHYOPS_API_KEY }),
);

openai.defaultHeaders = {
  ...(openai as any).defaultHeaders,
  'X-Trace-ID': traceId,
  'X-Thread-ID': traceId,
};

const response = await openai.chat.completions.create({
  model: 'gpt-4o',
  messages: [{ role: 'user', content: 'Where is order 123?' }],
});
The helper mutates the provider client in place. Go to TypeScript SDK Proxy Helpers for the exact key flow, header behavior, and OpenAI versus Anthropic details.
The proxy can generate a trace automatically, but the backend checks X-Trace-ID and X-Thread-ID first. If your app later emits tool or runtime events, reusing the same explicit trace ID is the cleaner and more reliable setup.

5. Add runtime traces only where you need more visibility

Start with proxy-only instrumentation first. Add trace() events when you need:
  • tool execution latency and outputs
  • retries inside your framework
  • runtime failures after the model returns
  • prompt caching-aware usage on manual llmResponse() calls
  • exposed thinking blocks or orchestration milestones

Next pages

Proxy Helpers

Understand which API key goes where and what the helper changes on the provider client.

Runtime Events

Add manual event coverage for tool spans, thinking blocks, embeddings, and errors.

Advanced Patterns

Finish with hybrid tracing, self-hosting, prompt caching, and event IDs.