Documentation Index
Fetch the complete documentation index at: https://whyops.com/docs/llms.txt
Use this file to discover all available pages before exploring further.
@whyops/sdk is the main WhyOps package for Node.js, Bun, and modern server-side TypeScript apps. This page is the basic setup path: install the package, register the agent, then route OpenAI or Anthropic traffic through WhyOps.
Start with this page to create the base
WhyOps client, then use the companion wrapper on the Vercel AI SDK integration page.Proxy Helpers
Read this next if you want the exact API-key flow and what
whyops.openai() or whyops.anthropic() mutates.Runtime Events
Add tool spans, cache-aware token usage, thinking events, and runtime failures.
Advanced Patterns
Hybrid tracing, self-hosted overrides, prompt caching, and common mistakes.
Vercel AI SDK
Wrap
generateText(), streamText(), and embedding helpers when your app uses ai.Before you start
| You need | Why |
|---|---|
WHYOPS_API_KEY | Authenticates agent init, manual events, and proxied model traffic to WhyOps |
A stable agentName | Keeps proxy traffic and runtime events attached to the same agent identity |
systemPrompt and tool metadata | Lets WhyOps version the agent and show the right configuration in the UI |
| Your provider key in the WhyOps dashboard | Lets WhyOps authenticate upstream when it forwards OpenAI or Anthropic traffic |
| A stable session or trace ID | Best for explicit continuity between proxied model calls and later tool or runtime events |
In TypeScript, the cleanest flow is: create the WhyOps client, call
await whyops.initAgent() during startup, then patch OpenAI or Anthropic, and only after that add manual runtime events.1. Install the package
- npm
- pnpm
- yarn
- bun
2. Create the WhyOps client once
If you include
inputSchema or outputSchema, pass JSON strings with JSON.stringify(...). If the agent has no tools, set tools: [] explicitly so the registered definition stays clear.3. Initialize the agent during startup
trace() auto-initializes on first event, but calling it during boot is the better default because agent registration problems fail early.
4. Make your first proxied model call
- OpenAI
- Anthropic
The helper mutates the provider client in place. Go to TypeScript SDK Proxy Helpers for the exact key flow, header behavior, and OpenAI versus Anthropic details.
The proxy can generate a trace automatically, but the backend checks
X-Trace-ID and X-Thread-ID first. If your app later emits tool or runtime events, reusing the same explicit trace ID is the cleaner and more reliable setup.5. Add runtime traces only where you need more visibility
Start with proxy-only instrumentation first. Addtrace() events when you need:
- tool execution latency and outputs
- retries inside your framework
- runtime failures after the model returns
- prompt caching-aware usage on manual
llmResponse()calls - exposed thinking blocks or orchestration milestones
Next pages
Proxy Helpers
Understand which API key goes where and what the helper changes on the provider client.
Runtime Events
Add manual event coverage for tool spans, thinking blocks, embeddings, and errors.
Advanced Patterns
Finish with hybrid tracing, self-hosting, prompt caching, and event IDs.