WhyOps ships three primary language SDKs plus companion integration packages for Vercel AI SDK and LangChain.js, so you can instrument agents without stitching the proxy and events API together manually.Documentation Index
Fetch the complete documentation index at: https://whyops.com/docs/llms.txt
Use this file to discover all available pages before exploring further.
TypeScript / JavaScript
Start with the TypeScript quickstart, then move to proxy helpers, runtime events, and advanced patterns.
Python
Start with the Python quickstart, then move to proxy helpers, runtime events, and advanced sync or async patterns.
Go
Start with the Go quickstart, then move to proxy transport and runtime events.
Vercel AI SDK
Use this companion package when your TypeScript app is built on
ai and you want wrapper-based traces for text generation, tools, streaming, and embeddings.LangChain JS (Beta)
Use this companion package when your TypeScript app is built on LangChain.js and you want automatic traces for any chain, agent, or tool.
The recommended order
1. Choose your language package
Pick the TypeScript, Python, or Go section in the sidebar based on the service you are instrumenting. If the app uses the Vercel AI SDK, add the companion package after the TypeScript client is in place.
2. Create a WhyOps client with stable agent metadata
Define
agentName, systemPrompt, and tool metadata once so WhyOps can version the agent correctly.3. Initialize the agent early
Call
initAgent(), init_agent_sync() or InitAgent(ctx) during boot so registration errors surface before runtime traffic.4. Route model calls through the proxy
Use the package proxy helper or transport so your OpenAI or Anthropic traffic goes through WhyOps.
Primary language SDKs
| Capability | TypeScript | Python | Go |
|---|---|---|---|
| Hosted defaults for proxy + analyse URLs | Yes | Yes | Yes |
| Automatic agent init before first event | Yes | Yes | Yes |
| OpenAI proxy helper | Yes | Yes | Via ProxyHTTPClient() |
| Anthropic proxy helper | Yes | Yes | Via ProxyHTTPClient() |
| Manual runtime events | Yes | Yes | Yes |
| Prompt caching-aware token fields | Yes | Yes | Yes |
| Self-hosted URL overrides | Yes | Yes | Yes |
Companion packages
| Package | Requires | Best for | Captures |
|---|---|---|---|
@whyops/vercel-ai-sdk | @whyops/sdk | Server-side TypeScript apps built on ai | generateText, streamText, multi-step tool calls, embed, embedMany |
@whyops/langchain-js (Beta) | @whyops/sdk | TypeScript apps built on LangChain.js | Any chain, agent, LLM call, and tool execution via the LangChain callback system |
Install
- npm
- Vercel AI SDK
- LangChain JS
- PyPI
- Go Modules
Hosted defaults used by the SDKs
All three packages use these defaults when you do not override them:| Setting | Default |
|---|---|
| Proxy base URL | https://proxy.whyops.com |
| Analyse base URL | https://a.whyops.com/api |
| Agent init fallback path | /v1/agents/init |
| Manual events ingest path | /events/ingest |
The SDK event payloads already support
cacheReadTokens and cacheCreationTokens inside usage metadata, so you can report cache-aware token usage when you emit manual llm_response events.Choose your section in the sidebar
TypeScript SDK
Best for Node.js, Bun, and modern server-side TypeScript services using OpenAI or Anthropic SDKs.
Python SDK
Best for synchronous APIs, async workers, and Python services that need one package for proxying and runtime traces.
Go SDK
Best for backend services that want a trace builder plus a proxy-aware
http.Client transport.Vercel AI SDK
Best for TypeScript services using the Vercel AI SDK that want automatic capture around text generation, tools, reasoning, streaming, and embeddings.
LangChain JS (Beta)
Best for TypeScript services using LangChain.js that want automatic capture across any chain, agent loop, or tool via the LangChain callback system.