Skip to main content

Documentation Index

Fetch the complete documentation index at: https://whyops.com/docs/llms.txt

Use this file to discover all available pages before exploring further.

WhyOps ships three primary language SDKs plus companion integration packages for Vercel AI SDK and LangChain.js, so you can instrument agents without stitching the proxy and events API together manually.

TypeScript / JavaScript

Start with the TypeScript quickstart, then move to proxy helpers, runtime events, and advanced patterns.

Python

Start with the Python quickstart, then move to proxy helpers, runtime events, and advanced sync or async patterns.

Go

Start with the Go quickstart, then move to proxy transport and runtime events.

Vercel AI SDK

Use this companion package when your TypeScript app is built on ai and you want wrapper-based traces for text generation, tools, streaming, and embeddings.

LangChain JS (Beta)

Use this companion package when your TypeScript app is built on LangChain.js and you want automatic traces for any chain, agent, or tool.
1

1. Choose your language package

Pick the TypeScript, Python, or Go section in the sidebar based on the service you are instrumenting. If the app uses the Vercel AI SDK, add the companion package after the TypeScript client is in place.
2

2. Create a WhyOps client with stable agent metadata

Define agentName, systemPrompt, and tool metadata once so WhyOps can version the agent correctly.
3

3. Initialize the agent early

Call initAgent(), init_agent_sync() or InitAgent(ctx) during boot so registration errors surface before runtime traffic.
4

4. Route model calls through the proxy

Use the package proxy helper or transport so your OpenAI or Anthropic traffic goes through WhyOps.
5

5. Add runtime events where the proxy cannot see enough

Use trace events for tool execution, retries, failures, prompt caching-aware usage, and framework orchestration.

Primary language SDKs

CapabilityTypeScriptPythonGo
Hosted defaults for proxy + analyse URLsYesYesYes
Automatic agent init before first eventYesYesYes
OpenAI proxy helperYesYesVia ProxyHTTPClient()
Anthropic proxy helperYesYesVia ProxyHTTPClient()
Manual runtime eventsYesYesYes
Prompt caching-aware token fieldsYesYesYes
Self-hosted URL overridesYesYesYes

Companion packages

PackageRequiresBest forCaptures
@whyops/vercel-ai-sdk@whyops/sdkServer-side TypeScript apps built on aigenerateText, streamText, multi-step tool calls, embed, embedMany
@whyops/langchain-js (Beta)@whyops/sdkTypeScript apps built on LangChain.jsAny chain, agent, LLM call, and tool execution via the LangChain callback system

Install

npm install @whyops/sdk

Hosted defaults used by the SDKs

All three packages use these defaults when you do not override them:
SettingDefault
Proxy base URLhttps://proxy.whyops.com
Analyse base URLhttps://a.whyops.com/api
Agent init fallback path/v1/agents/init
Manual events ingest path/events/ingest
The SDK event payloads already support cacheReadTokens and cacheCreationTokens inside usage metadata, so you can report cache-aware token usage when you emit manual llm_response events.

Choose your section in the sidebar

TypeScript SDK

Best for Node.js, Bun, and modern server-side TypeScript services using OpenAI or Anthropic SDKs.

Python SDK

Best for synchronous APIs, async workers, and Python services that need one package for proxying and runtime traces.

Go SDK

Best for backend services that want a trace builder plus a proxy-aware http.Client transport.

Vercel AI SDK

Best for TypeScript services using the Vercel AI SDK that want automatic capture around text generation, tools, reasoning, streaming, and embeddings.

LangChain JS (Beta)

Best for TypeScript services using LangChain.js that want automatic capture across any chain, agent loop, or tool via the LangChain callback system.