Supported providers
| Provider | Client | Method intercepted |
|---|---|---|
| OpenAI | new OpenAI() | chat.completions.create |
| Anthropic | new Anthropic() | messages.create |
new GoogleGenerativeAI() | generateContent | |
| Vercel AI | openai / anthropic / google | generateText / streamText |
How it works
The SDK uses JavaScriptProxy to intercept LLM client methods. When you call client.chat.completions.create(...), the proxy:
- Runs all pre-hooks on the request
- If no hook blocks → calls the original LLM method
- Runs all post-hooks on the response
- Returns the (potentially modified) response