AI Billing
Middleware for the Vercel AI SDK that captures LLM usage and sends normalized billing events directly to billing platforms.Provider-aware cost calculation
Understands the token pricing of each LLM provider so you don’t have to.
Request-level tags
Attach user, org, or any custom metadata to every billing event.
Multiple destinations
Forward events to Stripe, Polar, OpenMeter, Lago, or your own endpoint — all at once.
Drop-in middleware
Works with
wrapLanguageModel — no changes to your existing streamText / generateText calls.Install
@ai-billing/openrouter with the provider package that matches your setup (see Supported Providers below).
Quick start
1. Wrap your model
2. Add a destination and tags
BillingEvent to Polar with cost, token counts, and your custom tags attached.
Architecture
The library has two components that snap together:Provider middleware
Specialized middleware for each AI SDK provider that understands
providerMetadata shapes, extracts token usage, and calculates cost via a PriceResolver.Supported providers
Text model support is the current priority.
Coming soon — Anthropic, Google Generative AI, Requesty, and Cloudflare AI Gateway are in active development. To request a provider, open an issue.
Supported destinations
| Destination | Package |
|---|---|
| Polar.sh | @ai-billing/polar |
| Stripe | @ai-billing/stripe |
| OpenMeter (Kong) | @ai-billing/openmeter |
| Lago | @ai-billing/lago |
Reference
Core primitives
Shared billing types, price resolvers, and destination helpers in
@ai-billing/core.API reference
Generated reference docs for every type, function, and class.