Skip to main content

AI Billing

Middleware for the Vercel AI SDK that captures LLM usage and sends normalized billing events directly to billing platforms.

Provider-aware cost calculation

Understands the token pricing of each LLM provider so you don’t have to.

Request-level tags

Attach user, org, or any custom metadata to every billing event.

Multiple destinations

Forward events to Stripe, Polar, OpenMeter, Lago, or your own endpoint — all at once.

Drop-in middleware

Works with wrapLanguageModel — no changes to your existing streamText / generateText calls.

Install

npm install @ai-billing/core @ai-billing/openrouter
Replace @ai-billing/openrouter with the provider package that matches your setup (see Supported Providers below).

Quick start

1. Wrap your model

import { streamText, wrapLanguageModel } from 'ai';
import { createOpenRouter } from '@openrouter/ai-sdk-provider';
import { createOpenRouterV3Middleware } from '@ai-billing/openrouter';

const billingMiddleware = createOpenRouterV3Middleware({});

const model = wrapLanguageModel({
  model: createOpenRouter({
    apiKey: process.env.OPENROUTER_API_KEY,
  })('google/gemini-2.0-flash-001'),
  middleware: billingMiddleware,
});

2. Add a destination and tags

import { streamText, wrapLanguageModel } from 'ai';
import { createOpenRouter } from '@openrouter/ai-sdk-provider';
import { createOpenRouterV3Middleware } from '@ai-billing/openrouter';
import { createPolarDestination } from '@ai-billing/polar';

const billingMiddleware = createOpenRouterV3Middleware({
  destinations: [
    createPolarDestination({
      accessToken: process.env.POLAR_ACCESS_TOKEN,
      eventName: 'llm_usage',
    }),
  ],
});

const model = wrapLanguageModel({
  model: createOpenRouter({
    apiKey: process.env.OPENROUTER_API_KEY,
  })('google/gemini-2.0-flash-001'),
  middleware: billingMiddleware,
});

const { textStream } = await streamText({
  model,
  messages: [{ role: 'user', content: 'Quantify the value of metadata.' }],
  providerOptions: {
    'ai-billing-tags': { userId: 'usr_123', org: 'Acme' },
  },
});
Every completed request now emits a BillingEvent to Polar with cost, token counts, and your custom tags attached.

Architecture

The library has two components that snap together:
1

Provider middleware

Specialized middleware for each AI SDK provider that understands providerMetadata shapes, extracts token usage, and calculates cost via a PriceResolver.
2

Destinations

Functions that receive a normalized BillingEvent and handle the API call to an external billing service. You can attach multiple destinations to a single middleware.

Supported providers

Text model support is the current priority. Coming soonAnthropic, Google Generative AI, Requesty, and Cloudflare AI Gateway are in active development. To request a provider, open an issue.

Supported destinations

DestinationPackage
Polar.sh@ai-billing/polar
Stripe@ai-billing/stripe
OpenMeter (Kong)@ai-billing/openmeter
Lago@ai-billing/lago

Reference

Core primitives

Shared billing types, price resolvers, and destination helpers in @ai-billing/core.

API reference

Generated reference docs for every type, function, and class.