Integration overview

Narev complements your observability, gateways, and FinOps tools

Narev is a router and experimentation layer that integrates with most providers, gateways and tracing platforms. If you don't see an integration, let us know.

Available integrations

Providers

  • OpenAI - GPT-4 and other OpenAI models

Gateways

  • OpenRouter - Universal LLM gateway with load balancing
  • LiteLLM - Unified interface for 100+ LLMs
  • Portkey - AI gateway with routing and fallbacks
  • Helicone Gateway - AI gateway with routing and fallbacks

Observability

Narev is the missing experimentation layer

A common LLM stack has three layers:

Layer

Tools

Purpose

Provider

OpenAI, Claude, Groq

Generates responses

Narev

Helps find optimal configuration

Gateway

OpenRouter, LiteLLM, Portkey

Provides common interface

Observability

LLM: Helicone, Langfuse, W&B

Shows LLM history

FinOps: Vantage, CloudZero, Finout

Tracks costs and spend

How Narev complements existing tools?

CategoryHow Narev works togetherWhat Narev adds
Provider
(OpenAI, Anthropic, Groq)
Tests providers side-by-side on your promptsIdentifies which provider delivers the best cost/quality/latency
Gateway
(OpenRouter, LiteLLM, Portkey)
Tests routing strategies and fallback configurationsDetermines optimal routing logic and failover paths
Observability: LLM
(Helicone, Langfuse, W&B)
Imports production traces and runs what-if experimentsTests configurations before and during production
Observability: FinOps
(Vantage, CloudZero, Finout)
Connects cost data to quality-aware optimization testsTurns spend tracking into actionable cost reduction