Integration overview
Narev complements your observability, gateways, and FinOps tools
Narev is a router and experimentation layer that integrates with most providers, gateways and tracing platforms. If you don't see an integration, let us know.
Available integrations
Providers
- OpenAI - GPT-4 and other OpenAI models
Gateways
- OpenRouter - Universal LLM gateway with load balancing
- LiteLLM - Unified interface for 100+ LLMs
- Portkey - AI gateway with routing and fallbacks
- Helicone Gateway - AI gateway with routing and fallbacks
Observability
- Helicone - LLM observability and monitoring
- Langfuse - LLM engineering platform
- LangSmith - LangChain debugging and monitoring
- Weights and Biases - ML experiment tracking
Narev is the missing experimentation layer
A common LLM stack has three layers:
Layer | Tools | Purpose | |
|---|---|---|---|
| Provider | OpenAI, Claude, Groq | Generates responses | Narev Helps find optimal configuration |
| Gateway | OpenRouter, LiteLLM, Portkey | Provides common interface | |
| Observability | LLM: Helicone, Langfuse, W&B | Shows LLM history | |
FinOps: Vantage, CloudZero, Finout | Tracks costs and spend |
How Narev complements existing tools?
| Category | How Narev works together | What Narev adds |
|---|---|---|
| Provider (OpenAI, Anthropic, Groq) | Tests providers side-by-side on your prompts | Identifies which provider delivers the best cost/quality/latency |
| Gateway (OpenRouter, LiteLLM, Portkey) | Tests routing strategies and fallback configurations | Determines optimal routing logic and failover paths |
| Observability: LLM (Helicone, Langfuse, W&B) | Imports production traces and runs what-if experiments | Tests configurations before and during production |
| Observability: FinOps (Vantage, CloudZero, Finout) | Connects cost data to quality-aware optimization tests | Turns spend tracking into actionable cost reduction |