Arize

Using OpenRouter with Arize

Using Arize

Arize provides observability and tracing for LLM applications. Since OpenRouter uses the OpenAI API schema, you can utilize Arize’s OpenInference auto-instrumentation with the OpenAI SDK to automatically trace and monitor your OpenRouter API calls.

Installation

Prerequisites

  • OpenRouter account and API key
  • Arize account with Space ID and API Key

Why OpenRouter Works with Arize

Arize’s OpenInference auto-instrumentation works with OpenRouter because:

  1. OpenRouter provides a fully OpenAI-API-compatible endpoint - The /v1 endpoint mirrors OpenAI’s schema
  2. Reuse official OpenAI SDKs - Point the OpenAI client’s base_url to OpenRouter
  3. Automatic instrumentation - OpenInference hooks into OpenAI SDK calls seamlessly

Configuration

Set up your environment variables:

Simple LLM Call

Initialize Arize and instrument your OpenAI client to automatically trace OpenRouter calls:

What Gets Traced

All OpenRouter model calls are automatically traced and include:

  • Request/response data and timing
  • Model name and provider information
  • Token usage and cost data (when supported)
  • Error handling and debugging information

JavaScript/TypeScript Support

OpenInference also provides instrumentation for the OpenAI JavaScript/TypeScript SDK, which works with OpenRouter. For setup and examples, please refer to the OpenInference JavaScript examples for OpenAI.

Common Issues

  • API Key: Use your OpenRouter API key, not OpenAI’s
  • Model Names: Use exact model names from OpenRouter’s model list
  • Rate Limits: Check your OpenRouter dashboard for usage limits

Learn More