Skip to main content
This integration method is maintained but no longer actively developed. For the best experience and latest features, use our new AI Gateway with unified API access to 100+ models.
Integrate Helicone with Vercel AI Gateway to add observability to your multi-provider AI infrastructure. This integration allows you to route Vercel AI Gateway requests through Helicone while maintaining all of Vercel’s routing, failover, and provider management capabilities.
Vercel AI Gateway provides a unified endpoint to access multiple AI providers, automatic retries, and spend monitoring. By integrating with Helicone, you add comprehensive logging, analytics, and monitoring on top of these features.
The @ai-sdk/gateway integration (recommended) supports all Vercel AI Gateway features including provider failover ordering. Use this SDK if you need gateway-specific functionality.

Quick Start

1

Set up environment variables

VERCEL_AI_GATEWAY_API_KEY=your-vercel-ai-gateway-api-key
HELICONE_API_KEY=your-helicone-api-key
2

Install SDK

npm install @ai-sdk/gateway ai
3

Route requests through Helicone

import { createGateway } from '@ai-sdk/gateway';
import { generateText } from 'ai';

const gateway = createGateway({
  apiKey: process.env.VERCEL_AI_GATEWAY_API_KEY,
  baseURL: 'https://vercel.helicone.ai/v1/ai',
  headers: {
    'Helicone-Auth': `Bearer ${process.env.HELICONE_API_KEY}`,
  }
});

const result = await generateText({
  model: gateway('openai/gpt-4o-mini'),
  prompt: 'Explain quantum computing',
  // Gateway-specific features
  providerOptions: {
    gateway: {
      order: ['openai', 'anthropic', 'bedrock'] // Failover order
    }
  }
});

How it Works

  1. Your application sends requests to Helicone’s gateway
  2. Helicone logs the request and forwards it to Vercel AI Gateway
  3. Vercel routes the request to the appropriate AI provider
  4. The response flows back through Helicone for logging
  5. You get complete observability for all requests

Complete Example

import { createGateway } from '@ai-sdk/gateway';
import { generateText } from 'ai';

const gateway = createGateway({
  apiKey: process.env.VERCEL_AI_GATEWAY_API_KEY,
  baseURL: 'https://vercel.helicone.ai/v1/ai',
  headers: {
    'Helicone-Auth': `Bearer ${process.env.HELICONE_API_KEY}`,
  }
});

async function main() {
  const result = await generateText({
    model: gateway('openai/gpt-4o-mini'),
    prompt: 'What is the meaning of life?',
    temperature: 0.7,
    maxTokens: 100
  });

  return result.text;
}

Next Steps

Explore Analytics

Learn about Helicone’s analytics capabilities

Set Up Alerts

Configure alerts for your Vercel AI Gateway usage

User Tracking

Track and analyze user behavior

Cost Analysis

Monitor costs across all providers