Proxy Integration

1

Create an account + Generate an API Key

Log into helicone or create an account. Once you have an account, you can generate an API key.

2

Set HELICONE_API_KEY as an environment variable

HELICONE_API_KEY=your-helicone-api-key
3

Modify the base URL and add a Helicone-Auth header

import { createOpenAI } from "@ai-sdk/openai";

const openai = createOpenAI({
  baseURL: "https://oai.helicone.ai/v1",
  headers: {
    "Helicone-Auth": `Bearer ${process.env.HELICONE_API_KEY}`,
  },
});

// Use openai to make API calls
const response = streamText({
  model: openai("gpt-4o"),
  prompt: "Hello world",
});

Configuring Helicone Features with Headers

Enable Helicone features through headers, configurable at client initialization or individual request level.

Configure Client

const openai = createOpenAI({
  baseURL: "https://oai.helicone.ai/v1",
  headers: {
    "Helicone-Auth": `Bearer ${process.env.HELICONE_API_KEY}`,
    "Helicone-Cache-Enabled": "true",
  },
});

Generate Text

const response = generateText({
  model: openai("gpt-4o"),
  prompt: "Hello world",
  headers: {
    "Helicone-User-Id": "john@doe.com",
    "Helicone-Session-Id": "uuid",
    "Helicone-Session-Path": "/chat",
    "Helicone-Session-Name": "Chatbot",
  },
});

Stream Text

const response = streamText({
  model: openai("gpt-4o"),
  prompt: "Hello world",
  headers: {
    "Helicone-User-Id": "john@doe.com",
    "Helicone-Session-Id": "uuid",
    "Helicone-Session-Path": "/chat",
    "Helicone-Session-Name": "Chatbot",
  },
});

Using with Existing Custom Base URLs

If you’re already using a custom base URL for an OpenAI-compatible vendor, you can proxy your requests through Helicone by setting the Helicone-Target-URL header to your existing vendor’s endpoint.

Example with Custom Vendor

import { createOpenAI } from "@ai-sdk/openai";

const openai = createOpenAI({
  baseURL: "https://oai.helicone.ai/v1",
  headers: {
    "Helicone-Auth": `Bearer ${process.env.HELICONE_API_KEY}`,
    "Helicone-Target-URL": "https://your-vendor-api.com/v1", // Your existing vendor's endpoint
  },
});

// Use openai to make API calls - requests will be proxied to your vendor
const response = streamText({
  model: openai("gpt-4o"),
  prompt: "Hello world",
});

Example with Multiple Vendors

You can also dynamically set the target URL per request:

const response = streamText({
  model: openai("gpt-4o"),
  prompt: "Hello world",
  headers: {
    "Helicone-Target-URL": "https://your-vendor-api.com/v1", // Override for this request
  },
});

This approach allows you to:

  • Keep your existing vendor integrations
  • Add Helicone monitoring and features
  • Switch between vendors without changing your base URL
  • Maintain compatibility with your current setup