Use this file to discover all available pages before exploring further.
This integration method is maintained but no longer actively developed. For the best experience and latest features, use our new AI Gateway with unified API access to 100+ models.
If you’re already using a custom base URL for an OpenAI-compatible vendor, you can proxy your requests through Helicone by setting the Helicone-Target-URL header to your existing vendor’s endpoint.
import { createOpenAI } from "@ai-sdk/openai";const openai = createOpenAI({ baseURL: "https://oai.helicone.ai/v1", headers: { "Helicone-Auth": `Bearer ${process.env.HELICONE_API_KEY}`, "Helicone-Target-URL": "https://your-vendor-api.com/v1", // Your existing vendor's endpoint },});// Use openai to make API calls - requests will be proxied to your vendorconst response = streamText({ model: openai("gpt-4o"), prompt: "Hello world",});