Skip to main content
This integration method is maintained but no longer actively developed. For the best experience and latest features, use our new AI Gateway with unified API access to 100+ models.
1
2
HELICONE_API_KEY=<YOUR_HELICONE_API_KEY>
AZURE_OPENAI_API_KEY=<YOUR_AZURE_OPENAI_API_KEY>
3
Make sure to include the api-version in all of your requests.
from openai import AzureOpenAI
from dotenv import load_dotenv
import os

load_dotenv()

helicone_api_key = os.getenv("HELICONE_API_KEY")
azure_openai_api_key = os.getenv("AZURE_OPENAI_API_KEY")

client = AzureOpenAI(
    api_version="[API_VERSION]" # "2024-12-01-preview",
    azure_endpoint="https://[AZURE_DOMAIN].openai.azure.com/",
    api_key=azure_openai_api_key,
    default_headers={
        "Helicone-Auth": f"Bearer {helicone_api_key}",
        "Helicone-OpenAI-Api-Base": "https://[AZURE_DOMAIN].openai.azure.com",
        "Helicone-Model-Override": "[MODEL_NAME]",
        "api-key": azure_openai_api_key
    }
)
Recomendation: Model OverrideWhen using Azure, the model displays differently than expected at times. We have implemented logic to parse out the model, but if you want to guarantee your model is consistent, we highly recommend using model override:Helicone-Model-Override: [MODEL_NAME]

Click here to learn more about model override
4
response = azure_openai.chat.completions.create(
  model="[MODEL_NAME]",
  messages=[{"role": "User", "content": "What is the meaning of life?"}]
)

print(response)
5